Loss functions for classification: Difference between revisions

Content deleted Content added
put in the logistic sigmoid to make the equivalence between "logistic loss" and "cross entropy" more explicit.
Drevicko (talk | contribs)
m Cross entropy loss (Log Loss): use e^{-x} to be consistent with other sections
Line 107:
where we introduced the logistic sigmoid:
 
:<math>\sigma(\vec{x}) = \frac{1}{1+\exp(e^{-f(\vec{x}))}}</math>
 
It's easy to check that the [[logistic loss]] (above) and binary cross entropy are in fact the same (up to a multiplicative constant <math>1/\ln2</math>).