Content deleted Content added
put in the logistic sigmoid to make the equivalence between "logistic loss" and "cross entropy" more explicit. |
m →Cross entropy loss (Log Loss): use e^{-x} to be consistent with other sections |
||
Line 107:
where we introduced the logistic sigmoid:
:<math>\sigma(\vec{x}) = \frac{1}{1+
It's easy to check that the [[logistic loss]] (above) and binary cross entropy are in fact the same (up to a multiplicative constant <math>1/\ln2</math>).
|