Loss functions for classification: Difference between revisions

Content deleted Content added
add exponential loss
Line 89:
This function is undefined when <math>p(1\mid x)=1</math> or <math>p(1\mid x)=0</math> (tending toward ∞ and −∞ respectively), but predicts a smooth curve which grows when <math>p(1\mid x)</math> increases and equals 0 when <math>p(1\mid x)= 0.5</math>.<ref name="mitlec" />
 
== Cross entropy loss (Log Loss) ==
{{main|Cross entropy}}
Using the alternative label convention <math>t=(1+y)/2</math> so that <math>t \in \{0,1\}</math>, the cross entropy loss is defined as