Content deleted Content added
generate exp loss |
Adaboost algorithm for exp loss |
||
Line 135:
:<math>\phi(v)=C[f^{-1}(v)]+(1-f^{-1}(v))C'[f^{-1}(v)] = 2\sqrt{(\frac{e^{2v}}{1+e^{2v}})(1-\frac{e^{2v}}{1+e^{2v}})}+(1-\frac{e^{2v}}{1+e^{2v}})(\frac{1-\frac{2e^{2v}}{1+e^{2v}}}{\sqrt{\frac{e^{2v}}{1+e^{2v}}(1-\frac{e^{2v}}{1+e^{2v}})}}) = e^{-v}</math>
The exponential loss is convex and grows exponentially for negative values which make it more sensitive to outliers. The exponential loss is used in the [[AdaBoost|AdaBoost algorithm]].
== Hinge loss ==
|