Activation function: Difference between revisions

Content deleted Content added
Citation bot (talk | contribs)
Removed URL that duplicated identifier. | Use this bot. Report bugs. | Suggested by Dominic3203 | Linked from User:LinguisticMystic/cs/outline | #UCB_webform_linked 13/2277
Line 104:
| <math>(-1,1)</math>
| <math>C^\infty</math>
|-
|Softsign
|
|<math>\frac{x}{1+|x|}</math>
|<math>\frac{1}{(1+|x|)^2}</math>
|<math>(-\infty, +\infty)</math>
|<math>C^1</math>
|-
| [[Rectifier (neural networks)|Rectified linear unit]] (ReLU)<ref>{{Citation|last1=Nair|first1=Vinod|last2=Hinton|first2=Geoffrey E.|date=2010|contribution=Rectified Linear Units Improve Restricted Boltzmann Machines|contribution-url=http://dl.acm.org/citation.cfm?id=3104322.3104425|title=27th International Conference on International Conference on Machine Learning|series=ICML'10|___location=USA|publisher=Omnipress|pages=807–814|isbn=9781605589077}}</ref>
Line 155 ⟶ 162:
\end{cases}</math>
|-
| Scaled exponential linear unit (SELU)<ref>{{Cite journal |last1=Klambauer |first1=Günter |last2=Unterthiner |first2=Thomas |last3=Mayr |first3=Andreas |last4=Hochreiter |first4=Sepp |date=2017-06-08 |title=Self-Normalizing Neural Networks |journal=Advances in Neural Information Processing Systems |volume=30 |issue=2017 |arxiv=1706.02515 }}</ref>
| [[File:Activation selu.png]]
| <math>\lambda \begin{cases}