Activation function: Difference between revisions

Content deleted Content added
m Added links between established Wikipedia pages.
Line 131:
| <math>C^\infty</math>
|-
| [[Softplus]]<ref>{{Cite web|url=http://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf|title=Deep sparse rectifier neural networks|last1=Glorot|first1=Xavier|last2=Bordes|first2=Antoine|date=2011|website=International Conference on Artificial Intelligence and Statistics|last3=Bengio|first3=Yoshua}}</ref>
| [[File:Activation softplus.svg]]
| <math>\ln\left(1 + e^x\right)</math>
Line 138:
| <math>C^\infty</math>
|-
| [[Rectifier (neural networks)#ELU|Exponential linear unit (ELU)]]<ref>{{Cite arXiv|last1=Clevert|first1=Djork-Arné|last2=Unterthiner|first2=Thomas|last3=Hochreiter|first3=Sepp|date=2015-11-23|title=Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)|eprint=1511.07289 |class=cs.LG}}</ref>
| [[File:Activation elu.svg]]
| <math>\begin{cases}
Line 213:
| <math>C^0</math>
|-
| [[Rectifier (neural networks)#SiLU|Sigmoid linear unit]] (SiLU,<ref name="ReferenceA" /> Sigmoid shrinkage,<ref name="refssbs1">
{{Citation
|first1=Abdourrahmane M.|last1=Atto|first2=Dominique|last2=Pastor|first3=Grégoire|last3=Mercier