Activation function: Difference between revisions

Content deleted Content added
periodic and quadratic
Line 207:
| <math>[-0.278\ldots, \infty)</math>
| <math>C^\infty</math>
|-
|Exponential Linear Sigmoid SquasHing (ELiSH)<ref>{{Citation |last=Basirat |first=Mina |title=The Quest for the Golden Activation Function |date=2018-08-02 |url=https://arxiv.org/abs/1808.00783v1 |access-date=2024-10-05 |doi=10.48550/arXiv.1808.00783 |last2=Roth |first2=Peter M.}}</ref>
|
|<math>\begin{cases}
\frac{x}{1+e^{-x}} & \text{if } x \geq 0 \\
\frac{e^x - 1}{1 + e^{-x}} & \text{if } x < 0
\end{cases}
</math>
|
|
|
|-
| [[Gaussian function|Gaussian]]
Line 264 ⟶ 275:
{{reflist|30em}}
 
* {{cite arxiv |last=Nwankpa |first=Chigozie |title=Activation Functions: Comparison of trends in Practice and Research for Deep Learning |date=2018-11-08 |arxiv=1811.03378 |last2=Ijomah |first2=Winifred |last3=Gachagan |first3=Anthony |last4=Marshall |first4=Stephen}}
* {{cite journal |last=Dubey |first=Shiv Ram |last2=Singh |first2=Satish Kumar |last3=Chaudhuri |first3=Bidyut Baran |year=2022 |title=Activation functions in deep learning: A comprehensive survey and benchmark |journal=Neurocomputing |publisher=Elsevier BV |volume=503 |pages=92–108 |doi=10.1016/j.neucom.2022.06.111 |issn=0925-2312 |doi-access=free}}
{{Differentiable computing}}