Content deleted Content added
→Table of activation functions: swiglu |
|||
Line 168:
| <math>(-\lambda\alpha,\infty)</math>
| <math>C^0</math>
|-
| Leaky rectified linear unit (Leaky ReLU)<ref>{{cite journal |last1=Maas |first1=Andrew L. |last2=Hannun |first2=Awni Y. |last3=Ng |first3=Andrew Y. |s2cid=16489696 |title=Rectifier nonlinearities improve neural network acoustic models |journal=Proc. ICML |date=June 2013 |volume=30 |issue=1}}</ref>
Line 189 ⟶ 182:
| <math>C^0</math>
|-
| Parametric rectified linear unit (PReLU)<ref>{{Cite arXiv |last1=He |first1=Kaiming |last2=Zhang |first2=Xiangyu |last3=Ren |first3=Shaoqing |last4=Sun |first4=Jian |date=2015-02-06 |title=Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification |eprint=1502.01852 |class=cs.CV}}</ref>
| [[File:Activation prelu.svg]]
| <math>\begin{cases}
|