Content deleted Content added
Citation bot (talk | contribs) Altered template type. Add: class, eprint, volume, series, arxiv, bibcode, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Dominic3203 | #UCB_webform 63/199 |
m Add-on: the most flexible activation function available, thanks to its 4 parameters |
||
Line 193:
| <math>(-\infty,\infty)</math>
| <math>C^0</math>
|-
| Rectified Parametric Sigmoid Units (flexible, 4 parameters)
| [[File:RePSU.svg|thumb|Rectified Power Sigmoid Units]]
| <math> \alpha (2x {1}_{ \{ x \geqslant \lambda \} } - g_{\lambda, \sigma, \mu, \beta}(x)) + (1-\alpha) g_{\lambda, \sigma, \mu, \beta}(x)</math>
where
<math>g_{\lambda, \sigma, \mu, \beta}(x) = \frac{ (x - \lambda) {1}_{ \{ x \geqslant \lambda \} } }{ 1 + e^{- \sgn(x-\mu) \left( \frac{\vert x-\mu \vert}{\sigma} \right)^\beta } } </math>
<ref name=refrepsu1>
{{Citation
|first1=Abdourrahmane M.|last1=Atto|first2=Sylvie|last2=Galichet|first3=Dominique|last3=Pastor|first4=Nicolas|last4=Méger
|contribution = On joint parameterizations of linear and nonlinear functionals in neural networks
|contribution-url = https://doi.org/10.1016/j.neunet.2022.12.019
|title = Elsevier Pattern Recognition
|date = 2023
}}</ref>
| <math> - </math>
| <math>(-\infty,+\infty)</math>
| <math>C^0</math>
| {{No}}
| {{No}}
| {{No}}
|-
| Sigmoid linear unit (SiLU,<ref name="ReferenceA" /> Sigmoid shrinkage,<ref name=refssbs1>
|