Content deleted Content added
Citation bot (talk | contribs) Add: pmid, doi, arxiv, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | #UCB_toolbar |
|||
Line 42:
<ref>{{Cite journal |last=H |first=Nielsen R. |date=1987 |title=Kolmogorov's mapping neural network existence theorem |url=https://cir.nii.ac.jp/crid/1572543025788928512 |journal=Proceedings of International Conference on Neural Networks, 1987 |volume=3 |pages=11–13}}</ref> showed that a three-layer neural network can approximate any continuous multivariate function. This was extended to the discontinuous case in <ref>{{cite journal |last1=Ismailov |first1=Vugar E. |date=July 2023 |title=A three layer neural network can represent any multivariate function |journal=Journal of Mathematical Analysis and Applications |volume=523 |issue=1 |pages=127096 |arxiv=2012.03016 |doi=10.1016/j.jmaa.2023.127096 |s2cid=265100963}}</ref>.
<ref>{{Citation |
=== Variants ===
Discontinuous activation functions,<ref name="leshno" /> noncompact domains,<ref name="kidger" /><ref>{{Cite journal |last1=van Nuland |first1=Teun |year=2024 |title=Noncompact uniform universal approximation |url=https://doi.org/10.1016/j.neunet.2024.106181 |journal=Neural Networks |volume=173|doi=10.1016/j.neunet.2024.106181 |pmid=38412737 |arxiv=2308.03812 }}</ref> certifiable networks,<ref>{{cite conference |last1=Baader |first1=Maximilian |last2=Mirman |first2=Matthew |last3=Vechev |first3=Martin |date=2020 |title=Universal Approximation with Certified Networks |url=https://openreview.net/forum?id=B1gX8kBtPr |conference=ICLR}}</ref>
random neural networks,<ref>{{Cite journal |last1=Gelenbe |first1=Erol |last2=Mao |first2=Zhi Hong |last3=Li |first3=Yan D. |year=1999 |title=Function approximation with spiked random networks |url=https://zenodo.org/record/6817275 |journal=IEEE Transactions on Neural Networks |volume=10 |issue=1 |pages=3–9 |doi=10.1109/72.737488 |pmid=18252498}}</ref> and alternative network architectures and topologies.<ref name="kidger" /><ref>{{Cite conference |last1=Lin |first1=Hongzhou |last2=Jegelka |first2=Stefanie |date=2018 |title=ResNet with one-neuron hidden layers is a Universal Approximator |url=https://papers.nips.cc/paper/7855-resnet-with-one-neuron-hidden-layers-is-a-universal-approximator |publisher=Curran Associates |volume=30 |pages=6169–6178 |journal=Advances in Neural Information Processing Systems}}</ref>
|