Universal approximation theorem: Difference between revisions

Content deleted Content added
Citation bot (talk | contribs)
Add: pmid, doi, arxiv, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | #UCB_toolbar
ce
Line 42:
<ref>{{Cite journal |last=H |first=Nielsen R. |date=1987 |title=Kolmogorov's mapping neural network existence theorem |url=https://cir.nii.ac.jp/crid/1572543025788928512 |journal=Proceedings of International Conference on Neural Networks, 1987 |volume=3 |pages=11–13}}</ref> showed that a three-layer neural network can approximate any continuous multivariate function. This was extended to the discontinuous case in <ref>{{cite journal |last1=Ismailov |first1=Vugar E. |date=July 2023 |title=A three layer neural network can represent any multivariate function |journal=Journal of Mathematical Analysis and Applications |volume=523 |issue=1 |pages=127096 |arxiv=2012.03016 |doi=10.1016/j.jmaa.2023.127096 |s2cid=265100963}}</ref>.
 
<ref>{{Citationcite arxiv |last1=Liu |first1=Ziming |title=KAN: Kolmogorov-Arnold Networks |date=2024-05-24 |url=http://arxiv.org/abs/2404.19756 |access-date=2024-06-03 |arxiv=2404.19756 |last2=Wang |first2=Yixuan |last3=Vaidya |first3=Sachin |last4=Ruehle |first4=Fabian |last5=Halverson |first5=James |last6=Soljačić |first6=Marin |last7=Hou |first7=Thomas Y. |last8=Tegmark |first8=Max}}</ref> shows a practical application.
 
=== Variants ===