Talk:Universal approximation theorem: Difference between revisions

Content deleted Content added
No edit summary
Line 67:
 
::::::::: OK, sounds good. [[User:Koertefa|''<span style="color:#2F4F4F">'''K'''<span style="color:Teal">œrte</span>'''F'''</span><span style="color:Teal">a</span>'']] [[User talk:Koertefa#top|<span style="color:#2F4F4F">'''{'''<i style="color:Teal">ταλκ</i>'''}'''</span>]] 19:22, 6 July 2020 (UTC)
 
 
== Arbitrary Width Case ==
I am wondering if the theorem presented here is correct. In http://www2.math.technion.ac.il/~pinkus/papers/acta.pdf, the neural network has only one node in the output layer (so they map to R). Here the number of output nodes is arbitrary. Is this correct?