Content deleted Content added
No edit summary |
No edit summary |
||
Line 8:
Universal approximation theorems are existence theorems: They simply state that there ''exists'' such a sequence <math>\phi_1, \phi_2, \dots \to f</math>, and do not provide any way to actually find such a sequence. They also do not guarantee any method, such as [[backpropagation]], might actually find such a sequence. Any method for searching the space of neural networks, including backpropagation, might find a converging sequence, or not (i.e. the backpropagation might get stuck in a local optimum).
Universal approximation theorems are limit theorems: They simply state
== Setup ==
|