Content deleted Content added
mNo edit summary |
→Connection between RKHSs and the ReLU function: Fixed typo in heading Tags: Mobile edit Mobile web edit |
||
Line 267:
We lastly remark that the above theory can be further extended to spaces of functions with values in function spaces but obtaining kernels for these spaces is a more difficult task.<ref>Rosasco</ref>
== Connection between
The [[Rectifier (neural networks)|ReLU function]] is commonly defined as <math>f(x)=\max \{0, x\}</math> and is a mainstay in the architecture of neural networks where it is used as an activation function. One can construct a ReLU-like nonlinear function using the theory of reproducing kernel Hilbert spaces. Below, we derive this construction and show how it implies the representation power of neural networks with ReLU activations.
|