Radial basis function network: Difference between revisions

Content deleted Content added
No edit summary
Ixnay (talk | contribs)
Pseudoinverse solution for the linear weights: i don't agree with unique local minimum (there can be a affine space of equivalent solutions)
Line 220:
====Pseudoinverse solution for the linear weights====
 
After the centers <math>c_i</math> have been fixed, the weights that minimize the error at the output arecan be computed with a linear [[pseudoinverse]] solution:
:<math>\mathbf{w} = \mathbf{G}^+ \mathbf{b}</math>,
where the entries of ''G'' are the values of the radial basis functions evaluated at the points <math>x_i</math>: <math>g_{ji} = \rho(||x_j-c_i||)</math>.
 
The existence of this linear solution means that unlike multi-layer perceptron (MLP) networks, RBF networks have aan uniqueexplicit local minimumminimizer (when the centers are fixed).
 
====Gradient descent training of the linear weights====