Radial basis function network: Difference between revisions

Content deleted Content added
m comma
Line 174:
====Gradient descent training of the linear weights====
 
Another possible training algorithm is [[gradient descent]]. In gradient descent training, the weights are adjusted at each time step by moving them in a direction opposite from the gradient of the objective function
 
:<math> \mathbf{w}(t+1) = \mathbf{w}(t) - \nu \frac {d} {d\mathbf{w}} H_t(\mathbf{w}) </math>