Content deleted Content added
Olexa Riznyk (talk | contribs) m →Neural networks as functions: Copyedit (minor) |
Olexa Riznyk (talk | contribs) m →Backpropagation: Updating a wikililnk |
||
Line 50:
Backpropagation training algorithms fall into three categories:
* [[Gradient descent|steepest descent]] (with variable [[learning rate]] and [[Gradient descent
* quasi-Newton ([[Broyden–Fletcher–Goldfarb–Shanno algorithm|Broyden–Fletcher–Goldfarb–Shanno]], [[Secant method|one step secant]]);
* [[Levenberg–Marquardt algorithm|Levenberg–Marquardt]] and [[Conjugate gradient method|conjugate gradient]] (Fletcher–Reeves update, Polak–Ribiére update, Powell–Beale restart, scaled conjugate gradient).<ref>{{cite conference|author1=M. Forouzanfar|author2=H. R. Dajani|author3=V. Z. Groza|author4=M. Bolic|author5=S. Rajan|name-list-style=amp|date=July 2010|title=Comparison of Feed-Forward Neural Network Training Algorithms for Oscillometric Blood Pressure Estimation|url=https://www.researchgate.net/publication/224173336|conference=4th Int. Workshop Soft Computing Applications|___location=Arad, Romania|publisher=IEEE}}</ref>
|