Proximal gradient method: Difference between revisions

Content deleted Content added
No edit summary
No edit summary
Line 11:
where <math>f_1, f_2, ..., f_n </math> are convex functions defined from <math>f: \mathbb{R}^N \rightarrow \mathbb{R} </math>
where some of the functions are non-differentiable, this rules out our conventional smooth optimization techniques like
[http://en.wikipedia.org/wiki/Gradient_descent Steepest decent method], [http://en.wikipedia.org/wiki/Conjugate_gradient_method conjugate gradient method] etc. TheThere is a specific class of algorithms which can solve above optimization problem. These methods proceed by splitting,
in that the functions <math>f_1, . . . , f_n</math> are used individually so as to yield an easily implementable algorithm.
They are called proximal because each non smooth function among <math>f_1, . . . , f_n</math> is involved via its proximity