Content deleted Content added
No edit summary |
|||
Line 3:
'''Proximal gradient methods''' are a generalized form of projection used to solve non-differentiable [[convex optimization]] problems.
Many interesting problems can be formulated as convex optimization problems of the form
<math>
Line 10:
where <math>f_i,\ i = 1, \dots, n</math> are [[convex functions]] defined from <math>f: \mathbb{R}^N \rightarrow \mathbb{R} </math>
where some of the functions are non-differentiable
[[Gradient descent|Steepest descent method]], [[conjugate gradient method]] etc. Proximal gradient methods can be used instead. These methods proceed by splitting, in that the functions <math>f_1, . . . , f_n</math> are used individually so as to yield an easily [[wikt:implementable|implementable]] algorithm.
They are called [[proximal]] because each non [[smooth function]] among <math>f_1, . . . , f_n</math> is involved via its proximity
|