Content deleted Content added
m task, replaced: SIAM J. Imaging Science → SIAM Journal on Imaging Sciences |
Make it more explicit we are talking about proximal gradient methods as the solution to non smoothnesses |
||
Line 1:
{{more footnotes|date=November 2013}}
'''Proximal gradient methods''' are a generalized form of projection used to solve non-differentiable [[convex optimization]] problems.
Many interesting problems can be formulated as convex optimization problems of form :<math>
\operatorname{min}\limits_{x \in \mathbb{R}^N} \sum_{i=1}^n f_i(x)
Line 7 ⟶ 9:
where <math>f_i,\ i = 1, \dots, n</math> are [[convex functions]] defined from <math>f: \mathbb{R}^N \rightarrow \mathbb{R} </math>
where some of the functions are non-differentiable, this rules out our conventional smooth optimization techniques like
[[Gradient descent|Steepest descent method]], [[conjugate gradient method]] etc.
They are called [[proximal]] because each non [[smooth function]] among <math>f_1, . . . , f_n</math> is involved via its proximity
operator. Iterative Shrinkage thresholding algorithm,<ref>
|