Content deleted Content added
m layout |
minor change, reformatting of a sum so \sum is used |
||
Line 1:
{{more footnotes|date=November 2013}}
'''Proximal gradient methods''' are a generalized form of projection used to solve non-differentiable [[convex optimization]] problems. Many interesting problems can be formulated as convex optimization problems of form
:<math>
\operatorname{min}\limits_{x \in \mathbb{R}^N}
</math>
where <math>
▲where <math>f_1, f_2, ..., f_n </math> are [[convex functions]] defined from <math>f: \mathbb{R}^N \rightarrow \mathbb{R} </math>
where some of the functions are non-differentiable, this rules out our conventional smooth optimization techniques like
[[Gradient descent|Steepest descent method]], [[conjugate gradient method]] etc. There is a specific class of [[algorithms]] which can solve the above optimization problem. These methods proceed by splitting,
|