Proximal gradient method: Difference between revisions

Content deleted Content added
m Disambiguated: implementablewikt:implementable
Line 8:
where some of the functions are non-differentiable, this rules out our conventional smooth optimization techniques like
[[Gradient descent|Steepest descent method]], [[conjugate gradient method]] etc. There is a specific class of [[algorithms]] which can solve the above optimization problem. These methods proceed by splitting,
in that the functions <math>f_1, . . . , f_n</math> are used individually so as to yield an easily [[wikt:implementable|implementable]]{{dn|date=April 2017}} algorithm.
They are called [[proximal]] because each non [[smooth function]] among <math>f_1, . . . , f_n</math> is involved via its proximity
operator. Iterative Shrinkage thresholding algorithm, [[Landweber iteration|projected Landweber]], projected