Content deleted Content added
No edit summary |
No edit summary |
||
Line 5:
Proximal gradient methods offer a general framework for solving regularization problems from statistical learning theory with penalties that are tailored to a specific problem application.<ref name=combettes>{{cite journal|last=Combettes|first=Patrick L.|coauthors=Wajs, Valérie R.|title=Signal Recovering by Proximal Forward-Backward Splitting|journal=Multiscale Model. Simul.|year=2005|volume=4|issue=4|pages=1168-1200|url=http://epubs.siam.org/doi/abs/10.1137/050626090}}</ref><ref name=structSparse>{{cite journal|last=Mosci|first=S.|coauthors=Rosasco, L., Matteo, S., Verri, A., and Villa, S.|title=Solving Structured Sparsity Regularization with Proximal Methods|journal=Machine Learning and Knowledge Discovery in Databases|year=2010|volume=6322|pages=418-433}}</ref> Such customized penalties can help to induce certain structure in problem solutions, such as ''sparsity'' (in the case of [[#Lasso_regularization|lasso]]) or ''group structure'' (in the case of [[#Exploiting_group_structure| group lasso]]).
==Relevant background==
Line 32 ⟶ 30:
In certain situations it may be easier to compute the proximity operator for the conjugate <math>\varphi^*</math> instead of the function <math>\varphi</math>, and therefore the Moreau decomposition can be applied. This is the case for [[#Exploiting_group_structure| group lasso]].
==Lasso regularization==
|