Content deleted Content added
m →Fixed point iterative schemes: clean up, typo(s) fixed: penality → penalty using AWB |
m Task 5: Fix CS1 deprecated coauthor parameter errors |
||
Line 2:
:<math>\min_{w\in\mathbb{R}^d} \frac{1}{n}\sum_{i=1}^n (y_i- \langle w,x_i\rangle)^2+ \lambda \|w\|_1, \quad \text{ where } x_i\in \mathbb{R}^d\text{ and } y_i\in\mathbb{R}.</math>
Proximal gradient methods offer a general framework for solving regularization problems from statistical learning theory with penalties that are tailored to a specific problem application.<ref name=combettes>{{cite journal|last=Combettes|first=Patrick L.|
== Relevant background ==
Line 105:
=== Group lasso ===
Group lasso is a generalization of the [[#Lasso regularization|lasso method]] when features are grouped into disjoint blocks.<ref name=groupLasso>{{cite journal|last=Yuan|first=M.|
:<math>R(w) =\sum_{g=1}^G \|w_g\|_2,</math>
|