why constrained regression and Regularized regression equivalent

problem 1:

  $\min_{\beta} ~f_\alpha(\beta):=\frac{1}{2}\Vert y-X\beta\Vert^2 +\alpha\Vert \beta\Vert$

problem 2:

  $\min_{\beta} ~\frac{1}{2}\Vert y-X\beta\Vert^2 \\ s.t.~\Vert \beta\Vert-c\leq 0$

problem 2 Lagrangian:

$\mathcal{L}(\beta,\lambda)=\frac{1}{2}\Vert y-X\beta\Vert^2+\lambda (\Vert \beta\Vert-c)$

kkt shows:

dual-inner optimal:$\beta^*=min_{\beta}~\mathcal{L}(\beta,\lambda):=\frac{1}{2}\Vert y-X\beta\Vert^2+\lambda (\Vert \beta\Vert-c)$

primal-inner optimal:$\lambda^*(\Vert \beta\Vert-c)=0$

for problem 1:

$\beta^*=\min_{\beta} ~f_\alpha(\beta):=\frac{1}{2}\Vert y-X\beta\Vert^2 +\alpha\Vert \beta\Vert$

set $\lambda = \alpha$ and $c=\Vert \beta\Vert$

can see both kkt conditions meet

上一篇:iserver频繁崩溃、内存溢出事故解决小记


下一篇:机器学习方法(五):逻辑回归Logistic Regression,Softmax Regression