exercise:D0e0218f32: Difference between revisions

From Stochiki
(Created page with "<div class="d-none"> <math> \require{textmacros} \def \bbeta {\bf \beta} \def\fat#1{\mbox{\boldmath$#1$}} \def\reminder#1{\marginpar{\rule[0pt]{1mm}{11pt}}\textbf{#1}} \def\SS...")
 
mNo edit summary
 
Line 45: Line 45:
\begin{eqnarray*}
\begin{eqnarray*}
\hat{\beta}_j (\lambda_1)  & = & \left\{  
\hat{\beta}_j (\lambda_1)  & = & \left\{  
\begin{array}{lcl}
\begin{array}{ll}
\mbox{sgn}(\hat{\beta}_j) [| \hat{\beta}_j | - \tfrac{1}{2} \lambda_1 (1+\rho)^{-1}]_+ & \mbox{ if } & \mbox{sgn}[\hat{\beta}_1 (\lambda_1)] = \mbox{sgn}[\hat{\beta}_2 (\lambda_1)],
\mbox{sgn}(\hat{\beta}_j) [| \hat{\beta}_j | - \tfrac{1}{2} \lambda_1 (1+\rho)^{-1}]_+ & \mbox{ if } \, \mbox{sgn}[\hat{\beta}_1 (\lambda_1)] = \mbox{sgn}[\hat{\beta}_2 (\lambda_1)],
\\
\\
& & \hat{\beta}_j (\lambda_1) \not= 0 \not= \hat{\beta}_2 (\lambda_1),
& \hat{\beta}_j (\lambda_1) \not= 0 \not= \hat{\beta}_2 (\lambda_1),
\\
\\
\mbox{sgn}(\hat{\beta}_j) [| \hat{\beta}_j | - \tfrac{1}{2} \lambda_1 (1-\rho)^{-1}]_+ & \mbox{ if } & \mbox{sgn}[\hat{\beta}_1 (\lambda_1)] \not= \mbox{sgn}[\hat{\beta}_2 (\lambda_1)],  
\mbox{sgn}(\hat{\beta}_j) [| \hat{\beta}_j | - \tfrac{1}{2} \lambda_1 (1-\rho)^{-1}]_+ & \mbox{ if } \, \mbox{sgn}[\hat{\beta}_1 (\lambda_1)] \not= \mbox{sgn}[\hat{\beta}_2 (\lambda_1)],  
\\
\\
& & \hat{\beta}_1 (\lambda_1) \not= 0 \not= \hat{\beta}_2 (\lambda_1),
& \hat{\beta}_1 (\lambda_1) \not= 0 \not= \hat{\beta}_2 (\lambda_1),
\\
\\
\left\{
\left\{
Line 61: Line 61:
\end{array}
\end{array}
\right.
\right.
&
& \mbox{ otherwise, }
\multicolumn{2}{l}{\mbox{ otherwise, }}
\end{array}  
\end{array}  
\right.
\right.

Latest revision as of 18:27, 25 June 2023

[math] \require{textmacros} \def \bbeta {\bf \beta} \def\fat#1{\mbox{\boldmath$#1$}} \def\reminder#1{\marginpar{\rule[0pt]{1mm}{11pt}}\textbf{#1}} \def\SSigma{\bf \Sigma} \def\ttheta{\bf \theta} \def\aalpha{\bf \alpha} \def\ddelta{\bf \delta} \def\eeta{\bf \eta} \def\llambda{\bf \lambda} \def\ggamma{\bf \gamma} \def\nnu{\bf \nu} \def\vvarepsilon{\bf \varepsilon} \def\mmu{\bf \mu} \def\nnu{\bf \nu} \def\ttau{\bf \tau} \def\SSigma{\bf \Sigma} \def\TTheta{\bf \Theta} \def\XXi{\bf \Xi} \def\PPi{\bf \Pi} \def\GGamma{\bf \Gamma} \def\DDelta{\bf \Delta} \def\ssigma{\bf \sigma} \def\UUpsilon{\bf \Upsilon} \def\PPsi{\bf \Psi} \def\PPhi{\bf \Phi} \def\LLambda{\bf \Lambda} \def\OOmega{\bf \Omega} [/math]

Consider the linear regression model [math]\mathbf{Y} = \mathbf{X} \bbeta + \vvarepsilon[/math] with [math]\vvarepsilon \sim \mathcal{N}(0,\sigma^2)[/math] and an [math]n \times 2[/math]-dimensional design matrix with zero-centered and standardized but collinear columns, i.e.:

[[math]] \begin{eqnarray*} \mathbf{X}^{\top} \mathbf{X} & = & \left( \begin{array}{ll} 1 & \rho \\ \rho & 1 \end{array} \right) \end{eqnarray*} [[/math]]

with [math]\rho \in (-1, 1)[/math]. Then, an analytic expression for the lasso regression estimator exists. Show that:

[[math]] \begin{eqnarray*} \hat{\beta}_j (\lambda_1) & = & \left\{ \begin{array}{ll} \mbox{sgn}(\hat{\beta}_j) [| \hat{\beta}_j | - \tfrac{1}{2} \lambda_1 (1+\rho)^{-1}]_+ & \mbox{ if } \, \mbox{sgn}[\hat{\beta}_1 (\lambda_1)] = \mbox{sgn}[\hat{\beta}_2 (\lambda_1)], \\ & \hat{\beta}_j (\lambda_1) \not= 0 \not= \hat{\beta}_2 (\lambda_1), \\ \mbox{sgn}(\hat{\beta}_j) [| \hat{\beta}_j | - \tfrac{1}{2} \lambda_1 (1-\rho)^{-1}]_+ & \mbox{ if } \, \mbox{sgn}[\hat{\beta}_1 (\lambda_1)] \not= \mbox{sgn}[\hat{\beta}_2 (\lambda_1)], \\ & \hat{\beta}_1 (\lambda_1) \not= 0 \not= \hat{\beta}_2 (\lambda_1), \\ \left\{ \begin{array}{lcl} 0 & \mbox{ if } & j \not= \arg \max_{j'} \{ | \hat{\beta}_{j'}^{\mbox{{\tiny (ols)}}} | \} \\ \mbox{sgn}(\tilde{\beta}_j) ( | \tilde{\beta}_j | - \tfrac{1}{2} \lambda_1)_+ & \mbox{ if } & j = \arg \max_{j'} \{ | \hat{\beta}_{j'}^{\mbox{{\tiny (ols)}}} | \} \end{array} \right. & \mbox{ otherwise, } \end{array} \right. \end{eqnarray*} [[/math]]

where [math]\tilde{\beta}_j = (\mathbf{X}_{\ast,j}^{\top} \mathbf{X}_{\ast,j})^{-1} \mathbf{X}_{\ast,j}^{\top} \mathbf{Y}[/math].