exercise:206e6bd1d6: Difference between revisions

From Stochiki
(Created page with "<div class="d-none"> <math> \require{textmacros} \def \bbeta {\bf \beta} \def\fat#1{\mbox{\boldmath$#1$}} \def\reminder#1{\marginpar{\rule[0pt]{1mm}{11pt}}\textbf{#1}} \def\SS...")
 
mNo edit summary
 
Line 32: Line 32:
</div>
</div>


The iteratively reweighted least squares (IRLS) algorithm for the numerical evaluation of the ridge logistic regression estimator requires the inversion of a <math>p \times p</math>-dimensional matrix at each iteration. In Section [[guide:7d6298862b#sect.ridgeEfficientCalculation | Computationally efficient evaluation ]] the singular value decomposition (SVD) of the design matrix is exploited to avoid the inversion of such a matrix in the numerical evaluation of the ridge regression estimator. Use this trick to show that the computational burden of the IRLS algorithm may be reduced to one SVD prior to the iterations and the inversion of an <math>n \times n</math> dimensional matrix at each iteration (as is done in <ref name="Eilers2001">Eilers, P. H. C., Boer, J. M., van Ommen, G.-J., and van Houwelingen, H. C.  (2001).Classification of microarray data with penalized logistic regression.In \em Microarrays: Optical technologies and informatics\/, volume  4266, pages 187--198</ref>).
The iteratively reweighted least squares (IRLS) algorithm for the numerical evaluation of the ridge logistic regression estimator requires the inversion of a <math>p \times p</math>-dimensional matrix at each iteration. In Section [[guide:7d6298862b#sect.ridgeEfficientCalculation | Computationally efficient evaluation ]] the singular value decomposition (SVD) of the design matrix is exploited to avoid the inversion of such a matrix in the numerical evaluation of the ridge regression estimator. Use this trick to show that the computational burden of the IRLS algorithm may be reduced to one SVD prior to the iterations and the inversion of an <math>n \times n</math> dimensional matrix at each iteration (as is done in <ref name="Eilers2001">Eilers, P. H. C., Boer, J. M., van Ommen, G.-J., and van Houwelingen, H. C.  (2001).Classification of microarray data with penalized logistic regression.In ''Microarrays: Optical technologies and informatics'', volume  4266, pages 187--198</ref>).

Latest revision as of 00:42, 25 June 2023

[math] \require{textmacros} \def \bbeta {\bf \beta} \def\fat#1{\mbox{\boldmath$#1$}} \def\reminder#1{\marginpar{\rule[0pt]{1mm}{11pt}}\textbf{#1}} \def\SSigma{\bf \Sigma} \def\ttheta{\bf \theta} \def\aalpha{\bf \alpha} \def\ddelta{\bf \delta} \def\eeta{\bf \eta} \def\llambda{\bf \lambda} \def\ggamma{\bf \gamma} \def\nnu{\bf \nu} \def\vvarepsilon{\bf \varepsilon} \def\mmu{\bf \mu} \def\nnu{\bf \nu} \def\ttau{\bf \tau} \def\SSigma{\bf \Sigma} \def\TTheta{\bf \Theta} \def\XXi{\bf \Xi} \def\PPi{\bf \Pi} \def\GGamma{\bf \Gamma} \def\DDelta{\bf \Delta} \def\ssigma{\bf \sigma} \def\UUpsilon{\bf \Upsilon} \def\PPsi{\bf \Psi} \def\PPhi{\bf \Phi} \def\LLambda{\bf \Lambda} \def\OOmega{\bf \Omega} [/math]

The iteratively reweighted least squares (IRLS) algorithm for the numerical evaluation of the ridge logistic regression estimator requires the inversion of a [math]p \times p[/math]-dimensional matrix at each iteration. In Section Computationally efficient evaluation the singular value decomposition (SVD) of the design matrix is exploited to avoid the inversion of such a matrix in the numerical evaluation of the ridge regression estimator. Use this trick to show that the computational burden of the IRLS algorithm may be reduced to one SVD prior to the iterations and the inversion of an [math]n \times n[/math] dimensional matrix at each iteration (as is done in [1]).

  1. Eilers, P. H. C., Boer, J. M., van Ommen, G.-J., and van Houwelingen, H. C. (2001).Classification of microarray data with penalized logistic regression.In Microarrays: Optical technologies and informatics, volume 4266, pages 187--198