Access the full text.
Sign up today, get DeepDyve free for 14 days.
R. Mazumder, J. Friedman, T. Hastie (2011)
SparseNet: Coordinate Descent With Nonconvex PenaltiesJournal of the American Statistical Association, 106
J. Friedman, T. Hastie, Holger Hofling, R. Tibshirani (2007)
PATHWISE COORDINATE OPTIMIZATIONThe Annals of Applied Statistics, 1
Cun-Hui Zhang (2010)
Nearly unbiased variable selection under minimax concave penaltyAnnals of Statistics, 38
Zou Zou, Hastie Hastie
Regularization and variable selection via the elastic netJ. R. Statist. Soc. B, 67
E. Greenshtein, Y. Ritov (2004)
Persistence in high-dimensional linear predictor selection and the virtue of overparametrizationBernoulli, 10
Wenjiang Fu (1998)
Penalized Regressions: The Bridge versus the LassoJournal of Computational and Graphical Statistics, 7
M. Osborne, B. Presnell, B. Turlach (2000)
On the LASSO and its DualJournal of Computational and Graphical Statistics, 9
H. Zou, T. Hastie (2005)
Addendum: Regularization and variable selection via the elastic netJournal of the Royal Statistical Society: Series B (Statistical Methodology), 67
E. Candès, T. Tao (2009)
The Power of Convex Relaxation: Near-Optimal Matrix CompletionIEEE Transactions on Information Theory, 56
T. Hastie, R. Tibshirani, J. Friedman (2001)
The Elements of Statistical Learning
(1995)
Better subset selection using the non-negative garotte
E.J. Candes
Compressive Sampling
S. Chen, D. Donoho, M. Saunders (1998)
Atomic Decomposition by Basis PursuitSIAM J. Sci. Comput., 20
N. Meinshausen, P. Buehlmann (2008)
Stability selectionJournal of the Royal Statistical Society: Series B (Statistical Methodology), 72
D. Witten, R. Tibshirani (2009)
Covariance‐regularized regression and classification for high dimensional problemsJournal of the Royal Statistical Society: Series B (Statistical Methodology), 71
P. Zhao, Bin Yu (2006)
On Model Selection Consistency of LassoJ. Mach. Learn. Res., 7
(1998)
These resampling techniques are feasible since computation is efficient: as pointed out by Rob, (block) co-ordinatewise algorithms are often extremely fast
(2007)
Bayesian relaxation : boosting , the Lasso , and other L α norms
Jürg Schelldorfer, P. Bühlmann, S. Geer (2010)
Estimation for High‐Dimensional Linear Mixed‐Effects Models Using ℓ1‐PenalizationScandinavian Journal of Statistics, 38
R. Tibshirani, Jonathan Taylor (2010)
The solution path of the generalized lassoAnnals of Statistics, 39
S. Sardy, P. Tseng (2004)
On the Statistical Analysis of Smoothing by Maximizing Dirty Markov Random Field Posterior DistributionsJournal of the American Statistical Association, 99
Meinshausen Meinshausen, Bühlmann Bühlmann (2010)
Stability selection (with discussion)J. R. Statist. Soc. B, 72
Genevera Allen, R. Tibshirani (2009)
TRANSPOSABLE REGULARIZED COVARIANCE MODELS WITH AN APPLICATION TO MISSING DATA IMPUTATION.The annals of applied statistics, 4 2
S. Geer, Peter Buhlmann (2009)
On the conditions used to prove oracle results for the LassoElectronic Journal of Statistics, 3
D. Witten, R. Tibshirani, T. Hastie (2009)
A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis.Biostatistics, 10 3
Trevor Park, G. Casella (2008)
The Bayesian LassoJournal of the American Statistical Association, 103
M. Yuan, Yi Lin (2007)
Model selection and estimation in the Gaussian graphical modelBiometrika, 94
R. Tibshirani, H. Höfling, R. Tibshirani (2011)
Nearly-Isotonic RegressionTechnometrics, 53
Tseng Tseng (2001)
Convergence of a block coordinate descent method for nonsmooth separable minimizationJ. Optimzn Theor. Appl., 109
H. Zou, Runze Li (2008)
One-step Sparse Estimates in Nonconcave Penalized Likelihood Models.Annals of statistics, 36 4
M. Clyde, E. George (2004)
Model Uncertainty
Wu Wu, Lange Lange (2008)
Coordinate descent procedures for lasso penalized regressionAnn. Appl. Statist., 2
K. Nordhausen (2009)
The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition by Trevor Hastie, Robert Tibshirani, Jerome FriedmanInternational Statistical Review, 77
H. Zou (2006)
The Adaptive Lasso and Its Oracle PropertiesJournal of the American Statistical Association, 101
D. Donoho, I. Johnstone (1994)
Ideal spatial adaptation by wavelet shrinkageBiometrika, 81
R. Tibshirani, M. Saunders, Saharon Rosset, Ji Zhu, K. Knight (2005)
Sparsity and smoothness via the fused lassoJournal of the Royal Statistical Society: Series B (Statistical Methodology), 67
Nicolai Meinshausen (2007)
Relaxed LassoComput. Stat. Data Anal., 52
H. Bondell, A. Krishna, Sujit Ghosh (2010)
Joint Variable Selection for Fixed and Random Effects in Linear Mixed‐Effects ModelsBiometrics, 66
N. Meinshausen, L. Meier, P. Bühlmann (2008)
p-Values for High-Dimensional RegressionJournal of the American Statistical Association, 104
P. Bickel, Y. Ritov, A. Tsybakov (2008)
SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTORAnnals of Statistics, 37
F. Bunea, A. Tsybakov, M. Wegkamp (2007)
Sparsity oracle inequalities for the LassoarXiv: Statistics Theory
D. Donoho, Michael Elad, V. Temlyakov (2006)
Stable recovery of sparse overcomplete representations in the presence of noiseIEEE Transactions on Information Theory, 52
Jianqing Fan, Runze Li (2001)
Variable Selection via Nonconcave Penalized Likelihood and its Oracle PropertiesJournal of the American Statistical Association, 96
Candes Candes, Tao Tao (2007)
The dantzig selector statistical estimation when p is much larger than nAnn. Statist., 35
N. Städler, P. Bühlmann, S. Geer (2010)
ℓ1-penalization for mixture regression modelsTEST, 19
R. Mazumder, T. Hastie, R. Tibshirani (2010)
Spectral Regularization Algorithms for Learning Large Incomplete MatricesJournal of machine learning research : JMLR, 11
P. Bühlmann, S. Geer (2011)
Statistics for High-Dimensional Data
J. Friedman, T. Hastie, R. Tibshirani (2010)
Regularization Paths for Generalized Linear Models via Coordinate Descent.Journal of statistical software, 33 1
I. Jolliffe, N. Trendafilov, Mudassir Uddin (2003)
A Modified Principal Component Technique Based on the LASSOJournal of Computational and Graphical Statistics, 12
Edward George, R. McCulloch (1993)
Variable selection via Gibbs samplingJournal of the American Statistical Association, 88
P. Tseng (2001)
Convergence of a Block Coordinate Descent Method for Nondifferentiable MinimizationJournal of Optimization Theory and Applications, 109
R. Tibshirani (1996)
Regression Shrinkage and Selection via the LassoJournal of the royal statistical society series b-methodological, 58
James Franklin (2005)
The elements of statistical learning: data mining, inference and predictionThe Mathematical Intelligencer, 27
T. Hastie, C. Mallows (1993)
[A Statistical View of Some Chemometrics Regression Tools]: DiscussionTechnometrics, 35
T. Hastie, R. Tibshirani, J. Friedman (2001)
The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition
M. Yuan, Yi Lin (2006)
Model selection and estimation in regression with grouped variablesJournal of the Royal Statistical Society: Series B (Statistical Methodology), 68
E. Candès, Terence Tao (2005)
The Dantzig selector: Statistical estimation when P is much larger than nQuality Engineering, 54
(2009)
INFORMATION THEORY TUTORIAL Compressed sensing
D. Hinkley (2008)
Bootstrap Methods: Another Look at the Jackknife
Tong Wu, K. Lange (2008)
Coordinate descent algorithms for lasso penalized regressionThe Annals of Applied Statistics, 2
Peter Bhlmann, S. Geer (2011)
Statistics for High-Dimensional Data: Methods, Theory and Applications
N. Städler, P. Bühlmann (2009)
Missing values: sparse inverse covariance estimation and an extension to sparse regressionStatistics and Computing, 22
P. Green (1995)
Reversible jump Markov chain Monte Carlo computation and Bayesian model determinationBiometrika, 82
S. Geer (2008)
HIGH-DIMENSIONAL GENERALIZED LINEAR MODELS AND THE LASSOAnnals of Statistics, 36
L. Meier, Sara Geer, P. Bühlmann (2008)
The group lasso for logistic regressionJournal of the Royal Statistical Society: Series B (Statistical Methodology), 70
E. Candès, M. Wakin, Stephen Boyd (2007)
Enhancing Sparsity by Reweighted ℓ1 MinimizationJournal of Fourier Analysis and Applications, 14
S. Sardy, A. Bruce, P. Tseng (2000)
Block Coordinate Relaxation Methods for Nonparametric Wavelet DenoisingJournal of Computational and Graphical Statistics, 9
Adam Rothman, E. Levina, Ji Zhu (2010)
Discussion of "Stability selection" by N. Meinshausen and P. BuhlmannJournal of the royal statistical society series b-methodological, 72
Y. Benjamini, Y. Hochberg (1995)
Controlling the false discovery rate: a practical and powerful approach to multiple testingJournal of the royal statistical society series b-methodological, 57
H. Brunk, R. Barlow, D. Bartholomew, J. Bremner (1973)
Statistical inference under order restrictions : the theory and application of isotonic regressionInternational Statistical Review, 41
Abbas Khalili, Jiahua Chen (2007)
Variable Selection in Finite Mixture of Regression ModelsJournal of the American Statistical Association, 102
A. Genkin, D. Lewis, D. Madigan (2007)
Large-Scale Bayesian Logistic Regression for Text CategorizationTechnometrics, 49
(2007)
Finally, for recovering the active set S0, such that P.Ŝ = S0/ is large, tending to 1 as p n → ∞
© Institute of Mathematical Statistics, 2004 LEAST ANGLE REGRESSION
N. Meinshausen, Peter Buhlmann (2006)
High-dimensional graphs and variable selection with the LassoAnnals of Statistics, 34
lldiko Frank, J. Friedman (1993)
A Statistical View of Some Chemometrics Regression ToolsTechnometrics, 35
Städler Städler, Bühlmann Bühlmann, van de Geer van de Geer (2010)
l 1 ‐penalization for mixture regression models (with discussion)Test, 19
P. Tseng, S. Yun (2008)
A coordinate gradient descent method for nonsmooth separable minimizationMathematical Programming, 117
Summary. In the paper I give a brief review of the basic idea and some history and then discuss some developments since the original paper on regression shrinkage and selection via the lasso.
Journal of the Royal Statistical Society: Series B (Statistical Methodology) – Wiley
Published: Jun 1, 2011
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.