Statistics and Its Interface
Volume 6 (2013)
High-dimensional regression and classification under a class of convex loss functions
Pages: 285 – 299
The weighted $L_1$ penalty was used to revise the traditional Lasso in the linear regression model under quadratic loss. We make use of this penalty to investigate the highdimensional regression and classification under a wide class of convex loss functions. We show that for the dimension growing nearly exponentially with the sample size, the penalized estimator possesses the oracle property for suitable weights, and its induced classifier is shown to be consistent to the optimal Bayes rule. Moreover, we propose two methods, called componentwise regression (CR) and penalized componentwise regression (PCR), for estimating weights. Both theories and simulation studies provide supporting evidence for the advantage of PCR over CR in high-dimensional regression and classification. The effectiveness of the proposed method is illustrated using real data sets.
convex loss, high-dimensional model, optimal Bayes rule, oracle property, weighted $L_1$ penalty
Published 10 May 2013