Statistics and Its Interface

Volume 6 (2013)

Number 2

Sparse bridge estimation with a diverging number of parameters

Pages: 231 – 242



Hosik Choi (Department of Informational Statistics and Institute of Basic Science, Hoseo University, Chungnam, Korea)

Yongdai Kim (Department of Statistics, Seoul National University, Seoul, Korea)

Sunghoon Kwon (School of Statistics, University of Minnesota, Minneapolis, Minn., U.S.A.)


The Bridge estimator with $ℓ^ν_ν$-penalty for some $ν \gt 0$ is one of the popular choices in penalized linear regression models. It is known that, when $ν ≤ 1$, the Bridge estimator produces sparse models which allow us to control the model complexity. However, when $ν = 1$, the Bridge estimator fails to identify the correct model since it requires certain strong sufficient conditions that are hard to hold in general, and when $ν \gt 1$, it achieves no sparsity in parameter estimation. In this paper, we propose the sparse Bridge estimator that is developed to find the correct sparse version of the Bridge estimator when $ν≥1$. Theoretically, the sparse Bridge estimator is asymptotically equivalent to the oracle Bridge estimator when the number of predictive variables diverges to infinity but less than the sample size. Here, the oracle Bridge estimator is an ideal Bridge estimator obtained by deleting all irrelevant predictive variables in advance. Hence, the sparse Bridge estimator naturally inherits the properties of the Bridge estimator without losing correct model identification asymptotically. Numerical studies show that the sparse Bridge estimator can outperform other penalized estimators with a finite sample.


bridge, diverging number of parameters, lasso, regression, ridge, variable selection

2010 Mathematics Subject Classification

Primary 62J05. Secondary 62J07.

Published 10 May 2013