(This abstract was borrowed from another version of this item.)"> (This abstract was borrowed from another version of this item.)">
[go: up one dir, main page]

IDEAS home Printed from https://ideas.repec.org/p/hal/journl/halshs-02599138.html
   My bibliography  Save this paper

On the prediction performance of the Lasso

Author

Listed:
  • Arnak Dalalyan
  • Mohamed Hebiri

    (LAMA - Laboratoire d'Analyse et de Mathématiques Appliquées - UPEM - Université Paris-Est Marne-la-Vallée - BEZOUT - Fédération de Recherche Bézout - CNRS - Centre National de la Recherche Scientifique - UPEC UP12 - Université Paris-Est Créteil Val-de-Marne - Paris 12 - CNRS - Centre National de la Recherche Scientifique)

  • Johannes C. Lederer

    (Seminar für Statistik - ETH Zürich - Eidgenössische Technische Hochschule - Swiss Federal Institute of Technology [Zürich])

Abstract
Although the Lasso has been extensively studied, the relationship between its prediction performance and the correlations of the covariates is not fully understood. In this paper, we give new insights into this relationship in the context of multiple linear regression. We show, in particular, that the incorporation of a simple correlation measure into the tuning parameter leads to a nearly optimal prediction performance of the Lasso even for highly correlated covariates. However, we also reveal that for moderately correlated covariates, the prediction performance of the Lasso can be mediocre irrespective of the choice of the tuning parameter. For the illustration of our approach with an important application, we deduce nearly optimal rates for the least-squares estimator with total variation penalty
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Arnak Dalalyan & Mohamed Hebiri & Johannes C. Lederer, 2017. "On the prediction performance of the Lasso," Post-Print halshs-02599138, HAL.
  • Handle: RePEc:hal:journl:halshs-02599138
    DOI: 10.3150/15-BEJ756
    as

    Download full text from publisher

    To our knowledge, this item is not available for download. To find whether it is available, there are three options:
    1. Check below whether another version of this item is available online.
    2. Check on the provider's web page whether it is in fact available.
    3. Perform a search for a similarly titled item that would be available.

    Other versions of this item:

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Pierre Bellec & Alexandre Tsybakov, 2015. "Sharp oracle bounds for monotone and convex regression through aggregation," Working Papers 2015-04, Center for Research in Economics and Statistics.
    2. Laura Freijeiro‐González & Manuel Febrero‐Bande & Wenceslao González‐Manteiga, 2022. "A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates," International Statistical Review, International Statistical Institute, vol. 90(1), pages 118-145, April.
    3. Alexandre Belloni & Mingli Chen & Oscar Hernan Madrid Padilla & Zixuan & Wang, 2019. "High Dimensional Latent Panel Quantile Regression with an Application to Asset Pricing," Papers 1912.02151, arXiv.org, revised Aug 2022.
    4. Jacob Bien & Irina Gaynanova & Johannes Lederer & Christian L. Müller, 2019. "Prediction error bounds for linear regression with the TREX," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 28(2), pages 451-474, June.
    5. Gold, David & Lederer, Johannes & Tao, Jing, 2020. "Inference for high-dimensional instrumental variables regression," Journal of Econometrics, Elsevier, vol. 217(1), pages 79-111.
    6. Pawan Gupta & Marianna Pensky, 2018. "Solution of Linear Ill-Posed Problems Using Random Dictionaries," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 80(1), pages 178-193, May.
    7. Sheng Xu & Zhou Fan, 2021. "Iterative Alpha Expansion for estimating gradient‐sparse signals from linear measurements," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(2), pages 271-292, April.
    8. Wanling Xie & Hu Yang, 2023. "Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 107(3), pages 469-507, September.
    9. Tanin Sirimongkolkasem & Reza Drikvandi, 2019. "On Regularisation Methods for Analysis of High Dimensional Data," Annals of Data Science, Springer, vol. 6(4), pages 737-763, December.
    10. Tung Duy Luu & Jalal Fadili & Christophe Chesneau, 2020. "Sharp oracle inequalities for low-complexity priors," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(2), pages 353-397, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hal:journl:halshs-02599138. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: CCSD (email available below). General contact details of provider: https://hal.archives-ouvertes.fr/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.