[go: up one dir, main page]

IDEAS home Printed from https://ideas.repec.org/a/bla/anzsta/v46y2004i2p257-274.html
   My bibliography  Save this article

Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence

Author

Listed:
  • Joseph E. Cavanaugh
Abstract
Model selection criteria are frequently developed by constructing estimators of discrepancy measures that assess the disparity between the 'true' model and a fitted approximating model. The Akaike information criterion (AIC) and its variants result from utilizing Kullback's directed divergence as the targeted discrepancy. The directed divergence is an asymmetric measure of separation between two statistical models, meaning that an alternative directed divergence can be obtained by reversing the roles of the two models in the definition of the measure. The sum of the two directed divergences is Kullback's symmetric divergence. In the framework of linear models, a comparison of the two directed divergences reveals an important distinction between the measures. When used to evaluate fitted approximating models that are improperly specified, the directed divergence which serves as the basis for AIC is more sensitive towards detecting overfitted models, whereas its counterpart is more sensitive towards detecting underfitted models. Since the symmetric divergence combines the information in both measures, it functions as a gauge of model disparity which is arguably more balanced than either of its individual components. With this motivation, the paper proposes a new class of criteria for linear model selection based on targeting the symmetric divergence. The criteria can be regarded as analogues of AIC and two of its variants: 'corrected' AIC or AICc and 'modified' AIC or MAIC. The paper examines the selection tendencies of the new criteria in a simulation study and the results indicate that they perform favourably when compared to their AIC analogues.

Suggested Citation

  • Joseph E. Cavanaugh, 2004. "Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence," Australian & New Zealand Journal of Statistics, Australian Statistical Publishing Association Inc., vol. 46(2), pages 257-274, June.
  • Handle: RePEc:bla:anzsta:v:46:y:2004:i:2:p:257-274
    DOI: 10.1111/j.1467-842X.2004.00328.x
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/j.1467-842X.2004.00328.x
    Download Restriction: no

    File URL: https://libkey.io/10.1111/j.1467-842X.2004.00328.x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Thomas Gkelsinis & Alex Karagrigoriou, 2020. "Theoretical Aspects on Measures of Directed Information with Simulations," Mathematics, MDPI, vol. 8(4), pages 1-13, April.
    2. Filia Vonta & Kyriacos Mattheou & Alex Karagrigoriou, 2012. "On Properties of the (Φ, a)-Power Divergence Family with Applications in Goodness of Fit Tests," Methodology and Computing in Applied Probability, Springer, vol. 14(2), pages 335-356, June.
    3. Hafidi, Bezza, 2006. "A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling," Statistics & Probability Letters, Elsevier, vol. 76(15), pages 1647-1654, September.
    4. Carvajal-Rodríguez, A., 2020. "Multi-model inference of non-random mating from an information theoretic approach," Theoretical Population Biology, Elsevier, vol. 131(C), pages 38-53.
    5. Thomas Gkelsinis & Alex Karagrigoriou & Vlad Stefan Barbu, 2022. "Statistical inference based on weighted divergence measures with simulations and applications," Statistical Papers, Springer, vol. 63(5), pages 1511-1536, October.
    6. Nicolas Depraetere & Martina Vandebroek, 2014. "Order selection in finite mixtures of linear regressions," Statistical Papers, Springer, vol. 55(3), pages 871-911, August.
    7. Marhuenda, Yolanda & Morales, Domingo & del Carmen Pardo, María, 2014. "Information criteria for Fay–Herriot model selection," Computational Statistics & Data Analysis, Elsevier, vol. 70(C), pages 268-280.
    8. G. Avlogiaris & A. C. Micheas & K. Zografos, 2019. "A Criterion for Local Model Selection," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 406-444, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:anzsta:v:46:y:2004:i:2:p:257-274. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.blackwellpublishing.com/journal.asp?ref=1369-1473 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.