[go: up one dir, main page]

IDEAS home Printed from https://ideas.repec.org/a/oup/econjl/v134y2023i657p193-219..html
   My bibliography  Save this article

The Null Result Penalty

Author

Listed:
  • Felix Chopra
  • Ingar Haaland
  • Christopher Roth
  • Andreas Stegmann
Abstract
We examine how the evaluation of research studies in economics depends on whether a study yielded a null result. Studies with null results are perceived to be less publishable, of lower quality, less important and less precisely estimated than studies with large and statistically significant results, even when holding constant all other study features, including the sample size and the precision of the estimates. The null result penalty is of similar magnitude among PhD students and journal editors. The penalty is larger when experts predict a large effect and when statistical uncertainty is communicated with p-values rather than standard errors. Our findings highlight the value of a pre-result review.

Suggested Citation

  • Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2023. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
  • Handle: RePEc:oup:econjl:v:134:y:2023:i:657:p:193-219.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/ej/uead060
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Peter Andrebriq & Carlo Pizzinelli & Christopher Roth & Johannes Wohlfart, 2022. "Subjective Models of the Macroeconomy: Evidence From Experts and Representative Samples," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 89(6), pages 2958-2991.
    2. Jonas Hjort & Diana Moreira & Gautam Rao & Juan Francisco Santini, 2021. "How Research Affects Policy: Experimental Evidence from 2,150 Brazilian Municipalities," American Economic Review, American Economic Association, vol. 111(5), pages 1442-1480, May.
    3. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    4. Stefano DellaVigna & Devin Pope, 2018. "Predicting Experimental Results: Who Knows What?," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2410-2456.
    5. David Card & Stefano DellaVigna & Patricia Funk & Nagore Iriberri, 2020. "Are Referees and Editors in Economics Gender Neutral?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 135(1), pages 269-327.
    6. Peter Andre & Armin Falk, 2021. "What’s Worth Knowing? Economists’ Opinions about Economics," ECONtribute Discussion Papers Series 102, University of Bonn and University of Cologne, Germany.
    7. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    8. Peter Andre & Ingar Haaland & Christopher Roth & Johannes Wohlfart, 2021. "Narratives about the Macroeconomy," CEBI working paper series 21-18, University of Copenhagen. Department of Economics. The Center for Economic Behavior and Inequality (CEBI).
    9. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    10. Bogdanoski, Aleksandar & Foster, Andrew & Karlan, Dean & Miguel, Edward, 2020. "Pre-results Review at the Journal of Development Economics: Lessons learned," MetaArXiv 5yacr, Center for Open Science.
    11. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    12. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    13. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    14. Anna, Petrenko, 2016. "Мaркування готової продукції як складова частина інформаційного забезпечення маркетингової діяльності підприємств овочепродуктового підкомплексу," Agricultural and Resource Economics: International Scientific E-Journal, Agricultural and Resource Economics: International Scientific E-Journal, vol. 2(1), March.
    15. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    16. Jonathan de Quidt & Johannes Haushofer & Christopher Roth, 2018. "Measuring and Bounding Experimenter Demand," American Economic Review, American Economic Association, vol. 108(11), pages 3266-3302, November.
    17. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    18. David Card & Stefano DellaVigna, 2013. "Nine Facts about Top Journals in Economics," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 144-161, March.
    19. Kasy, Maximilian, 2019. "Selective publication of findings: Why does it matter, and what should we do about it?," MetaArXiv xwngs, Center for Open Science.
    20. Daniel J. Benjamin & Sebastian A. Brown & Jesse M. Shapiro, 2013. "Who Is ‘Behavioral’? Cognitive Ability And Anomalous Preferences," Journal of the European Economic Association, European Economic Association, vol. 11(6), pages 1231-1255, December.
    21. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    22. Berinsky, Adam J. & Druckman, James N. & Yamamoto, Teppei, 2021. "Publication Biases in Replication Studies," Political Analysis, Cambridge University Press, vol. 29(3), pages 370-384, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Patrick Dylong & Paul Setzepfand & Silke Uebelmesser, 2023. "Priming Attitudes Towards Immigrants: Implications for Migration Research and Survey Design," CESifo Working Paper Series 10306, CESifo.
    2. Alexander L. Brown & Taisuke Imai & Ferdinand M. Vieider & Colin F. Camerer, 2024. "Meta-analysis of Empirical Estimates of Loss Aversion," Journal of Economic Literature, American Economic Association, vol. 62(2), pages 485-516, June.
    3. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    4. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    5. Garg, Prashant & Fetzer, Thiemo, 2024. "Causal Claims in Economics," I4R Discussion Paper Series 183, The Institute for Replication (I4R).
    6. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    7. Fang, Ximeng & Innocenti, Stefania, 2023. "Increasing the acceptability of carbon taxation: The role of social norms and economic reasoning," INET Oxford Working Papers 2023-25, Institute for New Economic Thinking at the Oxford Martin School, University of Oxford.
    8. Burro, Giovanni & Castagnetti, Alessandro, 2022. "Will I tell you that you are smart (dumb)? Deceiving Others about their IQ or about a Random Draw," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    3. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    4. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    5. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    6. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    7. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    8. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    9. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    10. Garg, Prashant & Fetzer, Thiemo, 2024. "Causal Claims in Economics," OSF Preprints u4vgs, Center for Open Science.
    11. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    12. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    13. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    14. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    15. Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.
    16. Patrick Vu, 2022. "Can the Replication Rate Tell Us About Publication Bias?," Papers 2206.15023, arXiv.org, revised Jul 2022.
    17. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    18. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    19. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    20. Alexander L. Brown & Taisuke Imai & Ferdinand M. Vieider & Colin F. Camerer, 2024. "Meta-analysis of Empirical Estimates of Loss Aversion," Journal of Economic Literature, American Economic Association, vol. 62(2), pages 485-516, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:econjl:v:134:y:2023:i:657:p:193-219.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press or the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/resssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.