[go: up one dir, main page]

IDEAS home Printed from https://ideas.repec.org/a/oup/scippl/v48y2021i2p154-163..html
   My bibliography  Save this article

A framework for open policy analysis

Author

Listed:
  • Fernando Hoces de la Guardia
  • Sean Grant
  • Edward Miguel
Abstract
The evidence-based policy movement promotes the use of empirical evidence to inform policy decision-making. While several social science disciplines are undergoing a ‘credibility revolution’ focused on openness and replication, policy analysis has yet to systematically embrace transparency and reproducibility. We argue that policy analysis should adopt the open research practices increasingly espoused in related disciplines to advance the credibility of evidence-based policy making. We first discuss the importance of evidence-based policy in an era of increasing disagreement about facts, analysis, and expertise. We present a novel framework for ‘open’ policy analysis (OPA) and how to achieve it, focusing on examples of recent policy analyses that have incorporated open research practices such as transparent reporting, open data, and code sharing. We conclude with recommendations on how key stakeholders in evidence-based policy can make OPA the norm and thus safeguard trust in using empirical evidence to inform important public policy decisions.

Suggested Citation

  • Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
  • Handle: RePEc:oup:scippl:v:48:y:2021:i:2:p:154-163.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/scipol/scaa067
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Charles F. Manski, 2013. "Response to the Review of ‘Public Policy in an Uncertain World’," Economic Journal, Royal Economic Society, vol. 0, pages 412-415, August.
    2. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    3. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    4. Don Husereau & Michael Drummond & Stavros Petrou & Chris Carswell & David Moher & Dan Greenberg & Federico Augustovski & Andrew Briggs & Josephine Mauskopf & Elizabeth Loder, 2013. "Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement," The European Journal of Health Economics, Springer;Deutsche Gesellschaft für Gesundheitsökonomie (DGGÖ), vol. 14(3), pages 367-372, June.
    5. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    6. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    7. Paul Gertler & Sebastian Galiani & Mauricio Romero, 2018. "How to make replication the norm," Nature, Nature, vol. 554(7693), pages 417-419, February.
    8. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    9. Ron Haskins, 2017. "Presidential Address: Making Federal Social Programs Work," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 36(2), pages 276-302, March.
    10. Anthony Perl & Michael Howlett & M. Ramesh, 2018. "Policy-making and truthiness: Can existing policy models cope with politicized evidence and willful ignorance in a “post-fact” world?," Policy Sciences, Springer;Society of Policy Sciences, vol. 51(4), pages 581-600, December.
    11. Hoces de la Guardia, Fernando, 2017. "How Transparency and Reproducibility Can Increase Credibility in Policy Analysis: A Case Study of the Minimum Wage Policy Estimate," MetaArXiv ba7tr, Center for Open Science.
    12. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    13. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    14. Manski, Charles F., 2013. "Public Policy in an Uncertain World: Analysis and Decisions," Economics Books, Harvard University Press, number 9780674066892, Spring.
    15. Pfenninger, Stefan & DeCarolis, Joseph & Hirth, Lion & Quoilin, Sylvain & Staffell, Iain, 2017. "The importance of open data and software: Is energy research lagging behind?," Energy Policy, Elsevier, vol. 101(C), pages 211-215.
    16. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    17. J. Scott Long, 2009. "The Workflow of Data Analysis Using Stata," Stata Press books, StataCorp LP, number wdaus, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Alex Eble & Peter Boone & Diana Elbourne, 2017. "On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics," The World Bank Economic Review, World Bank, vol. 31(3), pages 687-707.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    5. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    6. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    7. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    8. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    9. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    10. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    11. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    12. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    13. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    14. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    15. Guido W. Imbens, 2020. "Potential Outcome and Directed Acyclic Graph Approaches to Causality: Relevance for Empirical Practice in Economics," Journal of Economic Literature, American Economic Association, vol. 58(4), pages 1129-1179, December.
    16. Brodeur, Abel & Esterling, Kevin & Ankel-Peters, Jörg & Bueno, Natália S & Desposato, Scott & Dreber, Anna & Genovese, Federica & Green, Donald P & Hepplewhite, Matthew & de la Guardia, Fernando Hoces, 2024. "Promoting Reproducibility and Replicability in Political Science," Department of Economics, Working Paper Series qt23n3n3dg, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    17. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2024. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    18. Fišar, Miloš & Greiner, Ben & Huber, Christoph & Katok, Elena & Ozkes, Ali & Management Science Reproducibility Collaboration, 2023. "Reproducibility in Management Science," Department for Strategy and Innovation Working Paper Series 03/2023, WU Vienna University of Economics and Business.
    19. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    20. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:scippl:v:48:y:2021:i:2:p:154-163.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/spp .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.