[go: up one dir, main page]

IDEAS home Printed from https://ideas.repec.org/a/eee/ecolet/v167y2018icp131-135.html
   My bibliography  Save this article

Randomization bias in field trials to evaluate targeting methods

Author

Listed:
  • Potash, Eric
Abstract
This paper studies the evaluation of methods for targeting the allocation of limited resources to a high-risk subpopulation. We consider a randomized controlled trial to measure the difference in efficiency between two targeting methods and show that it is biased. An alternative, survey-based design is shown to be unbiased. Both designs are simulated for the evaluation of a policy to target lead hazard investigations using a predictive model. Based on our findings, we advised the Chicago Department of Public Health to use the survey design for their field trial. Our work anticipates further developments in economics that will be important as predictive modeling becomes an increasingly common policy tool.

Suggested Citation

  • Potash, Eric, 2018. "Randomization bias in field trials to evaluate targeting methods," Economics Letters, Elsevier, vol. 167(C), pages 131-135.
  • Handle: RePEc:eee:ecolet:v:167:y:2018:i:c:p:131-135
    DOI: 10.1016/j.econlet.2018.03.012
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0165176518301034
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.econlet.2018.03.012?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Jon Kleinberg & Jens Ludwig & Sendhil Mullainathan & Ziad Obermeyer, 2015. "Prediction Policy Problems," American Economic Review, American Economic Association, vol. 105(5), pages 491-495, May.
    2. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    3. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    4. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    5. Angus Deaton & Nancy Cartwright, 2016. "Understanding and Misunderstanding Randomized Controlled Trials," Working Papers august_25.pdf, Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies..
    6. Dana Chandler & Steven D. Levitt & John A. List, 2011. "Predicting and Preventing Shootings among At-Risk Youth," American Economic Review, American Economic Association, vol. 101(3), pages 288-292, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Alexander Ruder, 2019. "What Works at Scale? A Framework to Scale Up Workforce Development Programs," FRB Atlanta Community and Economic Development Discussion Paper 2019-1, Federal Reserve Bank of Atlanta.
    2. Margaret Dalziel, 2018. "Why are there (almost) no randomised controlled trial-based evaluations of business support programmes?," Palgrave Communications, Palgrave Macmillan, vol. 4(1), pages 1-9, December.
    3. Donald Moynihan, 2018. "A great schism approaching? Towards a micro and macro public administration," Journal of Behavioral Public Administration, Center for Experimental and Behavioral Public Administration, vol. 1(1).
    4. Christopher J. Ruhm, 2019. "Shackling the Identification Police?," Southern Economic Journal, John Wiley & Sons, vol. 85(4), pages 1016-1026, April.
    5. Vellore Arthi & James Fenske, 2018. "Polygamy and child mortality: Historical and modern evidence from Nigeria’s Igbo," Review of Economics of the Household, Springer, vol. 16(1), pages 97-141, March.
    6. Andreas C Drichoutis & Rodolfo M Nayga, 2020. "Economic Rationality under Cognitive Load," The Economic Journal, Royal Economic Society, vol. 130(632), pages 2382-2409.
    7. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    8. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    9. Vicky Chemutai & Hubert Escaith, 2017. "Measuring World Trade Organization (WTO) Accession Commitments and their Economic Effects," Journal of International Commerce, Economics and Policy (JICEP), World Scientific Publishing Co. Pte. Ltd., vol. 8(02), pages 1-27, June.
    10. Ashkan Pakseresht & Brandon R McFadden & Carl Johan Lagerkvist, 2017. "Consumer acceptance of food biotechnology based on policy context and upstream acceptance: evidence from an artefactual field experiment," European Review of Agricultural Economics, Oxford University Press and the European Agricultural and Applied Economics Publications Foundation, vol. 44(5), pages 757-780.
    11. Sutherland, Alex & Ariel, Barak & Farrar, William & De Anda, Randy, 2017. "Post-experimental follow-ups—Fade-out versus persistence effects: The Rialto police body-worn camera experiment four years on," Journal of Criminal Justice, Elsevier, vol. 53(C), pages 110-116.
    12. Hanushek, Eric A., 2021. "Addressing cross-national generalizability in educational impact evaluation," International Journal of Educational Development, Elsevier, vol. 80(C).
    13. Robin Maialeh, 2019. "Generalization of results and neoclassical rationality: unresolved controversies of behavioural economics methodology," Quality & Quantity: International Journal of Methodology, Springer, vol. 53(4), pages 1743-1761, July.
    14. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    15. Yonatan Eyal, 2020. "Self-Assessment Variables as a Source of Information in the Evaluation of Intervention Programs: A Theoretical and Methodological Framework," SAGE Open, , vol. 10(1), pages 21582440198, January.
    16. Boris Salazar-Trujillo & Daniel Otero Robles, 2019. "La revolución empírica en economía," Apuntes del Cenes, Universidad Pedagógica y Tecnológica de Colombia, vol. 38(68), pages 15-48, July.
    17. Gustavo Canavire-Bacarreza & Luis Castro Peñarrieta & Darwin Ugarte Ontiveros, 2021. "Outliers in Semi-Parametric Estimation of Treatment Effects," Econometrics, MDPI, vol. 9(2), pages 1-32, April.
    18. Anthony Yezer & Yishen Liu, 2017. "Can Differences Deceive? The Case of “Foreclosure Externalities"," Working Papers 2017-29, The George Washington University, Institute for International Economic Policy.
    19. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    20. Huber, Martin & Steinmayr, Andreas, 2017. "A Framework for Separating Individual Treatment Effects From Spillover, Interaction, and General Equilibrium Effects," Rationality and Competition Discussion Paper Series 21, CRC TRR 190 Rationality and Competition.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ecolet:v:167:y:2018:i:c:p:131-135. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/ecolet .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.