Abstract
Support for evidence-based policy making and the microeconometric evaluation methods necessary to uncover causal effects has grown during the last two decades. Today, there is a growing appetite for credible and transparent evidence on whether a policy intervention achieves its expected outcomes. While data and methodological innovations have driven progress in the field of impact evaluation, progress has also been facilitated by a growing institutional commitment to evaluation in many countries. Despite this progress, no agreement has yet been reached in policy circles as to whether high standards of rigour in impact evaluation are needed in all situations. This is in part why the support for evidence-based policy making is not equally widespread in policy settings than in academic circles. This short article discusses this question. First, it reviews the progress achieved thus far in the impact evaluation literature, with a focus on labour economics. Then, it examines the obstacles that the impact evaluation profession needs to overcome to achieve an even wider use of evaluation techniques that continues sha** the design of labour market policies. I focus on the two biggest challenges that I have observed when discussing the implementation of impact evaluations with policy makers: impact evaluation is hard to implement—data and techniques are not accessible to everyone, and their implementation is time-consuming and often costly. More importantly, it is not always clear to policy makers how to use the results of impact evaluations, making their benefits less evident.
I am grateful to Hilary Hoynes, Miguel Ángel Malo, Domenico Tabasso and the editors of this book for their comments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Own calculations based on the data presented in Panhans and Singleton (2017), which was kindly provided by the authors. Data includes articles published in 11 all-field journals plus the top four economic journals.
- 2.
List and Rasul (2011) trace the first use of field experiments in labour economics to two historical examples: first, the Hawthorne plant experiment in the 1920s, that varied the amount of light for different groups in the workplace to assess the effect on productivity of female assemblers. Second, the large-scale social experiment carried out by the US government (and led by Heather Ross) starting in 1968, which explored the behavioural effects of negative income taxation.
- 3.
References
Abadie, A., & Cattaneo, M. D. (2018). Econometric Methods for Program Evaluation. Annual Review of Economics, 10(1), 465–503. https://doi.org/10.1146/annurev-economics-080217-053402
Angrist, J. D. (1990). Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records. The American Economic Review, 80(3), 313–336.
Angrist, J. D., & Krueger, A. B. (1991). Does Compulsory School Attendance Affect Schooling and Earnings? The Quarterly Journal of Economics, 106(4), 979–1014. https://doi.org/10.2307/2937954
Angrist, J. D., & Krueger, A. B. (1999). Chapter 23—Empirical Strategies in Labor Economics. In O. C. Ashenfelter & D. Card (Eds.), Handbook of Labor Economics (pp. 1277–1366). Elsevier. https://doi.org/10.1016/S1573-4463(99)03004-7
Angrist, J. D., & Pischke, J.-S. (2010). The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics. Journal of Economic Perspectives, 24(2), 3–30. https://doi.org/10.1257/jep.24.2.3
Angrist, J. D., & Pischke, J.-S. (2014). Mastering ’Metrics: The Path from Cause to Effect. Princeton University Press.
Ashenfelter, O. (1987). The Case for Evaluating Training Programs with Randomized Trials. Economics of Education Review, 6(4), 333–338. https://doi.org/10.1016/0272-7757(87)90016-1
Ashenfelter, O., & Card, D. (1985). Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs. The Review of Economics and Statistics, 67(4), 648–660. https://doi.org/10.2307/1924810
Ashenfelter, O., Layard, R., & Card, D. E. (1986). Handbook of Labor Economics (Vols. 1–2). Elsevier Science Pub. Co.
Ashenfelter, O., Layard, R., & Card, D. E. (1999). Handbook of Labor Economics (Vol. 3). Elsevier Science Pub. Co.
Ashenfelter, O., Layard, R., & Card, D. E. (2011). Handbook of Labor Economics (Vol. 4). Elsevier Science Pub. Co.
Athey, S., & Imbens, G. W. (2017). The State of Applied Econometrics: Causality and Policy Evaluation. Journal of Economic Perspectives, 31(2), 3–32. https://doi.org/10.1257/jep.31.2.3
Card, D., Kluve, J., & Weber, A. (2018). What Works? A Meta Analysis of Recent Active Labor Market Program Evaluations. Journal of the European Economic Association, 16(3), 894–931.
Cattaneo, M. D., Idrobo, N., & Titiunik, R. (2018). A Practical Introduction to Regression Discontinuity DesignsII (Vol. II, p. 106). Cambridge University Press.
Christensen, G., Freese, J., & Miguel, E. (2019). Transparent and Reproducible Social Science Research: How to Do Open Science. University of California Press.
Christensen, G., & Miguel, E. (2018). Transparency, Reproducibility, and the Credibility of Economics Research. Journal of Economic Literature, 56(3), 920–980. https://doi.org/10.1257/jel.20171350
Clemens, M. A. (2017). The Meaning of Failed Replications: A Review and Proposal. Journal of Economic Surveys, 31(1), 326–342. https://doi.org/10.1111/joes.12139
Clemens, M. A., & Demombynes, G. (2011). When does rigorous impact evaluation make a difference? The case of the Millennium Villages. Journal of Development Effectiveness, 3(3), 305–339. https://doi.org/10.1080/19439342.2011.587017
Escudero, V., Kluve, J., Moureloand, E. L., & Pignatti, C. (2019). Active Labour Market Programmes in Latin America and the Caribbean: Evidence from a Meta Analysis. The Journal of Development Studies, 55(12), 2644–2661.
Fougère, D., & Jacquemet, N. (2020). Policy Evaluation Using Causal Inference Methods. 12922. IZA—Institute of Labor Economics. Retrieved July 21, 2021, from https://www.iza.org/publications/dp/12922/policy-evaluation-using-causal-inference-methods
Gertler, P. J., Martinez, S., Premand, P., Rawlings, L. B., & Vermeersch, C. M. J. (2016). Impact Evaluation in Practice (2nd ed.). International Bank for Reconstruction and Development/The World Bank. https://openknowledge-worldbank-org.ilo.idm.oclc.org/handle/10986/25030
Greenberg, D. H., Michalopoulos, C., & Robins, P. K. (2003). A Meta-Analysis of Government-Sponsored Training Programs. ILR Review, 57(1), 31–53.
Gruber, J. (1994). The Incidence of Mandated Maternity Benefits. The American Economic Review, 84(3), 622–641.
Heckman, J. J., Lalonde, R. J., & Smith, J. A. (1999). The Economics and Econometrics of Active Labor Market Programs. In O. Ashenfelter, & D. Card (eds.), Handbook of Labor Economics (pp. 1865–2097). Retrieved February 18, 2018, from https://experts.umich.edu/en/publications/chapter-31-the-economics-and-econometrics-of-active-labor-market
Hoces de la Guardia, F., Grant, S., & Miguel, E. (2021). A Framework for Open Policy Analysis. Science and Public Policy, 48(2), 154–163. https://doi.org/10.1093/scipol/scaa067
Kluve, J. (2010). The Effectiveness of European Active Labor Market Programs. Labour Economics, 17(6), 904–918. https://doi.org/10.1016/j.labeco.2010.02.004
Kluve, J., Puerto, S., Robalino, D., Romero, J. M., Rother, F., Stöterau, J., Weidenkaff, F., & Witte, M. (2019). Do Youth Employment Programs Improve Labor Market Outcomes? A Quantitative Review. World Development, 114, 237–253.
LaLonde, R. J. (1986). Evaluating the Econometric Evaluations of Training Programs with Experimental Data. The American Economic Review, 76(4), 604–620.
Leamer, E. E. (1983). Let’s Take the Con Out of Econometrics. The American Economic Review, 73(1), 31–43.
List, J. A., & Rasul, I. (2011). Chapter 2—Field Experiments in Labor Economics. In O. Ashenfelter & D. Card (Eds.), Handbook of Labor Economics (pp. 103–228). Elsevier. https://doi.org/10.1016/S0169-7218(11)00408-4
Meyer, B. D. (1995). Natural and Quasi-Experiments in Economics. Journal of Business & Economic Statistics, 13(2), 151–161. https://doi.org/10.2307/1392369
Miguel, E. (2021). Evidence on Research Transparency in Economics. Journal of Economic Perspectives, 35(3), 193–214. https://doi.org/10.1257/jep.35.3.193
Moffitt, R. A. (1999). Chapter 24—New Developments in Econometric Methods for Labor Market Analysis. In O. C. Ashenfelter & D. Card (Eds.), Handbook of Labor Economics (pp. 1367–1397). Elsevier. https://doi.org/10.1016/S1573-4463(99)03005-9
Panhans, M. T., & Singleton, J. D. (2017). The Empirical Economist’s Toolkit: From Models to Methods. History of Political Economy, 49(Supplement), 127–157. https://doi.org/10.1215/00182702-4166299
Solon, G. (1985). Work Incentive Effects of Taxing Unemployment Benefits. Econometrica, 53(2), 295–306. https://doi.org/10.2307/1911237
Stock, J. H. (2010). The Other Transformation in Econometric Practice: Robust Tools for Inference. Journal of Economic Perspectives, 24(2), 83–94. https://doi.org/10.1257/jep.24.2.83
Vooren, M., Haelermans, C., Groot, W., & van den Brink, H. M. (2019). The Effectiveness of Active Labor Market Policies: A Meta-Analysis. Journal of Economic Surveys, 33(1), 125–149. https://doi.org/10.1111/joes.12269
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Escudero, V. (2022). How Impact Evaluation Is Sha** the Design of Labour Market Policies. In: Goulart, P., Ramos, R., Ferrittu, G. (eds) Global Labour in Distress, Volume II. Palgrave Readers in Economics. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-89265-4_26
Download citation
DOI: https://doi.org/10.1007/978-3-030-89265-4_26
Published:
Publisher Name: Palgrave Macmillan, Cham
Print ISBN: 978-3-030-89264-7
Online ISBN: 978-3-030-89265-4
eBook Packages: Economics and FinanceEconomics and Finance (R0)