Measuring the Impact of Educational Interventions: A Quantitative Approach

  • Chapter
  • First Online:
Advancing Surgical Education

Part of the book series: Innovation and Change in Professional Education ((ICPE,volume 17))

  • 1620 Accesses

Overview

This chapter will discuss impact evaluation, an important method of measuring the effectiveness of an educational intervention. This form of evaluation represents a subset of program evaluation and focuses on outcomes and consequential events related to an educational intervention. In doing so, it incorporates several different quantitative methods and is typically reserved for stable, long-standing educational programs/curricula. Many of these methods are also used as part of program evaluation as a whole and in surgical research. Readers are directed to Chaps. 23 (“Demystifying Program Evaluation for Surgical Education”, Battista et al.) and 30 (“Researching in Surgical Education: An Orientation”, Ajjawi and McIllhenny) for more information on these subjects. In addition to providing a working definition of impact evaluation, this chapter will help define key concepts related to its successful use as well as aid in delineating the most useful quantitative methods to employ.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 119.50
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
GBP 149.99
Price includes VAT (United Kingdom)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Patton, M. Q. (1997). Utilization-focused evaluation (4th ed.). Thousand Oaks: Sage Publications.

    Google Scholar 

  2. Owen, J. M. (2006). Program evaluation: Forms and approaches (3rd ed.). Crows Nest: Allen and Unwin.

    Google Scholar 

  3. Fudickar, A., et al. (2012). The effect of the WHO Surgical Safety Checklist on complication rate and communication. Deutsches Ärzteblatt International, 109(42), 695–701.

    Google Scholar 

  4. Evers, U., et al. (2013). ‘Get your life back’: Process and impact evaluation of an asthma social marketing campaign targeting older adults. BMC Public Health, 13, 759–768.

    Article  Google Scholar 

  5. Tavakol, M., & Sanders, J. (2014). Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I. Medical Teacher, 36(9), 746–756.

    Article  Google Scholar 

  6. Papaconstantinou, H. T., et al. (2013). Implementation of a surgical safety checklist: Impact on surgical team perspectives. The Oschner Journal, 13, 299–309.

    Google Scholar 

  7. Seymour, N. E., et al. (2002). Virtual reality training improves operating room performance. Results of a randomized, double-blinded study. Annals of Surgery, 236(4), 458–464.

    Article  Google Scholar 

  8. Anastakis, D. J., et al. (1999). Assessment of technical skills transfer from the bench training model to the human model. American Journal of Surgery, 177(2), 167–170.

    Article  Google Scholar 

  9. Moulton, C. E., et al. (2006). Teaching surgical skills: What kind of practice makes perfect? A randomized, controlled trial. Annals of Surgery, 244(3), 400–409.

    Google Scholar 

  10. Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design & analysis issues for field settings (1st ed.). Chicago: Rand McNally.

    Google Scholar 

  11. Martling, A. L., et al. (2000). Effect of a surgical training programme on outcome of rectal cancer in the County of Stockholm. Lancet, 356(9224), 93–96.

    Article  Google Scholar 

  12. Rosser, J. C., et al. (2007). The impact of video games on training surgeons in the 21st century. Archives of Surgery, 142, 181–186.

    Article  Google Scholar 

  13. Tavakol, M., & Sanders, J. (2014). Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II. Medical Teacher, 36(10), 838–848.

    Article  Google Scholar 

  14. Artino, A. R., et al. (2014). Develo** questionnaires for educational research: AMEE Guide No.87. Medical Teacher, 36(6), 463–474.

    Article  Google Scholar 

  15. DeVellis, R. F. (2014). Scale development: Theory and applications (2nd ed.). Newbury Park: Sage Publications.

    Google Scholar 

  16. Dillman, D., et al. (2009). Internet, mail and mixed-mode surveys: The tailored design method (3rd ed.). Hoboken: Wiley.

    Google Scholar 

  17. American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  18. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  19. Yule, S., et al. (2008). Surgeons’ non-technical skills in the operating room: Reliability testing of the NOTSS behaviour rating system. World Journal of Surgery, 32, 548–556.

    Article  Google Scholar 

  20. Regher, G., et al. (1998). Comparing psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Academic Medicine, 73(9), 993–997.

    Article  Google Scholar 

  21. Cook, D. A., & Beckman, T. J. (2006). Current concepts in validity and reliability for psychometric instruments: Theory and application. The American Journal of Medicine, 119, 166.e7–166.e16.

    Article  Google Scholar 

  22. Hojat, M., et al. (2002). Physician empathy: Definition, components, measurement and relationship to gender and speciality. American Journal of Psychiatry, 159(90), 1563–1569.

    Article  Google Scholar 

  23. Cook, D. A., & Hatala, R. (2016). Validation of educational assessments: A primer for simulation and beyond. Advances in Simulation, 1, 31.

    Article  Google Scholar 

  24. Martin, J. A., et al. (1997). Objective structured assessment of technical skill (OSATS) for surgical residents. British Journal of Surgery, 84, 273–278.

    Article  Google Scholar 

  25. Reznick, R., et al. (1997). Testing technical skill via an innovative ‘bench station’ examination. American Journal of Surgery, 173(3), 226–230.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jenepher A. Martin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Martin, J.A. (2019). Measuring the Impact of Educational Interventions: A Quantitative Approach. In: Nestel, D., Dalrymple, K., Paige, J., Aggarwal, R. (eds) Advancing Surgical Education. Innovation and Change in Professional Education, vol 17. Springer, Singapore. https://doi.org/10.1007/978-981-13-3128-2_34

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-3128-2_34

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-3127-5

  • Online ISBN: 978-981-13-3128-2

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics

Navigation