Abstract
Background
Panel management (PM) curricula in internal medicine (IM) residency programs often assign performance measures which may not address the varied interests or needs of resident-learners.
Aim
To evaluate a self-directed learning (SDL)–based PM curriculum.
Setting
University-based primary care practice in Burlington, Vermont.
Participants
Thirty-five internal medicine residents participated.
Program Description
Residents completed a PM curriculum that integrated SDL, electronic health record (EHR)–driven performance feedback, mentorship, and autonomy to set learning and patient care goals.
Program Evaluation
Pre/post-curricular surveys assessed EHR tool acceptability, weekly curricular surveys and post-curricular focus groups assessed resident perceptions and goals, and an interrupted time series analysis of care gap closure rates was used to compare the pre-intervention and intervention periods. Majority of residents (28–32 or 80–91%) completed the surveys and focus groups. Residents found the EHR tools acceptable and valued protected time, mentorship, and autonomy to set goals. A total of 13,313 patient visits were analyzed. There were no significant differences between rates between the pre-intervention period and the first intervention period (p=0.44).
Discussion
A longitudinal PM curriculum that incorporated SDL and goal setting with EHR-driven performance feedback was well-received by residents, however did not significantly impact the rate of care gap closure.
Avoid common mistakes on your manuscript.
INTRODUCTION
Panel management (PM) is a proactive, team-based approach to population health (PH) that systematically addresses gaps in care1,2,3,4 and often utilizes electronic health record (EHR) technologies to provide performance feedback in the form of clinic measures or quality metrics.5 Internal medicine (IM) residents in graduate medical education (GME) are expected to become competent in utilizing EHR data and performing PM, and gain skills to meet the core competency of practice-based learning improvement (PBLI) set by the Accreditation Council of Graduate Medical Education (ACGME).6,7 The ACGME defines PBLI as “one of the defining characteristics of being a physician,” with emphasis on personal responsibility for lifelong learning and continuous self-evaluation to improve patient care.6 Therefore, self-directed learning (SDL) is an integral skill for physicians to develop, since it is the process in which individuals take initiative with or without help, to identify learning needs, formulate goals, identify and utilize resources, apply learned knowledge, and evaluate learning outcomes.8,9,10,11,12
Previous PM curricula have correlated modest improvements in quality metrics and confidence in PM skills13,14,15 and highlighted the importance of self-reflection in improving gaps in performance.16,17,18 Longitudinal EHR-driven feedback has shown to help residents develop competency in accepting and applying performance feedback data, recognizing the growing role of technology in GME and patient care.19 Residents are often provided performance data with panel registries or perform chart audits that lack regular interval feedback.15,17,18,20,21,22,23 Usually there is a set topic of interest associated with a clinic performance measure which may not address residents’ varied interests or needs across postgraduate years. 13,14,15,18,24,25
We sought to create a PM curriculum that integrates SDL and interval performance feedback using EHR-based tools while also allowing learner autonomy to set learning and patient care goals to address concepts of PM and PH. The purpose of this paper is to describe the development, implementation, and evaluation of our curriculum.
SETTING AND PARTICIPANTS
The PM curriculum was a requirement of the University of Vermont Medical Center IM residency program at the mixed faculty-resident practice in Burlington, VT. The curriculum was implemented beginning in July 2019 through mid-June 2020. The resident clinic consists of 35 residents (27 categorical and 8 primary care residents) divided into 5 cohorts that rotate through clinic every fifth week (4+1 block schedule). Each resident has their own panel with approximately 80–100 patients.
PROGRAM DESCRIPTION
We structured the curriculum based on the theoretical model proposed by Sawtasky et al. for SDL in the GME setting, which recommends a “trigger” to help uncover a knowledge gap, form learning objectives, utilize resources to gain knowledge, apply knowledge, and self-reflect on learning outcomes.9
Each clinic week, residents participated in a 30-min precepted session with faculty in which they received a care gap report to act as the “trigger” to review and reflect on their performance to uncover knowledge gaps in learning and patient care. Care gap reports consisted of the percentage of “care gaps” or overdue health maintenance items that were closed the previous continuity clinic week. Residents also had access individualized dashboards, which display clinic performance measures for their patient panels. Residents formulated learning objectives by reviewing their data and setting goals for the week. Residents were given autonomy to choose a care gap goal, such as diabetes management (e.g., foot exams, A1c checks), and a learning goal (e.g., how to manage hyperglycemia). Residents had 1 half-day per clinic week to enact their plan of self-study, explore educational resources, and identify patients overdue for care and coordinate patient outreach with support staff.
PROGRAM EVALUATION
Program evaluation included pre- and post-curricular surveys, weekly clinic surveys during the curriculum, and post-curriculum focus groups. The surveys and interview guide were developed by the authors. Pre- and post-surveys occurred between July 2019 and June 2020, were adapted to assess ease of use of the dashboard and EHR tools,26 and consisted of multiple-choice questions that utilized a 5-point Likert scale (strongly agree to strongly disagree). The weekly clinic surveys occurred from August 2019 to early June 2020, and consisted of multiple-choice questions and open-ended responses to assess care gap and learning goals, successes and barriers, educational tools utilized, and confidence in PM skills. All survey data were managed using REDCap electronic data capture tools hosted at the University of Vermont.27 Focus groups capturing resident experiences and perceptions were conducted during April and May 2020 and were moderated by an experienced facilitator (L.A.H., K.N.H., or A.G.K.) with an interview guide.
Care gap data was collected from the EHR and the primary outcome of interest was rate of care gap closure to assess resident behavior change and curriculum outcomes. Given the substantial impact on preventive care at the onset of the COVID-19 pandemic, three time periods were evaluated: the pre-intervention period: July 2018–June 2019, the first intervention period: July 2019–March 2020, and the second intervention period (COVID-19 period): April through early June 2020.
STATA 16.1 (Stata Corporation, College Station, TX) was used for all quantitative analyses, with p<0.05 required for statistical significance. Agreement between pre- and post-survey responses was compared using McNemar’s test. Weekly clinic surveys were analyzed descriptively. The care gap closure rate was analyzed using an interrupted time series analysis during the time periods. The focus group transcripts were evaluated using qualitative content analysis through an iterative process until major themes were identified.
According to the policy defining activities which constitute research at the University of Vermont and University of Vermont Medical Center, this work met criteria for improvement activities exempt from ethics review.
RESULTS
Of the 35 residents that participated in the curriculum, 91% (32) completed the pre- and post-surveys, 80% (28) completed at least six of the eight weekly clinic surveys, and 83% (29) participated in the focus groups. Data from weeks 7–8 were excluded due to disruption from the pandemic; therefore, residents that completed at least 6 surveys were included (n=168).
Pre- and post-survey responses showed that post-curriculum, residents were more likely to report the dashboard was useful in their job, easy to use, clear and understandable, and likely benefited patients’ health overall (p<0.05). Weekly clinic surveys revealed the common successes and barriers (Table 1). Of the 168 surveys, the most reported care gap goal set was social determinants of health screening at 11% (19), followed by 8% (14) behavioral health screening, and 8% (14) advance directive completion. Focus groups revealed that residents valued protected time, dashboard data, perceived high-yield resources, peer and faculty mentorship, autonomy to set learning and care gap goals, and an overall increased sense of panel ownership. However, residents reported goal setting felt arbitrary when it was not pertinent to patients seen during a given clinic week, and they did not feel accountable for patients outside of their panel. Residents felt that lack of continuity with their patient panel was a barrier to PM and patient care. Many residents found that the care gap reports were difficult to interpret and did not represent their performance or efforts.
There were 13,313 patient visits available for analysis, including 7395 visits in the pre-intervention period, 4697 visits in the first intervention period, and 1221 visits in the COVID-19 intervention period. The percentage of care gaps closed per month ranged from 3 to 21%. Overall, there was a statistically significant amount of variation in care gap closure rate across the three time periods (p<0.001) (Fig. 1). There was no significant difference in the rates between the pre-intervention period and the first intervention period (p=0.44). The rate of care gap closure was significantly higher in the COVID-19 period than it was in the first intervention period (p<0.001). Across all time periods, fall risk screening and behavioral health screening were found among the top five care gaps closed.
DISCUSSION
We developed a longitudinal PM curriculum that incorporated EHR-driven performance feedback and SDL concepts to address PBLI required by the ACGME. Resident experiences based on surveys and focus group data showed that EHR tools, individualized goal setting, protected time, and mentorship were well-received and important for resident engagement and patient ownership, but performance feedback needs to be accurate, timely, and easy to interpret for resident acceptability.
To our knowledge, we are the only PM curriculum in the literature to give residents autonomy in choosing care gap metrics and learning goals. Previous curricula have allowed autonomy in goal setting specific to performance or allowed residents to come to a consensus as a group, but the quality metrics were still assigned.19,28 Focus groups revealed residents felt empowered and appreciated this autonomy, but they found it problematic when goals were not relevant to patient visits. Previous work has suggested using resident-sensitive metrics, or metrics that are actionable and appropriate for resident panel populations, and align with educational goals19,29,30,31 Continuous review of whether metrics are appropriate for residents may be needed. It is also plausible that more time may increase the likelihood that residents would have pertinent patient visits, but future work is needed to address how to better align learner needs with patient care.
Most residents rejected the care gap reports as a helpful tool as they did not feel the reports were representative of their perceived performance or efforts. Residents elaborated that care gaps were not always completed due to reasons outside of their control, such as EHR tools not capturing completion, continuity issues, competing patient priorities, and time constraints. The lack of context from the reports also proved to be problematic since residents could not easily reflect on the data and attribute it to specific encounters. These limitations of EHR metrics and barriers to PM and patient care are not specific to GME and remain prevalent in real-world primary care. Residents may benefit from more guidance, since a previous curriculum found faculty-guided interpretation helped residents gain competency in receiving, interpreting, and applying performance feedback by addressing accuracy concerns and preventing defensive reactions.19 A common challenge for educators is balancing the tension between assessment and feedback to promote learning and growth;32 future iterations of this curriculum may benefit from trying to continue to shift resident perceptions of EHR-driven performance feedback as a formative tool versus summative assessment.
Despite an increased sense of patient ownership, multiple residents voiced the concern that performance metrics should not be based on patients outside of their panel. We would argue that in primary care, both residents and attendings may see colleagues’ patients from time to time outside of their panel, and the bigger perspective of PH does not draw lines at individual panels. This suggests a need for more emphasis and education around concepts of PH and a team-based approach to patient care.
Residents’ behavior changes and patient outcomes were in the form of care gap closure rate, which trended toward improvement, but results did not reach statistical significance in the pre-pandemic period. It may have been too short of a period to capture significant change. Unsurprisingly, care gap closure percentages fell during the pandemic. The subsequent rise in care gap completion rate across this period was robust at a time when patient access was still limited for most preventive care and primarily telehealth based. This finding, in conjunction with the findings of behavioral health and social determinants of health as top care gaps completed during that period, may reflect residents’ awareness and interest in continuing to address preventive health needs that were particularly pertinent and accessible amidst the challenges of the pandemic, and improved proficiency in utilizing the EHR to deliver care.
There are several limitations to note. Surveys and the interview guide were not tested for validity of evidence or piloted. Data were self-reported and therefore subject to recall bias. We had a limited assessment of behavioral change and patient-level outcomes like most studies in GME.33 This curriculum may only be feasible in programs with a X+Y model and protected time for PM. The findings of this single institution evaluation may not be generalizable to other residencies.
More work is needed to investigate how PM curricula can not only prepare residents for future practice but also impact patient outcomes.
Abbreviations
- PH:
-
Population health
- PM:
-
Panel management
- EHR:
-
Electronic health record
- ACGME:
-
Accreditation Council of Graduate Medical Education
- PBLI:
-
Practice-based learning and improvement
- IM:
-
Internal medicine
- GME:
-
Graduate medical education
- SDL:
-
Self-directed learning
References
Berwick DM, Nolan TW, Whittington J. The Triple Aim: Care, Health, and Cost. Health Aff (Millwood). 2008;27(3):759-769. doi:https://doi.org/10.1377/hlthaff.27.3.759
Bodenheimer T, Wagner EH, Grumbach K. Improving Primary Care for Patients with Chronic Illness. JAMA. 2002;288(14):1775-1779. doi:https://doi.org/10.1001/jama.288.14.1775
Bodenheimer T, Wagner EH, Grumbach K. Improving Primary Care for Patients with Chronic Illness: the Chronic Care Model, Part 2. JAMA. 2002;288(15):1909-1914. doi:https://doi.org/10.1001/jama.288.15.1909
Neuwirth EE, Schmittdiel JA, Tallman K, Bellows J. Understanding Panel Management: a Comparative Study of an Emerging Approach to Population Care. Perm J. 2007;11(3):12-20. doi:https://doi.org/10.7812/tpp/07-040
Jones SS, Rudin RS, Perry T, Shekelle PG. Health Information Technology: an Updated Systematic Review with a Focus on Meaningful Use. Ann Intern Med. 2014;160(1):48-54. doi:https://doi.org/10.7326/M13-1531
[ACGME] Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in internal medicine. https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRResidency2019.pdf. Published July 1, 2009. Updated June 9, 2019. Accessed December 9, 2019.
[ACGME] Accreditation Council for Graduate Medical Education. Supplemental Guide: Internal Medicine. https://www.acgme.org/Portals/0/PDFs/Milestones/InternalMedicineSupplementalGuide.pdf?ver=2020-12-02-124831-067 Published November 1, 2020. Accessed April 3, 2021.
Knowles M. Self-directed learning: a guide for learners and teachers. Chicago, IL: Associated Press; 1975.
Sawatsky, A.P., Ratelle, J.T., Bonnes, S.L. et al. A Model of Self-directed Learning in Internal Medicine Residency: a Qualitative Study Using Grounded Theory. BMC Med Educ. 2017;17:31. DOI https://doi.org/10.1186/s12909-017-0869-4.
Knowles MS, Holton III EF, Swanson RA. The adult learner: the definitive classic in adult education and human resource development. 6th ed. Boston, MA: Elsevier/Butterworth Heinemann; 2005.
Murad MH, Coto-Yglesias F, Varkey P, Prokop LJ, Murad AL. The Effectiveness of Self-directed Learning in Health Professions Education: a Systematic Review. Med Educ. 2010;44(11):1057-1068. doi:https://doi.org/10.1111/j.1365-2923.2010.03750.x
Brydges R, Butler D. A reflective analysis of medical education research on self-regulation in learning and practice. Medical Education. 2012;46:71-79.
Salem JK, Jones RR, Sweet DB, Hasan S, Torregosa-Arcay H, Clough L. Improving Care in a Resident Practice for Patients with Diabetes [published correction appears in J Grad Med Educ. 2011 Sep;3(3):446]. J Grad Med Educ. 2011;3(2):196-202. https://doi.org/10.4300/JGME-D-10-00113.1
Janson SL, Cooke M, McGrath KW, Kroon LA, Robinson S, Baron RB. Improving Chronic Care of Type 2 Diabetes Using Teams of Interprofessional Learners. Acad Med. 2009;84(11):1540-1548. doi:https://doi.org/10.1097/ACM.0b013e3181bb2845
Holmboe ES, Prince L, Green M. Teaching and Improving Quality of Care in a Primary Care Internal Medicine Residency Clinic. Acad Med. 2005;80(6):571-577. doi:https://doi.org/10.1097/00001888-200506000-00012
Duffy FD, Holmboe ES. Self-assessment in Lifelong Learning and Improving Performance in Practice: Physician Know Thyself. JAMA. 2006;296(9):1137-1139. doi:https://doi.org/10.1001/jama.296.9.1137
Hildebrand C, Trowbridge E, Roach MA, Sullivan AG, Broman AT, Vogelman B. Resident Self-assessment and Self-reflection: University of Wisconsin-Madison’s Five-Year Study. J Gen Intern Med. 2009;24(3):361-365. doi:https://doi.org/10.1007/s11606-009-0904-1
Hadley Strout EK, Landrey AR, MacLean CD, Sobel HG. Internal Medicine Resident Experiences with a 5-Month Ambulatory Panel Management Curriculum. J Grad Med Educ. 2018;10(5):559-565. doi:https://doi.org/10.4300/JGME-D-18-00172.1
Haynes C, Yamamoto M, Dashiell-Earp C, Gunawardena D, Gupta R, Simon W. Continuity Clinic Practice Feedback Curriculum for Residents: a Model for Ambulatory Education. J Grad Med Educ. 2019;11(2):189-195. doi:https://doi.org/10.4300/JGME-D-18-00714.1
Carney PA, Eiff MP, Green LA, et al. Transforming primary Care Residency Training: a Collaborative Faculty Development Initiative Among Family Medicine, Internal Medicine, and Pediatric Residencies. Acad Med. 2015;90(8):1054-1060. doi:https://doi.org/10.1097/ACM.0000000000000701
Holmboe E, Scranton R, Sumption K, Hawkins R. Effect of Medical Record Audit and Feedback on Residents’ Compliance with Preventive Health Care Guidelines. Acad Med. 1998;73(8):901-903. doi:https://doi.org/10.1097/00001888-199808000-00016
Kogan JR, Reynolds EE, Shea JA. Effectiveness of Report Cards Based on Chart Audits of Residents’ Adherence to Practice Guidelines on Practice Performance: a Randomized Controlled Trial. Teach Learn Med. 2003;15(1):25-30. doi:https://doi.org/10.1207/S15328015TLM1501_06
Kern DE, Harris WL, Boekeloo BO, Barker LR, Hogeland P. Use of an Outpatient Medical Record Audit to Achieve Educational Objectives: Changes in Residents’ Performances Over Six Years. J Gen Intern Med. 1990;5(3):218-224. doi:https://doi.org/10.1007/BF02600538
Boggan JC, Swaminathan A, Thomas S, Simel DL, Zaas AK, Bae JG. Improving Timely Resident Follow-Up and Communication of Results in Ambulatory Clinics Utilizing a Web-Based Audit and Feedback Module. J Grad Med Educ. 2017;9(2):195-200. doi:https://doi.org/10.4300/JGME-D-16-00460.1
Simon SR, Soumerai SB. Failure of Internet-Based Audit and Feedback to Improve Quality of Care Delivered by Primary Care Residents. Int J Qual Health Care. 2005;17(5):427-431. doi:https://doi.org/10.1093/intqhc/mzi044
Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly. 1989;13(3):319-340.
PA Harris, R Taylor, R Thielke, J Payne, N Gonzalez, JG. Conde, Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support, J Biomed Inform. 2009 Apr;42(2):377-81.
Warm EJ, Jacobs N, Neville J, Schauer D. Defense of the Measures: a Tool for Engaging Integrated Care Teams in Outcomes Measurement. J Grad Med Educ. 2014;6(2):370-372. doi:https://doi.org/10.4300/JGME-D-14-00066.1
Epstein JA, Noronha C, Berkenblit G. Smarter Screen Time: Integrating Clinical Dashboards Into Graduate Medical Education. J Grad Med Educ. 2020;12(1):19-24. doi:https://doi.org/10.4300/JGME-D-19-00584.1
Rosenbluth G, Tong MS, Condor Montes SY, Boscardin C. Trainee and Program Director Perspectives on Meaningful Patient Attribution and Clinical Outcomes Data. J Grad Med Educ. 2020;12(3):295-302. doi:https://doi.org/10.4300/JGME-D-19-00730.1
Schumacher DJ, Holmboe ES, van der Vleuten C, Busari JO, Carraccio C. Develo** resident-sensitive quality measures: a model from pediatric emergency medicine. Acad Med.2018;93(7):1071–1078. https://doi.org/10.1097/ACM.0000000000002093.
Watling CJ, Ginsburg S. Assessment, Feedback and the Alchemy of Learning. Med Educ. 2019;53(1):76-85. doi:https://doi.org/10.1111/medu.13645
Coyle A, Helenius I, Cruz CM, et al. A Decade of Teaching and Learning in Internal Medicine Ambulatory Education: a Sco** Review. J Grad Med Educ. 2019;11(2):132-142. doi:https://doi.org/10.4300/JGME-D-18-00596.1
Acknowledgements
Contributors
The authors wish to acknowledge the following contributors from the Robert Larner M.D. College of Medicine at the University of Vermont: Kathryn N. Huggett, Ph.D. Director, The Teaching Academy and Assistant Dean for Medical Education, Leigh Ann Holterman, Ph.D., Director of Curricular Evaluation and Assessment, The Teaching Academy, and the research librarians available at the Dana Medical Library.
Author information
Authors and Affiliations
Contributions
The original idea was conceived by EKHS and EAW with support from the coauthors and mentors AGK and HGS. The curriculum was created and developed in equal parts by EKHS and EAW. The pre/post-survey was developed by AGK and HGS with input and review by EKHS and EAW. EKHS and EAW developed the weekly surveys with review and revisions by AGK and HGS. The manuscript was written by EKHS and revised and edited by all coauthors. Survey data descriptive analysis was done by EKHS and checked by the BJT with the weekly survey data being heavily revised. All statistical and quantitative data analyses were done by BJT. The interview guide protocol was developed by EKHS and revised by EAW, AGK, and HGS.
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare that they do not have a conflict of interest.
Disclaimers
The views expressed in the submitted are the authors’ own and not an official position of the institution.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Presentations
Society of General Internal Medicine (SGIM) New England Regional Meeting, Boston, MA, November 2, 2019
The American College of Physicians (ACP) Annual Meeting Virtual Poster Session, April 23–25, 2020
Society of General Internal Medicine (SGIM) Annual Meeting Virtual Poster Session, May 6–9, 2020
Department of Medicine Quality Showcase Virtual Poster Session, Burlington, VT, May 8, 2020
Department of Medicine Quality Showcase Virtual Oral Presentation, Burlington, VT, May 7, 2021
Supplementary Information
ESM 1
(PDF 185 kb)
Rights and permissions
About this article
Cite this article
Hadley Strout, E.K., Wahlberg, E.A., Kennedy, A.G. et al. A Mixed-Methods Program Evaluation of a Self-directed Learning Panel Management Curriculum in an Internal Medicine Residency Clinic. J GEN INTERN MED 37, 2246–2250 (2022). https://doi.org/10.1007/s11606-022-07507-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11606-022-07507-3