[The following notes were generated by Andrea Belovich, PhD.]
The fourth session of the Winter 2021 IAMSE Web Seminar Series, “USMLE Step-1 is Going to Pass/Fail, Now what do we do?”, was presented on January 28th, 2021 by Drs. Bruce Morgenstern and Brenda Roman. Dr. Morgenstern is the Vice Dean for Academic and Clinical Affairs at the Roseman University College of Medicine, and the immediate past President of the Alliance for Clinical Affairs. Dr. Roman is the Associate Dean of Medical Education and a Professor of Psychiatry at Wright State University Boonshoft School of Medicine, and the current President of the Alliance for Clinical Education. In this webinar, “Step 1 Going Pass-Fail: Are We Just Kicking the Can Down the Road?,” the speakers discussed issues with the current residency application process that will remain to be solved after the USMLE Step 1 is scored pass/fail (P/F). The audience also engaged in a rich discussion about how Undergraduate Medical Education (UME) can help address these issues.
Dr. Morgenstern began the webinar with an overview of the original purpose of the USMLE Step series exams by quoting the USMLE’s mission statement, “To provide to licensing authorities meaningful information from assessments of physician characteristics including medical knowledge, skills, values, and attitudes that are important to the provision of safe and effective patient care” [1]. He contrasted this with the Step exams’ off-label use to distinguish residency applicants from one another and questioned how well exams consisting of multiple-choice questions can assess abstract qualities such as values and skills. Given these limitations and the recent discontinuation of the Step 2 Clinical Skills (CS) exam, Dr. Morgenstern raised the question of how the USMLE will continue to move forward in advancing its mission statement.
Dr. Morgenstern then pointed out that medical education, in general, faces a similar challenge. The ultimate goal of medical education is, of course, to produce “good” physicians. However, since the field lacks an operational definition of what a “good” physician is, medical education struggles to articulate how and when a candidate is determined to be one. Competency-based medicine is a trend evolving to meet this challenge, although Dr. Morgenstern added the caveat that the term “competency” itself may not be a sufficient descriptor due to the word stigma of “bare minimum” often associated with the term. Beyond clinical knowledge, the areas commonly used to measure competence are also difficult to assess, including clinical skills, professional identity development, professionalism, values and attitudes, health systems science-related curricular content, and growth mindset. As an example, while Dr. Morgenstern believes the ability to communicate in a culturally humble way is paramount, assessing this skill is far more difficult, especially when the focus on USMLE Step 1 has created a parallel curriculum for medical schools.
The driving factor behind this focus on USMLE Step 1 is the residency “match frenzy,” which has encouraged program directors find ways to screen and rank applicants. Citing a collection of Bryan Carmody’s research [2], Dr. Morgenstern acknowledged that Step 1 and Step 2 CK scores predict passage of standardized specialty boards, but in a more global context, Step 1 scores do not correlate with other measures of overall success in residency. Despite the data, perception is reality, so the practice of relying upon Step 1 scores to develop interviewee lists for residency slots has been generally accepted by both students and program directors, causing significant stress for students [3, 4, 5].
In addition to the overwhelming number of residency applications per residency slot, a significant contributor to the reliance on ranking tools is the lack of staffing available to residency program directors in support of the conductance of holistic application review. As program directors experience pressure to select applicants likely to pass specialty boards, they prefer methodologies to meaningfully rank applicants. Subjective metrics, such as Medical Student Performance Evaluation (MSPE) letters, tend to lack consistency regarding clinical grading and undergo “modifier inflation” in the description of medical students. Even the term “excellent” is no longer the highest superlative, since medical students as a group are generally very intelligent and capable individuals (reminiscent of the Lake Wobegon effect). Taken together, this leads to a general lack of trust between UME and GME regarding the MSPE.
Dr. Morgenstern concluded his portion of the webinar by positing that the current residency application process itself may not be fair. Even ERAS contributes greatly to the overwhelming number of applicants, and the National Residency Match Program algorithm is proprietary, so it is not possible to verify whether/how it favors applicants or programs, not to mention diversity, equity, and inclusion issues. Finally, he emphasized that the underlying issues with the current residency match process will remain after the USMLE Step 1 transitions to P/F, so, going forward, how can program directors screen applicants and conduct holistic reviews? Even after consideration for the Lake Wobegon effect, 50% of all applicants will perpetually be “below average.” To protect these students, Dr. Morgenstern emphasized that it is incumbent upon UME to develop tools that are more intentional in their design and purpose to assess “good” residents compared to the minimum competency licensure exams previously coopted for the match process.
Dr. Roman then continued the webinar by expanding upon problems that will remain inherent to the residency match process despite the USMLE Step 1 P/F transition. In particular, student stress is not addressed by this change, rather it will probably shift from Step 1 to Step 2 CK, as program directors are likely to focus more on Step 2 CK (a stronger predictor of clinical performance than Step 1 [6]). This is likely to impact the clerkship years and may detract from patient care and clinical learning, especially in light of the discontinuation of Step 2 CS. Medical schools and programs are still evaluated by match rate, and the job performance of Student Affairs Deans are tied to successful matches. Students with well-explained gaps in their education may be negatively viewed by program directors for requiring more time to complete medical school. Students and programs will still try to game the system. Despite these issues, almost all students are successful in matching to a residency program.
A brief overview of the history of the match process from its inception in 1920 shows that these underlying themes have plagued the transition from UME to GME for an entire century. Dr. Roman then asked the audience for their input on a series of open-ended and polling questions to help identify solutions to these long-standing challenges.
When asked, “What characteristics of a future ‘good’ clinician can the pre-clerkship educators identify about their students?” audience answers included “professionalism, teamwork, emotional intelligence, diligence, maturity, hard-working, and curious.” These responses underscored the collective observation that the system has become over-reliant on measuring knowledge by numbers.
Following up on this question, Dr. Roman asked, “What can the pre-clerkship educators document in a narrative format about students that would be helpful for consideration by program directors?” Audience answers included, “Professionalism, ability to think critically, curiosity, grit, and determination,” which described resident characteristics seen as desirable by program directors.
Dr. Roman then asked the audience whether educators should “agree to a common/standardized vocabulary in describing medical students in evaluations and letters of recommendation?” and “Should other measures be used in the application process regarding fit to the medical specialty, like ‘dexterity assessments” for skills-based specialties or communication assessments for ‘people-centered specialties’?” The audience responded in an overwhelmingly positive manner to both questions.
Finally, Dr. Roman polled the audience with the following question: “Since data regarding ‘predicting success in residency’ is not robust, a radical solution would be for students to identify a specialty of choice, geographic preference and a few key attributes about what they would like in a program (academic medical center versus community-based program, research opportunities) and do away with interviews…?” The audience was more evenly split in their response to this question.
As the webinar drew to a conclusion, Dr. Roman reminded the audience that medical schools can improve the residency application process and trust between UME and GME by developing better ways to identify and addressing professionalism issues. This may include involving basic science faculty who can detect concerning patterns, and greater distinction between professionalism concerns that can be remediated versus a long-standing pattern of behavior and documenting these.
Finally, Dr. Roman left with the audience with a provocative proposal intended to spark thought and discussion. With a disclaimer that the idea in its current form was not intended for serious consideration, Dr. Roman suggested the following:
- Programs should define optimal fit for residents in their programs
- Applicants define their characteristics
- The match algorithm does not use rank lists, but rather one that prioritizes compatibility between applicant and program
Dr. Roman also suggested that medicine should also ask that schools and hospitals stop being ranked by media publications, as this creates a false reality and may drive students to applying to so many residencies. Ultimately, it is important for UME and GME to work together more to avoid simply moving the stress from Step 1 to Step 2 CK, and to involve clinicians and basic medical science faculty in helping program directors assess applicants.
References:
- USMLE Mission Statement. United States Medical Licensing Examination. https://www.usmle.org/about/
- https://thesherriffofsodium.com/2019/03/05/the-mythology-of-usmle-step-1-scores-and-board-certification
- Beck Dallaghan. Medical School Resourcing of USMLE Step 1 Preparation: Questioning the Validity of Step 1. Med Sci Educ. (2019) 29:1141-1145
- Bryan Carmody. On Residency Selection and the Quantitative Fallacy. J Grad Med Educ. (2019) 11 (4): 420-421.
- Bryan Carmody. Medical Student Attitudes toward USMLE Step 1 and Health Systems Science – A Multi-Institutional Survey. Teaching and Learning in Medicine. December 8, 2020.
- Akshita Sharma, Daniel P. Schauer, Matthew Kelleher, Benjamin Kinnear, Dana Sall, and Eric Warm. USMLE Step 2 CK: Best Predictor of Multimodal Performance in an Internal Medicine Residency. J Grad Med Educ. (2019) 11(4): 412-419.