[The following notes were generated by Sandra Haudek, PhD.]
The Winter 2022 IAMSE Webinar Seminar Series, titled âHow Science Educators Still Matter. Leveraging The Basic Sciences for Student Successâ ended with its fifth seminar on Thursday February 3, 2022, titled âRethinking Assessment Strategies in the Basic Sciences as Step 1 Goes Pass/Failâ. This seminar was presented by Dr. David Harris, Associate Professor of Physiology at the University of Central Florida (UCF), College of Medicine. Dr. Harris discussed alternative strategies to assess basic science content, specifically the new assessment strategies recently implemented at his institution.
Dr. Harris started with describing UCFâs current medical school curriculum (120 students per class). The first year focuses on the normal basic sciences; the second year builds on the pathology, pathophysiology, pharmacology, and management of different organ systems. Over the first 7 months, two courses cover most of the traditional basic sciences. He stated that the herein described changes targeted the assessment strategies during this first 7-month period. He cautioned that these changes were still work in progress as they continuously adapt to feedback. He also mentioned upfront that a particular challenge they faced was UCFâs use of letter grades.
Dr. Harris cited external recourses frequently used by students during their early study of basic sciences, such as Anki, Aquifer, and Boards & Beyond. While many of these resources are well constructed with practice questions, feedback, and links to reliable sources, he stated that students struggle with how to effectively include these resources into their studies. Nevertheless and probably further driven by the COVID-19 pandemic, students accelerated their use of such outside resources. Thus, the role of faculty also changed from traditional textbook teaching to include other teaching modalities. Citing work by Simpson (Simpson, JGME 2018), Dr. Harris listed expected roles of a medical educator in 2025. Dr. Harris proposed that, due to the COVID-19-induced drastic changes in pedagogy, these role changes will happen more quickly than originally expected.
Dr. Harris then specifically discussed the role of the medical educator as âassessorâ using his own experience at UCF. He reviewed that most assessments to test basic science content were multiple-choice questions (MCQ). However, he argued that, while MCQ work well to assess content knowledge, they do not assess how learners apply what they memorized. He emphasized on other skills and attributes needed for being a âgood doctorâ. Such attributes include critical thinking, good communication skills, and life-long learning. In addition, he mentioned that students with disabilities and specific learning difficulties should not be disadvantaged by the assessment method (disability inclusion). Dr. Harris specifically outlined how faculty will have to adapt to roles of being an âassessorâ and âcoachâ to provide feedback for skills that cannot be fostered by outside resources or tested in MCQs.
Dr. Harris continued with stating UCFâs goals that guided their changes in assessment strategies: (1) USMLE STEP 1 going pass/fail, (2) emphasis on communication, critical thinking, and self-regulation, (3) limit use of MCQ, and (4) provide opportunities for cognitive integration early in the curriculum. With these goals in mind, UCF focused their efforts on four areas: (1) concept mapping, (2) high fidelity patient simulations, (3) team-based learning sessions, and (4) case-based learning. At this point, the fourth focus has not been implemented since this requires uniforming the use of case presentations (for delivering new content, highlighting applications, or reviewing old content) and thus will not be discussed in this seminar.
Dr. Harris explained how UCF integrated concept mapping exercises at various points (once a month) during the first 7 months of their preclinical curriculum. Some were used for formative assessments; others were designed as summative assessments. For each exercise, students received the following: The overall objective of the activity (âterminalâ objective), several specific objectives (âenablingâ objectives) each of which had to be addressed, a set of instructions including rules for which resources can be used, a clinical vignette, a list of facts they memorized from text books (e.g., Ohmâs Law), a starter map, tools to build and change the map, and the rubric by which they will be assessed (number grading). Students worked in groups of 6, applying their knowledge of foundational content and using available technology; faculty acted as facilitators. The final concept maps (created with âCmapâ software) were due within 2 hours. Dr. Harris noted that several of these steps were adjusted over time. Grading/assessment was performed by content and non-content experts. Several challenges arose: Non-content experts needed too long to finish grading, contributed to high variability in grading, and were unable to provide in-depth content feedback (solved by only involving content experts, using a speed-grading feature). The inability to comment directly in the Cmap file (solved by converting file to pdf files). Other challenges pertained to the grading mechanisms, faculty time, and reducing student stress.
Dr. Harris then discussed the âhigh fidelity patient simulationsâ in his basic science course. The goal was to highlight important physiology concepts underlying various diseases. He explained that students started in the classroom with an introduction (10 min), then entered the simulation center (20 min), then came back to the classroom for debriefing (30 min). To increase student engagement, students needed to answer (individually or in teams) specific essay questions on which they receive feedback from faculty. Students were then asked to self-reflect on their simulation experience on which faculty again provided narrative feedback (to foster self-regulation skills). Dr. Harris stated that the advantages (e.g., engaging, clinical application) and disadvantages (e.g., resource and time extensive) of this process were still under debate.
Lastly, Dr. Harris addressed team-based learning. After briefly explaining the general process of team-based learning, Dr. Harris pointed out the changes UCF made: all parts were made summative, MCQ questions were replaced with other question types, the grading scale was modified, and peer evaluation was added.
Dr. Harris concluded with his take home message: (1) Modify what you have or is available to you. (2) Recognize the âculturalâ shift for students and faculty roles. (3) Focus on what you want the students to be able to do, what you value, and measure those.
The presentation lasted about 40 minutes and a rich discussion followed. Among other topics, questions from the audience addressed: How to prepare students on how to do concept mapping and using the software? What was their response to the mapping exercise? How (and why) can students be limited in their use of resources? How big is inter-rater variability? Recruitment and training of faculty? More information on the grading scale? Do student groups rotate?