Year Three of the Objective Structured Clinical Evaluations (OSCE) in the Internal Medicine Residency Program brings more than a few changes. The first, most important, is that this innovative exercise has now been given the stamp of peer-review. After presenting their poster at the Alliance for Academic Internal Medicine meeting last spring, Program Director Manish Suneja, MD; Director of Curriculum Jane Rowat, MS; and Clinical Assistant Professor Sheena CarlLee, MD, published their article on the OSCE in the Journal of Graduate Medical Education.
Trainees arrive at Iowa from many different medical schools, which may have placed varying levels of emphasis on skill development. This creates a gap between expectations of residency program directors and actual skills of entering interns. The main goal of the OSCE exercise during orientation is to assess interns’ baseline clinical skills and to provide just-in-time formative feedback on these critical skills. In addition, this exercise allows the program to address identified gaps through curricular change and individualized feedback. As the article explains, the most valuable part of this exercise for both interns and faculty is the opportunity for them to interact directly with the faculty members after each station.
In its first year, the OSCE surveyed the abilities of the 33 incoming Internal Medicine residents with about 20 observers. This year, with the addition of interns from residency programs in the Departments of Family Medicine and of Anesthesia, about 30 evaluators covered the 55 interns assessed in a morning and afternoon session. In order to preserve reliability, the overall structure of the sessions and the skills assessed remained the same from previous years:
- Gather a history and perform a physical examination.
- Provide an oral presentation of a clinical encounter.
- Give or receive a patient handover to transition care responsibility.
- Handle call from nursing.
- Obtain informed consent for tests or procedures.
- Interpret EKG and radiographs.
In five different ten-to-twenty minute scenarios, in half a dozen clinical observation rooms, interns demonstrated their skills, often while interacting with simulated patients, while faculty observers watched via closed-circuit video and took notes for later conversation.
While the interns assigned to the morning session waited in one room, Suneja welcomed the educator-observers in another room down the hall. He welcomed the addition of four nurses to the evaluation team, adding to the range of perspectives and the verisimilitude of the experience. He also reminded the observers, as he would the interns themselves, that the OSCE is meant to be a formative assessment, not an intimidating exam. The results of this, he emphasized, are meant to increase interns’ comfort level on day one of residency.
After that it was just a matter of everyone taking their places. With the actors posing as patients in the exam rooms and proctors standing by to lend a hand, faculty observers sat down at their screens ready to take notes.
Suneja also pointed out that the Chief Residents play a critical role in the process. Former Chiefs turned faculty or fellow were on hand as observers, as were current Chiefs, and a different rising Chief in the morning and afternoon. “Because they have so much contact with the interns’ training, it is important for the Chiefs to be involved,” Suneja said.
Over the course of the next couple hours, every twenty minutes the interns would rotate to the next station and begin a new scenario. At the end of their five sessions, the interns then returned to the conference room to complete an evaluation of their own, providing valuable feedback to the evaluators on what they had just experienced.
Then it was time for a well-earned lunch break.
The same routine was then run again in the afternoon for the other half of the interns. The benefit this year, Suneja said, was that almost every one of the morning faculty observers were also working in the afternoon, reducing the amount of training time.
After the final session, the interns again recorded their evaluations while faculty raters returned to their conference room for a debrief about what they felt like was working and what needed improvement. Organizers will take all this feedback and continue to search for ways to make this unique program even more beneficial.
Rowat, Suneja, and the rest of the Multidisciplinary EPA team would like to thank everyone who helped make this day possible. Special thanks to Ellen Franklin, director of Clinical Skills Assessment at the Carver College of Medicine, and her team for providing the material resources that made the OSCE possible. OSCE Day was also made possible by a grant from the GME Innovation Fund.
Cassie Drees Duncan