Ensuring the competence and capability of newly graduated doctors is critical to patient safety. Objective Structured Clinical Exams (OSCEs) are used to assess the competence of medical students throughout training and prior to graduation. These involve medical students rotating around a series of timed, simulated clinical tasks, and being observed by a qualified examiner on each. Issues with the fairness of OSCEs can reduce their ability to detect struggling or failing trainees, which can then challenge the safety of graduation decisions. As a result, enhancing examiners’ standardisation and increasing their confidence to identify weak or failing performances is vital to ensuring that OSCEs effectively guide trainees’ performance whilst ensuring safe standards of clinical care.
This project aims to further develop a novel intervention called Video-Based Examiner Benchmarking” (VBB), that addresses this problem. VBB uses station-specific (ie showing the same clinical task the examiner will examine) video-based examples of performances to calibrate examiners before examining. It has been developed by our team over the last two years. VBB has a strong underpinning theoretical basis, based on cognitive research, that showed that examiners’ judgements can be clearly influenced by the standard of recent performances.
Accordingly, it is expected that by providing examiners with the same station-specific example performances before they start examining (along with an agreed score and explanation for the score for each performance) will help to calibrate their judgements, making examiners more consistent and more confident in detecting sub-standard performances.
This study will provide evidence of the effectiveness of VBB on examiners’ consistency and confidence, and how prepared they are to fail weak performances. This is expected to encourage the adoption of findings into recognised good practice and subsequent use to enhance assessment.