Validity evidence for a patient note scoring rubric based on the new patient note format of the United States Medical Licensing Examination

Acad Med. 2013 Oct;88(10):1552-7. doi: 10.1097/ACM.0b013e3182a34b1e.

Abstract

Purpose: This study examines validity evidence for the Patient Note Scoring Rubric, which was developed for a local graduation competency exam (GCE) to assess patient notes written in the new United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills format. The rubric was designed to measure three dimensions: Documentation, justified differential diagnosis (DDX), and Workup.

Method: Analyses used GCE data from 170 fourth-year medical students who completed five standardized patient (SP) cases in May 2012. Five physician raters each scored all responses for one case. Internal structure was examined using correlations between dimensions and between cases; a generalizability study was also conducted. Relationship to other variables was examined by correlating patient note scores with SP encounter scores. Consequence was assessed by comparing pass-fail rates between the rubric and the previous global rating. Response process was examined using rater feedback.

Results: Correlations between scores from different dimensions ranged between 0.33 and 0.44. Reliability of scores based on the phi coefficient was 0.43; 15 cases were required to reach a phi coefficient of 0.70. Evidence of case specificity was found. Documentation scores were moderately correlated with SP scores for data gathering (r = 0.47, P < .001). There was no meaningful change in pass-fail rates. Raters' feedback indicated that they required more training for scoring the DDX and Workup dimensions.

Conclusions: There is initial validity evidence for use of this rubric to score local clinical exams that are based on the new USMLE patient note format.

MeSH terms

  • Clinical Competence / standards*
  • Diagnosis, Differential
  • Documentation
  • Education, Medical, Undergraduate / standards*
  • Educational Measurement / methods*
  • Feedback
  • Humans
  • Licensure, Medical*
  • Reproducibility of Results
  • Schools, Medical
  • United States