Trifactor Models for Multiple-Ratings Data

Multivariate Behav Res. 2019 May-Jun;54(3):360-381. doi: 10.1080/00273171.2018.1530091. Epub 2019 Mar 28.

Abstract

In this study we extend and assess the trifactor model for multiple-ratings data in which two different raters give independent scores for the same responses (e.g., in the GRE essay or to subset of PISA constructed-responses). The trifactor model was extended to incorporate a cross-classified data structure (e.g., items and raters) instead of a strictly hierarchical structure. we present a set of simulations to reflect the incompleteness and imbalance in real-world assessments. The effects of the rate of missingness in the data and of ignoring differences among raters are investigated using two sets of simulations. The use of the trifactor model is also illustrated with empirical data analysis using a well-known international large-scale assessment.

Keywords: Programme for International Student Assessment (PISA); Trifactor model; multiple-ratings data; rater effects; validity.

MeSH terms

  • Data Interpretation, Statistical*
  • Humans
  • Models, Psychological*
  • Psychometrics*