Machine learning models are often trained on sociodemographic features to predict mental health outcomes. Biases in the collection of race-related data can limit the development of useful and fair models. To assess the current state of this data in mental health research, we conducted a rapid review guided by Critical Race Theory. Findings reveal limitations in the measurement and reporting of race and ethnicity, potentially leading to models that amplify health inequities.
Keywords: Continental Population Groups; Machine learning; Mental Health.