Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcomes

J Med Syst. 2021 Jan 4;45(1):5. doi: 10.1007/s10916-020-01701-8.

Abstract

Deep neural network models are emerging as an important method in healthcare delivery, following the recent success in other domains such as image recognition. Due to the multiple non-linear inner transformations, deep neural networks are viewed by many as black boxes. For practical use, deep learning models require explanations that are intuitive to clinicians. In this study, we developed a deep neural network model to predict outcomes following major cardiovascular procedures, using temporal image representation of past medical history as input. We created a novel explanation for the prediction of the model by defining impact scores that associate clinical observations with the outcome. For comparison, a logistic regression model was fitted to the same dataset. We compared the impact scores and log odds ratios by calculating three types of correlations, which provided a partial validation of the impact scores. The deep neural network model achieved an area under the receiver operating characteristics curve (AUC) of 0.787, compared to 0.746 for the logistic regression model. Moderate correlations were found between the impact scores and the log odds ratios. Impact scores generated by the explanation algorithm has the potential to shed light on the "black box" deep neural network model and could facilitate its adoption by clinicians.

Keywords: Clinical outcome; Deep neural network; Machine learning; Predictive model.

MeSH terms

  • Algorithms*
  • Humans
  • Logistic Models
  • Neural Networks, Computer*
  • ROC Curve