Automatic breast ultrasound (ABUS) tumor segmentation based on global and local feature fusion

Phys Med Biol. 2024 May 30;69(11). doi: 10.1088/1361-6560/ad4d53.

Abstract

Accurate segmentation of tumor regions in automated breast ultrasound (ABUS) images is of paramount importance in computer-aided diagnosis system. However, the inherent diversity of tumors and the imaging interference pose great challenges to ABUS tumor segmentation. In this paper, we propose a global and local feature interaction model combined with graph fusion (GLGM), for 3D ABUS tumor segmentation. In GLGM, we construct a dual branch encoder-decoder, where both local and global features can be extracted. Besides, a global and local feature fusion module is designed, which employs the deepest semantic interaction to facilitate information exchange between local and global features. Additionally, to improve the segmentation performance for small tumors, a graph convolution-based shallow feature fusion module is designed. It exploits the shallow feature to enhance the feature expression of small tumors in both local and global domains. The proposed method is evaluated on a private ABUS dataset and a public ABUS dataset. For the private ABUS dataset, the small tumors (volume smaller than 1 cm3) account for over 50% of the entire dataset. Experimental results show that the proposed GLGM model outperforms several state-of-the-art segmentation models in 3D ABUS tumor segmentation, particularly in segmenting small tumors.

Keywords: automated breast ultrasound (ABUS); graph convolution; transformer; tumor segmentation.

MeSH terms

  • Automation
  • Breast Neoplasms* / diagnostic imaging
  • Humans
  • Image Processing, Computer-Assisted* / methods
  • Imaging, Three-Dimensional / methods
  • Ultrasonography, Mammary* / methods