ABUSDet: A Novel 2.5D deep learning model for automated breast ultrasound tumor detection

Xudong Song, Xiaoyang Lu, Gengfa Fang, Xiangjian He, Xiaochen Fan, Le Cai, Wenjing Jia, Zumin Wang

Research output: Journal PublicationArticlepeer-review

1 Citation (Scopus)

Abstract

Automated Breast Ultrasound is a highly advanced breast tumor detection modality that produces hundreds of 2D slices in each scan. However, this large number of slices poses a significant burden for physicians to review. This paper proposes a novel 2.5D tumor detection model, named “ABUSDet,” to assist physicians in automatically reviewing ABUS images and predicting the locations of breast tumors in images. At the core of this approach, a sequence of data blocks partitioned from a pre-processed 3D volume are fed to a 2.5D tumor detection model, which outputs a sequence of 2D tumor candidates. An aggregation module then clusters the 2D tumor candidates to produce the ultimate 3D coordinates of the tumors. To further improve the accuracy of the model, a novel mechanism for training deep learning models, called “Deliberate Training,” is proposed. The proposed model is trained and tested on a dataset of 87 patients with 235 ABUS volumes. It achieves sensitivities of 77.94%, 75.49%, and 65.19% at FPs/volume of 3, 2, and 1, respectively. Compared with the 2D and 3D object detection models, the proposed ABUSDet model achieves the highest sensitivity with relatively low false-positive rates. Graphical abstract: [Figure not available: see fulltext.]

Original languageEnglish
Pages (from-to)26255-26269
Number of pages15
JournalApplied Intelligence
Volume53
Issue number21
DOIs
Publication statusPublished - Nov 2023

Keywords

  • 2.5D tumor detection
  • Automated breast ultrasound (ABUS)
  • Breast cancer
  • Deliberate training

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'ABUSDet: A Novel 2.5D deep learning model for automated breast ultrasound tumor detection'. Together they form a unique fingerprint.

Cite this