Automatic fibroatheroma identification in intravascular optical coherence tomography volumes

Qifeng Yan, Mengdi Xu, Damon Wing Kee Wong, Akira Taruya, Atsushi Tanaka, Jiang Liu, Philip Wong, Jun Cheng

Research output: Journal PublicationArticlepeer-review


Coronary heart disease is the most common type of heart disease that leads to heart attacks. The identification of vulnerable plaques, especially the thin-cap fibroatheroma (TCFA), is crucial to the diagnosis of coronary artery disease. Intravascular optical coherence tomography (IVOCT), an emerging imaging modality, has been proven to be useful for the identification of vulnerable plaques. In this work, we propose an approach to identify the volumes with fibroatheroma frames automatically. In the proposed method, we first detect the lumen using a graph-search based method from unfolded images. Then a region of interest starting from the lumen boundary is cropped for feature extraction. We explore three texture features, Local Binary Patterns (LBP), Haar-like and Histograms of Oriented Gradients (HOG), for fibroatheroma identification. In order to reduce the amount of calculation, a bag of words (BoW) approach is utilized in the feature extraction. Finally, support vector machines are trained to classify the volumes with fibroatheroma frames from those without. A dataset with 41 volumes collected from 41 different subjects is used. Experimental results show that we can achieve a sensitivity of 0.88 and a specificity of 0.94, demonstrating the effectiveness of the proposed method.

Original languageEnglish
Pages (from-to)15477-15483
Number of pages7
JournalJournal of Ambient Intelligence and Humanized Computing
Issue number11
Publication statusPublished - 19 Oct 2019
Externally publishedYes


  • Image processing
  • Optical coherence tomography

ASJC Scopus subject areas

  • General Computer Science


Dive into the research topics of 'Automatic fibroatheroma identification in intravascular optical coherence tomography volumes'. Together they form a unique fingerprint.

Cite this