COMPARISON OF BREAST MRI TUMOR CLASSIFICATION USING HUMAN-ENGINEERED RADIOMICS, TRANSFER LEARNING FROM DEEP CONVOLUTIONAL NEURAL NETWORKS, AND FUSION METHODS

Abstract

This project proposes breast cancer identification using digital image processing approach. Digital image based signatures of breast tumors may ultimately contribute to the design of patient specific breast cancer diagnostics and treatments. Beyond traditional human-engineered computer vision methods, tumor classification methods using transfer learning from deep convolutional neural networks (CNNs) are actively under development. This article will first discuss our progress in using CNN-based transfer learning to characterize breast tumors for various diagnostic, prognostic, or predictive image based tasks across multiple imaging modalities, including mammography, digital breast tomosynthesis, ultrasound (US), and magnetic resonance imaging (MRI), compared to both human-engineered feature-based radiomics and fusion classifiers created through combination of such features. Second, a new study is presented that reports on a comprehensive comparison of the classification performances of features derived from human-engineered radiomic features, CNN transfer learning, and fusion classifiers for breast lesions imaged with MRI. These studies demonstrate the utility of transfer learning for computer-aided diagnosis and highlight the synergistic improvement in classification performance using fusion classifiers. This project is implemented with MATLAB software.

Let's Talk