Purpose We developed DeepLook, a CAD based on a CNN for the analysis of Digital Breast Tomosynthesis images to classify and locate possible breast mass lesions. Methods This CAD takes advantage of a pre-processing algorithm to remove pectoral muscle and skin, before entering the lesion classification step and the use of a CNN architecture. Evaluation of the architecture performance was realized using a dataset coming from Duke University, one of the largest databases publicly available online. Results DeepLook can classify individual DBT images into two classes: healthy (Negative) and presence of a mass (Positive) or in three classes: healthy, presence of a malignant or benignant masses. Accuracy and sensitivity on the binary classification of test data of the most suitable considered dataset reached 82% and 68%, respectively, with an AUC of 0.90. Conclusions We developed DeepLook, a CAD based on a CNN for the analysis of DBT images to classify and locate possible breast mass lesions. The reported performance was obtained with a CNN architecture that is significantly simpler than the ones proposed in the literature so far and is promising for future clinical implementation in routine DBT diagnosis, so as to suggest to radiologists the slices that are worth paying attention to better. As a further step, a statistical study of the mass locations in images generated through the GradCam algorithm is under development.

DBT_DCNN: a new convolutional neural network for mass detection in digital breast tomosynthesis / Paternò, Gianfranco; Bianchini, Matteo; Cardarelli, Paolo; Taibi, Angelo; Ricciardi, Roberta; Russo, Paolo; Mettivier, Giovanni. - In: HEALTH AND TECHNOLOGY. - ISSN 2190-7188. - (2026). [10.1007/s12553-025-01039-6]

DBT_DCNN: a new convolutional neural network for mass detection in digital breast tomosynthesis

Russo, Paolo;Mettivier, Giovanni
2026

Abstract

Purpose We developed DeepLook, a CAD based on a CNN for the analysis of Digital Breast Tomosynthesis images to classify and locate possible breast mass lesions. Methods This CAD takes advantage of a pre-processing algorithm to remove pectoral muscle and skin, before entering the lesion classification step and the use of a CNN architecture. Evaluation of the architecture performance was realized using a dataset coming from Duke University, one of the largest databases publicly available online. Results DeepLook can classify individual DBT images into two classes: healthy (Negative) and presence of a mass (Positive) or in three classes: healthy, presence of a malignant or benignant masses. Accuracy and sensitivity on the binary classification of test data of the most suitable considered dataset reached 82% and 68%, respectively, with an AUC of 0.90. Conclusions We developed DeepLook, a CAD based on a CNN for the analysis of DBT images to classify and locate possible breast mass lesions. The reported performance was obtained with a CNN architecture that is significantly simpler than the ones proposed in the literature so far and is promising for future clinical implementation in routine DBT diagnosis, so as to suggest to radiologists the slices that are worth paying attention to better. As a further step, a statistical study of the mass locations in images generated through the GradCam algorithm is under development.
2026
DBT_DCNN: a new convolutional neural network for mass detection in digital breast tomosynthesis / Paternò, Gianfranco; Bianchini, Matteo; Cardarelli, Paolo; Taibi, Angelo; Ricciardi, Roberta; Russo, Paolo; Mettivier, Giovanni. - In: HEALTH AND TECHNOLOGY. - ISSN 2190-7188. - (2026). [10.1007/s12553-025-01039-6]
File in questo prodotto:
File Dimensione Formato  
DeepLook s12553-025-01039-6.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Accesso privato/ristretto
Dimensione 1.34 MB
Formato Adobe PDF
1.34 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/1025614
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact