# Review — Breast Lesion Classification in Ultrasound Images Using Deep Convolutional Neural Network

**A ****LeNet****-Like CNN is Proposed**

Breast Lesion Classification in Ultrasound Images Using Deep Convolutional Neural Network,Zeimarani ACCESS’20, by Federal University of Amazonas, Fundação Centro de Controle de Oncologia do Estado do Amazonas FCECON, and Federal University of Rio de Janeiros,2020 ACCESS, Over 20 Citations(Sik-Ho Tsang @ Medium)

Medical Imaging, Medical Image Analysis, Image Classification

- The
**dataset**used in this work consists of a limited number of cases,**641 in total**, histopathologically categorized (**413 benign and 228 malignant lesions**). - First, due to a limited number of training data,
**a custom-built CNN with a few hidden layers**is used and**regularization techniques**are applied to improve the performance. - Second,
**transfer learning**is used and some**pre-trained models**are adapted for the dataset.

# Outline

**Pre-Processing****CNN Model Architecture****Results**

# 1. Pre-Processing

## 1.1. Resizing

- The
**first**step adjust the images size to CNN architecture, 224×224 pixels. A**bilinear interpolation**is performed on the original image, size 159×182 pixels, and obtained an intermediate image size 201×224 pixels. - After,
**zero-padding**the vertical dimension, obtained a final image size of**224×224 pixels**.

## 1.2. Class Balancing

- The original image database has
**413 benign images**and**228 malignant images**. **185 malignant images**were chosen randomly and after applying image**flips**, to this randomly chosen images, the**total number of malignant**cases were increased to**413**. Therefore, the**final image dataset**was comprised of**826 images (413 benign and 413 malignant).**

## 1.3. Normalization

- zero centering and normalization are performed to obtain obtain
**zero mean**and**unit variance**:

- where
represents the*x***original**image,*x*’**zero-centered**image,*N**is*the**number of samples**in the data set andthe*x*’’**normalized zero-centered image**.

**2. CNN Model Architecture**

## 2.1. Model Architecture

**A****LeNet****-like CNN**, consists of**four convolutional layers**, is used.- In the first convolutional layer, 32 filters of size 3×3 are used.
- In the second convolutional layer, 64 filters of size 7×7 are used.
- In the third convolution layer, 128 filters of size 5×5 are used.
- In the last convolutional layer, 256 filters of size 3×3 are used.
- In all convolutional operations, the stride of 1 and zero-padding of 1 are used. The activation function of all convolutional layers is
**ReLU**. - In between convolutions, after ReLU, a
**2×2 max-pooling layer**is used for dimensional reduction. **Batch normalization**is applied after each convolutional layer before the non-linearity.- The last convolutional layer is followed by
**two fully connected layers**. The first and second fully connected layers are followed by a ReLU and by a**Softmax**activation function. - A
**binary logistic regression with cross-entropy loss**, or a**binary classification**is used.

## 2.2. Training

**500 epochs**are used. The**mini-batch size**is**128**.- For
**image augmentation**, various**image reflections, rotations, and translations**were used to generate a new dataset. This new data set contains**41630 images.** - The
**Dropout****L2 regularization**with a fixed regularization factor of 0.05.

**3. Results**

## 3.1. Ablations

Using

SGDMresulted in aslight improvement in AUCvalue and therefore selected as the candidate.

Image augmentationassociated with appropriateregularizationtechniquesincreased both accuracy and AUC.

## 3.2. SOTA Comparisons

Some pretrained models are fine-tuned, e.g.: VGG-19, GoogLeNet and ResNet-50.

The proposed method, with simple architecture, obtainsa little bit higher AUC.

CNN has much higher AUCcompared with other feature selection methods.

- A comparison of
**hits and errors**is also performed. - The
**null hypothesis**is that there are**no significant statistical differences**. The adopted significance level was 99.0%, and degree of freedom was equal to 1, resulting in a critical value of*X*² of*tc*=6.63.

The values of the significance tests shown in the table are

all higher than, so thetcnull hypothesis must be rejected.

The proposed method outperformed the radiologists’ evaluationsin terms of accuracy and sensitivity but falls below the radiologist performance regardingspecificity, precision,andfalse alarm.

## Reference

[2020 ACCESS] [Zeimarani ACCESS’20]

Breast Lesion Classification in Ultrasound Images Using Deep Convolutional Neural Network

## 4.1. Biomedical Image Classification

**2017** [ChestX-ray8] **2019** [CheXpert] **2020 **[VGGNet for COVID-19] [Dermatology] [Deep-COVID] [Zeimarani ACCESS’20] **2021 **[CheXternal] [CheXtransfer]