Reading: MultiResUNet — Rethinking U-Net (Biomedical Image Segmentation)

Some modifications on U-Net, Outperforms U-Net on Five Datasets

Sik-Ho Tsang
4 min readMay 14, 2020
Original U-Net (If interested, please read about U-Net.)

In this story, MultiResUNet, by Samsung, and Bangladesh University of Engineering and Technology, is briefly reviewed. By enhancing the U-Net architecture, MultiResUNet outperforms U-Net on five datasets. This is a paper in 2020 Journal of Neural Networks with high impact factor of 5.785. (Sik-Ho Tsang @ Medium)

Outline

  1. MultiResUNet: Modifications on U-Net
  2. Experimental Results

1. MultiResUNet: Modifications on U-Net

MultiResUNet

1.1. From 2 Conv Layers in U-Net to MultiRes Block in MultiResUNet

Developing the proposed MultiRes block from (a) to ©, © MultiRes Block
  • For the sequence of two convolutional layers at each level in the original U-Net, they are replaced by the proposed MultiRes block.
  • (a): First, start with a simple Inception-like block by using 3×3, 5×5 and 7×7 convolutional filters in parallel, to reconcile spatial features from different context size.
  • (b): Then, large filter is factorized into a succession of 3 × 3 filters.
  • (c): Finally, MultiRes block is established, by increasing the number of filters in the successive three layers gradually and adding a residual connection, along with 1×1 filters for conserving dimensions.
  • This is similar to the DenseBlock in DenseNet with the residual path, originated in ResNet.

1.2. ResPath in MultiResUNet

ResPath with 1×1 and 3×3 filters
  • For the ResPath, there are 3×3 and 1×1 filters as shown above. Number of 3×3 and 1×1 filters depends on the level inside the network, which as shown at the table below.
  • Authors hypothesized that the intensity of the semantic gap between the encoder and decoder feature maps are likely to decrease.
  • These additional non-linear operations are expected to reduce the semantic gap between encoder and decoder features.
  • (Yet, according to the figure above, is it still a residual path? lol)

1.3. Number of feature maps

  • In order for fair comparison with U-Net, similar number of parameters should be maintained between two models:
  • where U and W are the number of filters in one convolutional layer in U-Net and MultiResUNet respectively. α = 1.67 is used. Thus, the filter numbers as shown below are multipled by α already.

1.4. Architecture Summary

  • Below is the details of MultiResUNet architecture:
MultiResUNet Detailed Architecture

2. Experimental Results

2.1. Datasets

5 Datasets
  • 5 Datasets are used as shown above. The first 4 are 2D images and the last one are 3D images.

2.2. Model Sizes

U-Net & MultiResUNet Models with Similar Number of Parameters
  • As shown above, the parameters of U-Net and MultiResUNet models for 2D and 3D data are similar for fair comparison.

2.3. Evaluation Metric

  • Jaccard Index is used as evaluation metric, similar to IoU.

2.4. Results

Jaccard Index (%)
  • MultiResUNet obtains much higher Jaccard Index than U-Net.

2.5. Qualitative Results

Segmenting images with vague boundaries (Red: U-Net, Blue: MultiResUNet, Green: Ground-Truth)
  • With such subtle boundary, MultiResUNet can segment with much higher accuracy.
Segmenting images with irregularities (Red: U-Net, Blue: MultiResUNet, Green: Ground-Truth)
  • With varieties of shape, MultiResUNet still can segment with high accuracy compared with U-Net. U-Net can only segment with many false positive or false negative regions.

2.6. Ablation Study

Ablation Study for Different Parts of MultiResUNet
  • With only ResPath, a bit improvement over U-Net is obtained.
  • With only Resblock, larger improvement over U-Net can be achieved.
  • With both, the largest improvement is obtained.

During the days of coronavirus, let me have a challenge of writing 30 stories again for this month ..? Is it good? This is the 19th story in this month. Thanks for visiting my story..

--

--

Sik-Ho Tsang
Sik-Ho Tsang

Written by Sik-Ho Tsang

PhD, Researcher. I share what I learn. :) Linktree: https://linktr.ee/shtsang for Twitter, LinkedIn, etc.

No responses yet