Brief Review — nnU-Net: A Self-Configuring Method for Deep Learning-based Biomedical Image Segmentation
nnU-Net, Automatic Configuration for Diverse Biomedical Image Segmentation Datasets
nnU-Net: A Self-Configuring Method for Deep Learning-based Biomedical Image Segmentation,
nnU-Net, by German Cancer Research Center, University of Heidelberg, DeepMind, and Heidelberg University Hospital,
2021 Nature Methods, Over 1100 Citations (Sik-Ho Tsang @ Medium)
Medical Imaging, Medical Image Analysis, Image Segmentation, U-Net
==== My Other Previous Paper Readings Over Here ====
- Instead of proposing a new model architecture, nnU-Net is proposed to automatically configure the segmentation framework itself, including preprocessing, network architecture, training, and post-processing for any new task. Hence, nnU-Net means ‘No New Net’.
- nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions.
- nnU-Net Workflow
- nnU-Net Configuration Hyperparameters
1. nnU-Net Workflow
- As mentioned, nnU-Net is not proposing a new model architecture.
Instead, specifically, nnU-Net defines a recipe hereafter that systematizes the configuration process on a task-agnostic level and drastically reduces the search space for empirical design choices when given a new task:
- Collect design decisions that do not require adaptation between datasets and identify a robust common configuration (‘fixed parameters’).
- For as many of the remaining decisions as possible, formulate explicit dependencies between specific dataset properties (‘dataset fingerprint’) and design choices (‘pipeline fingerprint’) in the form of heuristic rules to allow for almost-instant adaptation on application (‘rule-based parameters’).
- Learn only the remaining decisions empirically from the data (‘empirical parameters’).
- The above steps are performed for each dataset.
2. nnU-Net Configuration Hyperparameters
- Given a new segmentation task, dataset properties are extracted in the form of a ‘dataset fingerprint’ (pink).
- A set of heuristic rules models parameter interdependencies (thin arrows) and operates on this fingerprint to infer the data-dependent ‘rule-based parameters’ (green) of the pipeline.
- These are complemented by ‘fixed parameters’ (blue), which are predefined and do not require adaptation.
- Up to three configurations are trained in a five-fold cross-validation.
- Finally, nnU-Net automatically performs empirical selection of the optimal ensemble of these models and determines whether post-processing is required (‘empirical parameters’, yellow).
- The table shown above shows explicit values as well as summarized rule formulations of all configured parameters. Res., resolution.
2.1. Some Configuration Examples
- U-Net family, which has encoder decoder architecture, is used. Particularly, 2D U-Net, 3D U-Net, and Cascaded 3D U-Net are used.
- None of U-Net configurations made use of recently proposed architectural variations, such as short residual connections, dense connections, attention mechanisms, squeeze-and-excitation networks or dilated convolutions.
- Batch normalization, which is often used to speed up or stabilize training, does not perform well with small batch sizes. Therefore, instance normalization is used for all U-Net models.
- Furthermore, ReLU is replaced with Leaky ReLUs (negative slope, 0.01).
- To improve the stability of the training, a minimum batch size of two is required and a large momentum term for network training is chosen (Fixed parameters). 1000 epochs are used for training.
- (Only few examples are shown here. Please check the paper for other details.)
- Results achieved by nnU-Net are highlighted in red; competing teams are shown in blue.
nnU-Net outperforms most specialized deep learning pipelines.
- (Only a part of results is shown here. Please check the paper for other details.)
[2021 Nature Methods] [nnU-Net]
nnU-Net: A Self-Configuring Method for Deep Learning-based Biomedical Image Segmentation
4.2. Biomedical Image Segmentation
2021 [Expanded U-Net] [3-D RU-Net] [nnU-Net]