Brief Review — Self-Supervised Learning Methods for Label-Efficient Dental Caries Classification

Dental Caries Classification Using SimCLR, BYOL, Barlow Twins

Sik-Ho Tsang
3 min readDec 5, 2022

Self-Supervised Learning Methods for Label-Efficient Dental Caries Classification, Taleb JDiagnostics’22, by University of Potsdam, Charité — Universitätsmedizin Berlin, Contraste Radiologia Odontológica, Universidade Federal do Rio Grande do Sul — UFRGS, and Icahn School of Medicine at Mount Sinai, 2022 JMDPI Diagnostics (Sik-Ho Tsang @ Medium)
Self-Supervised Learning, Medical Imaging, Medical Image Analysis, Image Classification

Outline

  1. Dataset
  2. SimCLR, BYOL, Barlow Twins
  3. Results

1. Dataset

  • The dataset consisted of 38,094 bitewing radiographs (BWRs) taken between 2018 and 2021.
  • After using helper model, a dataset of 315,786 cropped tooth images is obtained. Out of those, it is observed that there are 49.9% of molars, 40.5% of premolars and 9.6% of canines and incisors, respectively.
  • Tooth-level caries labels were extracted from electronic health records (EHRs) that summarize the patient’s dental status.
  • A hold-out test set was curated by dental professionals. The test set consisted of a random sample of 343 BWRs. The test set contained 2846 tooth samples with 29.9% caries prevalence (850 positive and 1996 negative). There are 49.2% molars, 40.5% premolars, and 10.3% canines and incisors, respectively.

2. SimCLR, BYOL, Barlow Twins

  • Two stages are involved: pretraining using self-supervised learning, and fine-tuning for downstream task.

2.1. Pretraining Using Self-Supervised Learning

(a) SimCLR, (b) BYOL, (c) Barlow Twins
  • (a): SimCLR relies on attracting the views of each image together and repelling them from the views of other images.
  • (b): In BYOL the target network calculates moving averages of the online network, which is updated with loss gradients.
  • (c): Barlow Twins computes the cross-correlation matrix of two batches of image views and minimizes its difference to the identity matrix.

2.2. Fine-Tuning for Downstream Task

(d) Fine-Tuning for Caries Classification
  • (d): The obtained CNN encoder is fine-tuned on input tooth images for caries classification.

3. Results

3.1. Full Training Set

Caries classification results when fine-tuning on the full training set. Highlight in bold the best models.
  • ResNet-18 is used as backbone.
  • The highest sensitivity, with 57.9% was observed for Barlow Twins, followed by SimCLR and BYOL, with 57.2% and 54.6%, respectively.
  • For specificity, all models perform similarly, with the baseline model.

With respect to the ROC-AUC values, all pretrained models are close (73.3%, 73% and 73.4% for SimCLR, BYOL and Barlow Twins, respectively) but consistently higher than the baseline model (71.5%).

3.2. Increasing the Size of the Training Set

Caries classification results when fine-tuning on varying quantities of labeled samples (numbers of #teeth/#BWRs).
Evaluation results for data-efficiency by successively increasing the size of the training set.
  • The baseline model was performing worse compared to the pretrained models over all dataset sizes.

For low data regimes (≤3K images), the ROC-AUC values were higher for Barlow Twins and BYOL compared to SimCLR and the baseline model in the balanced case (50% prevalence).

  • Barlow Twins, in particular, exhibited improved values for this metric in most settings.

To authors’ best knowledge, this paper is the first to showcase the potential of self-supervision in field of dentistry.

Reference

[2022 JMDPI Diagnostics] [Taleb JDiagnostics’22]
Self-Supervised Learning Methods for Label-Efficient Dental Caries Classification

4.4. Biomedical Image Self-Supervised Learning

2018 [Spitzer MICCAI’18] 2019 [Rubik’s Cube] [Context Restoration] 2020 [ConVIRT] [Rubik’s Cube+] 2021 [MICLe] [MoCo-CXR] [DVME] 2022 [BT-Unet] [Taleb JDiagnostics’22]

==== My Other Previous Paper Readings ====

--

--

Sik-Ho Tsang

PhD, Researcher. I share what I learn. :) Linktree: https://linktr.ee/shtsang for Twitter, LinkedIn, etc.