Tutorial: Transformer on Google Colab for Machine Translation

A Transformer Tutorial on Google Colab

Sik-Ho Tsang
3 min readApr 1, 2022
  • Recently, I tried a Transformer toy example on Google Colab, provided by D2L.ai. I found that there is a Python library issue needs to be addressed:
  1. Pip Install Issue
  2. Training
  3. Inference with BLEU Scores
  4. Visualization
  • (Note: There maybe changes in the future.)

1. Pip Install Issue

numpy version issue on Colab using Transformer from D2L.ai
  • It seems that D2L.ai Chinese version needs numpy v1.22.2 while colab doesn’t have this version at this moment. (Maybe later on it works.)
  • Change the Github path from Chinese version to English version. Particularly, change “zh” to “en”, as follows:
Change “d2l-zh” to “d2l-en”
  • This English version uses numpy v1.21.5 where colab also got this version inside. Then run it.
  • However, runtime also needs to restart for some newly installed libraries. Click “RESTART RUNTIME”:
Restart runtime for some newly installed libraries
  • Run the command again. There should be no any warnings appeared:
All libraries are ready now!

2. Training

Loss curve
  • According to the descriptions in Colab, there are only 2 encoders and 2 decoders, and the number of multi-head attention is 4.
  • Maybe the corpus is also small, or even the sentences are short. So, the training time is also quite short.

3. Inference with BLEU Scores

  • English-to-French machine translation is performed with BLEU score also provided.

4. Attention Visualization

Visualization at Encoder
Visualization at Decoder
Visualization at Encoder-Decoder
  • There are also attention visualization provided.

--

--

Sik-Ho Tsang
Sik-Ho Tsang

Written by Sik-Ho Tsang

PhD, Researcher. I share what I learn. :) Linktree: https://linktr.ee/shtsang for Twitter, LinkedIn, etc.

No responses yet