Tutorial: Transformer on Google Colab for Machine Translation
- Recently, I tried a Transformer toy example on Google Colab, provided by D2L.ai. I found that there is a Python library issue needs to be addressed:
- Pip Install Issue
- Inference with BLEU Scores
- (Note: There maybe changes in the future.)
1. Pip Install Issue
- It seems that D2L.ai Chinese version needs numpy v1.22.2 while colab doesn’t have this version at this moment. (Maybe later on it works.)
- Change the Github path from Chinese version to English version. Particularly, change “zh” to “en”, as follows:
- This English version uses numpy v1.21.5 where colab also got this version inside. Then run it.
- However, runtime also needs to restart for some newly installed libraries. Click “RESTART RUNTIME”:
- Run the command again. There should be no any warnings appeared:
- According to the descriptions in Colab, there are only 2 encoders and 2 decoders, and the number of multi-head attention is 4.
- Maybe the corpus is also small, or even the sentences are short. So, the training time is also quite short.
3. Inference with BLEU Scores
- English-to-French machine translation is performed with BLEU score also provided.
4. Attention Visualization
- There are also attention visualization provided.