Review — Neural Machine Translation by Jointly Learning to Align and Translate

Using Attention Decoder, Automatically Search for Part of Source Sentence at Encoder for Machine Translation

Attention Decoder/RNNSearch (Figure from https://meetonfriday.com/posts/4f49bf9b/)

Outline

1. Proposed Architecture Using Attention Decoder

Proposed Architecture Using Attention Decoder (Top: Decoder, Bottom: Encoder)

2. Encoder: Bidirectional RNN (BiRNN)

Encoder: Bidirectional RNN (BiRNN)

3. Decoder: Attention Decoder

Decoder: Attention Decoder

3.1. Hidden State si

3.2. Context Vector ci

3.3. Target Word yi

4. Experimental Results

4.1. Dataset

4.2. Models

4.3. BLEU Results

BLEU scores of the trained models computed on the test set (RNNsearch-50* was trained much longer)
The BLEU scores of the generated translations on the test set with respect to the lengths of the sentences

4.4. Qualitative Results

Four sample alignments found by RNNsearch-50 (a) an arbitrary sentence. (b–d) three randomly selected samples among the sentences without any unknown words and of length between 10 and 20 words from the test set.

4.5. Long Sentence

Input Long Sentences
Output by RNNencdec-50
Output by RNNsearch-50

Reference

Natural Language Processing (NLP)

My Other Previous Paper Readings

PhD, Researcher. I share what I've learnt and done. :) My LinkedIn: https://www.linkedin.com/in/sh-tsang/, My Paper Reading List: https://bit.ly/33TDhxG