Review — context2vec: Learning Generic Context Embedding with Bidirectional LSTM

Using Bidirectional LSTM Instead of Averaging in Word2Vec

A 2D illustration of context2vec’s embedded space and similarity metrics. Triangles and circles denote sentential context embeddings and target word embeddings, respectively

Outline

1. CBOW in Word2Vec

CBOW in Word2Vec

2. Bidirectional LSTM in context2vec

context2vec architecture
context2vec hyperparameters

3. Experimental Results

3.1. MSCC Corpus Development Set

Development set results (iters+ denotes the best model found when running more training iterations with α = 0.75)
Test set results (c2v is context2vec)

3.2. Others

Top-5 closest target words to a few given target words
Closest target words to various sentential contexts

Reference

Natural Language Processing (NLP)

My Other Previous Paper Readings

PhD, Researcher. I share what I've learnt and done. :) My LinkedIn: https://www.linkedin.com/in/sh-tsang/, My Paper Reading List: https://bit.ly/33TDhxG