Review — GloVe: Global Vectors for Word Representation

Using global corpus statistics for learning word representation, outperforms CBOW in Word2Vec

(Image from https://unsplash.com/photos/bqzLehtF8XE)

Outline

1. The Statistics of Word Occurrences in a Corpus

Co-occurrence probabilities for target words ice and steam with selected context words from a 6 billion token corpus

2. GloVe: Global Vectors

3. Word Analogy Task Results

Accuracy (%) on Word Analogy Task
man — woman
company — ceo
city — zip code
comparative — superlative

References

Natural Language Processing (NLP)

My Other Previous Paper Readings

PhD, Researcher. I share what I've learnt and done. :) My LinkedIn: https://www.linkedin.com/in/sh-tsang/, My Paper Reading List: https://bit.ly/33TDhxG