Skip to content

Resources

Willi Raschkowski edited this page Nov 13, 2016 · 2 revisions

Papers on word embeddings

Tutorials

Chris McCormick: Word2Vec Tutorial - The Skip-Gram Model

This tutorial covers the skip gram neural network architecture for Word2Vec. My intention with this tutorial was to skip over the usual introductory and abstract insights about Word2Vec, and get into more of the details. Specifically here I’m diving into the skip gram neural network model.

Chris McCormick: Word2Vec Resources

While researching Word2Vec, I came across a lot of different resources of varying usefullness, so I thought I’d share my collection of links and notes on what they contain.

Radim Rehurek: Deep learning with word2vec and gensim

In short, the spirit of word2vec fits gensim’s tagline of topic modelling for humans, but the actual code doesn’t, tight and beautiful as it is. I therefore decided to reimplement word2vec in gensim, starting with the hierarchical softmax skip-gram model, because that’s the one with the best reported accuracy.

Radim Rehurek: Word2vec Tutorial

I never got round to writing a tutorial on how to use word2vec in gensim. It’s simple enough and the API docs are straightforward, but I know some people prefer more verbose formats. Let this post be a tutorial and a reference example.

Clone this wiki locally