diff --git a/README.md b/README.md index e943690..8694b5c 100644 --- a/README.md +++ b/README.md @@ -17,3 +17,24 @@ curl http://127.0.0.1:5000/word2vec/model?word=restaurant ``` Note: The "model" method returns a base64 encoding of the Word2Vec vector. + +* Where to get a pretrained models + +In case you do not have domain specific data to train, it can be convenient to use a pretrained model. +Please feel free to submit additions to this list through a pull request. + + +| Model file | Number of dimensions | Corpus (size)| Vocabulary size | Author | Architecture | Training Algorithm | Context window - size | Web page | +| [Google News](GoogleNews-vectors-negative300.bin.gz) | 300 |Google News (100B) | 3M | Google | word2vec | negative sampling | BoW - ~5| [http://code.google.com/p/word2vec/] | +| [Freebase IDs](https://docs.google.com/file/d/0B7XkCwpI5KDYaDBDQm1tZGNDRHc/edit?usp=sharing) | 1000 | Gooogle News (100B) | 1.4M | Google | word2vec, skip gram | ? | BoW - ~10 | [http://code.google.com/p/word2vec/] | +| [Freebase names](https://docs.google.com/file/d/0B7XkCwpI5KDYeFdmcVltWkhtbmM/edit?usp=sharing) | 1000 | Gooogle News (100B) | 1.4M | Google | word2vec, skip gram | ? | BoW - ~10 | [http://code.google.com/p/word2vec/] | +| [Wikipedia+Gigaword 5](http://www-nlp.stanford.edu/data/glove.6B.50d.txt.gz) | 50 | Wikipedia+Gigaword 5 (6B) | 400,000 | GloVe | GloVe | AdaGrad | 10+10 | [http://nlp.stanford.edu/projects/glove/] | +| [Wikipedia+Gigaword 5](http://www-nlp.stanford.edu/data/glove.6B.100d.txt.gz) | 100 | Wikipedia+Gigaword 5 (6B) | 400,000 | GloVe | GloVe | AdaGrad | 10+10 | [http://nlp.stanford.edu/projects/glove/] | +| [Wikipedia+Gigaword 5](http://www-nlp.stanford.edu/data/glove.6B.200d.txt.gz) | 200 | Wikipedia+Gigaword 5 (6B) | 400,000 | GloVe | GloVe | AdaGrad | 10+10 | [http://nlp.stanford.edu/projects/glove/] | +| [Wikipedia+Gigaword 5](http://www-nlp.stanford.edu/data/glove.6B.300d.txt.gz) | 300 | Wikipedia+Gigaword 5 (6B) | 400,000 | GloVe | GloVe | AdaGrad | 10+10 | [http://nlp.stanford.edu/projects/glove/] | +| [Common Crawl 42B](http://www-nlp.stanford.edu/data/glove.42B.300d.txt.gz) | 300 | Common Crawl (42B) | ~2M | GloVe | GloVe | GloVe | AdaGrad | [http://nlp.stanford.edu/projects/glove/] | +| [Twitter (2B Tweets)](http://www-nlp.stanford.edu/data/glove.twitter.27B.25d.txt.gz) | 25 | Twitter (27B) | ? | GloVe | GloVe | GloVe | AdaGrad | [http://nlp.stanford.edu/projects/glove/] | +| [Twitter (2B Tweets)](http://www-nlp.stanford.edu/data/glove.twitter.27B.50d.txt.gz) | 50 | Twitter (27B) | ? | GloVe | GloVe | GloVe | AdaGrad | [http://nlp.stanford.edu/projects/glove/] | +| [Twitter (2B Tweets)](http://www-nlp.stanford.edu/data/glove.twitter.27B.100d.txt.gz) | 100 | Twitter (27B) | ? | GloVe | GloVe | GloVe | AdaGrad | [http://nlp.stanford.edu/projects/glove/] | +| [Twitter (2B Tweets)](http://www-nlp.stanford.edu/data/glove.twitter.27B.200d.txt.gz) | 200 | Twitter (27B) | ? | GloVe | GloVe | GloVe | AdaGrad | [http://nlp.stanford.edu/projects/glove/] | +| [Wikipedia dependency](http://u.cs.biu.ac.il/~yogo/data/syntemb/deps.words.bz2) | 300 | Wikipedia (?) | 174,015 | Levy \& Goldberg | word2vec modified | word2vec | syntactic dependencies | [https://levyomer.wordpress.com/2014/04/25/dependency-based-word-embeddings/] |