About 50 results
Open links in new tab
  1. How to fetch vectors for a word list with Word2Vec?

    Jul 15, 2015 · I want to create a text file that is essentially a dictionary, with each word being paired with its vector representation through word2vec. I'm assuming the process would be to first train word2vec...

  2. How to use word2vec to calculate the similarity distance by giving 2 ...

    Word2vec is a open source tool to calculate the words distance provided by Google. It can be used by inputting a word and output the ranked word lists according to the similarity.

  3. What is the concept of negative-sampling in word2vec? [closed]

    The terminology is borrowed from classification, a common application of neural networks. There you have a bunch of positive and negative examples. With word2vec, for any given word you have a list …

  4. SpaCy: how to load Google news word2vec vectors?

    SpaCy: how to load Google news word2vec vectors? Asked 8 years, 11 months ago Modified 6 years, 8 months ago Viewed 21k times

  5. Classic king - man + woman = queen example with pretrained word ...

    Dec 12, 2022 · I am really desperate, I just cannot reproduce the allegedly classic example of king - man + woman = queen with the word2vec package in R and any (!) pre-trained embedding model (as a …

  6. How to get vector for a sentence from the word2vec of tokens in ...

    Apr 21, 2015 · It is possible, but not from word2vec. The composition of word vectors in order to obtain higher-level representations for sentences (and further for paragraphs and documents) is a really …

  7. How to load a pre-trained Word2vec MODEL File and reuse it?

    Nov 29, 2017 · import gensim # Load pre-trained Word2Vec model. model = gensim.models.Word2Vec.load("modelName.model") now you can train the model as usual. also, if …

  8. What's the major difference between glove and word2vec?

    May 10, 2019 · What is the difference between word2vec and glove? Are both the ways to train a word embedding? if yes then how can we use both?

  9. What is the ideal "size" of the vector for each word in Word2Vec?

    Jun 21, 2022 · model = gensim.models.Word2Vec.load("w2model.trained") vec = [] finalvecs = [] #tokens is a list of over a 1 million rows for token in tokens: for word in token: …

  10. What are the differences between contextual embedding and word ...

    Jun 8, 2020 · Word embeddings and contextual embeddings are slightly different. While both word embeddings and contextual embeddings are obtained from the models using unsupervised learning, …