Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added CBOW method to Torch file #14

Closed
wants to merge 1 commit into from
Closed

Added CBOW method to Torch file #14

wants to merge 1 commit into from

Conversation

us
Copy link
Contributor

@us us commented Feb 20, 2019

Word2Vec has 2 methods skip-grams and CBOW(Continous Bag of Words) and I added CBOW method.

@us
Copy link
Contributor Author

us commented Feb 20, 2019

and change file name to Word2Vec-Torch-SkipGram(Softmax).py

@graykode
Copy link
Owner

@us Could you tell me what is different with my Skip-gram code?!

@graykode
Copy link
Owner

@us Ok I will change my file name, then after I will wait CBOW code ! Thanks

@graykode
Copy link
Owner

@us See my Commits 6a2a47a
I will close this Pr, Thanks!

@graykode graykode closed this Feb 20, 2019
@us
Copy link
Contributor Author

us commented Feb 20, 2019

Just changed w and target location.
If you know CBOW, that is learning to predict the word by the context.
The skip-gram model is designed to predict the context.

I changed there :

for w in context:
        skip_grams.append([w, target])

https://stackoverflow.com/questions/38287772/cbow-v-s-skip-gram-why-invert-context-and-target-words

TC-zerol pushed a commit to TC-zerol/nlp-tutorial-cn that referenced this pull request Aug 31, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants