Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backward rnnlm #2436

Merged
merged 5 commits into from
May 24, 2018
Merged

Backward rnnlm #2436

merged 5 commits into from
May 24, 2018

Conversation

hainan-xv
Copy link
Contributor

lattice rescoring with backward RNNLMs

@@ -0,0 +1,137 @@
#!/bin/bash
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks! please make it a soft link to tuning/ in case we later want to update it.

# Lattice rescoring
rnnlm/lmrescore_back.sh \
--cmd "$decode_cmd --mem 4G" \
--weight 0.5 --max-ngram-order $ngram_order \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you are using a weight of 0.5 for both the forward and backward passes, I'd be surprised if this weight was optimal. Because effectively the weight on the n-gram LM would be zero.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tuned this a bit and it seems 0.5 is the best weight.

I found another small in issue in the code which I should fix before merging this PR. Do you think we should just leave 0.5 as it is now?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you try using 0.4 for both forward and backward? It's very surprising if it doesn't want any weight at all on the n-gram LM-- that always helps, as they are so complementary.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. 0.4/0.4/0.2 is also worse than 0.5/0.5/0

@danpovey danpovey merged commit b1ae952 into kaldi-asr:master May 24, 2018
@hainan-xv hainan-xv deleted the backward_rnnlm branch May 25, 2018 21:59
dpriver pushed a commit to dpriver/kaldi that referenced this pull request Sep 13, 2018
Skaiste pushed a commit to Skaiste/idlak that referenced this pull request Sep 26, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants