You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I try to evaluate my model's text generation using the perplexity metric, the batch_size parameters in perplexity._compute(..) was not sufficient, because it tries to tokenize and move the entire set of predictions to GPU. A simple change to move the tokenization to each batch fixes the issue for me.
Also, it should be possible to pass my own model and tokenizer (since my model is not publishable on huggingface) to the metric. I have made these changes to enable my experiments.
I have made changes to fix this. I can open a PR to commit these changes, if this sounds good to you. I believe this will benefit the developer community.
The text was updated successfully, but these errors were encountered:
ChengSashankh
changed the title
Perplexity metric in HuggingFace evaluate does not apply batching to tokenization
Perplexity metric in HuggingFace evaluate does not apply batching correctly to tokenization
Apr 14, 2024
ChengSashankh
changed the title
Perplexity metric in HuggingFace evaluate does not apply batching correctly to tokenization
Perplexity metric in does not apply batching correctly to tokenization
Apr 14, 2024
ChengSashankh
changed the title
Perplexity metric in does not apply batching correctly to tokenization
Perplexity metric does not apply batching correctly to tokenization
Apr 14, 2024
When I try to evaluate my model's text generation using the perplexity metric, the batch_size parameters in perplexity._compute(..) was not sufficient, because it tries to tokenize and move the entire set of predictions to GPU. A simple change to move the tokenization to each batch fixes the issue for me.
Also, it should be possible to pass my own model and tokenizer (since my model is not publishable on huggingface) to the metric. I have made these changes to enable my experiments.
I have made changes to fix this. I can open a PR to commit these changes, if this sounds good to you. I believe this will benefit the developer community.
The text was updated successfully, but these errors were encountered: