Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support generation config in ORTModel #651

Merged
merged 5 commits into from
Jan 25, 2023

Conversation

fxmarty
Copy link
Contributor

@fxmarty fxmarty commented Dec 29, 2022

This PR adds support for generation config in ORTModel, following huggingface/transformers#20388

Note: we should really add nightly tests tracking on transformers/diffusers main.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Dec 29, 2022

The documentation is not available anymore as the PR was closed or merged.

@regisss
Copy link
Contributor

regisss commented Dec 29, 2022

Big +1 for nightly tests aligned with Transformers/Diffusers main branches!

So this PR cannot be merged before the next release of Transformers right?

@fxmarty
Copy link
Contributor Author

fxmarty commented Dec 29, 2022

Yes, I think we will need to wait for the next release of transformers to merge!

Copy link
Contributor

@regisss regisss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@fxmarty
Copy link
Contributor Author

fxmarty commented Jan 25, 2023

I'll merge despite the red CI for exporters, that is a minor tracked regression #721

@fxmarty fxmarty merged commit 50e87a6 into huggingface:main Jan 25, 2023
fxmarty added a commit that referenced this pull request Jan 25, 2023
* support generation config

* add can_generate method in ORTModelForConditionalGeneration

* trigger actions

* fix typog

* rollback
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants