Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug with saving model optimized by inference session #16716

Merged

Conversation

kunal-vaishnavi
Copy link
Contributor

@kunal-vaishnavi kunal-vaishnavi commented Jul 14, 2023

Description

A previous PR added a temporary directory to save the model optimizations after loading a model into an InferenceSession. Many models that have an external data file, however, require the data file to be in the same directory as the ONNX model file. Because the model is saved in a temporary directory and the data is saved in another directory, this causes a FileNotFoundError error when trying to load the model in the temporary directory.

This PR fixes this error by saving the external data file in the same directory that the optimized model is located in.

Motivation and Context

This PR fixes a bug with using a temporary directory while running the optimizer for models that have an external data file.

@kunal-vaishnavi kunal-vaishnavi changed the title Set use of temporary directory for optimizer to optional Fix bug with using temporary directory in optimizer Jul 19, 2023
@kunal-vaishnavi kunal-vaishnavi changed the title Fix bug with using temporary directory in optimizer Fix bug with saving model optimized by inference session Jul 19, 2023
@kunal-vaishnavi kunal-vaishnavi merged commit b7176f9 into microsoft:main Jul 21, 2023
88 of 90 checks passed
kunal-vaishnavi added a commit that referenced this pull request Jul 31, 2023
### Description
This PR adds support for saving model optimizations after loading a
model that contains external data into an `InferenceSession`.



### Motivation and Context
This PR is a follow-up to a [previous
PR](#16716) for saving a
model optimized by an `InferenceSession`.
jchen351 pushed a commit that referenced this pull request Aug 12, 2023
### Description
A [previous PR](#16531)
added a temporary directory to save the model optimizations after
loading a model into an `InferenceSession`. Many models that have an
external data file, however, require the data file to be in the same
directory as the ONNX model file. Because the model is saved in a
temporary directory and the data is saved in another directory, this
causes a `FileNotFoundError` error when trying to load the model in the
temporary directory.

This PR fixes this error by saving the external data file in the same
directory that the optimized model is located in.

### Motivation and Context
This PR fixes a bug with using a temporary directory while running the
optimizer for models that have an external data file.
jchen351 pushed a commit that referenced this pull request Aug 12, 2023
### Description
This PR adds support for saving model optimizations after loading a
model that contains external data into an `InferenceSession`.



### Motivation and Context
This PR is a follow-up to a [previous
PR](#16716) for saving a
model optimized by an `InferenceSession`.
guotuofeng added a commit to microsoft/Olive that referenced this pull request Aug 25, 2023
## Describe your changes
In onnxruntime, the following PRs will make the optimize_model save
external data in the temporary folder
  * microsoft/onnxruntime#16531
  * microsoft/onnxruntime#16716
  * microsoft/onnxruntime#16912
So, in 1.16.0 afterwards, we don't need to copy the model to a temp
directory

## Checklist before requesting a review
- [ ] Add unit tests for this change.
- [ ] Make sure all tests can pass.
- [ ] Update documents if necessary.
- [ ] Format your code by running `pre-commit run --all-files`
- [ ] Is this a user-facing change? If yes, give a description of this
change to be included in the release notes.

## (Optional) Issue link
kleiti pushed a commit to kleiti/onnxruntime that referenced this pull request Mar 22, 2024
### Description
This PR adds support for saving model optimizations after loading a
model that contains external data into an `InferenceSession`.



### Motivation and Context
This PR is a follow-up to a [previous
PR](microsoft#16716) for saving a
model optimized by an `InferenceSession`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants