-
-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TensorFlow Lite model deployment results #611
Comments
👋 Hello @jamesleech89, thank you for raising an issue about Ultralytics HUB 🚀! Please visit our HUB Docs to learn more:
If this is a 🐛 Bug Report, please provide screenshots and steps to reproduce your problem to help us get started working on a fix. If this is a ❓ Question, please provide as much information as possible, including dataset, model, environment details etc. so that we might provide the most helpful response. We try to respond to all issues as promptly as possible. Thank you for your patience! |
@jamesleech89 hello! It sounds like you're experiencing discrepancies between your model's predictions in TensorFlow Lite format compared to its performance via the Ultralytics API and preview tab. This can happen due to several reasons:
For a detailed guide on exporting models and ensuring consistency across different platforms, please refer to the Ultralytics HUB Docs. If the issue persists, consider providing more details about the preprocessing and postprocessing steps, along with any specific settings used during the TensorFlow Lite model export. This will help in diagnosing the issue more accurately. 😊 |
👋 Hello there! We wanted to give you a friendly reminder that this issue has not had any recent activity and may be closed soon, but don't worry - you can always reopen it if needed. If you still have any questions or concerns, please feel free to let us know how we can help. For additional resources and information, please see the links below:
Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed! Thank you for your contributions to YOLO 🚀 and Vision AI ⭐ |
Search before asking
Question
I have a new YOLO v5 object detection model trained on some new images with two classes. The model performs well in the Ultralytics preview tab and when I call the API for the model. I have exported a version of the model in a tensorflow lite format and applied it to some of the same images that I passed to the preview tab and through the API. Using the same confidence threshold and applying a non-maximum suppression using the same IoU threshold gives me different results to that which I get through the API and the preview tab? Even if I remove the non-maximum suppresion application I have on top of the tensorflow lite model, I can clearly see that the predictions are different (i.e. additional boxes predicted for classes that are not at all predicted through the API/preview tab).
Additional
No response
The text was updated successfully, but these errors were encountered: