Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exporting inference result in real time #434

Closed
1 task done
rumjie opened this issue Oct 11, 2023 · 3 comments
Closed
1 task done

Exporting inference result in real time #434

rumjie opened this issue Oct 11, 2023 · 3 comments
Labels
app Issue related to Ultralytics HUB App question A HUB question that does not involve a bug

Comments

@rumjie
Copy link

rumjie commented Oct 11, 2023

Search before asking

Question

Hi, I finally succeed in model training and deploying the model on app.

My question is, is it possible to export the result of inference on app - for example .txt file?
Or are you planning to develop option for saving detecting result?
As it requires quite large storage to save all result for 30Hz, it would be nice to set how often the user will save the result.

Additional

No response

@rumjie rumjie added the question A HUB question that does not involve a bug label Oct 11, 2023
@kalenmike kalenmike added the app Issue related to Ultralytics HUB App label Oct 11, 2023
@UltralyticsAssistant
Copy link
Member

@rumjie hello,

Congratulations on successfully training your model and deploying it on app!

As far as your question is concerned, yes, it is entirely possible to export the results of inference on the app, such as to a .txt file. However, the functionality to do this would need to be implemented within the app, which is outside the scope of the Ultralytics HUB.

The idea of allowing users to control how often results are saved is a good one. It would require an additional setting or control mechanism to be incorporated into the app, which would give you the ability to manage storage potentially consumed by a continuous result stream.

Rest assured, the Ultralytics team is constantly working on improvements and new features, and we appreciate community feedback like yours. While I can't provide a timeline for when or if this specific feature will be added, your feedback is invaluable and will certainly be considered during future development.

Remember, you're part of the open-source YOLO and Ultralytics community. Your contributions, suggestions, and questions help make these tools even better.

Thank you for using Ultralytics HUB.

@rumjie
Copy link
Author

rumjie commented Oct 11, 2023

@UltralyticsAssistant
I appreciate all explanation and support for users from your team.

What I understood was, as ultralytics hub uses cloud api and storage, if the dataset, training process, and model can be used via cloud and applied in mobile app (becuz whole things are belongs to my account maybe?) , it seems like the result that inferred from model can be extracted and also saved into cloud storage. I just want to know, whether kind of these things are planned to develop someday or not yet considered.
This might comes from my lack of understanding or confusion of app & hub so if there is wrong points with my opinion, plz let me know.

Anyway, FYI, the background of my question was on this - YOLO model is best for fast inference, so it can be applied to real-time inference, even for video. By using Python code we can do, yes, but ultralytics HUB can help it without code and installing app into appropriate OS or device we can apply into specific task.

@UltralyticsAssistant
Copy link
Member

Hello again @rumjie,

You clearly have understood the principle of cloud APIs and storage. Your assumption that all the data, processes, and models related to your account can be accessed and used from a mobile app is correct.

Regarding the extraction and storage of inference results into cloud storage, it is technically possible. Still, such functionality is currently outside the scope of features directly offered by the Ultralytics HUB. There are no immediate plans to develop this feature but your suggestion is valuable and we keep track of all user feedback for future developments.

As for the YOLO model and real-time inference, you're spot on. The YOLO model is very well-suited for fast, real-time inference and this is a strength that we try to leverage fully within the Ultralytics HUB and our various implementations. You can indeed apply it in a variety of tasks, on different devices with various operating systems, all without the necessity to write code.

Please remember, however, that the Ultralytics HUB primarily deals with model training, validation, and development, rather than application implementation. The specifics of how the model is used, including real-time inference and result storage, usually fall to the users of the model or the developers of applications that incorporate it.

I hope that this clarifies your point. Feel free to ask if you have further questions. You're contributing a lot to the Ultralytics and YOLO community with your keen questions, so keep them coming! Thank you!

@rumjie rumjie closed this as completed Oct 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
app Issue related to Ultralytics HUB App question A HUB question that does not involve a bug
Projects
None yet
Development

No branches or pull requests

3 participants