A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.
-
Updated
Jun 26, 2024 - Python
A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.
🌎 🚙📚 Predicting travel times and traffic density on a highway in Slovenia
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Java version of LangChain
Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
Speech-to-text, text-to-speech, and speaker recognition using next-gen Kaldi with onnxruntime without Internet connection. Support embedded systems, Android, iOS, Raspberry Pi, RISC-V, x86_64 servers, websocket server/client, C/C++, Python, Kotlin, C#, Go, NodeJS, Java, Swift, Dart, JavaScript, Flutter
Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference
ncnn is a high-performance neural network inference framework optimized for the mobile platform
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
ONNX Graph ToolBox - Operate on your ONNX model with ease, visualize ONNX LLM models containing thousands of nodes.
World's first open source real-time translation app for Android.
Open standard for machine learning interoperability
Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX
YOLOX is a high-performance anchor-free YOLO, exceeding yolov3~v5 with MegEngine, ONNX, TensorRT, ncnn, and OpenVINO supported. Documentation: https://yolox.readthedocs.io/
Scorpion Anti-malware official repository
Effortless data labeling with AI support from Segment Anything and other awesome models.
ONNX Model Exporter for PaddlePaddle
Add a description, image, and links to the onnx topic page so that developers can more easily learn about it.
To associate your repository with the onnx topic, visit your repo's landing page and select "manage topics."