ONNX on Ubuntu 24.04
av kCloudHub LLC
Version 1.20.0 + Free Support on Ubuntu 24.04
ONNX (Open Neural Network Exchange) is an open standard format designed for representing machine learning models. ONNX Runtime is a high-performance inference engine optimized for deep learning and classical ML models, enabling fast, portable, and scalable model execution across CPUs, GPUs, and specialized AI accelerators.
This image provides ONNX Runtime preinstalled on Ubuntu 24.04, offering a clean, production-ready environment for developers, data scientists, and AI engineers deploying ML models in the cloud.
Features of ONNX Runtime:
- High-performance inference engine optimized for CPU and GPU workloads.
- Supports industry-leading frameworks including PyTorch, TensorFlow, and Scikit-Learn.
- Cross-platform model portability with the open ONNX model format.
- Accelerated execution through hardware backends such as CUDA, TensorRT, ROCm, and OpenVINO.
- Low latency and reduced compute overhead for production AI workloads.
- API support for Python, C++, C#, Java, Go, and more.
To check the installed ONNX Runtime version, run:
python3 -c "import onnxruntime as ort; print(ort.__version__)"
Disclaimer: ONNX and ONNX Runtime are open-source projects governed by the LF AI & Data Foundation. They are provided under their respective open-source licenses without warranty. Users are responsible for validating compatibility and performance for their specific deployment environments.