Open Source ONNX Runtime

Posted on 04 December 2018

ONNX Runtime, a high-performance inference engine for machine learning models in the Open Neural Network Exchange (ONNX) format is now being open sourced. ONNX Runtime is compatible with ONNX version 1.2 and comes in Python packages that support both CPU and GPU inferencing. With the release of the open source ONNX Runtime project, developers have the freedom to customise and integrate the ONNX inference engine into their existing infrastructure directly from the source code, as well as compile and build it on a variety of operating systems. 

What is the ONNX format? 

Open Neural Network Exchange (ONNX) is the basis of an open ecosystem of interoperability and innovation in the AI ecosystem that Microsoft co-developed to make AI more accessible and valuable to all. An open format to represent machine-learning models, ONNX enables AI developers to choose the right framework for their task and hardware vendors to streamline optimisations.   

To learn more about ONNX Runtime, read the blog