{"id":1919,"date":"2018-12-04T00:00:00","date_gmt":"2018-12-04T00:00:00","guid":{"rendered":"https:\/\/azure.microsoft.com\/blog\/onnx-runtime-is-now-open-source"},"modified":"2023-05-11T15:35:39","modified_gmt":"2023-05-11T22:35:39","slug":"onnx-runtime-is-now-open-source","status":"publish","type":"post","link":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/","title":{"rendered":"ONNX Runtime is now open source"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" align=\"left\" alt=\"ONNX Runtime Logo\" height=\"139\" src=\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp\" title=\"ONNX Runtime Logo\" width=\"247\">Today we are announcing we have open sourced Open Neural Network Exchange\t(ONNX) Runtime\ton\t<a href=\"https:\/\/github.com\/microsoft\/onnxruntime\" target=\"_blank\" rel=\"noopener\">GitHub<\/a>. ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac.<\/p>\n<p><a href=\"https:\/\/onnx.ai\/\" target=\"_blank\" rel=\"noopener\">ONNX<\/a>\tis an open format for deep learning and traditional machine learning models that Microsoft co-developed with Facebook and AWS. The ONNX format is the basis of an open ecosystem that makes AI more accessible and valuable to all: developers can choose the right framework for their task, framework authors can focus on innovative enhancements, and hardware vendors can streamline optimizations for neural network computations.\t<\/p>\n<p>Microsoft has been conducting research in AI for more than two decades and incorporating machine learning and deep neural networks in a plethora of products and services. With teams using many different training frameworks and targeting different deployment options, there was a real need to unify these scattered solutions to make it quick and simple to operationalize models. ONNX Runtime provides that solution. It gives data scientists the flexibility to train and tune models in the framework of their choice and productionize these models with high performance in products spanning both cloud and edge.<\/p>\n<h2>Why use ONNX Runtime<\/h2>\n<p>ONNX Runtime is the first publicly available inference engine with full support for ONNX 1.2 and higher including the ONNX-ML profile. This means it is advancing directly alongside the ONNX standard to support an evolving set of AI models and technological breakthroughs.<\/p>\n<p>At Microsoft, teams are using ONNX Runtime to improve the scoring latency and efficiency for many of our models used in core scenarios in Bing Search, Bing Ads, Office productivity services, and more. For models we&#8217;ve converted to ONNX, we&#8217;ve seen average performance improve by 2X compared to scoring in their existing solutions. ONNX Runtime is also incorporated in other Microsoft offerings including Windows ML and ML.net.<\/p>\n<p>ONNX Runtime is lightweight and modular in design, with the CPU build only a few megabytes in size. The <a href=\"https:\/\/github.com\/Microsoft\/onnxruntime\/blob\/master\/docs\/HighLevelDesign.md\" target=\"_blank\" rel=\"noopener\">extensible architecture<\/a>\tenables optimizers and hardware accelerators to provide low latency and high efficiency for computations by registering as \u201cexecution providers.\u201d The result is smoother end-to-end user experiences with lower perceived latency, as well as cost savings from decreased machine utilization and higher throughput.\t<\/p>\n<h2>Deep support from industry partners<\/h2>\n<p>Leading companies in the ONNX community are actively working or planning\tto integrate their technology with ONNX Runtime. This enables them to support the full ONNX specification while achieving the best performance.<\/p>\n<p>Microsoft and Intel are working together to integrate the\tnGraph\tCompiler as an execution provider for the ONNX Runtime. The\tnGraph\tCompiler is capable of accelerating both existing and upcoming hardware targets by applying both non-device specific and device specific optimizations.\tUsing the\tnGraph\tCompiler for CPU inference achieves\tup to <a href=\"https:\/\/ai.intel.com\/ngraph-compiler-stack-beta-release\/\" target=\"_blank\" rel=\"noopener\">45x performance boost<\/a>\tas compared to native frameworks.\t\t<\/p>\n<p>NVIDIA is helping integrate\tTensorRT\twith ONNX Runtime to offer an easy workflow for deploying a rapidly growing set of models and apps on NVIDIA GPUs while achieving the best performance possible.\tNVIDIA\tTensorRT\tincludes a high-performance inference optimizer and runtime that delivers dramatically higher throughput at minimal latency across applications such as recommenders, natural language processing,\tand image\/video processing.\t<\/p>\n<p>Qualcomm, another\t<a href=\"https:\/\/developer.qualcomm.com\/blog\/run-your-onnx-ai-models-faster-snapdragon\" target=\"_blank\" rel=\"noopener\">early advocate<\/a>\tof ONNX, has also expressed support for ONNX Runtime.\t\u201cThe introduction of ONNX Runtime is a positive next step in further driving framework interoperability, standardization, and performance optimization across multiple device categories and we expect developers to welcome support for ONNX Runtime on Snapdragon mobile platforms,&#8221;\tsays Gary Brotman, senior director of AI product management at Qualcomm Technologies, Inc.\t\t<\/p>\n<p>After joining ONNX recently,\tleading IoT chip maker NXP also announced support for ONNX Runtime.\t\u201cWhen it comes to choosing from among the many machine learning frameworks, we want our customers to have maximum flexibility and freedom,\u201d says Markus Levy, head of the AI Technology Center at NXP.\t&#8220;We\u2019re happy to bring the ONNX benefits to our customer community of ML developers by supporting the ONNX Runtime released by Microsoft in our platform.\u201d<\/p>\n<p>In addition to hardware partners, framework provider Preferred Networks is also leveraging ONNX Runtime. &#8220;Preferred Networks, in addition to developing the deep learning framework\t<a href=\"https:\/\/chainer.org\/\" target=\"_blank\" rel=\"noopener\">Chainer<\/a>, has created\t<a href=\"https:\/\/github.com\/pfnet-research\/menoh\" target=\"_blank\" rel=\"noopener\">Menoh<\/a>, an ONNX inference engine wrapper library for multiple programming languages,&#8221; says Toru Nishikawa, President and CEO of Preferred Networks, Inc. &#8220;Menoh\twill use ONNX Runtime as its main backend, and\tChainer\tcurrently uses ONNX Runtime to test its ONNX export features. Preferred Networks is delighted that Microsoft has made ONNX Runtime and looks forward to working on ONNX with Microsoft in the future.&#8221;\t\t<\/p>\n<h2>How to use ONNX Runtime<\/h2>\n<p>First, you&#8217;ll need an ONNX model. Don&#8217;t have an ONNX model? No problem. The beauty of ONNX is the framework interoperability enabled through a\t<a href=\"https:\/\/github.com\/onnx\/tutorials\" target=\"_blank\" rel=\"noopener\">multitude of tools<\/a>.<\/p>\n<ul>\n<li>You can get pretrained versions of popular models like ResNet and TinyYOLO directly from the <a href=\"https:\/\/github.com\/onnx\/models\" target=\"_blank\" rel=\"noopener\">ONNX Model Zoo<\/a>.<\/li>\n<li>You can create your own customized computer vision models using <a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/cognitive-services\/Custom-Vision-Service\/home\" target=\"_blank\" rel=\"noopener\">Azure Custom Vision Cognitive Service<\/a>.<\/li>\n<li>If you already have models in TensorFlow, Keras, Scikit-Learn, or CoreML format, you can convert them using our open source converters (<a href=\"https:\/\/pypi.org\/project\/onnxmltools\/\" target=\"_blank\" rel=\"noopener\">ONNXMLTools<\/a> and <a href=\"https:\/\/pypi.org\/project\/tf2onnx\/\" target=\"_blank\" rel=\"noopener\">TF2ONNX<\/a>).<\/li>\n<li>You can train new models using Azure Machine Learning service and <a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/machine-learning\/service\/how-to-build-deploy-onnx\" target=\"_blank\" rel=\"noopener\">save into ONNX format<\/a>.\t\t<\/li>\n<\/ul>\n<p>To use ONNX Runtime, just install the package for your desired platform and language of choice or create a build from the source. ONNX Runtime supports both\t<strong>CPU\t<\/strong>and\t<strong>GPU\t<\/strong>(CUDA) with\t<strong>Python<\/strong>, <strong>C#<\/strong>, and <strong>C<\/strong> interfaces that are compatible on Linux, Windows, and Mac.\tCheck <a href=\"https:\/\/github.com\/microsoft\/onnxruntime\" target=\"_blank\" rel=\"noopener\">GitHub<\/a> for installation instructions. \t\t\t\t<\/p>\n<p>You can integrate ONNX Runtime into your code directly from source or from precompiled binaries, but an easy way to operationalize it is to use\tAzure Machine Learning\tto\t<a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/machine-learning\/service\/how-to-build-deploy-onnx\" target=\"_blank\" rel=\"noopener\">deploy a service<\/a>\tfor your application to call. \t\t<\/p>\n<h2>Get involved\t\t<\/h2>\n<p>The release of ONNX Runtime marks a significant step in our endeavor towards an open and interoperable ecosystem for AI, and we are extremely excited about the enthusiasm and support from the community thus far. We hope this makes it easier to drive product innovation in AI and strongly encourage the development community to try it out. We are continuously evolving and improving ONNX Runtime, and we look forward to your feedback and contributions to this very exciting area!\t\t<\/p>\n<p>Have feedback or questions?\t<a href=\"https:\/\/github.com\/Microsoft\/onnxruntime\/issues\" target=\"_blank\" rel=\"noopener\">File an issue<\/a>\ton <a href=\"https:\/\/github.com\/microsoft\/onnxruntime\" target=\"_blank\" rel=\"noopener\">Github<\/a>, and follow us on <a href=\"https:\/\/twitter.com\/onnxruntime\">Twitter<\/a>.\t\t<\/p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Today we are announcing we have open sourced Open Neural Network Exchange\u202f(ONNX) Runtime\u202fon\u202fGitHub. ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"ms_queue_id":[],"ep_exclude_from_search":false,"_classifai_error":"","_classifai_text_to_speech_error":"","_alt_title":"","footnotes":"","msx_community_cta_settings":[]},"categories":[1454,1467],"tags":[],"audience":[3057,3055,3053,3056],"content-type":[1465],"product":[1566],"tech-community":[],"topic":[],"coauthors":[703],"class_list":["post-1919","post","type-post","status-publish","format-standard","hentry","category-ai-machine-learning","category-compute","audience-data-professionals","audience-developers","audience-it-decision-makers","audience-it-implementors","content-type-announcements","product-linux-virtual-machines"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>ONNX Runtime is now open source | Microsoft Azure Blog<\/title>\n<meta name=\"description\" content=\"Today we are announcing we have open sourced Open Neural Network Exchange\u202f(ONNX) Runtime\u202fon\u202fGitHub. ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"ONNX Runtime is now open source | Microsoft Azure Blog\" \/>\n<meta property=\"og:description\" content=\"Today we are announcing we have open sourced Open Neural Network Exchange\u202f(ONNX) Runtime\u202fon\u202fGitHub. ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/\" \/>\n<meta property=\"og:site_name\" content=\"Microsoft Azure Blog\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/microsoftazure\" \/>\n<meta property=\"article:published_time\" content=\"2018-12-04T00:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-05-11T22:35:39+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp\" \/>\n<meta name=\"author\" content=\"Faith Xu\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@azure\" \/>\n<meta name=\"twitter:site\" content=\"@azure\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Faith Xu\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/\"},\"author\":[{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/author\/faith-xu\/\",\"@type\":\"Person\",\"@name\":\"Faith Xu\"}],\"headline\":\"ONNX Runtime is now open source\",\"datePublished\":\"2018-12-04T00:00:00+00:00\",\"dateModified\":\"2023-05-11T22:35:39+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/\"},\"wordCount\":1053,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp\",\"articleSection\":[\"AI + machine learning\",\"Compute\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/\",\"name\":\"ONNX Runtime is now open source | Microsoft Azure Blog\",\"isPartOf\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp\",\"datePublished\":\"2018-12-04T00:00:00+00:00\",\"dateModified\":\"2023-05-11T22:35:39+00:00\",\"description\":\"Today we are announcing we have open sourced Open Neural Network Exchange\u202f(ONNX) Runtime\u202fon\u202fGitHub. ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac.\",\"breadcrumb\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#primaryimage\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp\",\"contentUrl\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Blog home\",\"item\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI + machine learning\",\"item\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/category\/ai-machine-learning\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"ONNX Runtime is now open source\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#website\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/\",\"name\":\"Microsoft Azure Blog\",\"description\":\"Get the latest Azure news, updates, and announcements from the Azure blog. From product updates to hot topics, hear from the Azure experts.\",\"publisher\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization\",\"name\":\"Microsoft Azure Blog\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2024\/06\/microsoft_logo.webp\",\"contentUrl\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2024\/06\/microsoft_logo.webp\",\"width\":512,\"height\":512,\"caption\":\"Microsoft Azure Blog\"},\"image\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/microsoftazure\",\"https:\/\/x.com\/azure\",\"https:\/\/www.instagram.com\/microsoftdeveloper\/\",\"https:\/\/www.linkedin.com\/company\/16188386\",\"https:\/\/www.youtube.com\/user\/windowsazure\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/person\/c702e5edd662b328b49b7e1180cab117\",\"name\":\"shakir\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g7664e653ea371ce16eaf75e9fa8952c4\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g\",\"caption\":\"shakir\"},\"sameAs\":[\"https:\/\/azure.microsoft.com\"],\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/author\/shakir\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"ONNX Runtime is now open source | Microsoft Azure Blog","description":"Today we are announcing we have open sourced Open Neural Network Exchange\u202f(ONNX) Runtime\u202fon\u202fGitHub. ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/","og_locale":"en_US","og_type":"article","og_title":"ONNX Runtime is now open source | Microsoft Azure Blog","og_description":"Today we are announcing we have open sourced Open Neural Network Exchange\u202f(ONNX) Runtime\u202fon\u202fGitHub. ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac.","og_url":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/","og_site_name":"Microsoft Azure Blog","article_publisher":"https:\/\/www.facebook.com\/microsoftazure","article_published_time":"2018-12-04T00:00:00+00:00","article_modified_time":"2023-05-11T22:35:39+00:00","og_image":[{"url":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp","type":"","width":"","height":""}],"author":"Faith Xu","twitter_card":"summary_large_image","twitter_creator":"@azure","twitter_site":"@azure","twitter_misc":{"Written by":"Faith Xu","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#article","isPartOf":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/"},"author":[{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/author\/faith-xu\/","@type":"Person","@name":"Faith Xu"}],"headline":"ONNX Runtime is now open source","datePublished":"2018-12-04T00:00:00+00:00","dateModified":"2023-05-11T22:35:39+00:00","mainEntityOfPage":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/"},"wordCount":1053,"commentCount":0,"publisher":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization"},"image":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#primaryimage"},"thumbnailUrl":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp","articleSection":["AI + machine learning","Compute"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/","name":"ONNX Runtime is now open source | Microsoft Azure Blog","isPartOf":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#primaryimage"},"image":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#primaryimage"},"thumbnailUrl":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp","datePublished":"2018-12-04T00:00:00+00:00","dateModified":"2023-05-11T22:35:39+00:00","description":"Today we are announcing we have open sourced Open Neural Network Exchange\u202f(ONNX) Runtime\u202fon\u202fGitHub. ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac.","breadcrumb":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#primaryimage","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp","contentUrl":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/12\/738e9a46-5e21-4251-b8bf-992529c25468.webp"},{"@type":"BreadcrumbList","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-is-now-open-source\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Blog home","item":"https:\/\/azure.microsoft.com\/en-us\/blog\/"},{"@type":"ListItem","position":2,"name":"AI + machine learning","item":"https:\/\/azure.microsoft.com\/en-us\/blog\/category\/ai-machine-learning\/"},{"@type":"ListItem","position":3,"name":"ONNX Runtime is now open source"}]},{"@type":"WebSite","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#website","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/","name":"Microsoft Azure Blog","description":"Get the latest Azure news, updates, and announcements from the Azure blog. From product updates to hot topics, hear from the Azure experts.","publisher":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/azure.microsoft.com\/en-us\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization","name":"Microsoft Azure Blog","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2024\/06\/microsoft_logo.webp","contentUrl":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2024\/06\/microsoft_logo.webp","width":512,"height":512,"caption":"Microsoft Azure Blog"},"image":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/microsoftazure","https:\/\/x.com\/azure","https:\/\/www.instagram.com\/microsoftdeveloper\/","https:\/\/www.linkedin.com\/company\/16188386","https:\/\/www.youtube.com\/user\/windowsazure"]},{"@type":"Person","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/person\/c702e5edd662b328b49b7e1180cab117","name":"shakir","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g7664e653ea371ce16eaf75e9fa8952c4","url":"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g","caption":"shakir"},"sameAs":["https:\/\/azure.microsoft.com"],"url":"https:\/\/azure.microsoft.com\/en-us\/blog\/author\/shakir\/"}]}},"msxcm_display_generated_audio":false,"msxcm_animated_featured_image":null,"distributor_meta":false,"distributor_terms":false,"distributor_media":false,"distributor_original_site_name":"Microsoft Azure Blog","distributor_original_site_url":"https:\/\/azure.microsoft.com\/en-us\/blog","push-errors":false,"_links":{"self":[{"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/posts\/1919","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/comments?post=1919"}],"version-history":[{"count":0,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/posts\/1919\/revisions"}],"wp:attachment":[{"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/media?parent=1919"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/categories?post=1919"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/tags?post=1919"},{"taxonomy":"audience","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/audience?post=1919"},{"taxonomy":"content-type","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/content-type?post=1919"},{"taxonomy":"product","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/product?post=1919"},{"taxonomy":"tech-community","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/tech-community?post=1919"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/topic?post=1919"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/coauthors?post=1919"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}