{"id":2083,"date":"2018-10-16T00:00:00","date_gmt":"2018-10-16T00:00:00","guid":{"rendered":"https:\/\/azure.microsoft.com\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview"},"modified":"2023-05-11T15:36:31","modified_gmt":"2023-05-11T22:36:31","slug":"onnx-runtime-for-inferencing-machine-learning-models-now-in-preview","status":"publish","type":"post","link":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/","title":{"rendered":"ONNX Runtime for inferencing machine learning models now in preview"},"content":{"rendered":"<p>We are excited to release the preview of ONNX Runtime, a high-performance inference engine for machine learning models in the <a href=\"https:\/\/onnx.ai\/\" target=\"_blank\" rel=\"noopener\">Open Neural Network Exchange (ONNX)<\/a> format. ONNX Runtime is compatible with ONNX version 1.2 and comes in Python packages that support both <a href=\"https:\/\/pypi.org\/project\/onnxruntime\/\" target=\"_blank\" rel=\"noopener\">CPU<\/a> and <a href=\"https:\/\/pypi.org\/project\/onnxruntime-gpu\" target=\"_blank\" rel=\"noopener\">GPU<\/a> to enable inferencing using <a href=\"https:\/\/azure.microsoft.com\/en-us\/blog\/what-s-new-in-azure-machine-learning-service\/\" target=\"_blank\" rel=\"noopener\">Azure Machine Learning service<\/a> and on any Linux machine running Ubuntu 16.<\/p>\n<p>ONNX is an open source model format for deep learning and traditional machine learning. Since we launched ONNX in December 2017 it has gained support from more than 20 leading companies in the industry. ONNX gives data scientists and developers the freedom to choose the right framework for their task, as well as the confidence to run their models efficiently on a variety of platforms with the hardware of their choice.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"ONNX\" height=\"355\" src=\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp\" title=\"ONNX\" width=\"1176\"><\/p>\n<p>The ONNX Runtime inference engine provides comprehensive coverage and support of all operators defined in ONNX. Developed with extensibility and performance in mind, it leverages a variety of custom accelerators based on platform and hardware selection to provide minimal compute latency and resource usage. Given the platform, hardware configuration, and operators defined within a model, ONNX Runtime can utilize the most efficient execution provider to deliver the best overall performance for inferencing.<\/p>\n<p>The pluggable model for execution providers allows ONNX Runtime to rapidly adapt to new software and hardware advancements. The execution provider interface is a standard way for hardware accelerators to expose their capabilities to the ONNX Runtime. We have active collaborations with companies including Intel and NVIDIA to ensure that ONNX Runtime is optimized for compute acceleration on their specialized hardware. Examples of these execution providers include Intel&#8217;s MKL-DNN and nGraph, as well as NVIDIA&#8217;s optimized TensorRT.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"ONNXModel\" height=\"563\" src=\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/e7373906-1a6d-425c-8c03-da4bffd47fbb.webp\" title=\"ONNXModel\" width=\"1193\"><\/p>\n<p>The release of ONNX Runtime expands upon Microsoft&#8217;s existing support of ONNX, allowing you to run inferencing of ONNX models across a variety of platforms and devices.<\/p>\n<p><strong>Azure:<\/strong> Using the ONNX Runtime Python package, you can deploy an ONNX model to the cloud with Azure Machine Learning as an Azure Container Instance or production-scale Azure Kubernetes Service. Here are some <a href=\"https:\/\/aka.ms\/onnxnotebooks\" target=\"_blank\" rel=\"noopener\">examples<\/a> to get started.<\/p>\n<p><strong>.NET:<\/strong>\u00a0 You can integrate ONNX models into your .NET apps with <a href=\"https:\/\/www.microsoft.com\/net\/apps\/machinelearning-ai\/ml-dotnet\" target=\"_blank\" rel=\"noopener\">ML.NET<\/a>.<\/p>\n<p><strong>Windows Devices:<\/strong> You can run ONNX models on a wide variety of Windows devices using the built-in <a href=\"https:\/\/docs.microsoft.com\/en-us\/windows\/ai\/\" target=\"_blank\" rel=\"noopener\">Windows Machine Learning<\/a> APIs available in the latest Windows 10 October 2018 update.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"CreateDeploy\" height=\"613\" src=\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/13189eea-5f0f-4ef9-83d1-92679b3a60f6.webp\" title=\"CreateDeploy\" width=\"1271\"><\/p>\n<h2>Using ONNX<\/h2>\n<h3>Get an ONNX model<\/h3>\n<p>Getting an ONNX model is simple: choose from a selection of popular pre-trained ONNX models in the <a href=\"https:\/\/github.com\/onnx\/models\" target=\"_blank\" rel=\"noopener\">ONNX Model Zoo<\/a>, build your own image classification model using Azure Custom Vision service, <a href=\"https:\/\/docs.microsoft.com\/en-us\/windows\/ai\/convert-model-winmltools\" target=\"_blank\" rel=\"noopener\">convert existing models<\/a> from other frameworks to ONNX, or <a href=\"https:\/\/github.com\/Azure\/MachineLearningNotebooks\/tree\/master\/training\" target=\"_blank\" rel=\"noopener\">train a custom model in AzureML<\/a> and save it in the ONNX format.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"4Ways\" height=\"616\" src=\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/a4b73ece-3c76-4e4a-a867-d6a8314fdc82.webp\" title=\"4Ways\" width=\"821\"><\/p>\n<h2>Inference with ONNX Runtime<\/h2>\n<p>Once you have a trained model in ONNX format, you&#8217;re ready to feed it through ONNX Runtime for inferencing. The pre-built Python packages include integration with various execution providers, offering low compute latencies and resource utilization. The GPU build requires CUDA 9.1.<\/p>\n<p>To start, install the desired package from PyPi in your Python environment:<\/p>\n<pre>\r\npip install onnxruntime \r\npip install onnxruntime-gpu<\/pre>\n<p>Then, create an inference session to begin working with your model.<\/p>\n<pre>\r\nimport onnxruntime\r\nsession = onnxruntime.InferenceSession(\"your_model.onnx\")  <\/pre>\n<p>Finally, run the inference session with your selected outputs and inputs to get the predicted value(s).<\/p>\n<pre>\r\nprediction = session.run(None, {\"input1\": value})<\/pre>\n<p>For more details, refer to the <a href=\"https:\/\/aka.ms\/onnxruntime-python\" target=\"_blank\" rel=\"noopener\">full API documentation<\/a>. \t<\/p>\n<p>Now you are ready to <a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/machine-learning\/service\/how-to-build-deploy-onnx\" target=\"_blank\" rel=\"noopener\">deploy your ONNX model<\/a> for your application or service to use.<\/p>\n<h2>Get started today<\/h2>\n<p>As champions of open and interoperable AI, we are actively invested in building products and tooling to help you efficiently deliver new and exciting AI innovation. We are excited for the community to participate and try out ONNX Runtime! Get started today by <a href=\"https:\/\/pypi.org\/project\/onnxruntime\" target=\"_blank\" rel=\"noopener\">installing ONNX Runtime<\/a> and let us know your feedback on the <a href=\"https:\/\/social.msdn.microsoft.com\/Forums\/en-US\/home?forum=AzureMachineLearningService\" target=\"_blank\" rel=\"noopener\">Azure Machine Learning Service Forum<\/a>.<\/p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>We are excited to release the preview of Open Neural Network Exchange (ONNX) Runtime, a high-performance inference engine for machine learning models in the ONNX format.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"ms_queue_id":[],"ep_exclude_from_search":false,"_classifai_error":"","_classifai_text_to_speech_error":"","_alt_title":"","footnotes":"","msx_community_cta_settings":[]},"categories":[1454],"tags":[],"audience":[3057,3055,3056],"content-type":[],"product":[1493],"tech-community":[],"topic":[],"coauthors":[703],"class_list":["post-2083","post","type-post","status-publish","format-standard","hentry","category-ai-machine-learning","audience-data-professionals","audience-developers","audience-it-implementors","product-azure-machine-learning"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>ONNX Runtime for inferencing machine learning models now in preview | Microsoft Azure Blog<\/title>\n<meta name=\"description\" content=\"We are excited to release the preview of Open Neural Network Exchange (ONNX) Runtime, a high-performance inference engine for machine learning models in the ONNX format.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"ONNX Runtime for inferencing machine learning models now in preview | Microsoft Azure Blog\" \/>\n<meta property=\"og:description\" content=\"We are excited to release the preview of Open Neural Network Exchange (ONNX) Runtime, a high-performance inference engine for machine learning models in the ONNX format.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/\" \/>\n<meta property=\"og:site_name\" content=\"Microsoft Azure Blog\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/microsoftazure\" \/>\n<meta property=\"article:published_time\" content=\"2018-10-16T00:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-05-11T22:36:31+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp\" \/>\n<meta name=\"author\" content=\"Faith Xu\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@azure\" \/>\n<meta name=\"twitter:site\" content=\"@azure\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Faith Xu\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/\"},\"author\":[{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/author\/faith-xu\/\",\"@type\":\"Person\",\"@name\":\"Faith Xu\"}],\"headline\":\"ONNX Runtime for inferencing machine learning models now in preview\",\"datePublished\":\"2018-10-16T00:00:00+00:00\",\"dateModified\":\"2023-05-11T22:36:31+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/\"},\"wordCount\":627,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp\",\"articleSection\":[\"AI + machine learning\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/\",\"name\":\"ONNX Runtime for inferencing machine learning models now in preview | Microsoft Azure Blog\",\"isPartOf\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp\",\"datePublished\":\"2018-10-16T00:00:00+00:00\",\"dateModified\":\"2023-05-11T22:36:31+00:00\",\"description\":\"We are excited to release the preview of Open Neural Network Exchange (ONNX) Runtime, a high-performance inference engine for machine learning models in the ONNX format.\",\"breadcrumb\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#primaryimage\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp\",\"contentUrl\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Blog home\",\"item\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI + machine learning\",\"item\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/category\/ai-machine-learning\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"ONNX Runtime for inferencing machine learning models now in preview\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#website\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/\",\"name\":\"Microsoft Azure Blog\",\"description\":\"Get the latest Azure news, updates, and announcements from the Azure blog. From product updates to hot topics, hear from the Azure experts.\",\"publisher\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization\",\"name\":\"Microsoft Azure Blog\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2024\/06\/microsoft_logo.webp\",\"contentUrl\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2024\/06\/microsoft_logo.webp\",\"width\":512,\"height\":512,\"caption\":\"Microsoft Azure Blog\"},\"image\":{\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/microsoftazure\",\"https:\/\/x.com\/azure\",\"https:\/\/www.instagram.com\/microsoftdeveloper\/\",\"https:\/\/www.linkedin.com\/company\/16188386\",\"https:\/\/www.youtube.com\/user\/windowsazure\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/person\/c702e5edd662b328b49b7e1180cab117\",\"name\":\"shakir\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g7664e653ea371ce16eaf75e9fa8952c4\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g\",\"caption\":\"shakir\"},\"sameAs\":[\"https:\/\/azure.microsoft.com\"],\"url\":\"https:\/\/azure.microsoft.com\/en-us\/blog\/author\/shakir\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"ONNX Runtime for inferencing machine learning models now in preview | Microsoft Azure Blog","description":"We are excited to release the preview of Open Neural Network Exchange (ONNX) Runtime, a high-performance inference engine for machine learning models in the ONNX format.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/","og_locale":"en_US","og_type":"article","og_title":"ONNX Runtime for inferencing machine learning models now in preview | Microsoft Azure Blog","og_description":"We are excited to release the preview of Open Neural Network Exchange (ONNX) Runtime, a high-performance inference engine for machine learning models in the ONNX format.","og_url":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/","og_site_name":"Microsoft Azure Blog","article_publisher":"https:\/\/www.facebook.com\/microsoftazure","article_published_time":"2018-10-16T00:00:00+00:00","article_modified_time":"2023-05-11T22:36:31+00:00","og_image":[{"url":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp","type":"","width":"","height":""}],"author":"Faith Xu","twitter_card":"summary_large_image","twitter_creator":"@azure","twitter_site":"@azure","twitter_misc":{"Written by":"Faith Xu","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#article","isPartOf":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/"},"author":[{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/author\/faith-xu\/","@type":"Person","@name":"Faith Xu"}],"headline":"ONNX Runtime for inferencing machine learning models now in preview","datePublished":"2018-10-16T00:00:00+00:00","dateModified":"2023-05-11T22:36:31+00:00","mainEntityOfPage":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/"},"wordCount":627,"commentCount":0,"publisher":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization"},"image":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#primaryimage"},"thumbnailUrl":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp","articleSection":["AI + machine learning"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/","name":"ONNX Runtime for inferencing machine learning models now in preview | Microsoft Azure Blog","isPartOf":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#primaryimage"},"image":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#primaryimage"},"thumbnailUrl":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp","datePublished":"2018-10-16T00:00:00+00:00","dateModified":"2023-05-11T22:36:31+00:00","description":"We are excited to release the preview of Open Neural Network Exchange (ONNX) Runtime, a high-performance inference engine for machine learning models in the ONNX format.","breadcrumb":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#primaryimage","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp","contentUrl":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2018\/10\/094b1b5f-5e21-483b-927d-f5d76422e9c1.webp"},{"@type":"BreadcrumbList","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/onnx-runtime-for-inferencing-machine-learning-models-now-in-preview\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Blog home","item":"https:\/\/azure.microsoft.com\/en-us\/blog\/"},{"@type":"ListItem","position":2,"name":"AI + machine learning","item":"https:\/\/azure.microsoft.com\/en-us\/blog\/category\/ai-machine-learning\/"},{"@type":"ListItem","position":3,"name":"ONNX Runtime for inferencing machine learning models now in preview"}]},{"@type":"WebSite","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#website","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/","name":"Microsoft Azure Blog","description":"Get the latest Azure news, updates, and announcements from the Azure blog. From product updates to hot topics, hear from the Azure experts.","publisher":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/azure.microsoft.com\/en-us\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#organization","name":"Microsoft Azure Blog","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2024\/06\/microsoft_logo.webp","contentUrl":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-content\/uploads\/2024\/06\/microsoft_logo.webp","width":512,"height":512,"caption":"Microsoft Azure Blog"},"image":{"@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/microsoftazure","https:\/\/x.com\/azure","https:\/\/www.instagram.com\/microsoftdeveloper\/","https:\/\/www.linkedin.com\/company\/16188386","https:\/\/www.youtube.com\/user\/windowsazure"]},{"@type":"Person","@id":"https:\/\/azure.microsoft.com\/en-us\/blog\/#\/schema\/person\/c702e5edd662b328b49b7e1180cab117","name":"shakir","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g7664e653ea371ce16eaf75e9fa8952c4","url":"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/9342c7c05bb16548741bc5cd3a3e3b7ee0c8e746844ad2cc582db5beb5514c6f?s=96&d=mm&r=g","caption":"shakir"},"sameAs":["https:\/\/azure.microsoft.com"],"url":"https:\/\/azure.microsoft.com\/en-us\/blog\/author\/shakir\/"}]}},"msxcm_display_generated_audio":false,"msxcm_animated_featured_image":null,"distributor_meta":false,"distributor_terms":false,"distributor_media":false,"distributor_original_site_name":"Microsoft Azure Blog","distributor_original_site_url":"https:\/\/azure.microsoft.com\/en-us\/blog","push-errors":false,"_links":{"self":[{"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/posts\/2083","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/comments?post=2083"}],"version-history":[{"count":0,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/posts\/2083\/revisions"}],"wp:attachment":[{"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/media?parent=2083"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/categories?post=2083"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/tags?post=2083"},{"taxonomy":"audience","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/audience?post=2083"},{"taxonomy":"content-type","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/content-type?post=2083"},{"taxonomy":"product","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/product?post=2083"},{"taxonomy":"tech-community","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/tech-community?post=2083"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/topic?post=2083"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/azure.microsoft.com\/en-us\/blog\/wp-json\/wp\/v2\/coauthors?post=2083"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}