Skip to main content
Azure
  • 3 min read

Microsoft and Qualcomm accelerate AI with Vision AI Developer Kit

Artificial intelligence (AI) workloads include megabytes of data and potentially billions of calculations. With advancements in hardware, it is now possible to run time-sensitive AI workloads on the edge while also sending outputs to the cloud for downstream applications.

Artificial intelligence (AI) workloads include megabytes of data and potentially billions of calculations. With advancements in hardware, it is now possible to run time-sensitive AI workloads on the edge while also sending outputs to the cloud for downstream applications. AI scenarios processed on the edge can facilitate important business scenarios, such as verifying if every person on a construction site is wearing a hardhat, or detecting whether items are out-of-stock on a store shelf.

The combination of hardware, software, and AI models needed to support these scenarios can be difficult to organize. To remove this barrier, we announced a developer kit last year with Qualcomm, to accelerate AI inferencing at the intelligent edge. Today we’re pleased to share that the Vision AI Developer Kit is now broadly available. The developer kit includes a camera, which uses Qualcomm’s Vision Intelligence 300 Platform, and the software needed to develop intelligent edge solutions using Azure IoT Edge and Azure Machine Learning. It supports an end-to-end Azure enabled solution with real-time image processing locally on the edge device, and model training and management on Azure. The Vision AI Developer Kit, made by our partner eInfochips, can now be ordered from Arrow Electronics.

Using the Vision AI Developer Kit, you can deploy vision models at the intelligent edge in minutes, regardless of your current machine learning skill level. Below, we detail three options for developers to get started, including no code using Custom Vision, an Azure Cognitive Service, custom models with Azure Machine Learning, and the fully integrated development environment provided by Visual Studio Code.

Azure Cognitive Services support for no code development

Custom Vision, an Azure Cognitive Service, enables you to build your own computer vision model, even if you’re not a data scientist. It provides a user-friendly interface that walks you through the process for uploading your data, training, and deploying customer vision models including image tagging. The Vision AI Developer Kit integration with Custom Vision includes the ability to use Azure IoT Hub to deploy your custom vision model directly to the developer kit. These custom vision models are then accelerated using the camera’s Snapdragon Neural Processing Engine (SNPE), which enables image classification to run quickly even when offline. 

Azure Machine Learning integration for data scientists

Azure Machine Learning streamlines the building, training, and deployment of machine learning models using tools that meet your needs, including code-first, visual drag and drop, and automated machine learning experiences. The Vision AI Developer Kit enables data scientists to use Azure Machine Learning to build custom models and deploy them to the included camera.

Get started with Azure Machine Learning using reference implementations provided in Jupyter notebooks. These reference implementations walk data scientists through the steps to upload training data to Azure Blob Storage, run a transfer learning experiment, convert the trained model to be compatible with the developer kit platform, and deploy via Azure IoT Edge.

Visual Studio Code integration for developers

Visual Studio Code provides developers a single development environment to manage their code and access Azure services through plugins. For developers using Visual Studio Code, we have created a GitHub repository which includes sample Python modules, pre-built Azure IoT deployment configurations, and Dockerfiles for container creation and deployment. You can use Visual Studio Code to modify the sample modules or create your own and containerize them to deploy on the camera.

Install the Vision AI DevKit extension for Visual Studio Code to take full advantage of the developer kit as a cloud managed device.  With the extension, you can deploy modules, see messages from the device, manage your Azure IoT Hub, and more, all from within a familiar development environment. You can also leverage Visual Studio Code to add business logic to your own Azure solutions that consume information from the camera using IoT Hub and transform camera data into normalized data streams using Azure Stream Analytics.

Next steps

To order your own Vision AI Developer Kit, visit the product page from Arrow. For more information, visit the Vision AI DevKit GitHub.