At the Build conference this year, Microsoft announced many different ways they are bringing Artificial Intelligence to the Edge. Microsoft’s goal is to distribute access to AI across the cloud and the edge to empower more devices and people to benefit from machine learning and neural networks. In the past, neural networks required powerful and expensive machines (“heavy edge”) that were limited by power requirements and size making deployments constrained and expensive.
Microsoft is now investing in the development of “light edge” devices. These small, power-efficient devices contain built-in sensors and are designed for disconnected environments. Using Neural Hardware Acceleration chips, these devices are able to run powerful AI algorithms on the edge.
IoT Edge allows you to deploy complex event processing, machine learning, image recognition, and other high-value artificial intelligence without writing it in-house. Run Azure services such as Functions, Stream Analytics, and Machine Learning on-premises. Create AI modules and make them available to the community.
Easily build AI at the edge with the AI Toolkit for Azure IoT Edge
At Build, a partnership with both Qualcomm and DJI to provide some of the very first Azure IoT Edge devices. Qualcomm has developed the Qualcomm Snapdragon Neural Processing Engine which powers on device execution capable of running neural networks locally that were trained and developed in Azure and deployed using Azure IoT Edge. The first device containing this chip for the Azure IoT Edge contains a camera, audio-in and audio-out; providing a versatile set of input data for neural networks.
Because this device is Azure IoT Edge enabled, it is able to be updated and configured via Azure, including pushing new neural networks to the device. It will also have all the same capabilities as any IoT Edge device including transmitting data to Azure IoT Hubs. As the device is able to run the neural network locally, it will be able to operate and make predictions even when it’s in an offline state.
Microsoft also demonstrated a DJI drone that was running Azure IoT Edge on the drone itself. It was demonstrated having a neural network preinstalled that was constantly analyzing imagery from the camera with the neural network on the device/drone. This network could detect an abnormality in an oil pipeline. They demonstrated using a drone to fly along the pipelines and identify problems that needed repair. The video below shows the demonstration.
The most important way that Microsoft brought Neural Networks to the Intelligent Edge was by adding support for embedding Neural Networks into the IoT Edge runtime. This runtime runs on either Windows IoT or Linux and enables making any device into an Azure IoT Edge device. That means any device that you create can have the ability to run IoT Edge and bring a Neural Network to the Intelligent Edge!