Skip to main content

New in Azure Stream Analytics: Geospatial functions, Custom code and lots more!

Posted on February 1, 2017

Principal Program Manager, Azure Stream Analytics

Today, we are pleased to announce the roll-out of several compelling capabilities in Azure Stream Analytics. These include native support for geospatial functions, custom code with JavaScript, low latency dashboarding with Power BI and preview of Visual Studio integration and Job diagnostic logs. Additionally, effective today there will be no more ingress data throttling.

Native support for Geospatial functions

Starting today, customers can easily build solutions for scenarios such as connected cars, fleet management, and mobile asset tracking with tremendous ease using Azure Stream Analytics. Developers can now leverage powerful built-in geospatial functions in their stream-processing logic to define geographical areas, and evaluate incoming geospatial data for containment, proximity, overlap, and generate alerts or easily kick-off necessary workflows etc. These geospatial capabilities are in alignment with the GeoJSON specification.

We had more than 100 customers using these Geospatial capabilities in preview, including NASCAR. Established in 1947, NASCAR has grown to become the premier motorsports organization. Currently, NASCAR sanctions more than 1,200 races in more than 30 U.S. states, Canada, Mexico and Europe. NASCAR has been a pioneer in using geospatial capabilities in Azure Stream Analytics.

“We use real-time geospatial analytics with Azure Stream Analytics for analyzing race telemetry during and after the race,” said NASCAR’s Managing Director of Technology Development, Betsy Grider

Custom code with JavaScript user defined functions

With Azure Stream Analytics, customers can now combine the power of JavaScript with the simplicity and pervasiveness of SQL. Historically, Azure Stream Analytics let developers express their real-time query logic using a very simple SQL like language. That said, customers have also been asking us to support more expressive custom code to implement advanced scenarios. Today, in our journey to offer richer custom code support, we are pleased to announce the support for User defined functions using JavaScript in Azure Stream Analytics. With this new feature, customers can now write their custom code in JavaScript, and easily invoke it as part of their real-time stream processing query.

Java Script Image 1

Invoking JavaScript UDF from a Stream Analytics Query

Visual Studio tools for Azure Stream Analytics

To help maximize end-to-end developer productivity across authoring, testing and debugging Stream Analytics jobs, we are rolling out a public preview of Azure Stream Analytics tools for Visual Studio. Local testing on client machines to enable true offline query building and testing experience will be one of the key capabilities that will be available. Additionally, features such as IntelliSense (code-completion), Syntax Highlighting, Error Markers and Source control integrations are designed to offer best in class developer experiences.

Stream Analytics jobs in Visual Studio

Stream Analytics jobs in Visual Studio

Low-latency dashboarding with Power BI

In our quest to continually test the boundaries of performance and latencies to serve our customer needs better, we’ve worked closely with our Power BI engineering team to improve dashboarding experiences for solutions built using Azure Stream Analytics. Azure Stream Analytics jobs can now output to the new Power BI streaming datasets. This will enable rich visual and dynamic dashboards with a lot lower latency than what was possible until now.

Dashboards powered from Azure Stream Analytics

Dashboards powered by streaming data from Azure Stream Analytics

Job Diagnostics logs

Building on a series of ongoing investments designed to improve the self-service troubleshooting experience, today we are announcing the preview of Azure Stream Analytics’ integration with Azure Monitoring. This provides customers a systematic way to deal with lost, late or malformed data while enabling efficient mechanisms to investigate errors caused by bad data.

Having immediate access to actual data that causes errors helps customers quickly address problem(s). Users will be able to control how the job acts when errors occur in data, and persist relevant event data and operational metadata (eg. occurrence time and counts) in Azure Storage or Azure Event Hubs. This data can be used for diagnostics and troubleshooting offline. Furthermore, data routed to Azure Storage can be analyzed using the rich visualization and analytics capabilities of Azure Log Analytics.

Key examples of data handling errors include: Data conversion and serialization errors in cases of schema mismatch; Incompatible types and constraints such as allow null, duplicates; Truncation of strings and issues with precision during conversion etc.

Link to Diagnostics logs on Azure portal

Link to Diagnostics logs on Azure portal

Keep the feedback and ideas coming

Azure Stream Analytics team is highly committed to listening to your feedback and let the user voice dictate our future investments. We welcome you to join the conversation and make your voice heard via our UserVoice.

Please visit our pricing page to review the latest pricing.