メイン コンテンツにスキップ

 Subscribe
Updated on July 23, 2019: The functionality described in this blog was retired and no longer exist in Application Insights. Alternatively, you can send your custom log to the Azure Monitor log store, which is Log Analytics. You can query this data from Log Analytics or your Application Insights resource using cross-resource queries.

You can now apply the powerful Analytics query language to high-volume NoSQL data streams that you import from any source. You can display the results in Power BI or Azure dashboards, and get alerts if specified thresholds are crossed. Hitherto, Analytics queries have been applicable to performance and usage telemetry collected by Azure Application Insights from your live web app. Now, you can either join imported data with your app telemetry, or instead run queries to analyze completely separate data. Data can be supplied as either JSON or DSV (delimiter-separated values).

Here are some typical scenarios:

  • Join app telemetry with a lookup table. For example, you could import a table that maps URLs from your website to more readable page titles. In Analytics, you can create a dashboard chart report that shows the ten most popular pages in your website.
  • Correlate your application telemetry with other sources such as network traffic, server data, or CDN log files.
  • Apply Analytics to a separate data stream. If you have any sparse, timestamped streams, you can analyze it with Analytics, much more efficiently than with SQL in many cases. This is the scenario we will focus on in this blog.

For example, let’s suppose you receive a data feed about flights. You could automate a daily analysis of route popularity and congestion. Analytics can run complex queries, including joins, aggregations, and statistical functions, to extract the necessary results. You can view the results in the range of charts available in Analytics. Or you could have Power BI run the queries each day, plot the results on maps, and present them on a website.

Set up

To analyze your data with Analytics, you need an account in Microsoft Azure.

Sign in to the portal and set up a Storage resource in Azure. This is where you will put your data before it’s sucked into Analytics.

Create an Application Insights resource. Then navigate from there to the Analytics page.

Analyze your data with Application Insights Analytics

Define your data source

Before you analyze some data, you need to tell Analytics about its format. Like we said, we’ll skip the Application Insights material for now, and go straight to defining our own data source.

Analyze your data with Application Insights Analytics

This opens a wizard where you name the data source and define its schema. You can do that either by providing an explicit schema, or by uploading a small sample of my data – that’s usually easier.

In the flight data example, the files are in CSV format. The sample data file includes headers, and the schema is automatically inferred from it. You get the opportunity to update the inferred data types and field names if necessary.

Ingest data

Once you’ve defined a schema, you can upload data files as often as you like. Data files of hundreds of MB are easily handled by Analytics.

To ingest the data, it’s easiest to automate the process with a short script. The script uploads the data to Azure storage, and then notifies Analytics to ingest it. There’s a sample in the import documentation.

Run queries

Here’s a query to look for the top 10 destination airports by airlines.

airlineRoutes
| summarize count() by Destination_airport 
| top 10 by count_ desc
| render piechart

 

Result:

Analyze your data with Application Insights Analytics

The query language is powerful but easy to learn, and has a piped model in which each operator performs one task – much easier to work with than the nested SELECTs of SQL.

Join multiple tables

Analytics can ingest multiple sources and your queries can run joins over them.

For example, but wouldn’t it much nicer to see the airports identified by full name instead of by their codes? Let’s add a new data source, airportsData, which maps each airport code to its name and other information. Now we can perform a join on the tables:

airlineRoutes 
| summarize routeCount = count() 
  by airportCode = destination_airport // rename field for join
| top 10 by routeCount count_ desc 
| join kind=inner  ( airportsData  ) on airportCode
| project routeCount, airportCode  // the fields we want
| render piechart

Analyze your data with Application Insights Analytics

Augmenting Application Insights telemetry

The main job of Analytics is as the powerful query tool of Application Insights, which monitors the health and usage of your web applications. One of the reasons for importing data into Analytics is to augment the telemetry data. For example, to make the telemetry reports more readable, query URLs can be translated to page names.

Get started today

Analytics can be applied to your data today. Read detailed how-to here.

Whether you want to enrich your data or to analyze the logging data of your application, you can easily add a new data source and start ingesting the data. With a high-volume ingestion, you can now apply the power of Analytics query language to your own custom data.

As always, feel free to send us your questions or feedback by using one of the following channels:

· Try Application Analytics

· Suggest ideas and vote in Application Insights ideas

· Join the conversation at the Application Insights Community

  • Explore

     

    Let us know what you think of Azure and what you would like to see in the future.

     

    Provide feedback

  • Build your cloud computing and Azure skills with free courses by Microsoft Learn.

     

    Explore Azure learning


Join the conversation