Skip to main content

Data Lake Analytics pricing

Distributed analytics service that makes big data easy

Azure Data Lake Analytics is the first cloud serverless job-based analytics service where you can easily develop and run massively parallel data transformation and processing programs in U-SQL, R, Python, and .Net over petabytes of data. With no infrastructure to manage, you can process data on demand, scale instantly, and only pay per job.

Explore pricing options

Apply filters to customize pricing options to your needs.

Prices are estimates only and are not intended as actual price quotes. Actual pricing may vary depending on the type of agreement entered with Microsoft, date of purchase, and the currency exchange rate. Prices are calculated based on US dollars and converted using London closing spot rates that are captured in the two business days prior to the last business day of the previous month end. If the two business days prior to the end of the month fall on a bank holiday in major markets, the rate setting day is generally the day immediately preceding the two business days. This rate applies to all transactions during the upcoming month. Sign in to the Azure pricing calculator to see pricing based on your current program/offer with Microsoft. Contact an Azure sales specialist for more information on pricing or to request a price quote. See frequently asked questions about Azure pricing.


Pay-as-You-Go lets you pay by the second with no long-term commitments.

Usage Price
Analytics Unit $-/hour

Monthly commitment packages

Monthly commitment packages provide you with a significant discount (up to 74%) compared to Pay-as-You-Go pricing.

Included Analytics Unit Hours Price/Month Savings over Pay-As-You-Go Price Per Analytic Unit Overage Price Per Analytic Unit
100 $- $- $-
500 $- $- $-
1,000 $- $- $-
5,000 $- $- $-
10,000 $- $- $-
50,000 $- $- $-
100,000 $- $- $-
> 100,000 Submit support case via "Help + support" in Azure portal

Azure pricing and purchasing options

Connect with us directly

Get a walkthrough of Azure pricing. Understand pricing for your cloud solution, learn about cost optimization and request a custom proposal.

Talk to a sales specialist

See ways to purchase

Purchase Azure services through the Azure website, a Microsoft representative, or an Azure partner.

Explore your options

Additional resources

Data Lake Analytics

Learn more about Data Lake Analytics features and capabilities.

Pricing calculator

Estimate your expected monthly costs for using any combination of Azure products.


Review the Service Level Agreement for Data Lake Analytics.


Review technical tutorials, videos, and more Data Lake Analytics resources.

  • An Azure Data Lake Analytics Unit, or AU, is a unit of computation made available to your U-SQL job. Each AU gives your job access to a set of underlying resources like CPU and memory. Learn more about an AU

  • When you create a job, you must allocate AUs for the job to run. A job will pass through 4 major phases: preparation, queuing, execution and finalization and enters execution after the allocated AUs become available. You will be billed AUs allocated for the duration of the job's execution and finalization phases. Learn more about an AU

  • You should carefully allocate the right number of AUs that fits your job requirements. Increasing the number of AUs makes more compute resources available to your job, it however does not increase the job’s inherent parallelism. Depending on your job’s characteristics (e.g. how parallelizable it is, and how much data it is processing etc.), you may see that your jobs run faster with more AUs, or you may over-allocate more AUs than can be used. Azure Data Lake Tools for Visual Studio provides several tools that can help you diagnose the performance of your U-SQL jobs and estimate the optimal number of AUs. Learn more about saving money and controlling costs

  • Price is determined by the number of AUs and job length. Let’s assume two cases:

    • Case 1: A job takes three hours to complete with 10 AUs, so the price is calculated as 3*10=30 AU hours. If the job can take advantage of 20 AUs and runs twice as fast, the price would be 1.5*20= 30 AU hours. In this case the price is the same, but latency is improved.
    • Case 2: A job takes five hours to complete with 10 AUs, so the price is calculated as 5*10=50 AU hours. If the job takes 4 hours to complete when using 20 AUs, the price would be 4*20=80 AU hours. In this case, the total cost increased 80% with your job finishing one hour sooner.
  • Azure Data Lake Storage Gen1 transactions incur any time you read or write data to the service. Every time a user, an application, or another Azure service reads or writes data up to 4 MB in size, it's billed as one transaction. For example, if one writes operation puts 128 KB of data into Data Lake Storage Gen1, it's billed as one transaction. Transactions are billed in increments of up to 4 MB, so if an item is larger than 4 MB, it will be billed in multiple increments. For example, if one read operation gets 9 MB of data from Data Lake Storage Gen1, it's billed as three transactions (4 MB + 4 MB + 1 MB).

    Let's see how transactions appear on your bill based on read operations. For this, assume a scenario where your application runs a Data Lake Analytics job for four hours per day, while reading 1,000 items per second when the job is running, each item being less than 4 MB. In the above scenario, Data Lake Storage Gen1 will charge for read transactions for Data Lake Analytics reading data from Data Lake Storage Gen1. You will be charged the following:

    Item Usage Volume Per Month Rate Per Month Monthly Cost
    Read transactions from Data Lake Analytics 1,000 items/second * 3,600 * 4 * 31 $- per 10,000 transactions $-
    Total transactions cost $-
  • Price is determined by the number of AUs you reserve for the month.

    • A billing cycle is aligned to calendar month. Therefore, it always starts the 1st day of the month and ends the last day of the month.
    • When you commit for the first time to a package, we will pro-rate the monthly price and AU-hours to the days left within that month. As an example, if you commit to a 1,000 AU-hour package and there are 10 days left within that month, you will immediately get 334 AU-hours (1,000 AU-hours / 30 days in a month x 10 day left) at a price of $- ($- / 31 days in a month x 10 day left). We pro-rate by 30 days for the AU-hours in a package and by 31 days for the price to make sure that the pro-rata is always in your favor.
    • Units in a package reset the 1st day of the month. As an example, if you commit to 100 AU-hours and you have 20 AU-hours left by the end of the month, your package will be reset to 100 AU-hours the day after. There is no roll-over for unused AU-hours.
    • You can choose a new package at any time. The change will be effective the first day of the next calendar month. This means that during a month if you have a package of 100 AU-hours and decide to commit to a 500 AU-hours package, this change will apply on the 1st day of the next calendar month. For the current calendar month, you will remain on the 100 AU-hours package.
    • We use "seconds" as the unit of measure for the consumption of your commitment package.
    • Once your package is consumed, you will be charged at the overage consumption rate.
  • Consumption is determined by the number of AUs and job length. Job length is influenced by the number of AUs assigned to the job as well as the characteristic of the job such as data size and computation complexity.

    • Case 1: You committed to 100 AU-hours and submit a job that takes 2 hours and 30 minutes to complete with 1 AU, so the consumption is calculated as 2.5*1=2.5 AU Hours. You will have 97.5 AU-hours left in your commitment.
    • Case 2: You committed to 100 AU-hours and have only 1 AU-hour left. You submit a job that takes 2 hours to complete with 2 AUs, so the consumption is calculated as 2*2=4 AU Hours. You will use your remaining AU hour and be charged 3 additional AU hours at the overage rate (1.5*3 = $-)
  • Azure Data Lake Analytics allows you to read and write data from Azure Data Lake Storage Gen1, Azure Blob Storage and Azure SQL Database. The use of these services by Azure Data Lake Analytics can incur standard charges from these services (e.g., transactions, outbound data transfers, etc.). Please refer to the service pricing page for these services for more details.

    • Job Cancellation:
      Cancellation is always the result of a deliberate customer action or an admin-defined administrative policy. The ADLA service does not autonomously cancel jobs, except in the case that a vertex reaches its execution time limit of 5 hours (there is no time limit for a job but a limit for a single vertex). When a job is cancelled, we will bill you for the duration the job was running.
    • Job Failure:
      Job failures are either a result of user error, or sometimes a ADLA service error. The error code from a failed job will indicate whether a job failure was result of a user or service error. If the error code contains “USER”, the failure was result of a user error, in which case, the service will bill you for the duration the job was running. However, if the error contains "SYSTEM", the failure was a result of ADLS service error and you will not be billed for the job.

Talk to a sales specialist for a walk-through of Azure pricing. Understand pricing for your cloud solution.

Get free cloud services and a $200 credit to explore Azure for 30 days.

Added to estimate. Press 'v' to view on calculator
Can we help you?