Learn what Hadoop components and versions are included in HDInsight.
HDInsight continues to keep pace with Hadoop cluster versions. Get up to speed with Hadoop cluster versions 1.2, 2.2, and 2.4 on HDInsight.
Get an overview of HDInsight components, common terminology, and scenarios, and see resources for HDInsight, Apache Hadoop, and Microsoft Business Intelligence.
Learn how to use the Microsoft HDInsight Emulator for Azure, which provides a local development environment.
Apache HBase is an open source, distributed, large-scale data store that provides low latency for random reads and writes. In this tutorial, learn how to create and query HBase tables with HDInsight.
The four samples included are intended to get you started quickly and to give you an extensible testing bed to work through concepts. Create data sets, and observe the effects of data size on jobs.
Follow this end-to-end scenario to learn how to develop and test a word-counting MapReduce job on HDInsight Emulator, and then deploy and run it on HDInsight.
Learn how to develop and test a Hadoop streaming MapReduce program on HDInsight Emulator, and then run it on HDInsight using a PowerShell script.
Configure and run jobs on HDInsight clusters using Azure HDInsight PowerShell. Develop applications that manage HDInsight jobs with the Azure HDInsight .NET SDK.
Learn about errors that can occur when using PowerShell to manage HDInsight and the steps for recovering from them.
Learn how to submit MapReduce and Hive jobs using PowerShell and HDInsight .NET SDK.
Learn how to use Azure PowerShell from your workstation to submit a MapReduce program that counts word occurrences in text to an HDInsight cluster.
Use HiveQL to query data in an Apache log4j log file, and report basic statistics.
Write Pig Latin statements to analyze an Apache log4j log file, and run various queries on the data to generate output.
Both Hive and Pig allow you to create User Defined Functions (UDF) using a variety of programming languages. Learn how to use a Python UDF from Hive and Pig.
Learn how to use Azure PowerShell from a workstation to run Sqoop import and export between an HDInsight cluster and a Azure SQL database.
Learn how to run an Apache Oozie workflow to process a log4j log file to count the occurences of each log level type. Then. export the results to a Azure SQL database table.
Learn how to use the Microsoft .NET Library for Avro to serialize objects and other data structures into streams of bytes in order to persist them to memory, a database or a file.
Learn how to use Hive to analyze Twitter data to find usage frequency of a particular word.
Learn how to use Hive to calculate the average flight delay among airports, and how to use Sqoop to export the results to SQL Database.
Import data from Azure HDInsight into Excel using the Microsoft Hive ODBC Driver.
A key feature of Microsoft’s Big Data solution is solid integration of Apache Hadoop with Microsoft Business Intelligence (BI) components. Learn how to use Power Query to import HDInsight data into Excel.
DInsight clusters are enhanced to provide the reliability and availability required to manage enterprise workloads.
Learn how to use Azure Management Portal to create an HDInsight cluster, and how to open the administrative tools.
Learn how to manage HDInsight clusters using a local Azure PowerShell console.
Learn how to use the Cross-Platform Command-Line Interface to manage HDInsight clusters.
Use the Apache Ambari APIs for provisioning, managing, and monitoring Hadoop clusters. Ambari has intuitive operator tools and robust APIs that hide the complexity of Hadoop.
Learn how to define workflows and coordinators, and how to trigger the Hadoop jobs based on time.
Learn how to provision HDInsight clusters using Azure Management Portal, PowerShell, the command-line interface, and the HDInsight .NET SDK.
Learn how to upload and access data in HDInsight using Azure Storage Explorer, Azure PowerShell, the Hadoop command line, or Sqoop.
Learn how HDInsight works with data that is stored in Azure Blob storage, when to store data in HDFS, and when to store it in Blob storage.