The Azure Quickstart templates are currently available in English

Performs ETL job using Azure services

Sidst opdateret: 04-03-2020

This template provides an example of how to perform analytics on the historic as well as real time streaming data stored in Azure Blob Storage. The data from the event hub is taken by the Azure Stream Analytics Job to perform transformation and the output is stored in Azure Blob Storage and is visualized in PowerBI. The analytics is applied on the historic data stored in Blob storage by Azure Data Analytics and the movement of extracted, transformed and published data and the orchestration is done by Data Factory. The published data is further visualized in PowerBI

Denne ARM-skabelon (Azure Resource Manager) blev oprettet af et medlem af communityet og ikke af Microsoft. Hver ARM-skabelon er givet i licens til dig under en licensaftale med ejeren af skabelonen og ikke med Microsoft. Microsoft kan ikke gøres ansvarlig for de ARM-skabeloner, der leveres og gives i licens af communitymedlemmer, og vi undersøger dem ikke for sikkerhed, kompatibilitet eller ydeevne. ARM-skabeloner fra communytiet understøttes ikke af noget eller nogen Microsoft-supportprogram eller -tjeneste, og de stilles til rådighed, SOM DE ER, uden nogen former for garanti.

Parametre

Parameternavn Beskrivelse
location The location in which the resources will be created.Check supported locations
eventHubNamespaceName Name of the EventHub namespace
captureTime the time window in seconds for the archival
captureSize the size window in bytes for event hub capture
eventhubSku The messaging tier for service Bus namespace
skuCapacity MessagingUnits for premium namespace
isAutoInflateEnabled Enable or disable AutoInflate
messageRetentionInDays How long to retain the data in Event Hub
partitionCount Number of partitions chosen
captureEncodingFormat The encoding format Eventhub capture serializes the EventData when archiving to your storage
adlAnalyticsName The name of the Data Lake Analytics account to create.
adlStoreName The name of the Data Lake Store account to create.
vmSize Size of vm Eg. Standard_D1_v2
vm_username Username for the Virtual Machine.
vm_password Password for the Virtual Machine.
OptionalWizardInstall Select whether the VM should be in production or not.
dataFactoryName Name of the data factory. Must be globally unique.
appName Name of the Azure datalake UI app registered. Must be globally unique.
servicePrincipalId The ID of the service principal that has permissions to create HDInsight clusters in your subscription.
servicePrincipalKey The access key of the service principal that has permissions to create HDInsight clusters in your subscription.
dataLakeAnalyticsLocation The location in which the resources will be created.Check supported locations
_artifactsLocation The base URI where artifacts required by this template are located here
_artifactsLocationSasToken The sasToken required to access _artifactsLocation. When the template is deployed using the accompanying scripts, a sasToken will be automatically generated. Use the defaultValue if the staging location is not secured.

Brug skabelonen

PowerShell

New-AzResourceGroup -Name <resource-group-name> -Location <resource-group-location> #use this command when you need to create a new resource group for your deployment
New-AzResourceGroupDeployment -ResourceGroupName <resource-group-name> -TemplateUri https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/yash-datalake/azuredeploy.json
Installér og konfigurer Azure PowerShell

Kommandolinje

az group create --name <resource-group-name> --location <resource-group-location> #use this command when you need to create a new resource group for your deployment
az group deployment create --resource-group <my-resource-group> --template-uri https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/yash-datalake/azuredeploy.json
Installér og konfigurer Azure-kommandolinjegrænsefladen til flere platforme