The Azure Quickstart templates are currently available in English

Performs ETL job using Azure services

Por ajos1993
Última atualização: 04/03/2020

This template provides an example of how to perform analytics on the historic as well as real time streaming data stored in Azure Blob Storage. The data from the event hub is taken by the Azure Stream Analytics Job to perform transformation and the output is stored in Azure Blob Storage and is visualized in PowerBI. The analytics is applied on the historic data stored in Blob storage by Azure Data Analytics and the movement of extracted, transformed and published data and the orchestration is done by Data Factory. The published data is further visualized in PowerBI

Este modelo do ARM (Azure Resource Manager) foi criado por um membro da comunidade, e não pela Microsoft. Cada modelo do ARM é licenciado para você de acordo com o contrato de licença de seu proprietário, e não da Microsoft. A Microsoft não é responsável por modelos do ARM fornecidos e licenciado por membros da comunidade e não avalia sua segurança, compatibilidade ou desempenho. Modelos do ARM da comunidade não têm suporte de nenhum programa ou serviço de suporte da Microsoft e são disponibilizados DA FORMA COMO ESTÃO, sem nenhum tipo de garantia.

Parâmetros

Nome do parâmetro Descrição
location The location in which the resources will be created.Check supported locations
eventHubNamespaceName Name of the EventHub namespace
captureTime the time window in seconds for the archival
captureSize the size window in bytes for event hub capture
eventhubSku The messaging tier for service Bus namespace
skuCapacity MessagingUnits for premium namespace
isAutoInflateEnabled Enable or disable AutoInflate
messageRetentionInDays How long to retain the data in Event Hub
partitionCount Number of partitions chosen
captureEncodingFormat The encoding format Eventhub capture serializes the EventData when archiving to your storage
adlAnalyticsName The name of the Data Lake Analytics account to create.
adlStoreName The name of the Data Lake Store account to create.
vmSize Size of vm Eg. Standard_D1_v2
vm_username Username for the Virtual Machine.
vm_password Password for the Virtual Machine.
OptionalWizardInstall Select whether the VM should be in production or not.
dataFactoryName Name of the data factory. Must be globally unique.
appName Name of the Azure datalake UI app registered. Must be globally unique.
servicePrincipalId The ID of the service principal that has permissions to create HDInsight clusters in your subscription.
servicePrincipalKey The access key of the service principal that has permissions to create HDInsight clusters in your subscription.
dataLakeAnalyticsLocation The location in which the resources will be created.Check supported locations
_artifactsLocation The base URI where artifacts required by this template are located here
_artifactsLocationSasToken The sasToken required to access _artifactsLocation. When the template is deployed using the accompanying scripts, a sasToken will be automatically generated. Use the defaultValue if the staging location is not secured.

Usar o modelo

PowerShell

New-AzResourceGroup -Name <resource-group-name> -Location <resource-group-location> #use this command when you need to create a new resource group for your deployment
New-AzResourceGroupDeployment -ResourceGroupName <resource-group-name> -TemplateUri https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/yash-datalake/azuredeploy.json
Instale e configure o PowerShell do Azure

Linha de comando

az group create --name <resource-group-name> --location <resource-group-location> #use this command when you need to create a new resource group for your deployment
az group deployment create --resource-group <my-resource-group> --template-uri https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/yash-datalake/azuredeploy.json
Instalar e configurar a Interface de Linha de Comando de Plataforma Cruzada do Azure