Create a Data Factory, Copy from Blob to SQL with Sproc

Azure Public Test Date Azure Public Test Result

Azure US Gov Last Test Date Azure US Gov Last Test Result

Best Practice Check Cred Scan Check This template creates a Data Factory pipeline that copies data from a file in a Blob Storage into a SQL Database table while invoking a Stored Procedure (SProc).

Please do the following steps before deploying the template:

  1. Complete the prerequisites mentioned in Overview and prerequisites article.
  2. Update values for the following parameters in azuredeploy.parameters.json file.
    1. storageAccountName
    2. storageAccountKey
    3. sqlServerName
    4. sqlDatabaseName
    5. sqlUserId
    6. sqlPassword
  3. Create a Stored Procedure in your SQL Database. Run the following queries to align with the tutorial.
CREATE PROCEDURE spWriteEmployee READONLY
AS
BEGIN
	INSERT INTO [dbo].[emp](First, Last)
	VALUES  ('Bill', 'Gates')
END

Deploy To Azure Deploy To Azure US Gov Visualize

Deploying sample

You can deploy this sample directly through the Azure Portal or by using the scripts supplied in the root of the repository.

To deploy a sample using the Azure Portal, click the Deploy to Azure button at the top of the article.

To deploy the sample via the command line (using Azure PowerShell or the Azure CLI) you can use the scripts.

Simply execute the script from the root folder and pass in the folder name of the sample (101-data-factory-blob-to-sql-copy-stored-proc). For example:

.\Deploy-AzureResourceGroup.ps1 -ResourceGroupLocation 'eastus' -ArtifactStagingDirectory 101-data-factory-blob-to-sql-copy-stored-proc
azure-group-deploy.sh -a 101-data-factory-blob-to-sql-copy-stored-proc -l eastus

Tags: Microsoft.DataFactory/datafactories, linkedservices, AzureStorage, AzureSqlDatabase, datasets, AzureBlob, TextFormat, AzureSqlTable, dataPipelines, Copy, BlobSource, SqlSink, SqlServerStoredProcedure