Passer au contenu principal

 Subscribe

We are happy to announce that we have added support for new data stores to Azure Data Factory. You can now additionally connect to MySQL, DB2, Sybase, PostgreSQL and Teradata. Data can now be copied from any of these data sources to Azure Blob or Azure SQL Database.

Let’s take a look with this below example on how you can move data from MySQL to Azure Blob. Add a linked service for MySQL like this:

1

The linked service definition for MySQL will look like this:

{
    "name": "MyLinkedService_MySQL",
    "properties":
    {
        "type": "OnPremisesMySqlLinkedService",
        "server": "[ServerName]",
        "database": "[DatabaseName]",
        "username": "[UserName]",
        "password": "[Password]",
        "gatewayName": "[GatewayNameUsedToConnectToThisMySQL]","
        "authenticationType": "[Type of authentication: Basic/Windows/Anonymous]"
    }
}

Description of each field can be found here. Please note to do the same for DB2, Sybase, PostgreSQL or Teradata you will just need to specify a different ‘type’ which can be found here. The remaining fields remain the same.

Next, lets define the data set which we intend to copy over. It will look like this:

{
    "name": "DataSet_MySQL",
    "properties":
    {
        "location": 
        {
            "type": "RelationalTableLocation",
            "tableName": "northwind_mysql.orders",
            "linkedServiceName": "MyLinkedService_MySQL"
        },
        "availability": 
        {
            "frequency": "Hour",
            "interval": 1,
            "waitOnExternal":
            {
                "retryInterval": "00:01:00",
                "retryTimeout": "00:10:00",
                "maximumRetry": 3
            }
        }
    }
}

Now that the dataset is defined, lets use it in a pipeline to copy the data from MySQL to Azure Blob:

{
    "name": "CopyPipeline_MySQL_To_Blob",
    "properties":
    {
        "activities":
        [
            {
                "name": "MySQLToBlobCopyActivity",
                "inputs": [ {"name": "DataSet_MySQL"} ],
                "outputs": [ {"name": "DataSet_AzureBlob"} ],
                "type": "CopyActivity",
                "transformation":
                {
                    "source":
                    {                               
                        "type": "RelationalSource",
                        "query": "select * from northwind_mysql.orders"
                    },
                    "sink":
                    {
                        "type": "BlobSink",
                        "writeBatchSize": 1000000,
                        "writeBatchTimeout": "01:00:00"
                    }
                },
                "policy":
                {
                    "concurrency": 1,
                    "executionPriorityOrder": "NewestFirst",
                    "retry": 2,
                    "timeout": "01:00:00"
                }
            }
        ]
    }
}

 

With a simple and quick configuration you now have a pipeline which copies data from on-premises MySQL to Azure Blob.

2

The complete copy matrix along with the related information can be found here.

Note: These additional data stores can currently be connected as a source only for a copy activity and not as a destination.

We are continuously adding more data stores to Azure Data Factory. If you require one which we don’t support currently or have any other feedback then visit Azure Data Factory User Voice and/or Forums and let us know your thoughts. We are eager to hear from you!

  • Explore

     

    Let us know what you think of Azure and what you would like to see in the future.

     

    Provide feedback

  • Build your cloud computing and Azure skills with free courses by Microsoft Learn.

     

    Explore Azure learning


Join the conversation