New connectors added to Azure Data Factory empowering richer insights

Publicado em 4 fevereiro, 2019

Program Manager, Azure Data Factory

Data is essential to your business. The ability to unblock business insights more efficiently can be a key competitive advantage to the enterprise. As data grows in volume, variety, and velocity, organizations need to bring together a continuously increasing set of diverse datasets across silos in order to perform advanced analytics and uncover business opportunities. The first challenge to building such big data analytics solutions is how to connect and extract data from a broad variety of data stores. Azure Data Factory (ADF) is a fully-managed data integration service for analytic workloads in Azure, that empowers you to copy data from 80 plus data sources with a simple drag-and-drop experience. Also, with its flexible control flow, rich monitoring, and CI/CD capabilities you can operationalize and manage the ETL/ELT flows to meet your SLAs.

Today, we are excited to announce the release of a set of new ADF connectors which enable more scenarios and possibilities for your analytic workloads. For example, you can now:

  • Ingest data from Google Cloud Storage into Azure Data Lake Gen2, and process using Azure Databricks jointly with data coming from other sources.
  • Bring data from any S3-compatible data storage that you may consume from third party data vendors into Azure.
  • Copy data from MongoDB and others to Azure Cosmos DB's API for MongoDB for application consumption.
  • Retrieve data from any RESTful endpoint as an extensible point to reach hundreds of SaaS applications.

For more information, see the following updates on new connectors and additional features for existing connectors.

Connector updates

Azure Cosmos DB's API for MongoDB 

You can now copy data to and from Azure Cosmos DB's API for MongoDB, in addition to the already supported SQL API. For writing into Azure Cosmos DB specifically, the connector sink is built on top of the Azure Cosmos DB bulk executor library to provide the best performance. Learn more about Azure Cosmos DB's API for MongoDB.

Amazon S3

ADF enables a custom S3 endpoint configuration in Amazon S3 connector. With this you can now copy data from any S3-compatible storage providers using the connector and are no longer limited to the official Amazon S3 service. Learn more about Amazon S3 connector.

Google Cloud Storage

As Google Cloud Storage provides S3-compatible interoperability, you can now copy data from Google Cloud Storage. This leverages the S3 connector with Google Cloud Storage’s corresponding S3 endpoint. Learn more about Google Cloud Storage connector.

MongoDB

To address the feedback on MongoDB feature coverage, performance, and scalability, ADF releases a new version of MongoDB connector. It provides comprehensive native MongoDB support including generic MongoDB connection string with connection options, native MongoDB query, extracting hierarchical data, and more. Learn more about MongoDB connector.

Azure Database for MariaDB

You can copy data from Azure Database for MariaDB. Learn more about Azure Database for MariaDB connector.

Generic REST

You can now retrieve data from various RESTful services and apps. ADF releases a more targeted REST connector in addition to the generic HTTP connector. To fulfill the two most common asks we’ve received, this REST connector supports Azure Active Directory (AAD), service principal, Managed Identity for Azure resource (MSI) authentications, as well as pagination rules. Learn more about REST connector.

Generic OData

ADF now supports AAD service principal and Managed Identity for Azure resource (MSI) authentications when copying data form OData endpoint. Learn more about OData connector.

Dynamics AX (preview)

You can now copy data from Dynamics AX using OData protocol with service principal authentication. This connector also works with Dynamics 365 Finance and Operations (F&O). Learn more about Dynamics AX connector.

You are encouraged to give these additions a try and provide us with feedback. We hope you find them helpful in your scenarios. Please post your questions on Azure Data Factory forum or share your thoughts with us on Data Factory feedback site.