Azure Data Factory is a globally deployed data movement service in the cloud. Use it to ingest data from multiple on-premises and cloud sources easily. Then connect to on-premises sources with a data management gateway, and use Data Factory to get your data where it needs to go. Prepare and partition your data as you ingest it, or apply pre-processing steps.
Schedule, orchestrate, and manage the data transformation and analysis process. Choose from a wide range of processing services, and compose them into managed data pipelines to use the best tool for the job. For example, add a Hadoop processing step for big or semi structured data, a stored procedure invocation step for structured data, a machine-learning step for analytics, or insert your own custom code as a processing step in any pipeline.
Using data pipelines, transform raw data into finished or shaped data that's ready for consumption by BI tools or applications. Use Data Factory to get your valuable data where it needs to go for simple consumption by your on-premises or cloud applications and services.
Visualize, monitor, and manage your entire network of data pipelines at a glance to identify issues and take action. Easily understand when data arrived, where it came from, and how and when it’s ready for processing. Set up alerts to monitor your overall Data Factory service health. Let Data Factory save you time and money by automating your data pipelines with on-demand cloud resource management.