Azure Data Factory copy activity supports resume from last failed run
Updated: 18 January, 2020
Azure Data Factory copy activity now supports resume from last failed run when you copy files between file-based data stores including Amazon S3, Google Cloud Storage, Azure Blob and Azure Data Lake Storage Gen2, along with many more. Take advantage of this feature to easily and performantly ingest or migrate large-scale data, for example, from Amazon S3 to Azure Data Lake Storage Gen2. Upon copy activity retry or manual rerun from failed activity from the pipeline, copy activity will continue from where the last run failed.