• 1 min read

New architecture blueprint: HPC and data orchestration using Batch and Data Factory

A few months back we shared with you an Azure architecture blueprint for Large-Scale Computing of Financial Services workloads. We have now published a condensed version of that scenario, in the form of an example solution that includes a step-by-step walkthrough, as well as its corresponding architecture diagram.

A few months back we shared with you an Azure architecture blueprint for Large-Scale Computing of Financial Services workloads. It illustrates a real-world scenario that uses Azure Data Factory to manage and monitor complex data flows, while leveraging Azure Big Compute, Big Data, and infrastructure services including Batch, HDInsight, Machine Learning, and other Azure services and solutions.

 

 

We have now published a condensed version of that scenario, in the form of an example solution that includes a step-by-step walkthrough, as well as its corresponding architecture diagram.

 

Azure HPC: New architecture blueprint

 

The purpose of this example solution is to help customers on any industry vertical, running any type of workload, to take advantage of the great integration that we offer between Azure Data Factory and Azure Batch.

Also, if you like these blueprints, you may want to check out the collection of scenario-based Architecture Blueprints.

Finally, and as always, we want to hear from you. Let us know if example solutions like the one mentioned in this post are useful to you, and share ideas with us about what we can offer to help you build your next solution on Azure.