Skip Navigation

Azure for banking and capital markets

Get high-performance modeling, analytics, and data management at cloud scale

Discover HPC for financial services

Perform complex calculations on huge data volumes using on-demand compute resources. Use cloud-based high-performance computing (HPC) to run large parallel and batch compute jobs in the cloud, meet future capacity needs, and greatly reduce the time required to calculate valuation and risk.

Learn more about HPC on Azure

Meet global requirements

Choose from the broadest portfolio of security and regulatory compliance certifications in the banking and capital markets industry. Trust the security built into every aspect of the platform, from the physical security of Azure datacenters to ongoing threat analysis.

Learn more about Azure security, privacy, transparency and compliance

Experience speed and scale

Match your computing needs with the right resources at the right time. Gain a market edge by extending your on-premises HPC financial services workloads to the cloud when you need more capacity, or by running all workloads entirely in Azure.

Stick with tools you already know

Use Azure with commercial data connector and scheduling tools like TIBCO DataSynapse GridServer and Univa Grid Engine, or build out your own Windows or Linux clusters to run tools like Torque and SLURM with the help of Microsoft HPC Pack, an HPC compute cluster solution.

Extend your current workloads or deploy in the cloud

Hybrid cloud

Save infrastructure costs by extending your on-premises HPC or grid computing solution to Azure as a hybrid solution. Increase capacity on your existing HPC cluster and only pay for resources when you need them.

Cloud native

Deploy cloud applications on Windows or Linux infrastructure-as-a-service (IaaS) virtual machines. Use your own custom images, and run HPC workloads using your job scheduling solution of choice.

Managing risk at scale

Instead of buying hundreds of servers on its on-premises grid, Mitsubishi UJF used the deployment, administration, job scheduling, and monitoring tools in HPC Pack to offload its daily risk calculations to Azure. With a dramatic increase in computing power, and no increase in staffing, the company saves millions of dollars in servers and datacenter space.

Read the full case study

Browse more case studies

MUFG
"I can now manage 750 machines in Azure on weekdays and a thousand on weekends. Plus an extra 300 production machines on-prem. And that's all done by one person."

-Robert Griffiths, Director

Axioma
"The Microsoft Cloud gives us infinite capacity to handle these large books spanning all the asset classes large financial institutions hold. Leveraging an evergreen cloud platform gives us agility in our development cycle and ultimately improves time to market. As a result, our solutions are able to innovate in sync with our client needs."

-Fabien Couderc, Head of Enterprise Development

Innovating more freely

Using Azure as its risk computation engine allowed Axioma to offer an elastic enterprise-wide risk management system across asset classes. This transformation reduced the company's dependency on large on-premises datacenters and huge development and operations teams, and enabled its teams to innovate in analytics free of infrastructure capacity constraints.

Read the full case study

Browse more case studies

Designing for the Hybrid Cloud

  1. Use Azure Data Factory to collect financial data from disparate on-premises data sources into Azure Storage or Azure SQL Database.
  2. Connect to commercial data sources using Azure Data Factory and ingest the data into Azure Storage.
  3. Deploy a cluster completely on the cloud with HPC Pack or with your own grid computing solution. Deploy virtual machines from a growing list of Windows and Linux images on Azure Marketplace.
  4. If you prefer not to manage compute infrastructure, use Azure Batch, a job-scheduling service for running large-scale parallel workloads.
  5. Further aggregate output data in cloud storage with Azure HDInsight or with Hadoop running on Azure IaaS virtual machines.
  1. Create rich predictive models using Azure Machine Learning and operationalize them in your Azure Data Factory data-integration workflow.
  2. Move processed data to a cloud-based or on-premises data mart using Azure Data Factory and consume the data with online solutions like Microsoft Power BI or with client analysis and visualization tools.
  3. Consume actuarial and quantitative analysis results securely by authenticating with corporate account credentials federated with Azure Active Directory.
  4. Compose, schedule, operationalize, manage, and monitor your entire data pipeline in a single interface using Azure Data Factory.
  5. Create private connections with higher security, more reliability, faster speeds and lower latencies between Azure datacenters and your on-premises clusters with Azure Express Route.

See a sample solution for processing large-scale datasets in Azure.

Partners gallery

  • Apex
  • Axioma
  • Axis
  • Excelian
  • Milliman
  • Numerix
  • Oliver Wyman
  • Risk Metrics
  • RMS Service Group
  • Sungard
  • Willis Towers Watson
  • TIBCO
  • Aneo
  • Univa
  • Cycle Computing
  • IBM
  • Endjin

Technical resources