Scale up your deep learning with Batch AI preview
Imagine reducing your training time for an epoch from 30 minutes to 30 seconds, and testing many different hyper-parameter weights at the same time.
Imagine reducing your training time for an epoch from 30 minutes to 30 seconds, and testing many different hyper-parameter weights at the same time.
As our customers bring their workloads to the cloud to take advantages of scale on-demand, without the overhead of managing infrastructure, they ask if Microsoft can support them with Linux. The answer is yes.
I’m excited to announce the general availability of Azure Batch, our job scheduling and compute pool management service.
The Azure Big Compute team is attending SC14 this week, the annual conference for high performance computing and analysis.