Azure has two different deployment models for creating and working with resources: Resource Manager and classic. This article covers using both models, but Microsoft recommends that most new deployments use the Resource Manager model.
There are two levels of load balancing available for Azure infrastructure services:
Traffic Manager allows you to control the distribution of user traffic to endpoints, which can include cloud services, websites, external sites, and other Traffic Manager profiles. Traffic Manager works by applying an intelligent policy engine to Domain Name System (DNS) queries for the domain names of your Internet resources. Your cloud services or websites can be running in different datacenters across the world.
You must use either REST or Windows PowerShell to configure external endpoints or Traffic Manager profiles as endpoints.
Traffic Manager uses three load-balancing methods to distribute traffic:
For more information, see About Traffic Manager Load Balancing Methods.
The following diagram shows an example of the Round Robin load balancing method for distributing traffic between different cloud services.
The basic process is the following:
For more information, see Traffic Manager.
Virtual machines in the same cloud service or virtual network can communicate with each other directly using their private IP addresses. Computers and services outside the cloud service or virtual network can only communicate with virtual machines in a cloud service or virtual network with a configured endpoint. An endpoint is a mapping of a public IP address and port to that private IP address and port of a virtual machine or web role within an Azure cloud service.
The Azure Load Balancer randomly distributes a specific type of incoming traffic across multiple virtual machines or services in a configuration known as a load-balanced set. For example, you can spread the load of web request traffic across multiple web servers or web roles.
The following diagram shows a load-balanced endpoint for standard (unencrypted) web traffic that is shared among three virtual machines for the public and private TCP port of 80. These three virtual machines are in a load-balanced set.
Azure can also load balance within a cloud service or virtual network. This is known as internal load balancing and can be used in the following ways:
Similar to Azure load balancing, internal load balancing is facilitated by configuring an internal load-balanced set.
The following diagram shows an example of an internal load-balanced endpoint for a line of business (LOB) application that is shared among three virtual machines in a cross-premises virtual network.
For the steps to create a load-balanced set, see Configure an internal load-balanced set.
For more information about load balancer, see Internal load balancing.