At PDC last week, we introduced two communication-related capabilities: a) inter-role communication and b) external endpoints on worker roles. These capabilities enable new application patterns in Windows Azure-hosted services.
While loosely coupled communication via Queues remains the preferred method for reliable message processing, roles can now communicate directly using TCP or HTTP connections. In addition, roles are notified as role instances within the deployment are added or removed, enabling elasticity. A common application pattern enabled by this is client-server, where the server could be an application such as a database or a memory cache.
This is implemented via a) a worker role with an InternalEndpoint in the service definition, b) server code that calls RoleEnvironment.CurrentRoleInstance to discover what IP/port to bind to, and c) client code the calls RoleEnvironment.Roles[<TargetRole>] to discover the server endpoints. For more information, refer to the HelloFabric SDK sample. To add elasticity, refer to the documentation for the RoleEnvironment.Changed event to have the clients notified when server instances are added or removed.
External Endpoints on Worker Roles
Worker roles can now contain external facing endpoints, or InputEndpoints. You can bind to these endpoints either directly in the worker role or from within a process that you spawn from the worker role. Unlike the InternalEndpoints used by inter-role communication, InputEndpoints are load balanced.
A common application type enabled by this is a self-hosted Internet-exposed service, such as a custom application server.
Note that the port that actually gets assigned to the instances is different from the port that gets exposed via the load balancer. This port can be discovered via the RoleEnvironment.CurrentRoleInstance property.