This post describes how to use the Service Bus brokered messaging features in a way to yield best performance. You can find more details in the full article on MSDN.
Use the Service Bus Client Protocol
The Service Bus supports the Service Bus client protocol and HTTP. The Service Bus client protocol is more efficient, because it maintains the connection to the Service Bus service as long as the message factory exists. It also implements batching and prefetching. The Service Bus client protocol is available for .NET applications using the .NET managed API. Whenever possible, connect to the Service Bus via the Service Bus client protocol.
Reuse Factories and Clients
Service Bus client objects, such as QueueClient or MessageSender, are created through a MessagingFactory, which also provides internal management of connections. When using the Service Bus client protocol avoid closing any messaging factories and queue, topic and subscription clients after sending a message and then recreating them when sending the next message. Instead, use the factory and client for multiple operations. Closing a messaging factory deletes the connection to the Service Bus. Establishing a connection is an expensive operation.
Use Concurrent Operations
Performing an operation (send, receive, delete, etc.) takes a certain amount of time. This time includes the processing of the operation by the Service Bus service as well as the latency of the request and the reply. To increase the number of operations per time, operations should be executed concurrently. This is particularly true if the latency of the data exchange between the client and the datacenter that hosts the Service Bus namespace is large.
Executing multiple operations concurrently can be done in several different ways:
Asynchronous operations. The client pipelines operations by performing asynchronous operations. The next request is started before the previous request completes.
Multiple factories. All clients (senders as well as receivers) that are created by the same factory share one TCP connection. The maximum message throughput is limited by the number of operations that can go through this TCP connection. The throughput that can be obtained with a single factory varies greatly with TCP round-trip times and message size.
Use client-side batching
Client-side batching allows a queue/topic client to batch multiple send operations into a single request. It also allows a queue/subscription client to batch multiple Complete requests into a single request. By default, a client uses a batch interval of 20ms. You can change the batch interval by setting MessagingFactorySettings.NetMessagingTransportSettings.BatchFlushInterval before creating the messaging factory. This setting affects all clients that are created by this factory.
MessagingFactorySettings mfs = new MessagingFactorySettings();
mfs.TokenProvider = tokenProvider;
mfs.NetMessagingTransportSettings.BatchFlushInterval = TimeSpan.FromSeconds(0.05);
MessagingFactory messagingFactory = MessagingFactory.Create(namespaceUri, mfs);
For low-throughput, low-latency scenarios you want to disable batching. To do so, set the batch flush interval to 0. For high-throughput scenarios, increase the batching interval to 50ms. If multiple senders are used, increase the batching interval to 100ms.
Batching is only available for asynchronous Send and Complete operations. Synchronous operations are immediately sent to the Service Bus service. Batching does not occur for Peek or Receive operations, nor does batching occur across clients.
Use batched store access
To increase the throughput of a queue/topic/subscription, the Service Bus service batches multiple messages when writing to its internal store. If enabled on a queue or topic, writing messages into the store will be batched. If enabled on a queue or subscription, deleting messages from the store will be batched. Batched store access only affects Send and Complete operations; receive operations are not affected.
When creating a new queue, topic or subscription, batched store access is enabled with a batch interval is 20ms. For low-throughput, low-latency scenarios you want to disable batched store access by setting QueueDescription.EnableBatchedOperations to false before creating the entity.
QueueDescription qd = new QueueDescription();
qd.EnableBatchedOperations = false;
Queue q = namespaceManager.CreateQueue(qd);
Prefetching causes the queue/subscription client to load additional messages from the service when performing a receive operation. The client stores these messages in a local cache. The QueueClient.PrefetchCount and SubscriptionClient.PrefetchCount values specify the number of messages that can be prefetched. Each client that enables prefetching maintains its own cache. A cache is not shared across clients.
Service Bus locks any prefetched messages so that prefetched messages cannot be received by a different receiver. If the receiver fails to complete the message before the lock expires, the message becomes available to other receivers. The prefetched copy of the message remains in the cache. The receiver will receive an exception when it tries to complete the expired cached copy of the message.
To prevent the consumption of expired messages, the cache size must be smaller than the number of messages that can be consumed by a client within the lock timeout interval. When using the default lock expiration of 60 seconds, a good value for SubscriptionClient.PrefetchCount is 20 times the maximum processing rates of all receivers of the factory. If, for example, a factory creates 3 receivers and each receiver can process up to 10 messages per second, the prefetch count should not exceed 20*3*10 = 600.
By default, QueueClient.PrefetchCount is set to 0, which means that no additional messages are fetched from the service. Enable prefetching if receivers consume messages at a high rate. In low-latency scenarios, enable prefetching if a single client consumes messages from the queue or subscription. If multiple clients are used, set the prefetch count to 0. By doing so, the second client can receive the second message while the first client is still processing the first message.
Read the full article on MSDN.