• <1 minute

Delivering Premium Audio Experiences with Dolby® Digital Plus

Discover how to use Azure Media Encoder to encode your content with Dolby® Digital Plus multi-channel surround sound and deliver across multiple platforms, including Smart TVs, Xbox, Windows 8 devices, mobile devices and more.

With the proliferation of devices that consume media, there is an increasing need for video streaming services to offer superior audio quality to their users and deliver premium content with 5.1 surround sound. Using Azure Media Services to encode your HD Content with Dolby® Digital Plus multi-channel surround sound is possible across multiple platforms, including Smart TVs, Xbox, Windows 8 devices, mobile devices and more.

Dolby_linear_Black

Dolby® Digital Plus, or Enhanced AC-3 (E-AC-3), is an advanced surround sound audio codec designed for high quality audio. This codec is based on core Dolby Digital technologies, an established standard for cinema, broadcast, and home theater surround sound and is supported in over 2.1 B products. In this blog, I will describe how you can use Media Services to deliver a premium audio experience with your content using this codec.

Overview

The steps involved for encoding to Dolby® Digital Plus in this example are:

  1. Upload source content to your Azure  Media Services account and create an Asset
  2. Build a custom encoding preset, and save it to a file
  3. Submit a Task to encode the above Asset using the custom preset
  4. Publish the output Asset, and create a SAS URL
  5. Demonstrate playback in an application

To get started, let me describe each of these steps and then provide the sample code that I used.

After running through the sample code, you will end up with a URL for a standard MP4 file with H.264 video and Dolby Digital Plus audio. You can then use this URL to playback the stream on a device that supports Dolby Digital Plus decoding. Windows 8 and Xbox both have a built-in Dolby Digital Plus decoder, but Apple devices currently do not support this codec natively.  My recommendation would be to follow the guidance on how to use Windows 8 and the Player Framework to test your surround sound decodes.  If you want to hear the files in their full surround sound glory, you will need to have a PC that is hooked up to an AV Receiver that is capable of 5.1 playback.

Uploading the Source Content

To get started, you need to first upload some source content with HD video and multi-channel audio to your Media Service account. The following file formats are recommended:

  • MPEG-2 Transport streams with 5.1 audio encoded with AC-3 (also known as Dolby® Digital)
  • ISO MPEG-4 (MP4) files with 5.1 audio encoded with AAC
  • WMV files with 5.1 audio encoded with WMA Professional

See CreateAssetAndUploadSingleFile() for the sample code to upload the source file.

Build a Custom Preset

If this is your first time creating a custom preset, I would encourage you to review the details on how to create custom presets in my previous post on Advanced Encoding Features.

Next, I will describe a custom preset for transcoding your source into an MP4 file with 720p video encoded at 4.5 Mbps, and 5.1 channel Dolby Digital Plus audio at 512 kbps. This custom preset is based on the “H264 Broadband 720p” preset. The section of this preset modified to use the Dolby Digital Plus settings.

Note: for details on how to adjust the audio encoding settings, such as using a lower bitrate than 512kbps as in this example, see https://msdn.microsoft.com/en-us/library/dn296500.aspx

The completed XML custom preset is the following, and should be saved to a local file for use in encoding. I used the name “Dolby Audio Preset.xml”.



 
   
     
       
         
           
             
               
                 
                   
                 
               
             
           
         
          
            
              
                
              
            

            

       
     
   
 

Transcode Your Source Content

You can transcode your source content using the Windows Azure Media Encoder, passing the contents of the custom preset as the configuration string to the Task. See the  Transcode() method in my sample code below for the steps involved. The Task results in an output Asset containing an MP4 file with Dolby Digital Plus audio interleaved with H.264 encoded video.

Publish the Output Asset

Once the content has been transcoded, you can create a SAS locator for the output Asset. See CreateSASLocator()  in my sample code below for the steps involved. The SAS URI can be handed off to your player application.

The Sample Code

Note that code in this topic uses Azure Media Services .NET SDK Extensions.  The Media Services .NET SDK Extensions is a set of extension methods and helper functions that simplify your code and make it easier to develop with Media Services.

The App.Config file for the sample code looks as follows



  
    
  
  
    
    
  
  
    
      
        
        
      
    
  

In the above App.Config, replace and with your Media Services Account Name and Key.

In addition I used the following sample 5.1 surround sound file of the short film “Silent” supplied by Dolby.
You can download the source MP4 file here to use in your own testing.

Silent from Dolby

(©Dolby – Silent courtesy of Dolby )

The sample code used is as follows.

using System;
using System.Linq;
using System.Configuration;
using System.IO;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Collections.Generic;
using System.Diagnostics;
using System.Globalization;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.MediaServices.Client;

namespace DeliveringPremiumAudio
{
    /// 
    /// 
    /// 
    class Program
    {
        // Read values from the App.config file.
        private static readonly string _mediaServicesAccountName =
            ConfigurationManager.AppSettings["MediaServicesAccountName"];
        private static readonly string _mediaServicesAccountKey =
            ConfigurationManager.AppSettings["MediaServicesAccountKey"];
        private static readonly string _storageConnectionString =
            ConfigurationManager.AppSettings["StorageConnectionString"];

        private static CloudMediaContext _context = null;
        private static MediaServicesCredentials _cachedCredentials = null;

        // Pointers to sample file, and the saved custom preset XML
        private static readonly string _sampleFile = @"C:tempsintel.wmv";
        private static readonly string _customPreset = @"C:tempDolby Audio Preset.xml";

        static void Main(string[] args)
        {
            try
            {
                // Create and cache the Media Services credentials in a static class variable
                _cachedCredentials = new MediaServicesCredentials(_mediaServicesAccountName, _mediaServicesAccountKey);

                // Used the cached credentials to create the CloudMediaContext
                _context = new CloudMediaContext(_cachedCredentials);

                // Step 1. Upload the sample content and create an Asset
                IAsset inputAsset = CreateAssetAndUploadSingleFile(AssetCreationOptions.None, _sampleFile);

                // Step 2. Load the custom preset into a configuration string
                string configuration = File.ReadAllText(_customPreset);

                // Step 3. Transcode the input
                IAsset outputAsset = Transcode(inputAsset, configuration);

                // Step 4. Create a SAS locator for the output asset and print to console
                if (null != outputAsset) CreateSASLocator(outputAsset);

                // The above method creates a locator valid for 30 days
                // You should consider deleting the locator after the tests are complete
            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.Message);
            }
        }

        /// 
        /// This function creates an empty Asset
        /// 
        static private IAsset CreateEmptyAsset(string assetName, AssetCreationOptions assetCreationOptions)
        {
            var asset = _context.Assets.Create(assetName, assetCreationOptions);
            Console.WriteLine("Asset name: " + asset.Name);
            Console.WriteLine("Time created: " + asset.Created.Date.ToString());

            return asset;
        }

        /// 
        /// This function creates an Asset and uploads the input file to it
        /// 
        static public IAsset CreateAssetAndUploadSingleFile(AssetCreationOptions assetCreationOptions, string singleFilePath)
        {
            var fileName = Path.GetFileName(singleFilePath);
            // Create a unique asset name
            var assetName = fileName + DateTime.UtcNow.ToString();
            var asset = CreateEmptyAsset(assetName, assetCreationOptions);

            var assetFile = asset.AssetFiles.Create(fileName);
            Console.WriteLine("Created assetFile {0}", assetFile.Name);

            // In order to upload the file, we need a locator with the appropriate access policy
            var accessPolicy = _context.AccessPolicies.Create(assetName, TimeSpan.FromDays(30),
                                                                AccessPermissions.Write | AccessPermissions.List);
            var locator = _context.Locators.CreateLocator(LocatorType.Sas, asset, accessPolicy);

            assetFile.Upload(singleFilePath);
            Console.WriteLine("Done uploading {0}", assetFile.Name);
            Console.WriteLine("");
            long size = assetFile.ContentFileSize;

            locator.Delete();
            accessPolicy.Delete();

            return asset;
        }

        /// 
        /// This function transcodes the source Asset using the preset provided, and return the output Asset
        /// 
        static public IAsset Transcode(IAsset sourceAsset, string preset)
        {
            // Declare a new job.
            IJob job = _context.Jobs.Create("Transcoding Job for " + sourceAsset.Name);
            // Get a reference to Windows Azure Media Encoder, and pass to it the name of the 
            // processor to use for the specific task.
            IMediaProcessor processor = GetLatestMediaProcessorByName("Windows Azure Media Encoder");

            // Create a task with the encoding details, using a string preset.
            ITask task = job.Tasks.AddNew("Transcoding Task for " + sourceAsset.Name,
                processor,
                preset,
                Microsoft.WindowsAzure.MediaServices.Client.TaskOptions.None);

            // Specify the source asset to be encoded.
            task.InputAssets.Add(sourceAsset);
            // Add an output asset to contain the results of the job. 
            // This output is specified as AssetCreationOptions.None, which 
            // means the output asset is not encrypted. 
            task.OutputAssets.AddNew("Output asset", AssetCreationOptions.None);

            // Use the following event handler to check job progress.  
            job.StateChanged += new
                    EventHandler(StateChanged);

            // Launch the job.
            job.Submit();

            // Check job execution and wait for job to finish. 
            Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
            progressJobTask.Wait();

            // Get an updated job reference.
            job = GetJob(job.Id);

            // If job state is Error the event handling 
            // method for job progress should log errors.  Here we check 
            // for error state and exit if needed.
            if (job.State == JobState.Error)
            {
                Console.WriteLine("Transcode() failed, exiting...");
                return null;
            }

            // Get a reference to the output asset from the job.
            IAsset outAsset = job.OutputMediaAssets[0];

            return outAsset;
        }

        /// 
        /// This function returns a reference to the latest version of the specified media processor
        /// 
        private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
        {
            var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
                ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();

            if (processor == null)
                throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));

            return processor;
        }

        /// 
        /// A helper method to handle events
        /// 
        private static void StateChanged(object sender, JobStateChangedEventArgs e)
        {
            Console.WriteLine("Job state changed event:");
            Console.WriteLine("  Previous state: " + e.PreviousState);
            Console.WriteLine("  Current state: " + e.CurrentState);

            switch (e.CurrentState)
            {
                case JobState.Finished:
                    Console.WriteLine();
                    Console.WriteLine("********************");
                    Console.WriteLine("Job is finished.");
                    Console.WriteLine("Please wait while local tasks or downloads complete...");
                    Console.WriteLine("********************");
                    Console.WriteLine();
                    Console.WriteLine();
                    break;
                case JobState.Canceling:
                case JobState.Queued:
                case JobState.Scheduled:
                case JobState.Processing:
                    Console.WriteLine("Please wait...n");
                    break;
                case JobState.Canceled:
                case JobState.Error:
                    // Cast sender as a job.
                    IJob job = (IJob)sender;
                    // Display or log error details as needed.
                    LogJobStop(job.Id);
                    break;
                default:
                    break;
            }
        }

        /// 
        /// A helper method to log information about a failed Job
        /// 
        private static void LogJobStop(string jobId)
        {
            StringBuilder builder = new StringBuilder();
            IJob job = GetJob(jobId);

            builder.AppendLine("nThe job stopped due to cancellation or an error.");
            builder.AppendLine("***************************");
            builder.AppendLine("Job ID: " + job.Id);
            builder.AppendLine("Job Name: " + job.Name);
            builder.AppendLine("Job State: " + job.State.ToString());
            builder.AppendLine("Job started (server UTC time): " + job.StartTime.ToString());
            // Log job errors if they exist.  
            if (job.State == JobState.Error)
            {
                builder.Append("Error Details: n");
                foreach (ITask task in job.Tasks)
                {
                    foreach (ErrorDetail detail in task.ErrorDetails)
                    {
                        builder.AppendLine("  Task Id: " + task.Id);
                        builder.AppendLine("    Error Code: " + detail.Code);
                        builder.AppendLine("    Error Message: " + detail.Message + "n");
                    }
                }
            }
            builder.AppendLine("***************************n");
            Console.Write(builder.ToString());
        }

        /// 
        /// This function creates a SAS locator for the given Asset
        /// 
        private static void CreateSASLocator(IAsset asset)
        {
            Console.WriteLine("Publishing asset " + asset.Name);
            // Publish the output asset by creating an Origin locator.  
            // Define the Read only access policy and
            // specify that the asset can be accessed for 30 days.  
            _context.Locators.Create(
                LocatorType.Sas,
                asset,
                AccessPermissions.Read,
                TimeSpan.FromDays(30));

            // Generate a SAS Locator for the MP4 file.
            var mp4AssetFile = asset.AssetFiles.ToList().Where(f => f.Name.EndsWith(".mp4", StringComparison.OrdinalIgnoreCase)).FirstOrDefault();
            Uri mp4Uri = mp4AssetFile.GetSasUri();
            Console.WriteLine("Output is now available for progressive download: ");
            Console.WriteLine(mp4Uri.OriginalString);

            return;
        }
    }
}

Playback Demonstration

A simple way to demonstrate playback is to launch the Windows Media Player application on Windows 8.1, go to FileOpen URL, and enter in the SAS locator to your encoded Asset.
If you have your PC hooked up to an AV Receiver capable of 5.1 playback, you will be able to hear the output in full 5.1 surround.

You can download the output of my encoding job sample here.

Considerations

Encoding content with stereo audio

If your input asset has stereo audio, then Azure Media Encoder will insert silence into the surround channels – the output Asset will still have 5.1 audio. Note that the insertion of silence is recommended only when delivering the output content as Smooth Streaming.

Alternatively, you can modify the element to encode to a stereo output using the settings documented in the XML in Encoding to Dolby Digital Plus Stereo.

Streaming Dolby Digital Plus audio via Smooth Streaming

You can deliver Dolby Digital Plus audio to Modern applications on Windows 8.1, or to Xbox One via Smooth Streaming. In order to do this, you will need to modify the sample code as follows:

You will then need to build a Windows 8 Modern application using the Smooth Streaming Client SDK for Windows 8.  For details on building an application with the Smooth Streaming Client SDK to play back your Dolby content, check out the article “How to Build a Smooth Streaming Windows Store Application

Dolby decoders are not available on all platforms (like Apple iOS) today, but there are many other client frameworks on the market that support decoding of Dolby on devices like set top boxes and Smart TVs. If you are trying to reach other devices, you will need to check with the manufacturer on what decoders are supported.

Enabling Dolby Professional Loudness Metering

In the past, broadcasters working with multichannel audio had problems with soundtracks whose average levels fell above or below that of other programming. You may have experienced this yourself when a program switched to a really loud commercial during a break.  Also, issues arose when surround sound content was played back on television sets with stereo or mono audio output.   As discussed in this report from Dolby, a common problem when working with multichannel audio is maintaining consistent loudness across different programs. To address this problem, the recommended practice is to specify a dialog level parameter (aka Dialog Normalization or DialNorm) within the Dolby Digital Plus stream. This value sets the volume of the audio to a preset level, which aids the decoder in level matching from one program to another and gets rid of the disturbing volume changes.

The preset provided above assumes a default Dialog Normalization value of -31 dB for your source content.  See the section titled “Using Dolby Professional Loudness Metering (DPLM) Support” to learn how you can measure the actual loudness of the dialog in your source content, and set the correct value for Dialog Normalization.

To learn more about Dolby Digital Plus technologies, have a look at the details provided by our partner on this Dolby page.