When we build a professional grade online video streaming solution on Azure Media Services
(AMS), there are quite a few different flows of data. Let's take live streaming as an example:
- The first and the most important is of course the video flow: from capture, feed long haul, encoding, ingest into AMS, archiving, dynamic packaging/protection, then to CDN, and finally arriving at client players. Putting pretty pictures on screen is only the first part and may not be the hardest part. Actually showing pretty pictures on screen is "easy", what is "hard" is how to "not show pretty picture" (DRM protection), and how to let users watch stuff other than pretty picture (dynamically inserted ads, CC, metadata such as game scores and athlete info, emergency announcements, weather delay messages, etc.)
- The second is ad flow: from ad marker being inserted, to dynamically packaging ad markers into different SCTE formats based on streaming formats, to player parsing ad markers, followed by ad decision and ad delivery and finally ad playback;
- CC flow: textual content is generated and added into the contribution feed and flows through the whole path with video content;
- Metadata flow: first metadata being generated in various subsystem, then added to content management storage flowing to web portal, as well as to client player for display or other purposes such as failover. For example, after publishing an asset in AMS and generating its streaming URL, the URL may need to be flowed in the following directions, all in an automated and integrated fashion:
- the URL needs to be flowed into CMS database which feeds customers video portal for end user browsing, searching and playback;
- the URL needs to be flowed into the customer’s ad provider as one piece of the metadata;
- the URL needs to be flowed into the end points (such as WebAPI service) supplying metadata to video player clients, such as primary URL and secondary URL for player to fail-over in case of origin issue.
- The content protection flow: from generating content key, setting up dynamic content protection, to issuing authentication token, then delivering DRM license or decryption key, and finally decrypting content on client player;
- Analytics flow: from client player to data aggregation in the cloud, then to decision making to control other flows or to front end for display
- We have not included account provisioning, billing, portal for end-users, user authentication, authorization, EPG (Electronic Programming Guide), and EMS which are more important in OTT scenarios.
All of these data “flows” have to move in the right direction (from one subsystem to another), at the right speed (based on audience size/demand, network bandwidth, client CPU) and arriving at destinations at the right moment (down to seconds or even frame accuracy level). Therefore, we need a “controller” to control these “flows”, to make all these “flows” working together. Often we loosely call this “controller” a CMS.
How can we build such a CMS faster and better? Is there anything we can leverage?
When building a video solution on Azure Media Services, we may have the following scenarios:
- A customer may choose to build a CMS tool integrated with their existing systems for automated workflows.
- A system integrator (SI) who helps customers building video solutions based on Azure Media Services may choose to build a packaged SDK or software framework so that such a project does not have to start from zero. This approach speeds up time to market, makes project delivery more repeatable and makes the SI more competitive in the market.
In these situations, what customers/SIs need may not be a stand-alone tool for human, rather a faster way to build a fully integrated solution with automated workflow. The author would like to share a toolset which has been used in such solutions. The code is structured in such a way that it is scenario-based. The value of this toolset is to make the source code available. The source code can be used as sample code, or used directly in a solution or leveraged for design and planning of such a solution.
The source code has been approved by Microsoft LCA for OSS release under MIT license and can be found on GitHub
This is a comprehensive tool set and covers all key areas of Azure Media Services:
- Live-to-VOD management and live archive diagnostics
- PlayReady protection (both live and VOD)
- AES-128 Encryption (both live and VOD)
- Ad insertion (for Aventus live encoder)
The functionality included in this tool is based on real world solutions/use cases. This tool has been “battle tested”:
- This tool is used to test scenarios in NBC Sports solution as part of the release sign-off testing process of Azure Media Services.
- This tool was used for NBC Sports event live streaming before iStreamPlanet CMS tool was ready.
- This tool was used extensively in supporting Sochi Olympics streaming at both NBC Sports and Deltatre (for its diagnostics functionality).
- This tool was used for operating a 24x7 live streaming engineering environment used by 15 vendors across North America, Europe and Asia, throughout the engineering process of NBC Sports/Sochi Olympics project.
- This tool was used for all the VOD transcode workflow performance testing on both Azure Media Processors and Kayak Media Processor, for VOD subsystem in NBC Sports/Sochi Olympics project.
- Majority of the source code in this tool has been shared with iStreamPlanet to develop their CMS tool for operating the NBC Sports/Sochi Olympics solution.
- Part of the source code in this tool has been shared with Deltatre for some critical features during Sochi Olympics project.
- This tool was also used to create the end-to-end PlayReady protection prototype and the end-to-end AES encryption prototype.
In order to highlight the scope and capability of this tool, I’m listing a subset of the features of this tool which are all used in real-world solutions such as NBC Sports/Sochi Olympics solution
, which is a line of business (LOB) solution used by NBC Sports for their daily sports streaming.
It is well known that all AMS API calls, regardless it is REST or .NET API, requires a valid ACS bearer token and this ACS token is obtained from AMS ACS. Naturally, you would not want to rely on a single ACS instance during an important live event. If you cannot get an ACS token during a live event, you lose the access to everything in your media service, which is disastrous. This tool provides an implementation for automatic failover among multiple AMS ACS instances, and provides “stickiness” with the “preferred ACS”. Specifically, the ACS redundancy feature in this tool meets the following live production requirements, good enough to be used for running 2014 Sochi Olympics live streaming
||Need to be able to automatically switch to the other ACS when the current ACS fails.
|ACS token reuse
||Need to use the new feature in Media Services SDK 3.0: If ACS token is not expired, reuse it in creating a CloudMediaContext.
|ACS token auto-renewal
||If ACS token is expired, renew the ACS token first before using it to create CloudMediaContext.
|Support Multiple Context Instances
||An application needs to be able to interact with multiple instances of CloudMediaContext due to the needs to interact with multiple media services.
|Future-proof for SDK Change
||In future AMS SDK release, ACS token may not be renewed in CloudMediaContext constructor. We need to perform ACS token expiration check and ACS token renewal in our code (manual ACS token renewal).
||Needs to use the “preferred” ACS instance as long as it is available, instead of waiting for failing over to it.
|Support of ACS instances
||Needs to support all AMS ACS instances.
|Support of single ACS
||By setting SwitchACSEnabled and PreferredACSEnabled settings to false, a single ACS instance can be used. In some environments, only a single AMS ACS instance is available.
Converting Azure Blob to AMS Asset
Often you have the physical video files as a blob in Azure storage and you need to convert it to AMS IAsset. This tool provides the conversion in the following scenarios:
- The blob is in the storage associated with the media service you are working with;
- The blob is in a storage NOT associated with the media service you are working with;
Creating Consistent URL across Data Centers for CDN
The URL for a live program in AMS is of the following format: https://[origin_name]-[media_service_name].streaming.mediaservices.windows.net/[locator_id]/[manifest_file_name].ism For example,
Normally, a locator ID is a generated GUID. If we use two data centers for redundancy, for the same live channel and program, we would need the two URLs differ only in the host name parts, and identical in the virtual paths. This requires that when we create two corresponding programs in two separate data centers, the locator ID (hence its locator path) are exactly the same and manifest file names are exactly the same. This tool provides this capability in creating a program for live streaming.
Creating a Program in Specified Storage
Azure Media Services allows one to use multiple Azure storage for a single media service. For better reliability, a media service may use multiple underlying Azure blob storages for storing live archives or on-demand assets. The tool allows one to specify which storage to use when creating a program. This way, multiple concurrent channels can distribute archiving load across multiple blob storages.
Live Archive Diagnostics
Besides providing normal operations, this tool also provide diagnostics of live streaming and live archives. It can quickly discover live archiving issues during and after a live streaming. For example, the tool can diagnose and report the following issues:
- “Jagged start” in live archive and which quality levels are missing the first fragments. Live archives often have “jagged start”. According to WAMS team this is hardly a bug but due to the fact that fragments arrive at a different time.
- Network delay between Aventus feeds and Azure ingest: delays and bursts from Aventus buffer. Normally, if we use 6 second GOP, the time difference between consecutive fragblob write should be in the neighborhood of 6 seconds. The second column below shows the actual time difference between two consecutive fragblobs minus 6 seconds.
- Missing fragblobs in a live archive: the start time difference between two consecutive fragblobs should be around 6 seconds. If much bigger than that, there are missing fragblobs.
- Dynamic PlayReady protection of on-demand streaming;
- Dynamic PlayReady protection of live streaming;
- Dynamic AES encryption for on-demand streaming of assets without storage encryption;
- Dynamic AES encryption for on-demand streaming of assets with storage encryption
Content protection involves multiple subsystems in a solution: dynamic protection/encryption, license/key delivery, STS for authenticating a player client and issuing authorization token, origin and player. Please see the following two blogs for end-to-end content protection design and implementation, published in Azure Blog – Media Services
- An End-to-End Prototype of AES Encryption with ACS Authentication and ACS Token Authorization
- An End-to-End Prototype of PlayReady Protection with ACS Authentication and ACS Token Authorization
Notice that the two end-to-end prototypes/designs have been upgraded to JWT token authentication in addition to SWT.
This tool does have limitations. The main limitation is its lack of a user-friendly UI. The tool is of Windows console type Visual Studio project (one for live, one for on-demand and the other for ad marker insertion), with a color console custom class for color-coded UI. I personally found the most convenient way to use the tool is to “use Visual Studio as the User Interface” (changing input parameters, compile and run). Of course this requires the user to understand basic C# syntax. The ideal UI would be a web UI backed by the following
- Authentication provided by Azure Active Directory;
- Multi-tenant so that different customers can use it against their own media services;
- Hosted as a SaaS in Azure so that there is no need for user to install/setup
I’m sure this day will not be far.
Special acknowledgment goes to many members of Azure Media Services Team for their help and insights over the years.