Questions? Feedback? powered by Olark live chat software
Ignorar Navegação

Customized Renditions and Asset trimming with Dynamic Manifest

Publicado em 28 maio, 2015

Senior Program Manager, Azure Media Services
As a follow up on Jason’s announcement of our dynamic manifest capabilities last month during NAB, I am pleased to announce that they are now available for you to begin using in your streaming applications. Jason’s post covered many of the common usage scenarios so I encourage you to start there if this is the first time you’re reading about dynamic manifests since here I will primarily focus on technical implementation details. Before going into the details of how it works, I want to provide some background for the following terminologies.
  • Adaptive Bit Rate over HTTP (ABR), adaptive bit rate streaming is a method of video streaming over HTTP where the source content is encoded at multiple bit rates and chunked into small multi-second parts. The streaming client is made aware of the available streams at differing bit rates, and chunks of the streams by a manifest file. To meet the subscriber demand for multi-screen and deliver video to any device across any network, operators use adaptive bit rate (ABR) protocols in their multi-screen implementations, such as Apple HLS, Microsoft Smooth, Adobe HDS and MPEG-DASH.
  • The manifest, sometimes referred to as “the playlist”. This file includes streaming metadata such as qualities, audio languages, and presentation window and also provides the player currently available as well as next playable fragments with their respective locations. It is a text-based or XML file with specific attributes depending on format.  Using "asset filters" which I’ll describe below you can create a dynamic manifest based on the following properties and attributes. Track type - audio, video, or text Track name Start and end time Presentation window - sliding window of fixed duration, anchored from live position. Bitrate (qualities) Track language FourCC
  • Fragments/segments, actual chunks of video, audio or text content.

What is a filter?

Asset filter is a combination of rules based on manifest attributes and timeline which describes the streaming endpoint (backend service) how to manipulate the output playlist (Manifest).

Filter Types

There are two types of asset filters; global and local (asset level). Both of the filter types have exactly the same features and uses the same attributes to define filters. The main difference between the two is the scenarios they are suitable for and also where they are stored.
  • Global filters are generally suitable for device profiles (track filtering) whereas local filters are mostly preferred for range filters (trimming), since most of the time trimming is specific to asset.
  • Global filters are stored at the account level and when created can be applied to all assets under the account. Local filters are per asset and can only be used for the asset under which they are created. A local filter’s lifetime is tied to the asset lifetime, in other words when the asset is deleted all local filters created on it are also deleted.

How it works?

Filters can consists of multiple rules. As an example you can define remove slate and apply back-off over a live playlist (manifest) as well as include removing qualities to the combination. For multiple filtering rules the end result is the composition (intersection only) of the rules. dynamic_manifest8

Creating Filters through Our APIs

Asset filter API is available for Azure Media Service REST version >= 2.11. This release includes only REST APIs and a corresponding .NET SDK is underway which is targeted for following weeks. You can find more information regarding to Azure Media Services REST from here. Asset filters consist of two sections (groups); PresentationTimeRange and Tracks. PresentationTimeRange:
Attribute Description Applies To Constraints
StartTimestamp Media that starts after this timestamp will be included in the playlist (manifest). Live and VOD Absolute time. Value rounded to the closest next GOP start.
EndTimestamp Media that ends before this timestamp will be included in the playlist (manifest). VOD For live presentation it is silently ignored and applies when the presentation ends (Live2VOD) Absolute time. Value rounded to the closest next GOP start.
PresentationWindowDuration Defines a sliding window at the live edge or end of the presentation. Media within this sliding window will be included in the playlist (manifest). Live, but also applies to VOD to enable smooth transitions when the presentation ends. The minimum presentation window duration is 120 seconds.
LiveBackoffDuration Applies a live presentation backoff, or delay, to the media. Live only, but silently ignored for VOD to enable smooth transitions when the presentation ends. The maximum live backoff duration is 60 seconds.
Timescale The timescale used by the timestamps and durations specified above. The default timescale is 10000000. Live and VOD Default is 10000000 (HNS)
Tracks:
Property Description Applies To Constraints
Type Type of the track. Values can be video, audio, or text. Live and VOD video, audio, or text
Name Name of the Track Live and VOD
Language Track language Live and VOD format specified in RFC 5646
FourCC Track FourCC value Live and VOD first element of codecs format specified in RFC 6381
Bitrate Bitrate of the track. Live and VOD Can be a range value or exact value
 
Note: Azure Media Services dynamic packager can source from multi-bitrate MP4 and Smooth Streaming files. Dynamic manifests(asset filters) uses Smooth Streaming server and client manifest data for filtering. Please also note that you should have at least one streaming unit to use this feature.
Sample Global filter REST request:
<?xml version="1.0" encoding="utf-8"?>
<entry xmlns=http://www.w3.org/2005/Atom xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns:georss="http://www.georss.org/georss" xmlns:gml="http://www.opengis.net/gml">
<id /><title />
<updated>2015-04-03T20:50:34Z</updated>
<author><name /></author>
<content type="application/xml">

<m:properties><d:Name>MyGlobalFilter</d:Name>

<d:PresentationTimeRange>
<d:EndTimestamp m:type="Edm.Int64">9223372036854775807</d:EndTimestamp>
<d:LiveBackoffDuration m:type="Edm.Int64">0</d:LiveBackoffDuration>
<d:PresentationWindowDuration m:type="Edm.Int64">800000000</d:PresentationWindowDuration>
<d:StartTimestamp m:type="Edm.Int64">0</d:StartTimestamp>
<d:Timescale m:type="Edm.Int64">10000000</d:Timescale>
</d:PresentationTimeRange>

<d:Tracks /></m:properties>

</content>
</entry>
Sample Local (Asset Level) filter REST request:
<?xml version="1.0" encoding="utf-8"?>
<entry xmlns=http://www.w3.org/2005/Atom xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns:georss="http://www.georss.org/georss" xmlns:gml="http://www.opengis.net/gml">
<id /><title />
<updated>2015-04-03T20:53:34Z</updated>
<author><name /></author>
<content type="application/xml">
<m:properties><d:Id></d:Id>

<d:Name>MyLocalFilter</d:Name>

<d:ParentAssetId>nb:cid:UUID:fcef555d-1500-80c4-def5-f1e4d721ea68</d:ParentAssetId>

<d:PresentationTimeRange>

<d:EndTimestamp m:type="Edm.Int64">9223372036854775807</d:EndTimestamp>
<d:LiveBackoffDuration m:type="Edm.Int64">300000000</d:LiveBackoffDuration>
<d:PresentationWindowDuration m:type="Edm.Int64">3000000000</d:PresentationWindowDuration>
<d:StartTimestamp m:type="Edm.Int64">400000</d:StartTimestamp>
<d:Timescale m:type="Edm.Int64">10000000</d:Timescale>

</d:PresentationTimeRange>

<d:Tracks>

<d:element>

<d:PropertyConditions>

<d:element>

<d:Operator>Equal</d:Operator><d:Property>Type</d:Property><d:Value>audio</d:Value>

</d:element>

<d:element>

<d:Operator>Equal</d:Operator><d:Property>Language</d:Property><d:Value>en</d:Value>

</d:element>

</d:PropertyConditions>

</d:element>

<d:element>

<d:PropertyConditions>

<d:element>

<d:Operator>NotEqual</d:Operator><d:Property>Type</d:Property><d:Value>video</d:Value>

</d:element>

<d:element>

<d:Operator>Equal</d:Operator><d:Property>Bitrate</d:Property><d:Value>0-2427000</d:Value>

</d:element>

</d:PropertyConditions>

</d:element>

</d:Tracks>

</m:properties>

</content></entry>
   After a filter has been created you can start using it by adding the filter name to the URL request by adding a filter parameter to the manifest request as in the example below. http://cenkteststreaming.streaming.mediaservices.windows.net/3d56a4d-b71d-489b-854f-1d67c0596966/64ff1f89-b430-43f8-87dd-56c87b7bd9e2.ism/Manifest(filter=MyLocalFilter) Filters can also be used in combination with dynamic packaging and dynamic encryption. As an example here is a request for the same stream as above but in HLS v4 format. http://cenkteststreaming.streaming.mediaservices.windows.net/3d56a4d-b71d-489b-854f-1d67c0596966/64ff1f89-b430-43f8-87dd-56c87b7bd9e2.ism/Manifest(format=m3u8-aapl,filter=MyLocalFilter) 
Note: Global filters can be used with all your assets under the same media services account by just adding the filter name to the asset streaming URL. Local filters can only be used with the asset they are associated with. Please see local filter asset example. (“ParentAssetId”)

What happens behind the scenes?

When a request which includes a filter is received by the streaming endpoint, the original playlist (manifest) is processed using the filter rules and a filtered playlist is served out in our response. If there are multiple filter rules then rule intersection defines the output. Presentation range (time) rules are executed at GOP (Group of picture- Key Frame) boundaries and rounded to the nearest next fragment. As an example; dynamic_manifest9 Filters only apply to the playlist (manifest) requests and actual fragments are not processed, hence players can get benefit of already cached fragments in the proxies and CDNs. Below is an example showing the original playlist (manifest) and filtered one; after applying a presentation range filter. dynamic_manifest10 As you can see above the output manifest only includes the segments within the defined filter start and end times.

Know issues and limitations

  • Dynamic manifest operates in GOP boundaries (Key Frames) hence trimming has GOP accuracy. If you want frame accurate sub-clipping you can use rendered sub-clips feature.
  • You can use same filter name for local and global filters. In this case local filter takes precedence and overrides global.
  • If you update a filter, it can take up to 2 minutes for streaming endpoint to refresh the rules. Also please keep in mind that updating a filter can result in player failures if the content is already served with the old filter and cached in proxies and CDN caches. It is recommend to purge the cache after updating the filter. If this option is not possible please consider using a different filter.
  • This release doesn’t support combining multiple filters in the URL. You can only apply one filter at a time.
  • You can set “EndTimestamp” rules for both on-demand and live playlists. However this rule is silently ignored when the presentation is live and takes presence when the live presentation transforms to on-demand.
  • You can set “PresentationWindowDuration” and “LiveBackoffDuration” rules to both on-demand and live presentations. However for on-demand presentations this rules are silently ignored.

Conclusion

Dynamic manifest (asset filters) is a powerful new feature which helps you to reach multiple-end points without the hassle of post-production and having multiple copies of the same asset. It allows you create multiple renditions and views in seconds. I hope you will enjoy this cool feature! For feature requests or to provide general feedback—we want to hear it all! Please use comments section below and I’ll respond as quickly as I can.