Skip to main content

 Subscribe

Live streaming is now available for public preview, and one of the supported ingest protocols is RTMP. RTMP is a commonly used protocol for ingesting and delivering rich media including live streaming. Azure Media Services supports ingesting live feeds using RTMP and uses Dynamic Packaging to dynamically transmuxe live streams for delivery in MPEG-DASH, Microsoft Smooth Streaming, Apple HLS, or Adobe HDS formats. This enables using a widely adopted RTMP protocol for input and multiple output protocols to reach multiple devices and endpoints while maintaining compatibility with legacy players and formats. For information on setting up an Azure Media account with a live channel and streaming endpoint, see Jason Suess’s excellent blog, “Getting Started with Live Streaming Using the Azure Media Management Portal”. This article focuses on RTMP ingest feature enabled by Azure Media Services, and how it can be used to source multi-bitrate live feed to Azure Media Services channels using Wirecast, Flash Media Live Encoder (FMLE) and FFmpeg encoders.

 

Architecture and General Information

Live-architecture At a high level, the Live Streaming architecture consists of three main components: Channel/Program, Storage and Streaming Endpoints.

  • Channel/Program- Channels enable live streaming. They include the ingest point for your live encoder. As of today RTMP and Fragmented MP4 (Smooth Streaming) are the supported ingest protocols. A program is the logical component inside a channel. Program publish the received data for streaming and also archive the content for on-demand transformation and live presentation window (DVR).
  • Streaming Endpoint and Streaming Units- a Streaming Endpoint provides you with a URL from which you can pull your live and VOD assets.  Streaming Endpoints also provide dynamic packaging capabilities and secure the delivery of the streams.
  • Storage- Programs use Azure storage for storing live archives. On-demand streaming and encoding services also use storage.

Channel RTMP support

Azure Media Services Channel supports RTMP push model. It can support both single and multi-bitrate inputs, but it is highly recommended to use multi-bitrate input to get the benefits of Adaptive Bitrate Streaming. In future Azure Media Services will provide a live transcoder service which will convert single bitrate input to multi bitrate output. To use RTMP ingest, following is required:

  • RTMP output supported encoder
  • H.264 video and AAC audio codec output
  • Key frame or GOP (Group of pictures) alignment across video qualities
  • 2 second key frame interval (You can use up to 6 seconds, but this requires special configuration. Please see advanced settings section below)
  • Unique  stream names for each quality
  • Network connectivity (Available bandwidth for aggregated video+audio bitrates)
  • Strict CBR encoding recommended for optimum Adaptive bitrate performance.

In this post I will use three video qualities for output and ingest to Azure Media Services channel. You can use multiple qualities, but keep in mind that, the initial quality will be limited by your machine’s encoding capabilities and your network connection to the channel ingest. If you exceeded your bandwidth or there is a poor network connection, you might need to adjust the quality count and also the encoding settings to use a lower resolution and bitrate. When using multiple qualities you should pay attention to the aggregated bitrate of all qualities.

Note: Make sure to Reset your channel every time you change encoder settings and disconnect and reconnect your encoder to the channel.

Wirecast Configuration

Wirecast is a well-known RTMP encoder. Wirecast enables capture, live production, and encoding of live streams for broadcast. You can get more information and download a trial version from the Telestream web site. Wirecast version 5 and 6 is supported and tested with Azure Media Services. Below steps applies to both Wirecast version 5 and 6. Version 6 also includes ready to use Azure Media Services encoding presets. You can either use the default presets or create your own following the below steps.

 

Configuring Input

  • Hover over “+” button.

Wirecast1

  • Select from the available source options. Wirecast includes multiple options. For this post I will use my integrated Video Camera.

wirecast2

  • Camera capture appears in the new small window. Click on it and your camera capture will appear in the Preview window.

wirecast13  

Configuring Output

  • On the Main Menu, go to Output -> Output Settings.

wirecast4

  • In the Select an Output Destination dialog box, select RTMP Server.

wirecast5 The Output Settings dialog appears. wirecast6

  • Name your first output (quality). For example “Azure Media Services Quality1”
  • Enter your channel’s ingest URL in the Address field.  See “Getting Started with Live Streaming Using the Azure Media Management Portal” for information on how to get channel ingest URL.
  • Enter a unique stream name (for example, myStream1).If you have multiple qualities, each quality needs to have a unique stream name.
  • Create a new encoding preset for your first quality.
    • In the Output Settings dialog box, select New Preset
    • In the Enter New Preset Name text box, enter a name of the new preset.  For example “MyQuality1”
    • Set “Encoder:” type to H.264 or x264.
    • Configure the preset to desired values.

When creating your own encoding presets you should set “Frames per second” and “Key frame every” values to the same value for each quality. You should use same audio encoding settings for each quality and also you should check “Keyframe Aligned” setting for each quality.  Otherwise the streams will not work or not ingest to the channel. For a proper Adaptive Bitrate Streaming these settings must be same across qualities. wirecast7

  • After configuring your first output, your output configuration should look like

wirecast8

  • To add other quality levels, click Add and follow the above steps to add additional qualities. Please don’t forget to set same values to “Frames per second” and “Key frame every”. Also please check “Keyframe Aligned” setting and name streams unique.
  • After configuring all three qualities, your output configuration will be:

wirecast9  

Start Encoding and pushing data to channel ingest

  • Click arrow to start encoding:

wirecast10

  • Start pushing live encoded data to channel by pressing the Stream button. Indicator light turns to red when streaming starts.

wirecast11

Preview the Stream

You can use the Azure Management Portal to preview your streams. You can also publish your streams using Azure Portal. See “Getting Started with Live Streaming Using the Azure Media Management Portal” for information on how to publishing and preview. As an alternative you can use for multiple player options and preview.  

 

Flash Media Live Encoder (FMLE) Configuration

FMLE is a free software from Adobe. You can get more information and download a copy of the encoder from https://www.adobe.com/products/flash-media-encoder.html. By default FMLE supports MP3 audio output. Currently Azure Media Services doesn’t offer a live transcoding service and requires AAC audio codec to dynamic package source streams to multiple formats such as MPEG-DASH, Smooth Streaming and HLS. To use FMLE with Azure Media Services, you need a AAC plugin. In this post I will use a plugin from Main Concept.  You can get more information and download a trial version from https://www.mainconcept.com/eu/products/plug-ins/plug-ins-for-adobe/aac-encoder-fmle.html  

Configuring FMLE

  • The first thing you should do is, configure the FMLE to use NTP time source (Absolute time) for RTMP timestamps. To do this,
    1. Close your encoder.
    2. Open the FMLE config.xml file in a text editor.The default installation location on Windows is C:Program FilesAdobeFlash Media Live Encoder 3.2 (For x64 it is C:Program Files (x86)AdobeFlash Media Live Encoder 3.2) The default installation location on Mac OS is Macintosh HD:Applications:Adobe:Flash Media Live Encoder 3.2
    3. Set streamsynchronization/enable to true. true
    4. Save file and start your encoder again.
  • Select your source device from device list. For this post I will be using “Integrated Camera”
  • Select your encoding preset from the preset menu or create your own. If you will be creating your own please read the “Channel RTMP support” section for requirements.  I will be using “Multi Bitrate – 3 streams (1500) Kbps – H.264” from the presets menu which will create 3 qualities.

wirecast12

  • Configure H.264 advanced settings and set “Key Frame frequency” to 2 seconds by clicking wirecast14 icon. Note: This will turn preset name to “Custom”

wirecast15

  • Set Frame rate to 30 fps
  • Select your Audio device
  • Set Audio output format to AAC. Note: AAC and HE-AAC will not be available by default. FMLE only supports MP3 audio out of box. You need a plugin to enable AAC codec.
  • Set desired Audio bandwidth. For this post I will use 96 Kbps and 44100 sampling rate.

wirecast16

wirecast17 Your initial configuration will be: wirecast20

Start Encoding and pushing data to channel ingest

  • Click Connect. This will connect the encoder to the channel ingest URL.

wirecast18

  • Click start to start encoding.

wirecast19 Note: You can also use FMLE in command line mode. For more information please see “Start Flash Media Live Encoder in command-line mode”  

Preview the Stream

You can use the Azure Management Portal to preview your streams. You can also publish your streams using Azure Portal. See “Getting Started with Live Streaming Using the Azure Media Management Portal” for information on how to publish and preview. As an alternative you can use for multiple player options and preview.  

Using FFmpeg with Azure Media Services Channel ingest

FFmpeg is a well-known open source software which encodes and created different media formats. RTMP is one of the supported protocols. You can get more information and download a copy of FFmpeg from their official web site. In this post I will not go into the details of FFmpeg commands and their usage, but provide a sample command which will stream a local file and simulates a live stream. You can use FFmpeg to capture data from multiple sources/devices including local camera, desktop capture and other devices. More information can be found on their official web site.

Example Command

 I will use a Windows build of FFmpeg for my post. You can use FFmpeg version that matches your platform. Single bitrate:

C:toolsffmpegbinffmpeg.exe -v verbose -i MysampleVideo.mp4 -strict -2 -c:a aac -b:a 128k -ar 44100 -r 30 -g 60 -keyint_min 60 -b:v 400000 -c:v libx264 -preset medium -bufsize 400k -maxrate 400k -f flv rtmp://channel001-streamingtest.channel.media.windows.net:1935/live/a9bcd589da4b424099364f7ad5bd4940/mystream1

Multi bitrate ( 3 bit rates 500, 300 and 150 Kbps ):

C:toolsffmpegbinffmpeg.exe -threads 15 -re -i MysampleVideo.mp4 -strict experimental -acodec aac -ab 128k -ac 2 -ar 44100 -vcodec libx264 -s svga -b:v 500k -minrate 500k -maxrate 500k -bufsize 500k  -r 30 -g 60 -keyint_min 60 -sc_threshold 0 -f flv rtmp://channel001-streamingtest.channel.media.windows.net:1935/live/a9bcd589da4b424099364f7ad5bd4940/Streams_500 -strict experimental -acodec aac -ab 128k -ac 2 -ar 44100 -vcodec libx264 -s vga -b:v 300k -minrate 300k -maxrate 300k -bufsize 300k -r 30 -g 60 -keyint_min 60 -sc_threshold 0 -f flv rtmp://channel001-streamingtest.channel.media.windows.net:1935/live/a9bcd589da4b424099364f7ad5bd4940/Streams_300 -strict experimental -acodec aac -ab 128k -ac 2 -ar 44100 -vcodec libx264 -s qvga -b:v 150k -minrate 150k -maxrate 150k -bufsize 150k  -r 30 -g 60 -keyint_min 60 -sc_threshold 0 -f flv rtmp://channel001-streamingtest.channel.media.windows.net:1935/live/a9bcd589da4b424099364f7ad5bd4940/Streams_150

The Multi-bitrate command creates three qualities (500, 300 and 150 Kbps) of video with 2 seconds of key frame interval and send the output to Azure Media Services Channel ingest. See “Getting Started with Live Streaming Using the Azure Media Management Portal” for information on how to get channel ingest URL.  

Preview the Stream

You can use the Azure Management Portal to preview your streams. You can also publish your streams using Azure Portal. See “Getting Started with Live Streaming Using the Azure Media Management Portal” for information on how to publish and preview channels and programs. As an alternative you can use for multiple player options and preview.  

Advanced Configuration

By default Azure Media Services Channel is configured to ingest 2 seconds key frame interval data or GOP (KeyFrameInterval). Also dynamic packaging uses 3 to 1 mapping configuration for HLS output which means; if you ingest 2 second key frame interval configured data your HLS output segment will be 6 seconds (3 * 2 seconds) If you want to ingest data which is different than 2 seconds key frame interval than you need to adjust this values. To adjust this values and create a channel, you should use the SDK since these advanced settings cannot be changed using Portal. You can get more information for how to create a channel using SDK from “Creating a Live Streaming Application with the Media Services SDK for .NET” To configure this parameters please refer to:

 

Summary and Next Steps

Hopefully this post demonstrates the ease of getting RTMP live encoders to work with Azure Media Services and configure for basic streaming. You can do more with Azure Media Services Live streaming and other features. You can find more information about Live Streaming and SDK support at “Working with Azure Media Services Live Streaming” and general information about Azure Media at “Azure Media Services” official site. We hope you try our new Live streaming services with RTMP ingest support and let us know if you have any feedback. Happy Live Streaming!

  • Explore

     

    Let us know what you think of Azure and what you would like to see in the future.

     

    Provide feedback

  • Build your cloud computing and Azure skills with free courses by Microsoft Learn.

     

    Explore Azure learning