Share via


How to signal timed metadata with Azure Media Services

Media Services logo v3


Warning

Azure Media Services will be retired June 30th, 2024. For more information, see the AMS Retirement Guide.

Timed metadata is custom data that is inserted into a live stream. Both the data and its insertion timestamp are preserved in the media stream itself. This is so that clients playing the video stream can get the same custom metadata at the exact same time in relation to the video stream.

Note

Timed metadata works only for live events created with RTMP and RTMPS ingestion.

Prerequisites

  • A Media Services account
  • Familiarity with live streaming from an on-premises encoder. If you haven’t done this before, try live streaming with the OBS quickstart first. Once you have that setup and running, you should be able to perform the following steps.
  • A tool to test HTTP posts.

View the sample

The example below shows how a video player catches and displays the timed metadata of the video stream. It uses the Shaka player and its built-in support for Event Message data (‘emsg’) through the EmsgEvent.

Media Services also supports the Shaka player ID3 MetadataEvent, ‘emsg’ events that use the scheme ID uri https://aomedia.org/emsg/ID3.

Review the code on Stackblitz

We've provided a sample Shaka player on Stackblitz for you to work with. Use this button to fork the sample code on Stackblitz.com.

Open Fork in StackBlitz

Review the HTML page

The index.html document contains:

  • a div element where the message will show up once it's sent.
  • a standard HTML5 video element. Notice that the video element is set to autoplay and to start muted.
  • an input field for the streaming locator URL. There's a placeholder URL in the input field that you can view, but it isn’t a live stream. You'll be replacing this value.
<script type="module" src="./index.js"></script>
<link href="./style.css" type="text/css" rel="stylesheet">

<div class="grid-container">

  <div id="header">
    <a href="https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/Player/examples/shaka">
      <span class="microsoft"><svg aria-hidden="true" role="presentation" viewBox="0 0 26 25"
          xmlns="http://www.w3.org/2000/svg">
          <path d="M12.5708 0.981934H0.907471V12.3682H12.5708V0.981934Z" fill="#F25022"></path>
          <path d="M25.4625 0.981934H13.7992V12.3682H25.4625V0.981934Z" fill="#7FBA00"></path>
          <path d="M12.5708 13.5649H0.907471V24.9512H12.5708V13.5649Z" fill="#00A4EF"></path>
          <path d="M25.4629 13.5649H13.7996V24.9512H25.4629V13.5649Z" fill="#FFB900"></path>
        </svg></span>
      <span class="title">Shaka Player LL-HLS with Timed Metadata Sample</span>
    </a>
  </div>
  <div id="videoArea">


    <div id="video-container" data-shaka-player-cast-receiver-id="07AEE832">
      <div id="metadata" class="metadata-hide"></div>
      <video autoplay muted playsinline id="video" style="width: 100%; height: 100%"></video>
    </div>
  </div>
  <div id="clock">
  </div>

  <div id="console">Waiting for timed metadata signals to arrive...</div>
  <div id="manifest">
    <label>Your Manifest (paste and hit enter):</label>
    <input id="manifestUrl" type="url" placeholder="place manifest URL here" size="80"
      value="//aka.ms/lowlatencydemo.m3u8" />
  </div>

  <div id="footer">
  <a href="http://media.azure">Azure Media Services</a>
  </div>
</div>

Review the JavaScript

The index.js file creates and manages the player and player events. The onEventMessage function is registered to handle the emsg event from the Shaka Player and display the messages received from the POST.

player.addEventListener('emsg', onEventMessage);

function onEventMessage(event) {         
    // In version 4.2.x of Shaka player, the event message from AMS will fire here.
    // In version 4.3.0 and higher of Shaka player, the message will only fire in the "metadata' event, since the Shaka player is looking for ID3 messages and filtering them out to that event.

    console.log('Timed Metadata Event Message');
    //console.log('emsg:', event)
    // emsg box information are in emsg.details
    const dataMsg = new TextDecoder().decode(event.detail.messageData);
    console.log('EMSG: Scheme = ' + event.detail.schemeIdUri);
    console.log('EMSG: StartTime = ' + event.detail.startTime);
    console.log(
        'video.currenttime=' + document.getElementById('video').currentTime
    );

    // The start time and the presentationTimeDelta are in seconds on the presentation timeline. Shaka player does this work for us. The value startTime-presentationTimeDelta will give you the exact time in the video player's timeline to display the event.
    console.log(
        'EMSG: startTime-presentationTimeDelta = ' +
        (event.detail.startTime - event.detail.presentationTimeDelta)
    );

    console.log(
        'EMSG: presentationTimeDelta = ' + event.detail.presentationTimeDelta
    );
    console.log('EMSG: endTime = ' + event.detail.endTime);
    console.log('EMSG: timescale = ' + event.detail.timescale);
    console.log('EMSG: duration = ' + event.detail.eventDuration);
    console.log('EMSG: message length = ' + event.detail.messageData.length);

    try {
        const frames = shaka.util.Id3Utils.getID3Frames(event.detail.messageData);

        if (frames.length > 0) {
            console.log('EMSG: message = ', frames[0]);
            console.log('EMSG: mimeType = ', frames[0].mimeType);

            if (frames[0].mimeType === 'application/json') {
                const jsonPayload = JSON.parse(frames[0].data);
                let message = jsonPayload.message;
                console.log('message=' + message);

                // Now do something with your custom JSON payload
                let metadataDiv = document.getElementById('metadata');
                metadataDiv.innerText = message;

                let logLine = document.createElement('p');
                logLine.innerText = 'onEmsg - timestamp:' + (event.detail.startTime - event.detail.presentationTimeDelta).toFixed(2) + ' ' + JSON.stringify(jsonPayload);
                document.getElementById('console').appendChild(logLine).scrollIntoView(false);

                metadataDiv.className = 'metadata-show';

                setTimeout(() => {
                    metadataDiv.className = 'metadata-hide';
                }, 5000); // clear the message

                console.log('JSON= ' + JSON.stringify(jsonPayload));
            }
        }
    } catch (err) {
        console.error(err.stack);
    }
}

Create a live event with a streaming locator

If you haven’t done so already with the OBS quickstart mentioned earlier, create a live event with a streaming locator.

  1. Use the Azure portal, REST or your favorite SDK to create a live event. Copy the ingest URL and paste it in a text editor as you'll need to edit it to send a message to the player with an HTTP PUT request.
  2. Start the live event and make sure the associated streaming endpoint is also started.

Stream the live event

Copy and paste the streaming locator into the input field in the player on Stackblitz or optionally update the value on the input element in the index.html file. You should see the live event streaming to the player.

Create the POST URL

Edit the ingest URL:

  1. Change RTMPS to HTTPS.
  2. Remove the port number, including the colon.
  3. Remove /live/ from the path.
  4. Append ingest.isml/eventdata to the path.

Example:

rtmps://mylivestream.channel.media.azure-test.net:2935/live/0251458ba5df44b2b807ea02f40fed76

becomes

https://mylivestream.channel.media.azure-test.net/0251458ba5df44b2b807ea02f40fed76/ingest.isml/eventdata

Create and send a request

You can use any tool or SDK you like for sending an HTTP POST with the metadata in the body to the player.

Headers and request body

Reminder: The HTTP Content-type header MUST be set to application/json. Then, add the information you want to display with the key set as "message". Here is a simple example message:

POST https://mylivestream.channel.media.azure-test.net/0251458ba5df44b2b807ea02f40fed76/ingest.isml/eventdata
Content-Type: application/json

{

“message”: “Hello world!”

}

When you send the request, you should see the message in the JSON payload show up in the div floating over the video element.

Alternative requests

You can send additional information for an interactive overlay. The full setup for that scenario isn’t covered here, but here's what the request body could look like for a quiz. You could iterate through the answers for each "question" (here replacing "message" as the key) and supply a button for the viewer to select.

POST https://mylivestream.channel.media.azure-test.net/0251458ba5df44b2b807ea02f40fed76/ingest.isml/eventdata
Content-Type: application/json


{
    "question": "What is the airspeed velocity of an unladen swallow?",
     "answers" : [
        {"a1": "A shrubbery!"},
        {"a2": "I am not a witch!"},
        {"a3":  "An African or European swallow?"},
        {"a4": "It's just a flesh wound."},
    ]
}

Tip

Open the Developer Tools for the browser and watch the video events that are fired as well as the messages received from the request JSON payload.

Example POST using cURL

When using cURL, you must set the header using -H “Content-Type: application/json”. Use the -d flag to set the JSON data on the command line (escape quotes in the JSON body with a backslash when using the command line). Optionally you can point to a JSON file using -d \@\<path-to-json-file\>.

A POST is implicit when sending data, so you don't need to use the -X POST flag.

Example POST:

curl https://mylivestream.channel.media.azure.net/618377123f4c49b3937ade20204ca0b2/ingest.isml/eventdata -H "Content-Type: application/json" -d "{\\"message\\":\\"Hello from Seattle\\"}" -v

Clean up resources

Make sure you shut down the live event and the streaming endpoint and delete resources you don’t intend to keep using or you'll be billed.

Get help and support

You can contact Media Services with questions or follow our updates by one of the following methods: