This post was co-authored by William Zhang, Cenk Dingiloglu at the Microsoft Corporation, and Lakshminarasimhan Sundararajan, Claire Van de Voorde at VideoStitch.
With the successful launch of the Azure Media Services live streaming and encoding platform, we have seen an amazingly wide range of use cases and scenarios being built out by our customers and partners. One of the current hot topics on our forums and in our inboxes here on the Azure Media Services team has been 360° virtual reality streaming.
Ever since the launch of several 360° live video services from Facebook, YouTube, Comcast’s recent announcement with NASCAR, and the release of new camera capture hardware from GoPro, virtual reality live streaming has become the new 4K HEVC/UHD of 2016.
The Media Services team is no stranger to large scale live streaming events like the Olympics. For the first time in Olympic broadcasting. Virtual reality live streaming is now being used for events like the opening ceremony, as well as daily highlights throughout the 10 days of Youth Olympics in Lillehammer, Norway. We expect to see even more use of the technology during the 2016 Rio Summer games. For example, the Olympic Broadcaster Services have already announced their plans for a live virtual reality feed during the games.
More and more customers are exploring the benefits of 360° live streaming and virtual reality. Some of these benefits include:
- Improved user engagement and interactivity so users are no longer passive observers like with TV and movies.
- An immersive experience which is particularly important for live events like sports and concerts.
As the interest has grown, a set of vendors are beginning to specialize in this area by providing products ranging from head gear (Samsung and Facebook), to cameras (GoPro), to software (VideoStitch). In this post, we will present a live VR streaming workflow on Azure Media Services leveraging Vahana VR from VideoStitch, one of our Azure Media Services partners.
The live 360° streaming workflow
Today, a live streaming workflow can be built on Azure Media Services to stream any standard camera to an audience of millions with the click of a few buttons (or a few lines of code.) This same proven platform and set of APIs can be used along with a partner like VideoStitch to provide a complete live 360°/VR experience to your customers. Such a workflow has the following components:
- In order to capture a virtual reality video, you need access to a 360-degree capable video camera for virtual reality. Such a camera uses either fish-eye lenses or has an array of lenses to collectively cover a 360 (θ) by 180 (ϕ) sphere as shown below
- The video feeds from the array of multiple camera lenses are then calibrated, cropped, re-oriented and stitched into a single video called spherical video. In a spherical video, latitude and longitude coordinates of a spherical globe are mapped onto horizontal and vertical coordinates of a grid, with one side of the grid being twice as long as the other side.
- A few partners of Azure Media Services provide such video-stitching products and services. VideoStitch is one such partner that provides the on-premises software and GPU-based hardware called Vahana VR to enable capturing and stitching from a camera array. Vahana VR can output live 360-spherical video in RTMP protocol at a single high bitrate, which can then be easily ingested into an existing Azure Media Services live channel. This is possible, because the stitched video is sent to us as a normal rectangle with a wide aspect ratio.
- In order to prepare for multi-bitrate streaming over the Internet, a live 360-degree video stream in RTMP is ingested into an Azure Media Services live channel. A live encoder in Azure Media Services can be used to re-encode and package the live input into multiple bitrates for adaptive streaming on any device (iOS, Android, Browser, Windows, Mac, etc.).
- The spherical video is delivered to a virtual reality player clients in an adaptive bitrate protocol. A VR player is responsible for converting spherical video (equirectangular layout) to flat video. A VR player will present a subset of the flat video at a time. As a user changes viewing angle, different portion of the video is shown, presenting a 360° immersion experience. Currently, our own Azure Media Player does not support this rendering, but our partners can provide solutions for rending in multiple player types.
There are two types of VR players available from our partners.
- HMD (head-mounted devices or helmet-mounted devices), which usually have two lenses. Embedded accelerators control viewing angle as a user turns his/her head. Samsung Gear VR and Facebook Oculus Rift are such player products. The eleVR player lets you watch 360 flat and stereo video on your Oculus Rift or Android device with VR headset (Cardboard, Durovis Dive, etc.) from a web browser. It is written with Javascript, HTML5, and WebGL. See next section for such samples.
- Regular computing devices such as mobile phones, tablets or laptops which rely on player software for display and touch-screen, mouse or keyboard to control viewing angle. Azure Media Player does not yet support VR playback of spherical video. But you can playback the spherical video directly.
In summary, the overall workflow is as shown below:
Examples
You can take a look for yourself at the experience of 360°/VR streaming from Azure Media Services (without a virtual reality headset handy.) Just click the following URLs in a Chrome or Edge browser. Click once in the video to capture the mouse so you can point aournd the 360 degree video. Sorry, but these links do not currently work on Safari or iOS.
- Las Vegas, NV flyover:
-
Snowy takeoff:
Let us know what you think about 360° live streaming and virtual reality in the comments below. Send us feedback through our MSDN Forum and vote for this feature or your favorites on our UserVoice (https://aka.ms/amsvoice).
Also, come visit us at the National Association of Broadcasters (NAB) show, April 18-21 at the Microsoft booth in South Hall Lower SL6810 and our partner, VideoStitch, in South Upper Hall SU6418.