[object Object] Icon

Encoding
Learn how to create, start, manage and modify Encodings

[object Object] Icon

Player
Learn how to create, start, manage and modify Players

[object Object] Icon

Analytics
Learn how to create, start, manage and modify Analyticss

Docs Home
User shortcuts for search
Focus by pressing f
Hide results by pressing Esc
Navigate via   keys

Fri Dec 14 2018

Create a Live Encoding from an RTMP stream

IntroductionLink Icon

The Bitmovin cloud encoding service is a powerful tool for live streaming, and our API makes it easy to implement. This tutorial concentrates on feeds contributed with the RTMP protocol, which are the simplest to setup. There are basically 4 steps involved when it comes to our live streaming service in the cloud.

setup-adaptive-streaming-RTMP-800

1. Ingest RTMP Stream to our Live Encoder

Usually a mezzanine or "contribution" encoder that is processing the live signal will transcode this signal to a high quality mezzanine format and ingest it at the RTMP ingest point in our live encoder. You can now use such an encoder from Elemental, Teradek, Teracue, or any other vendor, or use software like the popular OBS studio or ffmpeg.

2. Encoding of the Input Stream to MPEG-DASH and HLS

You can define multiple output resolutions and bitrates for MPEG-DASH and HLS, define if you want to encode to H.264 (AVC) or H.265 (HEVC). There are literally no limits in defining what output you want from our live encoder, e.g. it can easily handle multiple 4k 60FPS streams encoded to HEVC.

3. Direct Output to Your Storage

Our live encoder writes the encoded files directly to your AWS S3 or GCS bucket. You could use this output storage as origin for your CDN or for multiple CDNs. This works well with GCS as Google offers a CDN Interconnect for several CDNs such as Akamai, Fastly, Level3, HighWinds, CloudFlare, Limelight, and Verizon. AWS S3 could also easily be connected with their Cloudfront CDN and provides you an out-of-the-box solution in one cloud.

4. Playback on any Device, or Browser

If you are using the Bitmovin HTML5 player your content will playback on any device or browser. However, you are not limited to use our player and can also choose among others as described in our recent blog post on encoding for multiple HTML5 Players. You can also use the other Bitmovin Player SDKs if you want to build native apps on iOS, Android, Roku and other devices.


Starting with the APILink Icon

Now, let’s see how we can start a live stream that will generate MPEG-DASH and HLS. To showcase the setup of the live stream we will be utilizing the .NET C# API client. The full example can also be found in our examples list of the .NET C# API client as well.

You will also find similar examples to get started from in our other client SDKs:


So, first, setup the Bitmovin API client with your API key:

1var bitmovin = new BitmovinApi(API_KEY);

Create an OutputLink Icon

We are using a Google Cloud Storage bucket as output location of your live encoder. However, it's a simple change to use an AWS S3 bucket as output instead if you prefer.

1var output = bitmovin.Output.Gcs.Create(new GcsOutput
2{
3 Name = "GCS Output",
4 AccessKey = GCS_ACCESS_KEY,
5 SecretKey = GCS_SECRET_KEY,
6 BucketName = GCS_BUCKET_NAME
7});

Create an EncodingLink Icon

To guarantee a stable connection between the contribution encoder and our live encoder, it is crucial to carefully select the cloud and region in which your live encoder will run to reduce latencies.

Beside the cloud region you can also pinpoint special encoder versions or use BETA to access newer features, although we recommend to use our STABLE branch for most cases.

Select the RTMP Live InputLink Icon

For live streams we currently support RTMP as ingest point. Your account comes with a pre configured RTMP input that you can get with the line below:

1var rtmpInput = bitmovin.Input.Rtmp.RetrieveList(0, 100)[0];

Note that the IP address of that RTMP endpoint will only be determined after you start the encoding, as you will later see. This address will typically be different for every new encoding, even if you stop and start the encoding to change parameters.

Add Video and Audio Codec ConfigurationsLink Icon

A codec configuration contains the encoding related configuration for a video rendition or an audio rendition. You need to link the codec configuration to a stream of your encoding that connects an input stream with the codec configuration. E.g., link your input video stream to a H.264 1080p codec configuration will encode this video stream to H.264 1080p output. The following example uses the AUTO selection mode and position 0, thus links this configuration to the first video stream of the RTMP input. Beside H.264 we also support H.265 (HEVC) as codec, and resolutions of 8k or even more.

1var videoConfig1080p = bitmovin.Codec.H264.Create(new H264VideoConfiguration
2{
3 Name = "H264_Profile_1080p",
4 Profile = H264Profile.HIGH,
5 Width = 1920,
6 Height = 1080,
7 Bitrate = 4800000,
8 Rate = 30.0f
9});
10var videoStream1080p = bitmovin.Encoding.Encoding.Stream.Create(encoding.Id,
11 CreateStream(rtmpInput, "live", 0, videoConfig1080p, SelectionMode.AUTO));

In a similar way, you can add more video codec configurations and add them to streams of your encoding to generate alternative renditions (e.g., 720p, 360p, etc.). Additionally to the video rendition you may also want to add an audio track. For audio it works pretty much the same way as for video as you can see in this code sample:

1var audioConfig = bitmovin.Codec.Aac.Create(new AACAudioConfiguration
2{
3 Name = "AAC_Profile_128k",
4 Bitrate = 128000,
5 Rate = 48000
6});
7var audioStream = bitmovin.Encoding.Encoding.Stream.Create(encoding.Id,
8 CreateStream(rtmpInput, "live", 0, audioConfig, SelectionMode.AUTO));

Create Muxings for MPEG-DASH and HLSLink Icon

In order to create MPEG-DASH and HLS content the encoded data needs to be packaged accordingly. In the following lines of code we will define segmented fMP4 for MPEG-DASH and segmented TS for HLS. Here you also define how long the video segments should be. This can be an important parameter when creating video clips out of your live stream. The smaller this value, the more accurate you can “cut” the video clip, and reduce latency. However too short a value and you will reduce the encoding performance and quality of the output for a given bitrate. By default, our encoder will choose 4 sec as segment length, which is what we would recommend you start with for best balance between latency and encoding efficiency. We also define where the segments should be stored in your output bucket. You have full control over the output location of the video and the audio streams.

Note that it is also possible to create VoD manifests out of the encoded data createdfrom the live stream in this way. This is explained in our blog post: Implement Live-to-VoD with the Bitmovin API.

First we will create the required fMP4 muxings for MPEG-DASH:

1var videoFMP4Muxing1080p = bitmovin.Encoding.Encoding.Fmp4.Create(encoding.Id,
2 CreateFMP4Muxing(videoStream1080p, output, OUTPUT_PATH + "video/1080p_dash", segmentLength));
3var audioFMP4Muxing = bitmovin.Encoding.Encoding.Fmp4.Create(encoding.Id,
4 CreateFMP4Muxing(audioStream, output, OUTPUT_PATH + "audio/128kbps_dash", segmentLength));

The following shows the same for segmented TS muxings:

1var videoTsMuxing1080p = bitmovin.Encoding.Encoding.Ts.Create(encoding.Id,
2 CreateTsMuxing(videoStream1080p, output, OUTPUT_PATH + "video/1080p_hls", segmentLength));
3var audioTsMuxing = bitmovin.Encoding.Encoding.Ts.Create(encoding.Id,
4 CreateTsMuxing(audioStream, output, OUTPUT_PATH + "audio/128kbps_hls", segmentLength));

If you added multiple video renditions in the previous steps, you also need to create fMP4 and TS muxings for each rendition.

Define the MPEG-DASH Live ManifestLink Icon

In order to playback content for MPEG-DASH we also need to have a manifest. With the Bitmovin API you have full control over creating manifests, e.g. create multiple manifests with different renditions.

When creating the MPEG-DASH manifest (or manifests), you also specify where it is output, with path location and filename for the manifest:

1var manifestOutput = new Encoding.Output
2{
3 OutputPath = OUTPUT_PATH,
4 OutputId = output.Id,
5 Acl = new List {new Acl {Permission = Permission.PUBLIC_READ}}
6};
7
8var manifestDash = bitmovin.Manifest.Dash.Create(new Dash
9{
10 Name = "MPEG-DASH Live Manifest",
11 ManifestName = "stream.mpd",
12 Outputs = new List { manifestOutput }
13});

Define a default DASH period, video and audio adaptation sets. In the audio adaptation set you can define the language of the audio track.

1var period = bitmovin.Manifest.Dash.Period.Create(manifestDash.Id, new Period());
2var videoAdaptationSet =
3 bitmovin.Manifest.Dash.VideoAdaptationSet.Create(manifestDash.Id, period.Id, new VideoAdaptationSet());
4var audioAdaptationSet = bitmovin.Manifest.Dash.AudioAdaptationSet.Create(manifestDash.Id, period.Id,
5 new AudioAdaptationSet { Lang = "en" });

Add the created fMP4 muxings to the relevant adaptation sets. You also need to define the relative path to the segments based on the manifest output location, based on your choices in the previous step:

1bitmovin.Manifest.Dash.Fmp4.Create(manifestDash.Id, period.Id, videoAdaptationSet.Id,
2 new Manifest.Fmp4
3 {
4 Type = SegmentScheme.TEMPLATE,
5 EncodingId = encoding.Id,
6 MuxingId = videoFMP4Muxing1080p.Id,
7 SegmentPath = "video/1080p_dash"
8 });
9bitmovin.Manifest.Dash.Fmp4.Create(manifestDash.Id, period.Id, audioAdaptationSet.Id,
10 new Manifest.Fmp4
11 {
12 Type = SegmentScheme.TEMPLATE,
13 EncodingId = encoding.Id,
14 MuxingId = audioFMP4Muxing.Id,
15 SegmentPath = "audio/128kbps_dash"
16 });

Repeat those steps for each audio and video rendition you want to add to the DASH manifest.

Define the HLS Live ManifestLink Icon

The definition of the HLS live manifest works in a similar fashion as for MPEG-DASH. First we define the output location and name of the live HLS manifest:

1var manifestHls = bitmovin.Manifest.Hls.Create(new Hls
2{
3 Name = "HLS Live Manifest",
4 ManifestName = "stream.m3u8",
5 Outputs = new List {manifestOutput}
6});

We then add the audio TS muxing to the HLS manifest as media info, and give it a group name. Here you also define the name of the playlist file that will be created for the audio segments:

1var mediaInfo = new MediaInfo
2{
3 GroupId = "audio",
4 Name = "English",
5 Uri = "audio.m3u8",
6 Type = MediaType.AUDIO,
7 SegmentPath = "audio/128kbps_hls/",
8 StreamId = audioStream.Id,
9 MuxingId = audioTsMuxing.Id,
10 EncodingId = encoding.Id,
11 Language = "en",
12 AssocLanguage = "en",
13 Autoselect = false,
14 IsDefault = false,
15 Forced = false
16};
17bitmovin.Manifest.Hls.AddMediaInfo(manifestHls.Id, mediaInfo);

Next we create a variant stream for each video rendition and link it to the defined audio group. Also define the name of the video playlist file for this rendition:

1bitmovin.Manifest.Hls.AddStreamInfo(manifestHls.Id, new StreamInfo
2{
3 Uri = "video_1080.m3u8",
4 EncodingId = encoding.Id,
5 StreamId = videoStream1080p.Id,
6 MuxingId = videoTsMuxing1080p.Id,
7 Audio = "audio",
8 SegmentPath = "video/1080p_hls/"
9});

As before, you repeat those steps for each TS audio or video muxing you want to include in the manifest

Start the Live EncodingLink Icon

Finally, we can start the live stream using both created manifests in the start call. You have the option to set specific live options, such as the timeshift parameter that defines how many seconds a user will be able to seek back in time. In the example below we allow the users to seek back 5 minutes. For MPEG-DASH you can also define how far away from the real live signal the player will start to playback the segments. If you are not aiming for low latency live streams, choose a value between 60 and 120 seconds to give the player enough room for buffering.

You can set the stream key (used in the definition of the RTMP entrypoint) parameter to anything that makes sense for your setup, such as the channel name, or event name. Or, you know, just "live". In the example below, we use "yourStreamKey".

1bitmovin.Encoding.Encoding.StartLive(encoding.Id, new StartLiveEncodingRequest
2{
3 StreamKey = "yourStreamKey",
4 HlsManifests = new List
5 {
6 new LiveHlsManifest
7 {
8 ManifestId = manifestHls.Id,
9 Timeshift = 300
10 }
11 },
12 DashManifests = new List
13 {
14 new LiveDashManifest
15 {
16 ManifestId = manifestDash.Id,
17 Timeshift = 300,
18 LiveEdgeOffset = 90
19 }
20 }
21});

Retrieve RTMP Ingest Point InformationLink Icon

Now that you have started the live stream, you need to wait for it to be ready before you can ingest your RTMP stream.

Once it is ready you can easily query the live stream details and (for example) print it to the console:

1liveEncoding = bitmovin.Encoding.Encoding.RetrieveLiveDetails(encoding.Id);
2Console.WriteLine("Live stream started");
3Console.WriteLine("Encoding ID: {0}", encoding.Id);
4Console.WriteLine("IP: {0}", liveEncoding.EncoderIp);
5Console.WriteLine("Rtmp URL: rtmp://{0}/live", liveEncoding.EncoderIp);
6Console.WriteLine("Stream Key: {0}", liveEncoding.StreamKey);

The console output will contain the RTMP push URL and the stream key. You need to pass this information to your contribution encoder to configure the RTMP URL it should publish to.

As stated before, you should choose your encoding region wisely to have a stable RTMP ingest from your location. However, should the RTMP signal be interrupted to our ingest point, we will continue to stream the last image we got from your encoder as still image in order to avoid interrupting playback of the stream. As soon as your encoder is able to connect to our ingest point again, we will resume streaming your content.

Shutdown the Live EncodingLink Icon

Because of this mechanism to provide a continuous output, the live encoding won't stop automatically when the input feed stops. You therefore have to shut the encoding down at the end of your event, to avoid generating unnecessary costs in your account. The following line of code shows how to stop a live encoding:

1bitmovin.Encoding.Encoding.StopLive(encoding.Id);

You can also use the Bitmovin dashboard to stop the encoding. The controls are shown in the Live Encoding details.

Give us feedback