HLS

From RidgeRun Developer Wiki
Revision as of 14:02, 23 March 2015 by Spalli (talk | contribs)

HTTP live streaming, or HLS, is a multimedia network streaming protocol. The protocol is popular in Apple devices such as iPads, Iphones or iMacs. Newer versions of Android and VLC also integrate the ability to play HLS.

Among the benefits of streaming over HTTP:

  • ability to pass though network firewalls,
  • not requiring special software to render HLS encoded audio and video,
  • multi-bitrate in a single stream allowing for easy response to changing network bandwidth

A common cited disadvantage is the long latency (around 30 seconds according to Apple's recommended implementation) due to buffering of A/V data that is exposed via HTTP.

Basic Functionality

Resources:

HLS basic functionality is shown in the following figure

The video to stream is encoded and stored into a specific container. The contents is split into multiple segments of fixed duration. Each of these segments are added to a playlist called the index. The index and the segments are exposed to the clients through an embedded web server. Whenever a client connects to the HLS server, it looks for the index file to know which segment to play and at what time to play each one.

Encoder

HLS only supports H.264 encoding. The H.264 properties must be set in compliance with the HLS specification in order to have a valid HLS stream. These properties depend on various factors and must be set according to the nature of each stream. For example, for 720p video the encoder properties must be set to

Property Value
encodingpreset 2
intraframeinterval 23
idrinterval 69
ratecontrol 2
profile 66
level 30
entropy 0
t8x8intra 0
seqscaling 0
targetbitrate 1500000
aud true
headers true
bytestream true

There are also technical recommendations that take into account the type of network connection and image resolution.

Muxer

The encoded video stream must be placed into a transport container. One good option is MPEG's transport stream mux, which is the ffmux_mpegts GStreamer element. Don't use mpegtsmux GStreamer element because it consumes a lot of CPU and it's not as efficient.

Segmenter

The segmenter splits the video into files that can be independently retrieved from the web server.

Split Video

The video is split in a fixed number of segments of a fixed duration. Every new segment must start in a key frame, so the segmenter must ensure that when starting a new file it is done with a key frame. Starting each file on a key frame means an intraframeinterval and segment duration are chosen to be a multiple of the IDR frame interval.

For example: if the segment duration is 3 seconds and the framerate is 30 fps then choosing a intraframeinterval = 30 every new segment will start with a key frame (since 3 * 30 frames are in each segment and that is divisible by the intraframeinterval).

A common complication in the segmenter is determining if a buffer is a keyframe because this information is only available after the buffer has been encoded but before the mux, while segmenter is typically after the mux.

Update Index

With every new segment that is created the index file must be updated. The index file is a playlist which clients refer to in order to know what files to retrieve and play. It also specifies the order to play those files. A typical index file looks like the following:

#EXTM3U
#EXT-X-VERSION:3
#EXT-X-MEDIA-SEQUENCE:15
#EXT-X-TARGETDURATION:5
#EXTINF:3.000,
/streams/hls_segment_0015.ts
#EXTINF:3.000,
/streams/hls_segment_0016.ts
#EXTINF:3.000,
/streams/hls_segment_0017.ts

Besides this, if the stream has finished then the following line must be added:

 #EXT-X-ENDLIST

If a client reaches thend end list tag it will no longer continue to download segments.

One important consideration is that changes to the index file must be atomic in order to avoid corrupted retrievals by the clients.

Delete old segments

For privacy or storage space reasons, you might want to delete the old segments, This functionality is optional. If old segments are not deleted then new clients that connect to the HLS server will be able to see the stream from the very beginning and be able to seek within time. This however requires more physical space to store the videos. This situation can be critical in limited devices like embedded systems, so its common to delete old files and just expose a real time stream. In this case new client connections will only be available to retrieve the current segments.

Segments must not only be deleted physically, but from the index file too in order to avoid that a client looks for a non-existent file!

In the index file example above, notice how the segments start from hls_segment_0015.ts and ends on hls_segment_0017.ts, because the segmenter continuously erases deprecated segments.

Web Server Configuration

Once the segmenter is working and both segments and index file are being updated somewhere physically in storage, a web server must be configured in order to allow clients retrieve required media.

Lighttpd

RidgeRun integrated Lighttpd web server in the professional SDK. The web server automatically starts when the board boots. In order to integrate HLS with lighttpd some configuration lines must be added to the configuration file "lightttpd.conf" (typically at /etc/linit.d/lighthttpd.conf). In general, the extensions must be added to the mimetype assign:

".m3u" => "audio/mpegURL" 
".ts" => "video/MP2T"

Some HLS clients only support index files in UTF-8 format. In this case the index must be written with that encoding and mimetype must be changed to

 ".m3u8" => "audio/mpegURL" 

An alias is also appended to the file to map the URL with actual HLS location

 alias.url = ( "/alias/path/" => "/actual/path/" ) 
 expire.url = ( "/alias/path/" => "access plus 3 seconds" )

where /actual/path/ is the path where the index file and the transport streams are held. Notice how it indicateS that the contents of these paths expire, so they are not cached avoiding the clients to accidentally see the same video over and over again!

An example lighttpd.conf file follows:

server.document-root = "/srv/www/" 
server.modules = (  "mod_rewrite",
                    "mod_alias",
                    "mod_expire",
                    "mod_fastcgi",
                    "mod_accesslog" )
 
mimetype.assign = (
   ".html" => "text/html",
  ".txt" => "text/plain",
  ".ico" => "image/vnd.microsoft.icon",
  ".jpg" => "image/jpeg",
  ".png" => "image/png",
  ".css" => "text/css",
  ".js"  => "text/javascript",
  ".m3u" => "audio/mpegURL",
  ".ts" => "video/MP2T"
)

static-file.exclude-extensions = ( ".fcgi", "php", ".rb", "~", ".inc" )
index-file.names = ( "index.html", "index.php" )
 
alias.url = ( "/streams/" => "/hls/streams/" )
expire.url = ( "/streams/" => "access plus 3 seconds" ) 
 
server.pid-file = "/var/run/lighttpd.pid"
#server.use-ipv6 = "enable"
server.port = 80
accesslog.filename = "/var/log/lighttpd.log"

HTML

Once configured the web server, a HTML file containing the HLS logic must be created inside /srv/www/. A simple HTML example file could be:

<html> 
   <head> 
     <title>RidgeRun HTTP Live Streaming (HLS)</title> 
   </head> 
   <body> 
     <video src=/alias/path/index.m3u controls autoplay /></video> 
   </body> 
</html>

where /alias/path/ is the alias configured in the lighthttpd.conf file and index.m3u is the name of the index file. This HTML file is handed with the test scripts in a file named “hls.html”. The clients who wish to connect to the HLS stream must enter the following URL on a web browser: http://<IP address>/hls.html

Starting Point

Convenience Tools

Apple provides several convenience tools to start developing for HLS. To mention, segmenters, validators, tag generators, playlist creators, among others... These tools can be downloaded from the Using HTTP Live Streaming guide.

Technical Documentation