How to Play HLS Live Stream using ExoPlayer


In the previous blog, we have discussed the basics of Android ExoPlayer, advantages & disadvantages. You can find its link here Introduction to Android ExoPlayer.

In this blog, we are going to learn how to play live stream using ExoPlayer. We will be using HLS (HTTP Live Streaming) technology as communication protocol to serve the multimedia content. HLS is an adaptive streaming communications protocol. At first lets discuss adaptive streaming since it is used in HLS.

Adaptive Streaming:
Adaptive streaming provides the multiple bit-rate streams to end users. Best stream get played at user’s end, which is decided by client player on the basis of some parameters like client computational capacity (CPU), internet bandwidth and memory utilization. Adaptive streaming provides the best user experience.


The Index File .m3u8 contains the index of all different version streams and manages the continuity and consistency of streams during playback. Hence for adaptive streaming it choses the files from low, medium or high resolution w.r.t. internet connection and indexes of those files are provided by .m3u8 file. This is the brief introduction of HLS, now jump directly to the implementation of ExoPlayer.

ExoPlayer supports HLS adaptive playbacks through use of HlsSampleSource. ThisHlsSampleSource loads chunks of media data and from that chunk of data individual samples are extracted.

HlsSampleSource is a SampleSource for HLS streams. This class contains various constructors & methods which are very effective in media playback. Some of them are listed below-

continueBuffering(int track, long playbackPositionUs)
Indicates to the source that it should still be buffering data for the specified track.

disable(int track)
Disable the specified track.

enable(int track, long positionUs)
Enable the specified track.

Returns an estimate of the position up to which data is buffered.

getFormat(int track)
Returns the format of the specified track.

Returns the number of tracks exposed by the source.

HlsSampleSource sampleSource = new HlsSampleSource(chunkSource, loadControl, MAIN_BUFFER_SEGMENTS * BUFFER_SEGMENT_SIZE);

Hence its clear from the constructor that HlsSampleSource needs HlsChunkSource and for HlsChunkSource we need DataSource. Refer to below figure:

Object model for HLS playbacks using ExoPlayer

HlsChunkSource is a temporary test source for HLS chunks. It contains chunks of video and audio before it is passed to HlsSampleSource.

Now we will see the code based implementation-

  1. LoadControl loadControl = new DefaultLoadControl(new DefaultAllocator(BUFFER_SEGMENT_SIZE));
  2. DefaultBandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
  3. PtsTimestampAdjusterProvider timestampAdjusterProvider = new PtsTimestampAdjusterProvider();
  4. DataSource dataSource = new DefaultUriDataSource(context, bandwidthMeter, userAgent);
  5. HlsChunkSource chunkSource = new HlsChunkSource(true /* isMaster */, dataSource, url, manifest, DefaultHlsTrackSelector.newDefaultInstance(context), bandwidthMeter, timestampAdjusterProvider, HlsChunkSource.ADAPTIVE_MODE_SPLICE);
  6. HlsSampleSource sampleSource = new HlsSampleSource(chunkSource, loadControl, MAIN_BUFFER_SEGMENTS * BUFFER_SEGMENT_SIZE);
  7. MediaCodecVideoTrackRenderer videoRenderer = new MediaCodecVideoTrackRenderer(context, sampleSource, MediaCodecSelector.DEFAULT, MediaCodec.VIDEO_SCALING_MODE_SCALE_TO_FIT);
  8. MediaCodecAudioTrackRenderer audioRenderer = new MediaCodecAudioTrackRenderer(sampleSource, MediaCodecSelector.DEFAULT);

In Line 1, LoadControl object is created and BUFFER_SEGMENT_SIZE is provided in the constructor. By default we keep this value as  64*1024.

In Line 2 & 3, DefaultBandwidthMeter & PtsTimestampAdjusterProvider objects are created which will be needed in the Renderer implementation.

In Line 4, DataSource object is created by passing context, bandwidthMeter object and userAgent. For obtaining userAgent we can use below code-

String userAgent = Util.getUserAgent(this, "ExoPlayerDemo");

In Line 5, HlsChunkSource object is created via DataSource object and in Line 6, HlsSampleSource object is created via HlsChunkSource object.

Now in Line 7 & 8, we can get the objects of MediaCodecVideoTrackRenderer & MediaCodecAudioTrackRenderer with the SampleSource object.

For more understanding you can refer to the sample project, which is the modified version of standard google exoplayer project:

In this sample you just need to enter the url which needs to be played and it starts playing seamlessly.

- See more at:

How to playback live mpeg2 stream?


DirectShow (Programmer) Jan 28, 2002 rnI have one "push" filter(based on the ball example) to receive the multicast stream from network. I connected it to the Microsoft's mpeg-2 demultiplexer and config the demux as transport mode. When start running, the video windows was created, but no active picture any more. rnI guess it's cause from that the demux was not feeded valid data. Does someone meet this problem and give me some help?rnrnThanksrnBruceAn rn rn rnTmon (Visitor) Feb 5, 2002 rnI am interested in doing a similar thing. I might be able to help. But I haven't figured out how to use a push stream to receive streaming media. Can you direct me to the ball example or show me the code that you wrote .for the push filter. rn rnkanoun (Programmer) Feb 20, 2002 rnmaybe you could check the PID mapping of the output pin you've created on the demux filter. Another cause of trouble could be the format of datas on this pin (Elementary stream ??)rnrn rn rnaresding (Programmer) Mar 26, 2002 rnI meet this question too.rnIt's because that the Microsoft's mpeg-2 demultiplexer rndid not support the live stream.rnyou can use the Elecard Mpeg2 decoder and demultiplexer.rnI use it and my application is working well now.rnrnyou can download it from rn rnDirectShow (Programmer) Mar 27, 2002 rnDear Aresdingrn Thank you for you help and advice.rnrn I want to playback the live transport_stream multicasted on the network. At first, I made a source_filter which support IFileSourceFilter interface, it can work with Elecard's demux and decoder, but the stream_feeding was not perfect. So I want to make a push_mode source filter.rnrn Did you means that in your application the Elecard's demultiplexer can work in push mode? And connected with a live_pushmode_sourcefilter?rnrn Will you give some details about it?rn rn rnposdnya (Visitor) Mar 27, 2002 rnElecard has two Demultiplexers - one works in pull mode with Async Source and Elecard's SyncToAsync Filters, and another in push mode and can be used with something like bouncing_ball_based_MPEG_TS_push_filter.rnIn order to use PushMode Demultiplexer you must connect it to your source, then play graph for a couple seconds (and send real data to Demuxer), Demuxer will find all streams in TS and will create correct output pins for them, then stop graph and render needed output pins on the Demuxer, and start play again.rn rn rnaresding (Programmer) Mar 27, 2002 rnIf I'm not wrong,The elecard's filters is worked in push mode.but they can used in pull mode too.rn rn rnDirectShow (Programmer) Mar 28, 2002 rnDear Posdnya an Aresding:rnrn According to yours advices, I tryed to connect my bouncing_ball_based_push__source_filter with Elecard's demultiplexer. And the push_soucce _filter support the MediaType as MEDIATYPE_Stream & MEDIASUBTYPE_NULL.rnrn But the connection was failed, cause the Elecard_demultiplexer's input_pin didn't provide the IID_IMemInputPin interface.rnrn Did I use the wrong demultiplexer or provide wrong MediaType? And where can i download the Elecard's push_mode_demultiplexer?rnrnThank Your!rnBruce An 2002/3/28 rn rnaresding (Programmer) Mar 28, 2002 rnHow did you get the pin?rnI use codes as follow:rnIPin *pInputPin;rnpElecardFilter->FindPin(L"Input",&pInputPin);rnif (FAILED(hr)) rn MessageBox(NULL,"FindPin error","",MB_OK);rn return hr;rnrnrnpSourceFilter->Connect(pInputPin,&MediaType);rnrnand it worked fine; rn rnDirectShow (Programmer) Mar 29, 2002 rnDear Aresding:rn The following is my testing code. When connect the pins, it still return error as 0x80004002. I think the error is return by Elecad_demultiplexer's inputPin, because it didn't support the IMemInputPin interface, Do you think so? Are you sure your source filter is a push mode filter?rnrnThanksrnBruce An. mailto: brucean@szonline.netrnrn // create the push_mode_source_filterrn hr = CoCreateInstance(CLSID_BouncingBall, rn NULL,rn CLSCTX_INPROC_SERVER,rn IID_IBaseFilter, rn reinterpret_cast(&pBouncintBall));rn hr = pGB->AddFilter(pBouncintBall, L"Bouncing Ball");rnrn IPin* pSourceOutputPin;rn pBouncintBall->FindPin(L"1", &pSourceOutputPin);rnrn // Create the demux filterrn hr = CoCreateInstance(CLSID_ElecardDemultiplexer, rn NULL, rn CLSCTX_INPROC_SERVER,rn IID_IBaseFilter, rn reinterpret_cast(&pMpeg2Splitter));rn pGB->AddFilter(pMpeg2Splitter, L"Mpeg-2 demultiplexer");rnrn IPin *pInputPin;rn hr = pMpeg2Splitter->FindPin(L"Input",&pInputPin);rnrn CMediaType mt;rn mt.SetType(&MEDIATYPE_Stream);rn mt.SetSubtype(&MEDIASUBTYPE_NULL);rn mt.SetTemporalCompression(TRUE);rn mt.SetSampleSize(4096);rn hr = pSourceOutputPin->Connect(pInputPin,&mt);rn rn rnaresding (Programmer) Mar 31, 2002 rnSorry,my source filter is a pull mode filter rn rnDirectShow (Programmer) Mar 31, 2002 rnDear Aresding:rnrn Thank you for your help and kindness.rn rn I have finished my job with MicroSoft's mpeg-2 demultiplxer.rnrnBest RegardsrnBruce Anrn rn rnposdnya (Programmer) Apr 16, 2002 rnElecard finished demo application, which demonstrate live mpeg2 playback.rnYou may download it from UDP Streaming Application rnIt consists of server and client parts, and can work in unicast/multicast mode.rnSource code is available with Elecard MPEG2 Video Decoder SDK rn rn供大家参考!rn 论坛