FAQ LIVE555

Why do most RTP sessions use separate streams for audio and video? How can a receiving client synchr

Sending audio and video in separate RTP streams provides a great deal of flexibility. For example, this makes it possible for a player to receive only the audio stream, but not video (or vice-versa). It would even be possible to have one computer receive and play audio, and a separate computer receive and play video.

These audio and video streams are synchronized using RTCP "Sender Report" (SR) packets - which map each stream's RTP timestamp to 'wall clock' (NTP) time. For more information, see the IETF'sRTP/RTCP specification.

Receivers can then use this mapping to synchronize the incoming RTP streams. The LIVE555 Streaming Media code does this automatically: For subclasses of "RTPSource", the "presentationTime" parameter that's passed to the 'afterGettingFunc' of "getNextFrame()" (see "liveMedia/include/FramedSource.hh") will be an accurate, time-synchronized time. (For this to work, you need to have also created a "RTCPInstance" for each RTP source.)

For example, if you use "openRTSP" to receive RTSP/RTP streams, then the contents of each RTP stream (audio and video) are written into separate files. This is done using the "FileSink" class. If you look at the "FileSink::afterGettingFrame()" member function, you'll notice that there's a "presentationTime" parameter for each incoming frame. Some other receiver could use the "presentationTime" parameter to synchronize audio and video.

 

The "test*Streamer" test programs read from a file. Can I modify them so that they take input from a H.264 or MPEG encoder instead, so I can stream live (rather than prerecorded) video and/or audio?

Yes. The easiest way to do this is to change the appropriate "test*Streamer.cpp" file to read from "stdin" (instead of "test.*"), and then pipe the output of your encoder to (your modified) "test*Streamer" application. (Even simpler, if your operating system represents the MPEG input device as a file, then you can just use the name of this file (instead of "test.*").)

Alternatively, if your encoder presents you with a sequence of frames, rather than a sequence of bytes, then a more efficient solution would be to write your own "FramedSource" subclass that encapsulates your encoder, and delivers audio or video frames directly to the appropriate "*RTPSink" object. This avoids the need for an intermediate 'framer' filter that parses the input byte stream.(If, however, you are streaming H.264, or MPEG-4 (or MPEG-2 video with "B" frames), then you should insert the appropriate "*DiscreteFramer" filter between your source object and your "*RTPSink" object.)

For a model of how to do that, see "liveMedia/DeviceSource.cpp" (and "liveMedia/include/DeviceSource.hh"). You will need to fill in parts of this code to do the actual reading from your encoder.

 

But what about the "testOnDemandRTSPServer" test program (for streaming via unicast)? How can I modify it so that it takes input from a live source instead of from a file?

First, you will need to modify "testProgs/testOnDemandRTSPServer.cpp" to set the variable "reuseFirstSource" to "True". This tells the server to use the same input source object, even if more than one client is streaming from the server concurrently.

Then, as above, if your input device is accessible by a file name (including "stdin" for standard input), then simply replace the appropriate "test.*" file name with the file name of your input device.

If, however, you have written your own "FramedSource" subclass (e.g., based on "DeviceSource", as noted above) to encapsulate your input source, then the solution is a little more complicated. In this case, you will also need to define and implement your own new subclass of "OnDemandServerMediaSubsession" that gets its input from your live source, rather than from a file. In particular, you will need to provide your own implementation of the two pure virtual functions "createNewStreamSource()" and "createNewRTPSink()". For a model of how to do this, see the existing "FileServerMediaSubsession" subclass that is used to stream your desired type of data from an input file. (For example, if you are streaming H.264 video, you would use "H264VideoFileServerMediaSubsession" as a model.) Note that:

  • Your "createNewStreamSource()" implementation will create (and return) an instance of your input source object. (You should also set the "estBitrate" result parameter to be the estimated bit rate (in kbps) of the stream. This estimate is used to determine the frequency of RTCP packets; it is not essential that it be accurate.)
  • Your "createNewRTPSink()" implementation will create (and return) an appropriate new "RTPSink" (subclass) object. (The code for this will usually be the same as the code for "createNewRTPSink()" in the corresponding "FileServerMediaSubsession" subclass.)

Is this code 'thread safe'? I.e., can it be accessed by more than one thread at the same time?

Short answer: No. As noted above, the code assumes a single thread of execution, using an event loop - rather than multiple threads - for concurrency. This generally makes the code much easier to debug, and much easier to port across multiple operating systems, which may have different thread APIs, or no thread support at all. (For even stronger arguments along these same lines, see John Ousterhout's presentation.)

Therefore, although it's true to say that the code is not 'thread safe', it's also somewhat misleading. It's like saying that a high-speed rail carriage is not 'airworthy'.

Longer answer: More than one thread can still use this code, if only one thread runs the library code proper, and the other thread(s) communicate with the library only by setting global 'flag' variables (such as event loop 'watch variables'), or by calling 'event triggers'. (Note that "triggerEvent()" is the only LIVE555 function that may be called from an external (i.e., non-LIVE555) thread.)

Another possible way to access the code from multiple threads is to have each thread use its own "UsageEnvironment" and "TaskScheduler" objects, and thus its own event loop. The objects created by each thread (i.e., using its own "UsageEnvironment") must not interact (except via global variables). Such a configuration is not recommended, however; instead, it is safer to structure such an application as multiple processes, not multiple threads.

In any case, when using the "LIVE555 Streaming Media" code, you should be familiar with event-driven programming, and understand that an event-driven application can perform at least as well as one that uses threads (unless you're actually running on a multiprocessor, in which case it's usually simpler to have your application consist of multiple processes (not just threads) - one running on each processor). Note, in particular, that you do not need multiple threads in order to transmit (or receive) multiple streams concurrently.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值