1,linux 环境:
官网上下载,下载地址:http://www.live555.com/liveMedia/public/
live555 版本:“2018.12.14”
参考:http://www.live555.com/liveMedia/faq.html 这个FAQ要仔细阅读。
2,编译
根据不同的平台来配置,并生成对应的Makefile
2.1 ARM平台:
修改交叉编译工具
cp config.armlinux config.arm
vi config.arm
CROSS_COMPILE?= arm-buildroot-linux-uclibcgnueabi-
生成Makefile: ./genMakefiles arm
2.2 Linux 64位平台(x86-64 ):
./genMakefiles linux-64bit
2.3 Linux 32位平台(x86):
./genMakefiles linux
make
生成mediaServer/live555MediaServer
3,测试
3.1,mediaServer下 会生成 live555MediaServer。
live555MediaServer test.264
如果出现Correct this by increasing “OutPacketBuffer::maxSize” to at least 186818, before creating this ‘RTPSink’. (Current value is 100000.)
在DynamicRTSPServer.cpp文件ServerMediaSession* createNewSMS()
里修改OutPacketBuffer::maxSize
if (strcmp(extension, ".264") == 0) {
// Assumed to be a H.264 Video Elementary Stream file:
NEW_SMS("H.264 Video");
OutPacketBuffer::maxSize = 300000; //100000;// allow for some possibly large H.264 frames
sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(env, fileName, reuseSource));
}
createNewSMS是在RTSP setup时调用的。
3.2,testProgs
testProgs 目录下各种测试文件,每个文件的作用和用法,官网上有详细的介绍。这些测试用例目前基本上都是以文件的形式作为输入源,下面重点介绍以实时流的形式作为输入源的2种方法。
主要是参考testH264VideoStreamer 和testOnDemandRTSPServer来修改。
4.不用读文件,使用实时视频流作为输入源
**
最简单的方法:将实时视频流推送到一个FIFO管道(或stdin),将文件名改为这个管道的文件名,这里不做详细介绍了。
4.1 方法1,参考testH264VideoStreamer
参考"liveMedia/DeviceSource.cpp"
定义一个H264LiveVideoSource例并继承DeviceSource,填充其成员,
void play() {
// Open the input file as a 'byte-stream file source':
ByteStreamFileSource* fileSource
=ByteStreamFileSource::createNew(*env, inputFileName);
}
这里用H264LiveVideoSource代替ByteStreamFileSource
H264LiveVideoSource类后面会给出具体的代码。
修改testH264VideoStreamer.cpp main()
ServerMediaSession* sms
= ServerMediaSession::createNew(*env, "testStream", NULL,
"Session streamed by \"testH264VideoStreamer\"",
True /*SSM*/);
修改play()函数如下:
void play() {
// Open the input file as a 'byte-stream file source':
#if 1
H264LiveVideoSource* fileSource
= new H264LiveVideoSource(*env);
if (fileSource == NULL) {
*env << "Unable to open file \"" << inputFileName
<< "\" as a byte-stream file source\n";
exit(1);
}
#else
ByteStreamFileSource* fileSource
= ByteStreamFileSource::createNew(*env, inputFileName);
if (fileSource == NULL) {
*env << "Unable to open file \"" << inputFileName
<< "\" as a byte-stream file source\n";
exit(1);
}
#endif
FramedSource* videoES = fileSource;
// Create a framer for the Video Elementary Stream:
videoSource = H264VideoStreamFramer::createNew(*env, videoES);
// Finally, start playing:
*env << "Beginning to read from file...\n";
videoSink->startPlaying(*videoSource, afterPlaying, videoSink);
}
4.2 方法2,参考testOnDemandRTSPServer
1)set the variable “reuseFirstSource” to “True”
2)根据类H264VideoFileServerMediaSubsession,新建一个新类H264LiveVideoServerMediaSubsession, implementation of the two pure virtual functions “createNewStreamSource()” and “createNewRTPSink()”
在createNewStreamSource()里用上面的H264LiveVideoSource代替ByteStreamFileSource。
H264VideoRTPSink继承关系:
H264VideoRTPSink->H264or5VideoRTPSink->VideoRTPSink->MultiFramedRTPSink->RTPSink->MediaSink->Medium。
H264VideoRTPSource继承关系:
H264VideoRTPSource->MultiFramedRTPSource->RTPSource->FramedSource->MediaSource->Medium.
H264VideoStreamFramer继承关系:
H264VideoStreamFramer->H264or5VideoStreamFramer->MPEGVideoStreamFramer->FramedFilter->FramedSource ->MediaSource->Medium.
下面列出具体实现的代码。
#ifndef _H264_LIVE_VIDEO_SERVER_MEDIA_SUBSESSION_HH
#define _H264_LIVE_VIDEO_SERVER_MEDIA_SUBSESSION_HH
#include "OnDemandServerMediaSubsession.hh"
#include "liveMedia.hh"
#include "UsageEnvironment.hh"
#include "GroupsockHelper.hh"
class H264LiveVideoServerMediaSubsession: public OnDemandServerMediaSubsession
{
public:
H264LiveVideoServerMediaSubsession(UsageEnvironment & env,Boolean reuseFirstSource);
~H264LiveVideoServerMediaSubsession();
static H264LiveVideoServerMediaSubsession* createNew(UsageEnvironment& env,Boolean reuseFirstSource);
public: // new virtual functions, defined by all subclasses
virtual FramedSource* createNewStreamSource(unsigned clientSessionId,
unsigned& estBitrate) ;
// "estBitrate" is the stream's estimated bitrate, in kbps
virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* inputSource);
virtual char const * getAuxSDPLine(RTPSink * rtpSink, FramedSource * inputSource);
static H264LiveVideoServerMediaSubsession* createNew(UsageEnvironment & env, FramedSource * source);
static void afterPlayingDummy(void * ptr);
static void chkForAuxSDPLine(void * ptr);
void chkForAuxSDPLine1();
private:
FramedSource * m_pSource;<