live555之PLAY指令解析

前言

上一篇讲解了DESCRIBE指令,这一篇我们介绍下PLAY指令,当然PLAY指令开始执行后,返回后,就需要开始发送视频了,这里我们大概了解下流程。希望可以给大家一些帮助,关于live555的博客关于PLAY指令解析的最好的是live555PLAY讲解。但是这里面貌似有部分代码比较老,最新的已经覆盖掉了,这里我修正部分,并且排版也是无法忍受。有兴趣的童鞋可以参考下

正文

这里基本的处理流程和上一篇博客一样,都是在RTSPServer::RTSPClientSession中接收消息,直接处理,其他的我想大家都很容易找到黑心代码,其实就是

void RTSPServer::RTSPClientSession
::handleCmd_PLAY(RTSPServer::RTSPClientConnection* ourClientConnection,
    ServerMediaSubsession* subsession, char const* fullRequestStr) {
    ......
    fStreamStates[i].subsession->startStream(fOurSessionId,
    ......
    }
void OnDemandServerMediaSubsession::startStream(unsigned clientSessionId,
                        void* streamToken,
                        TaskFunc* rtcpRRHandler,
                        void* rtcpRRHandlerClientData,
                        unsigned short& rtpSeqNum,
                        unsigned& rtpTimestamp,
                        ServerRequestAlternativeByteHandler* serverRequestAlternativeByteHandler,
                        void* serverRequestAlternativeByteHandlerClientData) {
  StreamState* streamState = (StreamState*)streamToken;
  Destinations* destinations
    = (Destinations*)(fDestinationsHashTable->Lookup((char const*)clientSessionId));
  if (streamState != NULL) {
    streamState->startPlaying(destinations, clientSessionId,
                  rtcpRRHandler, rtcpRRHandlerClientData,
                  serverRequestAlternativeByteHandler, serverRequestAlternativeByteHandlerClientData);
    ......
  }
}

void StreamState
::startPlaying(Destinations* dests, unsigned clientSessionId,
           TaskFunc* rtcpRRHandler, void* rtcpRRHandlerClientData,
           ServerRequestAlternativeByteHandler* serverRequestAlternativeByteHandler,
           void* serverRequestAlternativeByteHandlerClientData) {

......
    fRTCPInstance = fMaster.createRTCP(fRTCPgs, fTotalBW, (unsigned char*)fMaster.fCNAME, fRTPSink);

       ......

  if (dests->isTCP) {
    // Change RTP and RTCP to use the TCP socket instead of UDP:
    if (fRTPSink != NULL) {
      fRTPSink->addStreamSocket(dests->tcpSocketNum, dests->rtpChannelId);
      RTPInterface
    ::setServerRequestAlternativeByteHandler(fRTPSink->envir(), dests->tcpSocketNum,
                         serverRequestAlternativeByteHandler, serverRequestAlternativeByteHandlerClientData);
        // So that we continue to handle RTSP commands from the client
    }
    if (fRTCPInstance != NULL) {
      fRTCPInstance->addStreamSocket(dests->tcpSocketNum, dests->rtcpChannelId);
      fRTCPInstance->setSpecificRRHandler(dests->tcpSocketNum, dests->rtcpChannelId,
                      rtcpRRHandler, rtcpRRHandlerClientData);
    }
  }

......
      fRTPSink->startPlaying(*fMediaSource, afterPlayingStreamState, this);
}

Boolean MediaSink::startPlaying(MediaSource& source,
                afterPlayingFunc* afterFunc,
                void* afterClientData) {
......
  return continuePlaying();
}

Boolean H264or5VideoRTPSink::continuePlaying() {
.....
  return MultiFramedRTPSink::continuePlaying();
}

Boolean MultiFramedRTPSink::continuePlaying() {
  // Send the first packet.
  // (This will also schedule any future sends.)
  buildAndSendPacket(True);
  return True;
}

void MultiFramedRTPSink::buildAndSendPacket(Boolean isFirstPacket) {

......
  packFrame();
}

void MultiFramedRTPSink::packFrame() {

......
    fSource->getNextFrame(fOutBuf->curPtr(), fOutBuf->totalBytesAvailable(),
              afterGettingFrame, this, ourHandleClosure, this);
  }
}

这都和之前差不多,这里我们不在详细看如何获取么一帧数据,其中传入一个afterGettingFrame这是用来从h264文件读入一帧数据后,我们需要把这些数据,传输cline端,这里我们稍微了解下。具体调用,和上一篇一样,就是到了

void ByteStreamFileSource::doReadFromFile() {
 //读文件,
  fFrameSize = fread(fTo, 1, fMaxSize, fFid);

  //在(TaskFunc*)FramedSource::afterGetting中调用我们的方法指针,然后。开始我们的具体工作。
  nextTask() = envir().taskScheduler().scheduleDelayedTask(0,
                (TaskFunc*)FramedSource::afterGetting, this);

下面我们开始今天的上传工作。

void MultiFramedRTPSink
::afterGettingFrame(void* clientData, unsigned numBytesRead,
            unsigned numTruncatedBytes,
            struct timeval presentationTime,
            unsigned durationInMicroseconds) {
  MultiFramedRTPSink* sink = (MultiFramedRTPSink*)clientData;
  sink->afterGettingFrame1(numBytesRead, numTruncatedBytes,
               presentationTime, durationInMicroseconds);
}

void MultiFramedRTPSink
::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,
             struct timeval presentationTime,
             unsigned durationInMicroseconds) {
      sendPacketIfNecessary();     
 }
void MultiFramedRTPSink::sendPacketIfNecessary() {
......
 if (!fRTPInterface.sendPacket(fOutBuf->packet(), fOutBuf->curPacketSize())) 
//读取下一帧数据,这里不用过于操心了。1234在来一次的东西
    nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecondsToGo, (TaskFunc*)sendNext, this);
  }
}

Boolean RTPInterface::sendPacket(unsigned char* packet, unsigned packetSize) {

  //先写消息头
    statsOutgoing.countPacket(bufferSize);
    //在写消息内容。
    statsGroupOutgoing.countPacket(bufferSize);

这里流程大概就是这样,我也不太想重新画出类图,如果谁有心思,可以好好研究下,不过整体流程都在这里,
这里其实还有两部分比较复杂的东西,关于视频文件的解析,这里不再详细研究,

后记

终于算是吧live555的源代码基本搞懂,不过这里还是很多遗憾,没有好好整理代码,以后如果需要,好好在重新研究下。

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值