RTP - 视频流广播

http://blog.csdn.net/Tinnal/archive/2008/09/03/2871734.aspx

这是用RTP(RFC3350)按RFC2550封装MPEG ES流数据的发送程序。学习RTP的路真的辛苦。在网上收集的有关RTP的程序都是那种只负责RTP数据包发送的库,如jrtplib等,他们的DEMO程序都只是用来发发字符串,编编聊天程序,无论是国内还是国外,都没有结合真正的应用的DEMO。其实我的目的很简单,就是写发个视频流服务器,不用复杂,只用把基本原理弄懂,因为这样你才能有的放矢。与网上和RTP相关的库没有应用不一让,当你尝试以流媒体服务器、linux来baidu或google时,你搜出来完非就那么几类:

1.FFSERVER
     FFMPEG2的DEMO,说它有名只是因为这类程序太少了。FFMPEG2是很好用,我现在还在用,但这个DEMO就有很多“炒作”的嫌疑了。好像在做着FFMPEG2库的演示而不是真的视频流服务器。后来想想,这不正是作者想要的吗,但这不是我想要的。编解码部分我会很偏向FFMPEG这个“大杂会”,其它部分我会选择其它的“强者”

2.Darwin、Helix
     两个都是非常有名软件,也只能称之为软件了,因为就算Darwin有源码,这种代码规模,也不适合用于嵌入式。说回软件本身,真的很有名。它们都是很真真拿来商业化运行的软件,但我是研发人员,不是视频流服务商,对不起,Apple,对不起,Microsoft。

3.LIVE555
     如果说上面两个和我都相关性为零(当然了,也是困扰了N周以后痛苦得出的结论),那LIVE555真的给了我一条出路,它是一个代码规模非常合适,又非常强大的媒体解决方案(称之为方案是因为它功能非常的丰富)。有长一段时间,我想去弄懂它的源码,不过和网上的很多人一样,最后软下来了,毕竟,去把这么多东西揉在一起,框架会弄得很复杂,因为我们要把这些完全不同的东西不断一层一层的抽像,最后抽像成一样(哲学呀)。它结构复杂是我中断分析它原来的其中一个原因,但不是主要原因。它结构的复杂程度也没胡像很多人网上说的那样严重,如果你是一个C++的热忱爱好者,你反而会迷上这段代码,当然了,对C的爱好都来说,当然是一种折磨了。暂时把我自己归类在C++爱好者范畴吧,呵呵,我很欣赏这段原码。主要原因是我不希望被某一个库绑死。LIVE555是有编解码能力,但我更希望它只做服务器的工作。

     因此,最终后回来的老路上来,没有帮助,就得自己帮自己,从最基础的RFC看起。经过了N天(周)的英文,终于领会了如果在RTP承载MPEG数据包。在这个过程中很得到了一些LIVE555的帮助(通过对Ethereal捕捉的LIVE555数据包进行分析)。先把程序弄上来,原理性的以后有空再写,程序只有一个.cpp文件,在vs.net 2003下编译通过,播放的视频文件在http://www.cnitblog.com/Files/tinnal/ES流解释程序.rar  内,播放的客户端采用VLC,其下载地址为http://www.videolan.org/。选择打开网络串流,然后选择“UDP/RTP”端口,输入程序的输出端口1000,然后才运行程序,你将在VLC内看到测试的广播视频,IP不一样的话自己改改就行。其它所谓的原理性的,也就是看RFC 3350、RFC2550以及iso13818-2的一些重点地方。

 

   // MPEG2RTP.h
#include <stdio.h>
#include <stdlib.h>
#include <conio.h>
#include <string.h>

#include <winsock2.h>
#include <winsock2.h>


//#include "mem.h"

//
#define PACK_STARTCODE                     (unsigned int)0x000001ba
#define SYSTEM_HEADER_STARTCODE     (unsigned int)0x000001bb
#define PICTURE_START_CODE               (unsigned int)0x00000100
#define GROUP_START_CODE            (unsigned int)0x000000B8
#define ISO_11172_ENDCODE             (unsigned int)0x000001b9
#define SEQUENCE_HEADER_CODE        (unsigned int)0x000001b3

#define PACKET_BUFFER_END            (unsigned int)0x00000000


#define MAX_RTP_PKT_LENGTH     1440
#define HEADER_LENGTH        16
#define DEST_IP                "192.168.0.98"
#define DEST_PORT            1000
#define MPA                    14 /*MPEG PAYLOAD TYPE */
#define MPV                    32

typedef struct
{
    /**//* byte 0 */
    unsigned char csrc_len:4;        /**//* expect 0 */
    unsigned char extension:1;        /**//* expect 1, see RTP_OP below */
    unsigned char padding:1;        /**//* expect 0 */
    unsigned char version:2;        /**//* expect 2 */
    /**//* byte 1 */
    unsigned char payload:7;        /**//* RTP_PAYLOAD_RTSP */
    unsigned char marker:1;        /**//* expect 1 */
    /**//* bytes 2, 3 */
    unsigned short seq_no;           
    /**//* bytes 4-7 */
    unsigned  long timestamp;       
    /**//* bytes 8-11 */
    unsigned long ssrc;            /**//* stream number is used here. */
} RTP_FIXED_HEADER;

typedef struct {
    //byte 0
    unsigned char TR_high2:2;    /**//* Temporal Reference high 2 bits*/
    unsigned char T:1;            /**//* video specific head extension flag */
    unsigned char MBZ:5;        /**//* unused */
    //byte1
    unsigned char TR_low8:8;    /**//* Temporal Reference low 8 bits*/
    //byte3
    unsigned char P:3;            /**//* picture type; 1=I,2=P,3=B,4=D */
    unsigned char E:1;         /**//* set if last byte of payload is slice end code */
    unsigned char B:1;            /**//* set if start of payload is slice start code */   
    unsigned char S:1;            /**//* sequence header present flag */
    unsigned char N:1;            /**//* N bit; used in MPEG 2 */
    unsigned char AN:1;        /**//* Active N bit */
    //byte4
    unsigned char FFC:3;        /**//* forward_f_code */
    unsigned char FFV:1;        /**//* full_pel_forward_vector */
    unsigned char BFC:3;        /**//* backward_f_code */
    unsigned char FBV:1;        /**//* full_pel_backward_vector */
} MPEG_VID_SPECIFIC_HDR; /**//* 4 BYTES */


enum reading_status {
    SLICE_AGAIN,
    SLICE_BREAK,
    UNKNOWN,
    SLICE,
    SEQUENCE_HEADER,
    GROUP_START,
    PICTURE
    };

void validate_file();
float frame_rate(int buffer_index);
unsigned int read_picture_type(int buffer_index);
unsigned int read_FBV(int buffer_index);
unsigned int read_BFC(int buffer_index);
unsigned int read_FFV(int buffer_index);
unsigned int read_FFC(int buffer_index);
unsigned int extract_temporal_reference(int buffer_index);
unsigned int find_next_start_code(unsigned int *buffer_index);
void reset_buffer_index(void);
BOOL InitWinsock();

 

//MPEG2RTP.cpp
    //这个程序主要用于RTP封装MPEG2数据的学习和测试,不作任何其它用途
//软件在VS.net 2003中编译通过,但在linux下作小量修改也应编译通过。
//通过VLC测试,VLC能正确接收和解码由本程序发送的TEST.MPV编码流。
//
//作者:冯富秋 Tinnal
//邮箱:tinnal@163.com


#include "MPEG2RTP.h"

#pragma   comment(lib,"Ws2_32")

unsigned char            buf[MAX_RTP_PKT_LENGTH + 4]; //input buffer
enum reading_status        state = SEQUENCE_HEADER;
unsigned int            g_index_in_packet_buffer = HEADER_LENGTH;
static unsigned long    g_time_stamp = 0;
static unsigned long    g_time_stamp_current =0;
static float            g_frame_rate = 0;
static unsigned int        g_delay_time = 0;
static unsigned int        g_timetramp_increment = 0;
FILE    *mpfd;
SOCKET    socket1;
RTP_FIXED_HEADER        *rtp_hdr;
MPEG_VID_SPECIFIC_HDR    *mpeg_hdr;

#if 0
void Send_RTP_Packet(unsigned char *buf,int bytes)
{

    int i = 0;
    int count = 0;

    printf("/nPacket length %d/n",bytes);
    printf("RTP Header: [M]:%s [sequence number]:0x%lx [timestamp]:0x%lx/n",
        rtp_hdr->marker == 1?"TRUE":"FALSE",
        rtp_hdr->seq_no,
        rtp_hdr->timestamp);
    printf(" [TR]:%d [AN]:%d [N]:%d [Sequence Header]:%s /
        /n [Begin Slice]:%s [End Slice]:%s /
        /n [Pictute Type]:%d /
        /n [FBV]:%d [BFC]:%d [FFV]:%d [FFC]:%d/n",
        (mpeg_hdr->TR_high2 << 8 | mpeg_hdr->TR_low8),
        mpeg_hdr->AN, mpeg_hdr->N, mpeg_hdr->S == 1?"TRUE":"FALSE",
        mpeg_hdr->B ==1?"TRUE":"FALSE", mpeg_hdr->E ==1?"TRUE":"FALSE",
        mpeg_hdr->P,
        mpeg_hdr->FBV, mpeg_hdr->BFC, mpeg_hdr->FFV, mpeg_hdr->FFC);

    while(bytes --)
    {
        printf("%02x ",buf[count++]);
        if(++i == 16)
        {
            i=0;
            printf("/n");
        }
    }
    printf("/n");

}

#else

Send_RTP_Packet(unsigned char *buf,int bytes)
{
    return send( socket1,  (char*) buf, bytes, 0 );
}

#endif


void main(int argc, char *argv[])
{
    unsigned int next_start_code;
    unsigned int next_start_code_index;
    unsigned int sent_bytes;
    unsigned short seq_num =0;
    unsigned short stream_num = 10;
    struct sockaddr_in server;
    int len =sizeof(server);


#if 0
    mpfd = fopen("E://tinnal//live555//vc_proj//es//Debug//test.mpv", "rb"); 
#else
    if (argc < 2)
    {
        printf("/nUSAGE: %s mpegfile/nExiting../n/n",argv[0]);
                exit(0);
    }
    mpfd = fopen(argv[1], "rb");
#endif

 

    if (mpfd == NULL )
    {
        printf("/nERROR: could not open input file %s/n/n",argv[1]);
        exit(0);
    }
    rtp_hdr = (RTP_FIXED_HEADER*)&buf[0];
    mpeg_hdr = (MPEG_VID_SPECIFIC_HDR*)&buf[12];

    memset((void *)rtp_hdr,0,12); //zero-out the rtp fixed hdr
    memset((void *)mpeg_hdr,0,4); //zero-out the video specific hdr
    memset((void *)buf,0,MAX_RTP_PKT_LENGTH + 4);

    InitWinsock();

    server.sin_family=AF_INET;
    server.sin_port=htons(DEST_PORT);          //server的监听端口
    server.sin_addr.s_addr=inet_addr(DEST_IP); //server的地址

    socket1=socket(AF_INET,SOCK_DGRAM,0);
    connect(socket1, (const sockaddr *)&server, len) ;

    //read the first packet from the mpeg file
    //always read 4 extra bytes in (in case there's a startcode there)
    //but dont send  more than MAX_RTP_PKT_LENGTH in one packet
    fread(&(buf[HEADER_LENGTH]), MAX_RTP_PKT_LENGTH-HEADER_LENGTH+4, 1,mpfd);

    validate_file();

    do
    {   

        /**//* initialization of the two RTP headers */
        rtp_hdr->seq_no     = htons(seq_num ++);
        rtp_hdr->payload     = MPV;
        rtp_hdr->version     = 2;
        rtp_hdr->marker    = 0;
        rtp_hdr->ssrc        = htonl(stream_num);   

        mpeg_hdr->S = mpeg_hdr->E = mpeg_hdr->B= 0;

        do{
            next_start_code = find_next_start_code(&next_start_code_index);


            if ((next_start_code >0x100) && (next_start_code<0x1b0) )
            {
                //                //
                if(state == SEQUENCE_HEADER
                    || state ==GROUP_START
                    || state ==PICTURE
                    || state == UNKNOWN)
                {
                    state = SLICE;
                    mpeg_hdr->B = 1;
                }
                //                //                //
                else if (state == SLICE ||state == SLICE_AGAIN)
                {

                    state = SLICE_AGAIN;
                    sent_bytes = next_start_code_index;
                    mpeg_hdr->E     = 1;
                }
                //                //                //                //
                else if (state == SLICE_BREAK)
                {
                    state = UNKNOWN;
                    sent_bytes = next_start_code_index;
                    mpeg_hdr->E     = 1;
                    goto Sent_Packet;
                }

            }

            switch(next_start_code)
            {
            case SEQUENCE_HEADER_CODE:
                //                //                //
                if(state == SLICE || state == SLICE_AGAIN)
                {               
                    state = SEQUENCE_HEADER;
                    sent_bytes = next_start_code_index;
                    //                    //
                    rtp_hdr->marker = 1;
                    goto Sent_Packet;
                }

                state = SEQUENCE_HEADER;
                g_frame_rate = frame_rate(next_start_code_index);
                g_delay_time = (unsigned int)(1000.0 / g_frame_rate +0.5); //ms
                g_timetramp_increment = (unsigned int)(90000.0 / g_frame_rate +0.5); //90K Hz
                mpeg_hdr->S=1;
                break;

            case GROUP_START_CODE:
                //                //                //
                if(state == SLICE || state == SLICE_AGAIN)
                {
                    state = GROUP_START;
                    sent_bytes = next_start_code_index;
                    //                    //
                    rtp_hdr->marker = 1;
                    goto Sent_Packet;
                }   

                state = GROUP_START;

            case PICTURE_START_CODE:
                //                //                //
                if(state == SLICE || state == SLICE_AGAIN)
                {
                    state = PICTURE;
                    sent_bytes = next_start_code_index;
                    //                    //
                    rtp_hdr->marker = 1;
                    goto Sent_Packet;
                }

                state = PICTURE;

                mpeg_hdr->TR_high2    = (extract_temporal_reference(next_start_code_index) & 0x300 )>> 8;
                mpeg_hdr->TR_low8     =  extract_temporal_reference(next_start_code_index) & 0xff;
                mpeg_hdr->P             = read_picture_type(next_start_code_index);
                //now read the motion vectors information
                if( (mpeg_hdr->P==2) || (mpeg_hdr->P==3))
                { //if B- or P-type picture, need forward mv
                    mpeg_hdr->FFV = read_FFV(next_start_code_index);
                    mpeg_hdr->FFC = read_FFC(next_start_code_index);
                }
                if( mpeg_hdr->P==3)
                { // if B-type pictue, need backward mv
                    mpeg_hdr->FBV = read_FBV(next_start_code_index);
                    mpeg_hdr->BFC = read_BFC(next_start_code_index);
                }

                //                //
                if( mpeg_hdr->P== 1 || mpeg_hdr->P == 2 ){
                    g_time_stamp += g_timetramp_increment;
                    g_time_stamp_current = g_time_stamp;
                }else{
                    g_time_stamp += g_timetramp_increment;
                }
               

                break;

            case PACKET_BUFFER_END:
                //                //
                if(state == SLICE_AGAIN) {
                    state = UNKNOWN;
                    goto Sent_Packet;
                }

                //                //
                if(state == SLICE)
                {
                    state = SLICE_BREAK;
                    sent_bytes = next_start_code_index;
                    goto Sent_Packet;
                }

                //                //                //
                if(state == SLICE_BREAK )
                {
                    state = SLICE_BREAK;
                    sent_bytes = next_start_code_index;
                    goto Sent_Packet;
                }

                break;
            }
        }while(next_start_code != PACKET_BUFFER_END);

Sent_Packet:
        rtp_hdr->timestamp = htonl(g_time_stamp_current);
        Send_RTP_Packet(buf, sent_bytes);
       
        //copy the tail data to the head of the packet buffer
        memmove(&buf[HEADER_LENGTH], &buf[sent_bytes], MAX_RTP_PKT_LENGTH-sent_bytes);
        //reset the buffer index to zero
        reset_buffer_index();
        //reading data into buffer again
        fread(&(buf[(MAX_RTP_PKT_LENGTH-sent_bytes)+HEADER_LENGTH]), sent_bytes -HEADER_LENGTH , 1,mpfd);

        // sleep g_delay_time msec for sending next picture data
        if(rtp_hdr->marker ==1) Sleep( g_delay_time );

    }while(!feof(mpfd));
    closesocket(socket1);
    fclose(mpfd);

    printf("stream end./n");
}

//==================================================================

unsigned int find_next_start_code(unsigned int *next_start_code_index) //NOTE: all start codes ARE byte-aligned
{
    unsigned int byte0=0,byte1=0,byte2=0,byte3=0,startcode=0;

    //while not startcode and have not exceeded max packet length
    while (g_index_in_packet_buffer < MAX_RTP_PKT_LENGTH)
    {
        if (buf[g_index_in_packet_buffer+0] == 0
            && buf[g_index_in_packet_buffer+1] == 0
            && buf[g_index_in_packet_buffer+2] ==1)
        {
            //printf("FOUND startcode %d/n",indx);
            byte0=(int)buf[g_index_in_packet_buffer+0];
            byte1=(int)buf[g_index_in_packet_buffer+1];
            byte2=(int)buf[g_index_in_packet_buffer+2];
            byte3=(int)buf[g_index_in_packet_buffer+3];
            startcode=(byte0 << 24) + (byte1 << 16) + (byte2 << 8) + byte3;
            *next_start_code_index = g_index_in_packet_buffer;
            g_index_in_packet_buffer = g_index_in_packet_buffer+4;
            return(startcode);
        }
        else
            g_index_in_packet_buffer++;
    }

    //
    if (g_index_in_packet_buffer >= (MAX_RTP_PKT_LENGTH))
    {
        *next_start_code_index = g_index_in_packet_buffer -1;
        g_index_in_packet_buffer = HEADER_LENGTH;
        return PACKET_BUFFER_END;
    }

    printf("Error reading buffer../n");
    exit(-1);
    return -1;
}

void reset_buffer_index(void)
{
    g_index_in_packet_buffer = HEADER_LENGTH;
}

 

//========================================================
float frame_rate(int buffer_index)
{
    unsigned char frame_rate_code;
    frame_rate_code = (unsigned char)buf[buffer_index +7] & 0xf;
    switch(frame_rate_code)
    {
    case 0x1:
        return 23.976;
    case 0x2:
        return 24.0;
    case 0x3:
        return 25.0;
    case 0x4:
        return 29.97;
    case 0x5:
        return 30.0;
    case 0x6:
        return 50.0;
    case 0x7:
        return 59.94;
    case 0x8:
        return 60.0;
    default:
        return 0;
    }
}
//========================================================
unsigned int extract_temporal_reference(int buffer_index) // 10 bits
{
    unsigned int low2bits=0,TR=0; // TR = temporal reference;

    TR = (unsigned int) (buf[buffer_index+4]);
    TR <<= 2;
    low2bits = (unsigned int) (buf[buffer_index+5]);
    TR |= (low2bits >> 6);
    return(TR);
}

//========================================================

unsigned int read_picture_type(int buffer_index)
{
    unsigned int pictype=0;

    pictype = (unsigned int) buf[buffer_index+5];
    pictype = (pictype >> 3) & (0x7);
    return (pictype);
}

//=======================================================
unsigned int read_FFV(int buffer_index) // 1 bit
{
    return( (int) ((buf[buffer_index+7] & (0x4)) >> 2));
}
//=======================================================
unsigned int read_FFC(int buffer_index) // 3 bits
{
    unsigned int FFC=0,lowbit=0;
    FFC = (int) (buf[buffer_index+7] & (0x3));
    FFC <<= 1;
    lowbit = (int) ((buf[buffer_index+8]) & (0x80));
    FFC = FFC | (lowbit >> 7 );
    return(FFC);
}

//=======================================================
unsigned int read_FBV(int buffer_index) // 1 bit
{      
    return( (int) ((buf[buffer_index+8] & (0x40))>>6) );
}

//=======================================================
unsigned int read_BFC(int buffer_index) // 3 bits
{               
    return( (int) ( (buf[buffer_index+8] & (0x38) ) >> 3 ) );
}

void validate_file()
{
    /**//* to validate the file, ensure the existance of a startcode */
    int j=0,valid=0;

    while ((j++<MAX_RTP_PKT_LENGTH) && (!valid))
    {
        if (!((int)buf[j+0] + (int)buf[j+1]) && (((int)buf[j+2])==1))
            valid=1;
    }
    if (!valid)
    {
        printf("/nERROR: start code not found. /
               /nInput file must be a valid MPEG I file./n");
        exit(0);
    }          
}

BOOL InitWinsock()
{
    int Error;
    WORD VersionRequested;
    WSADATA WsaData;
    VersionRequested=MAKEWORD(2,2);
    Error=WSAStartup(VersionRequested,&WsaData); //启动WinSock2
    if(Error!=0)
    {
        return FALSE;
    }
    else
    {
        if(LOBYTE(WsaData.wVersion)!=2||HIBYTE(WsaData.wHighVersion)!=2)
        {
            WSACleanup();
            return FALSE;
        }
       
    }
    return TRUE;
}

 

      完成这个测试程序后,我有了很大的信心,又重复看了RFC3550几编,其实,如果你真看了程序,你发现我只发送了RTP,并没有发送RTCP数据包,因此,我们是不能同步多个RTP流的。我没去编码下去,因为我觉得已经够了。这里强调,没用说的RTP没有了RTCP就不行!接下来的工作,就是把这个程序的下层发包函数去掉,采用RTP库JRTPLIB,我觉得这才应该是JRTPLIB的DEMO!如果有人问,就这样的一个程序就能完成任务了,要JRTPLIB干嘛,其实,我不写RTCP相关代码的原因为多个:

    1.RTCP里头有很多关于RTCP发送简隔的时间计算,RTP信息的统计,这种操作不是难,而是烦,我不想去写
    2.RTCP和RTP一开始出来的时候并不是因为视频的点播等应用的,而是视频会议。RTCP有管理与会者的层面含义,这一功能在很多场合并不会用到。
    3.我想简单,没有写多个流间的同步,如一个影片的视频和音频流。这些其实是RTCP来完成的。

    我懒得去写,因为这些功作RTP的各个库类都做得很好。我觉得用库的最大优点就在这吧。

 

本文来自CSDN博客,转载请标明出处:http://blog.csdn.net/Tinnal/archive/2008/09/03/2871734.aspx

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值