p6s与onvif_onvif规范的实现:成功实现ONVIF协议RTSP-Video-Stream与OnvifDeviceManager的视频对接...

本文档详细介绍了如何实现ONVIF协议,特别是VideoStream部分,以便在Onvif Device Manager中接收设备的RTSP视频流。通过GetProfiles、GetStreamUri等步骤,结合开源库live555,实现了RTSP视频流的对接。同时,提供了生成ONVIF源码框架的步骤,并展示了如何响应GetCapabilities、GetVideoEncoderConfiguration等命令。最终,通过Onvif Device Manager手动添加设备并验证了视频流的成功对接。
摘要由CSDN通过智能技术生成

有了前几篇的基础,现在可以正式开始onvif的实现工作,其中一项非常重要的部分就是视频流的对接,即能够在符合onvif标准的监控客户端软件里接收到设备端NVT发来的RTSP视频流。这里,我所用的客户端软件是Onvif Device Manager v2.2。【来自http://blog.csdn.net/ghostyu】

ONVIF Profile S Specification文档描述了Device或者说DVT和Client可以使用的一种Profile,Profile这个词在计算机领域非常常见,我们可以理解成一种方案、配置、框架等。

文档里描述了如果实现VideoStream,device和client应该具备的条件,当然如果实现文档的所有条件,就可以说该设备符合Profile S

如果单纯实现VideoStream,只需完成下列命令。

1、GetProfiles

2、GetStreamUri

填充rtsp路径,例如:rtsp://192.168.1.201/petrov.m4e

3、Media Streaming using RTSP

这里使用开源的live555,完成rtsp功能

4、GetVideoEncoderConfiguration

5、GetVideoEncoderConfigurationOptions

6、GetCapabilities

NVC为了获取DVT所支持的功能的命令

1、GetProfiles

2、GetStreamUri

填充rtsp路径,例如:rtsp://192.168.1.201/petrov.m4e

3、Media Streaming using RTSP

这里使用开源的live555,完成rtsp功能

4、GetVideoEncoderConfiguration

5、GetVideoEncoderConfigurationOptions

6、GetCapabilities

NVC为了获取DVT所支持的功能的命令

参考文档:

1、ONVIF Profile S Specification

描述ProfileS是什么样的一个东西,如何实现

2、Reference_of_ONVIF_Development_v1.01.02

Onvif DVT设计参考,指明了一条道路,但没有具体内容

3、ONVIF-Media-Service-Spec-v220

Onvif Media的说明介绍

4、http://www.onvif.org/onvif/ver20/util/operationIndex.html

onvif几乎全部命令的详细说明,非常重要。该文档告诉我们结构体成员的意义和如何填充。Onvif开发其实就是各种结构体的填充。

1、ONVIF Profile S Specification

描述ProfileS是什么样的一个东西,如何实现

2、Reference_of_ONVIF_Development_v1.01.02

Onvif DVT设计参考,指明了一条道路,但没有具体内容

3、ONVIF-Media-Service-Spec-v220

Onvif Media的说明介绍

4、http://www.onvif.org/onvif/ver20/util/operationIndex.html

onvif几乎全部命令的详细说明,非常重要。该文档告诉我们结构体成员的意义和如何填充。Onvif开发其实就是各种结构体的填充。

一、产生onvif源码框架

1、从wsdl生成C头文件

wsdl2h -o onvif.h -c -s -t .\typemap.dat http://www.onvif.org/onvif/ver10/device/wsdl/devicemgmt.wsdl http://www.onvif.org/onvif/ver10/event/wsdl/event.wsdl http://www.onvif.org/onvif/ver10/display.wsdl http://www.onvif.org/onvif/ver10/deviceio.wsdl http://www.onvif.org/onvif/ver20/imaging/wsdl/imaging.wsdl http://www.onvif.org/onvif/ver10/media/wsdl/media.wsdl http://www.onvif.org/onvif/ver20/ptz/wsdl/ptz.wsdl  http://www.onvif.org/onvif/ver10/receiver.wsdl http://www.onvif.org/onvif/ver10/recording.wsdl  http://www.onvif.org/onvif/ver10/search.wsdl http://www.onvif.org/onvif/ver10/network/wsdl/remotediscovery.wsdl http://www.onvif.org/onvif/ver10/replay.wsdl http://www.onvif.org/onvif/ver20/analytics/wsdl/analytics.wsdl http://www.onvif.org/onvif/ver10/analyticsdevice.wsdl  http://www.onvif.org/onvif/ver10/schema/onvif.xsd  http://www.onvif.org/ver10/actionengine.wsdl

wsdl2h -o onvif.h -c -s -t .\typemap.dat http://www.onvif.org/onvif/ver10/device/wsdl/devicemgmt.wsdl http://www.onvif.org/onvif/ver10/event/wsdl/event.wsdl http://www.onvif.org/onvif/ver10/display.wsdl http://www.onvif.org/onvif/ver10/deviceio.wsdl http://www.onvif.org/onvif/ver20/imaging/wsdl/imaging.wsdl http://www.onvif.org/onvif/ver10/media/wsdl/media.wsdl http://www.onvif.org/onvif/ver20/ptz/wsdl/ptz.wsdl http://www.onvif.org/onvif/ver10/receiver.wsdl http://www.onvif.org/onvif/ver10/recording.wsdl http://www.onvif.org/onvif/ver10/search.wsdl http://www.onvif.org/onvif/ver10/network/wsdl/remotediscovery.wsdl http://www.onvif.org/onvif/ver10/replay.wsdl http://www.onvif.org/onvif/ver20/analytics/wsdl/analytics.wsdl http://www.onvif.org/onvif/ver10/analyticsdevice.wsdl http://www.onvif.org/onvif/ver10/schema/onvif.xsd http://www.onvif.org/ver10/actionengine.wsdl

跟前一篇discovery唯一不同的是,这里多了很多wsdl文件,这次创建完整的onvif代码框架

2、从头文件生成源码框架

soapcpp2 -c onvif.h -x -I /root/onvif/gsoap-2.8/gsoap/import -I /root/onvif/gsoap-2.8/gsoap/

soapcpp2 -c onvif.h -x -I /root/onvif/gsoap-2.8/gsoap/import -I /root/onvif/gsoap-2.8/gsoap/

产生的C文件比较庞大,最大的有十几兆,大部分的内容没有复用导致。

二、创建soap运行环境

intmain(intargc,char**argv)

{

intm, s;

structsoap add_soap;

intserver_udp;

server_udp = create_server_socket_udp();

//bind_server_udp1(server_udp);

pthread_t thrHello;

pthread_t thrProbe;

//pthread_create(&thrHello,NULL,main_Hello,server_udp);

//sleep(2);

pthread_create(&thrProbe,NULL,main_Probe,server_udp);

soap_init(&add_soap);

soap_set_namespaces(&add_soap, namespaces);

if(argc 

printf("usage: %s  \n", argv[0]);

exit(1);

}else{

m = soap_bind(&add_soap, NULL, 80, 100);

if(m 

soap_print_fault(&add_soap, stderr);

exit(-1);

}

fprintf(stderr,"Socket connection successful: master socket = %d\n", m);

for(;;) {

s = soap_accept(&add_soap);

if(s 

soap_print_fault(&add_soap, stderr);

exit(-1);

}

fprintf(stderr,"Socket connection successful: slave socket = %d\n", s);

soap_serve(&add_soap);

soap_end(&add_soap);

}

}

return0;

}

int main(int argc, char **argv)

{

int m, s;

struct soap add_soap;

int server_udp;

server_udp = create_server_socket_udp();

//bind_server_udp1(server_udp);

pthread_t thrHello;

pthread_t thrProbe;

//pthread_create(&thrHello,NULL,main_Hello,server_udp);

//sleep(2);

pthread_create(&thrProbe,NULL,main_Probe,server_udp);

soap_init(&add_soap);

soap_set_namespaces(&add_soap, namespaces);

if (argc < 0) {

printf("usage: %s \n", argv[0]);

exit(1);

} else {

m = soap_bind(&add_soap, NULL, 80, 100);

if (m < 0) {

soap_print_fault(&add_soap, stderr);

exit(-1);

}

fprintf(stderr, "Socket connection successful: master socket = %d\n", m);

for (;;) {

s = soap_accept(&add_soap);

if (s < 0) {

soap_print_fault(&add_soap, stderr);

exit(-1);

}

fprintf(stderr, "Socket connection successful: slave socket = %d\n", s);

soap_serve(&add_soap);

soap_end(&add_soap);

}

}

return 0;

}

注意,这里绑定了80端口,onvif使用的是http请求,然后附带xml,其实正常的是将onvif集成到web服务器中,普通的http请求有web服务器处理,onvif的http请求则有soap处理。我们这里的做法也可行,只不过onvif的访问web服务器的功能是无法使用的。

三、RTSP视频对接

1、实现GetCapabilities命令

客户端发送GetCapabilities命令来得到设备端的能力,然后依据GetCapabilities返回的结果再来进行下一步操作

在__tds__GetCapabilities函数中我们只需要填充Media部分和一些必要的即可

//想要对接RTSP视频,必须设置Media

tds__GetCapabilitiesResponse->Capabilities->Media = (structtt__MediaCapabilities*)soap_malloc(soap,sizeof(structtt__MediaCapabilities));

tds__GetCapabilitiesResponse->Capabilities->Media->XAddr = (char*) soap_malloc(soap,sizeof(char) * LARGE_INFO_LENGTH);

strcpy(tds__GetCapabilitiesResponse->Capabilities->Media->XAddr, _IPv4Address);

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities = (structtt__RealTimeStreamingCapabilities*)soap_malloc(soap,sizeof(structtt__RealTimeStreamingCapabilities));

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTPMulticast = (int*)soap_malloc(soap,sizeof(int));

*tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTPMulticast = _false;

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTP_USCORETCP = (int*)soap_malloc(soap,sizeof(int));

*tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTP_USCORETCP = _true;

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTP_USCORERTSP_USCORETCP = (int*)soap_malloc(soap,sizeof(int));

*tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTP_USCORERTSP_USCORETCP = _true;

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->Extension = NULL;

tds__GetCapabilitiesResponse->Capabilities->Media->Extension = NULL;

tds__GetCapabilitiesResponse->Capabilities->Media->__size = 0;

tds__GetCapabilitiesResponse->Capabilities->Media->__any = 0;

//想要对接RTSP视频,必须设置Media

tds__GetCapabilitiesResponse->Capabilities->Media = (struct tt__MediaCapabilities*)soap_malloc(soap, sizeof(struct tt__MediaCapabilities));

tds__GetCapabilitiesResponse->Capabilities->Media->XAddr = (char *) soap_malloc(soap, sizeof(char) * LARGE_INFO_LENGTH);

strcpy(tds__GetCapabilitiesResponse->Capabilities->Media->XAddr, _IPv4Address);

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities = (struct tt__RealTimeStreamingCapabilities*)soap_malloc(soap, sizeof(struct tt__RealTimeStreamingCapabilities));

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTPMulticast = (int *)soap_malloc(soap, sizeof(int));

*tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTPMulticast = _false;

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTP_USCORETCP = (int *)soap_malloc(soap, sizeof(int));

*tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTP_USCORETCP = _true;

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTP_USCORERTSP_USCORETCP = (int *)soap_malloc(soap, sizeof(int));

*tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->RTP_USCORERTSP_USCORETCP = _true;

tds__GetCapabilitiesResponse->Capabilities->Media->StreamingCapabilities->Extension = NULL;

tds__GetCapabilitiesResponse->Capabilities->Media->Extension = NULL;

tds__GetCapabilitiesResponse->Capabilities->Media->__size = 0;

tds__GetCapabilitiesResponse->Capabilities->Media->__any = 0;

另外必要填充的还有

//下面的重要,这里只实现视频流,需要设置VideoSources

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->VideoSources = TRUE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->VideoOutputs = FALSE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->AudioSources = FALSE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->AudioOutputs = FALSE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->RelayOutputs = FALSE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->__size = 0;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->__any = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Display = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Recording = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Search = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Replay = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Receiver = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->AnalyticsDevice = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Extensions = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->__size = 0;

tds__GetCapabilitiesResponse->Capabilities->Extension->__any = NULL;

//下面的重要,这里只实现视频流,需要设置VideoSources

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->VideoSources = TRUE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->VideoOutputs = FALSE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->AudioSources = FALSE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->AudioOutputs = FALSE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->RelayOutputs = FALSE;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->__size = 0;

tds__GetCapabilitiesResponse->Capabilities->Extension->DeviceIO->__any = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Display = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Recording = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Search = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Replay = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Receiver = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->AnalyticsDevice = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->Extensions = NULL;

tds__GetCapabilitiesResponse->Capabilities->Extension->__size = 0;

tds__GetCapabilitiesResponse->Capabilities->Extension->__any = NULL;

2、实现GetServices命令

int__tds__GetServices(structsoap* soap,struct_tds__GetServices *tds__GetServices,struct_tds__GetServicesResponse *tds__GetServicesResponse)

{

DBG("__tds__GetServices\n");

/*该函数很必要*/

char_IPAddr[INFO_LENGTH];

inti = 0;

sprintf(_IPAddr,"http://%03d.%03d.%03d.%03d/onvif/services", 192, 168, 1, 233);

tds__GetServicesResponse->__sizeService = 1;

tds__GetServicesResponse->Service = (structtds__Service *)soap_malloc(soap,sizeof(structtds__Service));

tds__GetServicesResponse->Service[0].XAddr = (char*)soap_malloc(soap,sizeof(char) * INFO_LENGTH);

tds__GetServicesResponse->Service[0].Namespace = (char*)soap_malloc(soap,sizeof(char) * INFO_LENGTH);

strcpy(tds__GetServicesResponse->Service[0].Namespace,"http://www.onvif.org/ver10/events/wsdl");

strcpy(tds__GetServicesResponse[0].Service->XAddr, _IPAddr);

tds__GetServicesResponse->Service[0].Capabilities = NULL;

tds__GetServicesResponse->Service[0].Version = (structtt__OnvifVersion *)soap_malloc(soap,sizeof(structtt__OnvifVersion));

tds__GetServicesResponse->Service[0].Version->Major = 0;

tds__GetServicesResponse->Service[0].Version->Minor = 3;

tds__GetServicesResponse->Service[0].__any = (char**)soap_malloc(soap,sizeof(char*));

tds__GetServicesResponse->Service[0].__any[0] = (char*)soap_malloc(soap,sizeof(char) * INFO_LENGTH);

strcpy(tds__GetServicesResponse->Service[0].__any[0],"why1");

tds__GetServicesResponse->Service[0].__any[1] = (char*)soap_malloc(soap,sizeof(char) * INFO_LENGTH);

strcpy(tds__GetServicesResponse->Service[0].__any[1],"why2");

tds__GetServicesResponse->Service[0].__size = NULL;

tds__GetServicesResponse->Service[0].__anyAttribute = NULL;

returnSOAP_OK;

}

int __tds__GetServices(struct soap* soap, struct _tds__GetServices *tds__GetServices, struct _tds__GetServicesResponse *tds__GetServicesResponse)

{

DBG("__tds__GetServices\n");

/*该函数很必要*/

char _IPAddr[INFO_LENGTH];

int i = 0;

sprintf(_IPAddr, "http://%03d.%03d.%03d.%03d/onvif/services", 192, 168, 1, 233);

tds__GetServicesResponse->__sizeService = 1;

tds__GetServicesResponse->Service = (struct tds__Service *)soap_malloc(soap, sizeof(struct tds__Service));

tds__GetServicesResponse->Service[0].XAddr = (char *)soap_malloc(soap, sizeof(char) * INFO_LENGTH);

tds__GetServicesResponse->Service[0].Namespace = (char *)soap_malloc(soap, sizeof(char) * INFO_LENGTH);

strcpy(tds__GetServicesResponse->Service[0].Namespace, "http://www.onvif.org/ver10/events/wsdl");

strcpy(tds__GetServicesResponse[0].Service->XAddr, _IPAddr);

tds__GetServicesResponse->Service[0].Capabilities = NULL;

tds__GetServicesResponse->Service[0].Version = (struct tt__OnvifVersion *)soap_malloc(soap, sizeof(struct tt__OnvifVersion));

tds__GetServicesResponse->Service[0].Version->Major = 0;

tds__GetServicesResponse->Service[0].Version->Minor = 3;

tds__GetServicesResponse->Service[0].__any = (char **)soap_malloc(soap, sizeof(char *));

tds__GetServicesResponse->Service[0].__any[0] = (char *)soap_malloc(soap, sizeof(char) * INFO_LENGTH);

strcpy(tds__GetServicesResponse->Service[0].__any[0],"why1");

tds__GetServicesResponse->Service[0].__any[1] = (char *)soap_malloc(soap,sizeof(char) * INFO_LENGTH);

strcpy(tds__GetServicesResponse->Service[0].__any[1],"why2");

tds__GetServicesResponse->Service[0].__size = NULL;

tds__GetServicesResponse->Service[0].__anyAttribute = NULL;

return SOAP_OK;

}

3、实现GetVideoSources命令

int  __tmd__GetVideoSources(struct soap* soap, struct _trt__GetVideoSources *trt__GetVideoSources, struct _trt__GetVideoSourcesResponse *trt__GetVideoSourcesResponse)

{

DBG("__tmd__GetVideoSources\n");

int size1;

size1=1;

trt__GetVideoSourcesResponse->__sizeVideoSources=size1;

trt__GetVideoSourcesResponse->VideoSources= (struct tt__VideoSource *)soap_malloc(soap, sizeof(struct tt__VideoSource) * size1);

trt__GetVideoSourcesResponse->VideoSources[0].Framerate=30;

trt__GetVideoSourcesResponse->VideoSources[0].Resolution= (struct tt__VideoResolution *)soap_malloc(soap, sizeof(struct tt__VideoResolution));

trt__GetVideoSourcesResponse->VideoSources[0].Resolution->Height=720;

trt__GetVideoSourcesResponse->VideoSources[0].Resolution->Width=1280;

trt__GetVideoSourcesResponse->VideoSources[0].token= (char *)soap_malloc(soap, sizeof(char)*INFO_LENGTH);

strcpy(trt__GetVideoSourcesResponse->VideoSources[0].token,"GhostyuSource_token"); //注意这里需要和GetProfile中的sourcetoken相同

trt__GetVideoSourcesResponse->VideoSources[0].Imaging=(struct tt__ImagingSettings*)soap_malloc(soap, sizeof(struct tt__ImagingSettings));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Brightness= (float*)soap_malloc(soap,sizeof(float));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Brightness[0] = 128;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->ColorSaturation= (float*)soap_malloc(soap,sizeof(float));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->ColorSaturation[0] = 128;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Contrast= (float*)soap_malloc(soap,sizeof(float));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Contrast[0] = 128;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->IrCutFilter= (int *)soap_malloc(soap,sizeof(int));

*trt__GetVideoSourcesResponse->VideoSources[0].Imaging->IrCutFilter=0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Sharpness= (float*)soap_malloc(soap,sizeof(float));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Sharpness[0] = 128;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->BacklightCompensation= (struct tt__BacklightCompensation*)soap_malloc(soap, sizeof(struct tt__BacklightCompensation));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->BacklightCompensation->Mode=0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->BacklightCompensation->Level=20;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Exposure=NULL;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Focus=NULL;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WideDynamicRange= (struct tt__WideDynamicRange*)soap_malloc(soap, sizeof(struct tt__WideDynamicRange));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WideDynamicRange->Mode=0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WideDynamicRange->Level=20;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WhiteBalance= (struct tt__WhiteBalance*)soap_malloc(soap, sizeof(struct tt__WhiteBalance));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WhiteBalance->Mode=0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WhiteBalance->CrGain=0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WhiteBalance->CbGain=0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Extension=NULL;

trt__GetVideoSourcesResponse->VideoSources[0].Extension=NULL;

return SOAP_OK;

}

int __tmd__GetVideoSources(struct soap* soap, struct _trt__GetVideoSources *trt__GetVideoSources, struct _trt__GetVideoSourcesResponse *trt__GetVideoSourcesResponse)

{

DBG("__tmd__GetVideoSources\n");

int size1;

size1 = 1;

trt__GetVideoSourcesResponse->__sizeVideoSources = size1;

trt__GetVideoSourcesResponse->VideoSources = (struct tt__VideoSource *)soap_malloc(soap, sizeof(struct tt__VideoSource) * size1);

trt__GetVideoSourcesResponse->VideoSources[0].Framerate = 30;

trt__GetVideoSourcesResponse->VideoSources[0].Resolution = (struct tt__VideoResolution *)soap_malloc(soap, sizeof(struct tt__VideoResolution));

trt__GetVideoSourcesResponse->VideoSources[0].Resolution->Height = 720;

trt__GetVideoSourcesResponse->VideoSources[0].Resolution->Width = 1280;

trt__GetVideoSourcesResponse->VideoSources[0].token = (char *)soap_malloc(soap, sizeof(char)*INFO_LENGTH);

strcpy(trt__GetVideoSourcesResponse->VideoSources[0].token,"GhostyuSource_token"); //注意这里需要和GetProfile中的sourcetoken相同

trt__GetVideoSourcesResponse->VideoSources[0].Imaging =(struct tt__ImagingSettings*)soap_malloc(soap, sizeof(struct tt__ImagingSettings));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Brightness = (float*)soap_malloc(soap,sizeof(float));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Brightness[0] = 128;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->ColorSaturation = (float*)soap_malloc(soap,sizeof(float));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->ColorSaturation[0] = 128;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Contrast = (float*)soap_malloc(soap,sizeof(float));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Contrast[0] = 128;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->IrCutFilter = (int *)soap_malloc(soap,sizeof(int));

*trt__GetVideoSourcesResponse->VideoSources[0].Imaging->IrCutFilter = 0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Sharpness = (float*)soap_malloc(soap,sizeof(float));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Sharpness[0] = 128;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->BacklightCompensation = (struct tt__BacklightCompensation*)soap_malloc(soap, sizeof(struct tt__BacklightCompensation));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->BacklightCompensation->Mode = 0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->BacklightCompensation->Level = 20;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Exposure = NULL;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Focus = NULL;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WideDynamicRange = (struct tt__WideDynamicRange*)soap_malloc(soap, sizeof(struct tt__WideDynamicRange));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WideDynamicRange->Mode = 0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WideDynamicRange->Level = 20;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WhiteBalance = (struct tt__WhiteBalance*)soap_malloc(soap, sizeof(struct tt__WhiteBalance));

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WhiteBalance->Mode = 0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WhiteBalance->CrGain = 0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->WhiteBalance->CbGain = 0;

trt__GetVideoSourcesResponse->VideoSources[0].Imaging->Extension = NULL;

trt__GetVideoSourcesResponse->VideoSources[0].Extension = NULL;

return SOAP_OK;

}

__tmd__GetVideoSources最重要的是token的填充,必须要和下面profile中的sourcetoken相同,需要匹配到这个视频源

4、实现GetProfiles命令

size = 1;

trt__GetProfilesResponse->Profiles =(structtt__Profile *)soap_malloc(soap,sizeof(structtt__Profile) * size);

trt__GetProfilesResponse->__sizeProfiles = size;

i=0;

trt__GetProfilesResponse->Profiles[i].Name = (char*)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

strcpy(trt__GetProfilesResponse->Profiles[i].Name,"my_profile");

trt__GetProfilesResponse->Profiles[i].token= (char*)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

strcpy(trt__GetProfilesResponse->Profiles[i].token,"token_profile");

trt__GetProfilesResponse->Profiles[i].fixed = _false;

trt__GetProfilesResponse->Profiles[i].__anyAttribute = NULL;

size = 1;

trt__GetProfilesResponse->Profiles =(struct tt__Profile *)soap_malloc(soap, sizeof(struct tt__Profile) * size);

trt__GetProfilesResponse->__sizeProfiles = size;

i=0;

trt__GetProfilesResponse->Profiles[i].Name = (char *)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

strcpy(trt__GetProfilesResponse->Profiles[i].Name,"my_profile");

trt__GetProfilesResponse->Profiles[i].token= (char *)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

strcpy(trt__GetProfilesResponse->Profiles[i].token,"token_profile");

trt__GetProfilesResponse->Profiles[i].fixed = _false;

trt__GetProfilesResponse->Profiles[i].__anyAttribute = NULL;

除了上面的基本信息,还需要填充两大项:VideoSourceConfiguration和VideoEncoderConfiguration,一个用于描述视频源的信息,另外一个描述视频的编码信息

先给VideoSourceConfiguration分配空间

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration = (structtt__VideoSourceConfiguration *)soap_malloc(soap,sizeof(structtt__VideoSourceConfiguration ));

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Name = (char*)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->token = (char*)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->SourceToken = (char*)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds = (structtt__IntRectangle *)soap_malloc(soap,sizeof(structtt__IntRectangle));

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration = (struct tt__VideoSourceConfiguration *)soap_malloc(soap,sizeof(struct tt__VideoSourceConfiguration ));

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Name = (char *)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->token = (char *)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->SourceToken = (char *)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds = (struct tt__IntRectangle *)soap_malloc(soap,sizeof(struct tt__IntRectangle));

然后在填充它

/*注意SourceToken*/

strcpy(trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Name,"VS_Name");

strcpy(trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->token,"VS_Token");

strcpy(trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->SourceToken,"GhostyuSource_token");/*必须与__tmd__GetVideoSources中的token相同*/

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->UseCount = 1;

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds->x = 1;

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds->y = 1;

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds->height = 720;

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds->width = 1280;

/*注意SourceToken*/

strcpy(trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Name,"VS_Name");

strcpy(trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->token,"VS_Token");

strcpy(trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->SourceToken,"GhostyuSource_token"); /*必须与__tmd__GetVideoSources中的token相同*/

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->UseCount = 1;

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds->x = 1;

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds->y = 1;

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds->height = 720;

trt__GetProfilesResponse->Profiles[i].VideoSourceConfiguration->Bounds->width = 1280;

如果是指针必须先用soap_malloc分配内存,然后才能赋值

下面是VideoEncoderConfiguration

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration = (structtt__VideoEncoderConfiguration *)soap_malloc(soap,sizeof(structtt__VideoEncoderConfiguration));

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Name = (char*)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->token= (char*)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

strcpy(trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Name,"VE_Name1");

strcpy(trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->token,"VE_token1");

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->UseCount = 1;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Quality = 10;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Encoding = 1;//JPEG = 0, MPEG4 = 1, H264 = 2;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Resolution = (structtt__VideoResolution *)soap_malloc(soap,sizeof(structtt__VideoResolution));

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Resolution->Height = 720;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Resolution->Width = 1280;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->RateControl = (structtt__VideoRateControl *)soap_malloc(soap,sizeof(structtt__VideoRateControl));

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->RateControl->FrameRateLimit = 30;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->RateControl->EncodingInterval = 1;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->RateControl->BitrateLimit = 500;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration = (struct tt__VideoEncoderConfiguration *)soap_malloc(soap,sizeof(struct tt__VideoEncoderConfiguration));

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Name = (char *)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->token= (char *)soap_malloc(soap,sizeof(char)*MAX_PROF_TOKEN);

strcpy(trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Name,"VE_Name1");

strcpy(trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->token,"VE_token1");

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->UseCount = 1;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Quality = 10;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Encoding = 1;//JPEG = 0, MPEG4 = 1, H264 = 2;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Resolution = (struct tt__VideoResolution *)soap_malloc(soap, sizeof(struct tt__VideoResolution));

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Resolution->Height = 720;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->Resolution->Width = 1280;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->RateControl = (struct tt__VideoRateControl *)soap_malloc(soap, sizeof(struct tt__VideoRateControl));

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->RateControl->FrameRateLimit = 30;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->RateControl->EncodingInterval = 1;

trt__GetProfilesResponse->Profiles[i].VideoEncoderConfiguration->RateControl->BitrateLimit = 500;

5、GetVideoSourceConfiguration和GetVideoEncoderConfiguration

int__trt__GetVideoSourceConfiguration(structsoap* soap,struct_trt__GetVideoSourceConfiguration *trt__GetVideoSourceConfiguration,struct_trt__GetVideoSourceConfigurationResponse *trt__GetVideoSourceConfigurationResponse)

{

DBG("__trt__GetVideoSourceConfiguration\n");

//该函数必要,live video需要

returnSOAP_OK;

}

int__trt__GetVideoEncoderConfiguration(structsoap* soap,struct_trt__GetVideoEncoderConfiguration *trt__GetVideoEncoderConfiguration,struct_trt__GetVideoEncoderConfigurationResponse *trt__GetVideoEncoderConfigurationResponse)

{

DBG("__trt__GetVideoEncoderConfiguration\n");

returnSOAP_OK;

}

int __trt__GetVideoSourceConfiguration(struct soap* soap, struct _trt__GetVideoSourceConfiguration *trt__GetVideoSourceConfiguration, struct _trt__GetVideoSourceConfigurationResponse *trt__GetVideoSourceConfigurationResponse)

{

DBG("__trt__GetVideoSourceConfiguration\n");

//该函数必要,live video需要

return SOAP_OK;

}

int __trt__GetVideoEncoderConfiguration(struct soap* soap, struct _trt__GetVideoEncoderConfiguration *trt__GetVideoEncoderConfiguration, struct _trt__GetVideoEncoderConfigurationResponse *trt__GetVideoEncoderConfigurationResponse)

{

DBG("__trt__GetVideoEncoderConfiguration\n");

return SOAP_OK;

}

6、GetVideoEncoderConfigurationOptions

int__trt__GetVideoEncoderConfigurationOptions(structsoap* soap,struct_trt__GetVideoEncoderConfigurationOptions *trt__GetVideoEncoderConfigurationOptions,struct_trt__GetVideoEncoderConfigurationOptionsResponse *trt__GetVideoEncoderConfigurationOptionsResponse)

{

DBG("__trt__GetVideoEncoderConfigurationOptions\n");

//该函数必要,video streaming需要

returnSOAP_OK;

}

int __trt__GetVideoEncoderConfigurationOptions(struct soap* soap, struct _trt__GetVideoEncoderConfigurationOptions *trt__GetVideoEncoderConfigurationOptions, struct _trt__GetVideoEncoderConfigurationOptionsResponse *trt__GetVideoEncoderConfigurationOptionsResponse)

{

DBG("__trt__GetVideoEncoderConfigurationOptions\n");

//该函数必要,video streaming需要

return SOAP_OK;

}

以上5、6不分的代码直接返回SOAP_OK即可,正常来说是应该填充的,这里不影响RTSP Video Stream,暂时就不去动它

四、运行live555MediaServer服务器

live555官网有很多测试文件,我这里用的是MPEG4的测试文件路劲为rtsp://192.168.1.201/petrov.m4e

五、启动Onvif Device Manager测试

有一个问题,OnvifDeviceManager的并不能自动发现设备(OnvifTestTool可以),还好它提供了手动添加功能

单击add,添加如下内容:http://192.168.1.233/onvif/device_service

注意,我在程序中固定了两个IP:linux192.168.1.233,windows:192.168.1.201,这里需看情况修改

测试截图:

1、Live video

2、Video streaming

3、Profiles

最后是运行的live555 rtsp服务器

终端打印的DEBUG信息

源代码下载地址:http://download.csdn.net/detail/ghostyu/4796093

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值