deepstream的app简介

目录

1 常用的示例源代码信息

2 deepstream的metadata分析


DeepStream SDK包包含包含插件、库、应用程序和源代码的存档。源代码目录位于/opt/nvidia/deepstream/deepstream-6.2/sources,用于Debian安装(在Jetson或dGPU上)和SDK管理器安装。

1 常用的示例源代码信息

参考的示例应用

内部存放路径

描述

Sample test application 1apps/sample_apps/deepstream-test1如何为单个H.264流使用DeepStream示例:filesrc→decode→nvstreammux→nvinfer或nvinferserver(主检测器)→nvdsosd→渲染器。这个app使用resnet10.caffemodel模型实现检测。
Sample test application 2apps/sample_apps/deepstream-test2如何为单个H.264流使用DeepStream示例:filesrc→decode→nvstreammux→nvinfer或nvinferserver(主检测器)→nvtracker→nvinfer或nvinferserver(二级分类器)→nvdsosd→渲染器。这个应用程序使用resnet10.caffemodel用于检测和3个分类器模型(即Car Color, Make和Model)。
Sample test application 3apps/sample_apps/deepstream-test3

基于deepstream-test1(简单测试应用程序1),演示:

  1. 如何在管道中使用多个源;
  2. 使用uridecodebin接受任何类型的输入(例如RTSP/File),任何GStreamer支持的容器格式,以及任何编解码器;
  3. 配置Gst-nvstreammux生成一批帧并对其进行推理,以更好地利用资源;
  4. 提取流metadata,其中包含关于批处理缓冲区中帧的有用信息。

这个app使用resnet10.caffemodel模型实现检测。

Sample test application 4apps/sample_apps/­deepstream-test4

在deepstream-test1上构建一个H.264流:filesrc, decode, nvstreammux, nvinfer或nvinferserver, nvdsosd, renderer,演示如何:

  • 使用管道中的Gst-nvmsgconv和Gst-nvmsgbroker插件;
  • 创建NVDS_META_EVENT_MSG类型元数据并将其附加到缓冲区;
  • 对不同类型的对象使用NVDS_META_EVENT_MSG,例如车辆和人;
  • 如果通过extMsg字段扩展metadata,则使用“copy”和“free”函数。

这个app使用resnet10.caffemodel模型实现检测。

Custom meta data exampleapps/sample_apps/deepstream-user-metadata-test演示如何将自定义或特定于用户的元数据添加到DeepStream的任何组件。测试代码将一个16字节数组用户数据的附加到所选元素。在另一个函数中恢复数据。这个app使用resnet10.caffemodel模型实现检测。
Gst-nvinfer tensor meta flow exampleapps/sample_apps/deepstream-infer-tensor-meta-app演示如何获取和访问nvinfer张量输出作为元数据。这个app使用了resnet10.caffemodel检测模型和3个分类器模型(即Car Color, Make和Model)。
Preprocess exampleapps/sample_apps/deepstream-preprocess-test

演示对为视频流配置的预处理roi推理。这个app使用resnet10.caffemodel模型实现检测。

注:

deepstream目前预处理操作只能用于:1.固定的ROI;2.必须是主处理器,从处理器前不能加次操作。

2 deepstream的metadata分析

deepstream中metadata是一层层嵌套构成的,参考apps/sample_apps/deepstream-test3的函数可以看出,NvDsObjectMeta是描述目标的最基本的元数据,其中可以存放目标的检测信息、跟踪信息、绘制信息、分类信息、分割信息等;NvDsFrameMeta 是以帧为单位,NvDsFrameMeta 中可以存放多个NvDsObjectMeta;NvDsBatchMeta是以batch为单位,NvDsBatchMeta中可以存放多个NvDsFrameMeta。如此嵌套存放,用户可以以batch为单位对metadata进行操作处理。以下为deepstream-test3中的部分示例代码。

static GstPadProbeReturn tiler_src_pad_buffer_probe (GstPad * pad, GstPadProbeInfo * info, gpointer u_data)

{

    GstBuffer *buf = (GstBuffer *) info->data;

    guint num_rects = 0;

    NvDsObjectMeta *obj_meta = NULL;

    guint vehicle_count = 0;

    guint person_count = 0;

    NvDsMetaList * l_frame = NULL;

    NvDsMetaList * l_obj = NULL;

    //NvDsDisplayMeta *display_meta = NULL;

    NvDsBatchMeta *batch_meta = gst_buffer_get_nvds_batch_meta (buf);

    for (l_frame = batch_meta->frame_meta_list; l_frame != NULL;l_frame = l_frame->next)

    {

        NvDsFrameMeta *frame_meta = (NvDsFrameMeta *) (l_frame->data);

        //int offset = 0;

        for (l_obj = frame_meta->obj_meta_list; l_obj != NULL;l_obj = l_obj->next)

        {

            obj_meta = (NvDsObjectMeta *) (l_obj->data);

            if (obj_meta->class_id == PGIE_CLASS_ID_VEHICLE)

            {

                vehicle_count++;

                num_rects++;

            }

            if (obj_meta->class_id == PGIE_CLASS_ID_PERSON)

            {

                person_count++;

                num_rects++;

            }

        }

        g_print ("Frame Number = %d Number of objects = %d "

        "Vehicle Count = %d Person Count = %d\n",

        frame_meta->frame_num, num_rects, vehicle_count, person_count);

    }

    return GST_PAD_PROBE_OK;

}

下图以deepstream运行的结果图,具体说明metadata的构成

下图为deepstream的metadata结构示意图,参考https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_metadata.html

 附:NvDsObjectMeta的结构说明,详细见/opt/nvidia/deepstream/deepstream-6.1/sources/includes/nvdsmeta.h

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

/**

 * Holds metadata for an object in the frame.

 */

typedef struct _NvDsObjectMeta {

  NvDsBaseMeta base_meta;

  /** Holds a pointer to the parent @ref NvDsObjectMeta. Set to NULL if

   no parent exists. */

  struct _NvDsObjectMeta *parent;

  /** Holds a unique component ID that identifies the metadata

   in this structure. */

  gint unique_component_id;

  /** Holds the index of the object class inferred by the primary

   detector/classifier. */

  gint class_id;

  /** Holds a unique ID for tracking the object. @ref UNTRACKED_OBJECT_ID

   indicates that the object has not been tracked. */

  guint64 object_id;

  /** Holds a structure containing bounding box parameters of the object when

    detected by detector. */

  NvDsComp_BboxInfo detector_bbox_info;

  /** Holds a structure containing bounding box coordinates of the object when

   * processed by tracker. */

  NvDsComp_BboxInfo tracker_bbox_info;

  /** Holds a confidence value for the object, set by the inference

   component. confidence will be set to -0.1, if "Group Rectangles" mode of

   clustering is chosen since the algorithm does not preserve confidence

   values. Also, for objects found by tracker and not inference component,

   confidence will be set to -0.1 */

  gfloat confidence;

  /** Holds a confidence value for the object set by nvdcf_tracker.

   * tracker_confidence will be set to -0.1 for KLT and IOU tracker */

  gfloat tracker_confidence;

  /** Holds a structure containing positional parameters of the object

   * processed by the last component that updates it in the pipeline.

   * e.g. If the tracker component is after the detector component in the

   * pipeline then positinal parameters are from tracker component.

   * Positional parameters are clipped so that they do not fall outside frame

   * boundary. Can also be used to overlay borders or semi-transparent boxes on

   * objects. @see NvOSD_RectParams. */

  NvOSD_RectParams rect_params;

  /** Holds mask parameters for the object. This mask is overlayed on object

   * @see NvOSD_MaskParams. */

  NvOSD_MaskParams mask_params;

  /** Holds text describing the object. This text can be overlayed on the

   standard text that identifies the object. @see NvOSD_TextParams. */

  NvOSD_TextParams text_params;

  /** Holds a string describing the class of the detected object. */

  gchar obj_label[MAX_LABEL_SIZE];

  /** Holds a pointer to a list of pointers of type @ref NvDsClassifierMeta. */

  NvDsClassifierMetaList *classifier_meta_list;

  /** Holds a pointer to a list of pointers of type @ref NvDsUserMeta. */

  NvDsUserMetaList *obj_user_meta_list;

  /** Holds additional user-defined object information. */

  gint64 misc_obj_info[MAX_USER_FIELDS];

  /** For internal use. */

  gint64 reserved[MAX_RESERVED_FIELDS];

}NvDsObjectMeta;

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Deepstream-App是一个在安装Deepstream时就已经编译好并安装在系统中的程序,可以在任意位置的命令终端进行调用。它有两种版本,分别是deepstream-appdeepstream-test5-app。这两个版本的功能基本相同,唯一的区别是deepstream-test5-app支持类型设置为6的sink,也就是可以向kafka服务器收发数据的组件。英伟达还提供了这两个app的源代码,源代码的位置在/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-app和/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-test5。你可以在这里找到它们并进行学习和参考。\[1\] 如果你想使用deepstream-app,你可以通过在命令终端中输入以下命令来调用它: ``` $ deepstream-app -c ./configs/deepstream-app/source12_1080p_dec_infer-resnet_tracker_tiled_display_fp16_tx2.txt ``` 这个命令会使用指定的配置文件来运行deepstream-app。\[2\] 如果你想查看deepstream-app的用法,你可以在命令终端中输入以下命令: ``` $ deepstream-app --help ``` 这个命令会显示deepstream-app的用法信息。此外,你还可以在/opt/nvidia/deepstream/deepstream-5.1/samples文件夹中找到一些配置文件的示例,可以作为参考。例如,/opt/nvidia/deepstream/deepstream-5.1/samples/configs/deepstream-app/config_infer_primary.txt是一个配置文件的示例。\[3\] #### 引用[.reference_title] - *1* [带你玩转Jetson之Deepstream简明教程(四)DeepstreamApp如何使用以及用于工程验证。](https://blog.csdn.net/u013963960/article/details/129032242)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* *3* [运行 deepstream-app](https://blog.csdn.net/quicmous/article/details/117817657)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值