gstreamer基础教程8-Short-cutting the pipeline

索引:https://blog.csdn.net/knowledgebao/article/details/84621238

Goal

在GStream中pipeline结构不是封闭的,我们可以给pipeline中插入数据,或者从pipeline中获取数据,本章主要讲解我们如何插入、后去控制pipeline中的数据。

Pipelines constructed with GStreamer do not need to be completely closed. Data can be injected into the pipeline and extracted from it at any time, in a variety of ways. This tutorial shows:

  • How to inject external data into a general GStreamer pipeline.

  • How to extract data from a general GStreamer pipeline.

  • How to access存取 and manipulate操作 this data.

Playback tutorial 3: Short-cutting the pipeline explains how to achieve the same goals in a playbin-based pipeline.

Introduction

本章提供一种与pipeline交互最简单的方法.这里用到一个element,这个element专门为此功能而生,这个element叫appsrc,对应从pipeline中提取数据的element叫appsink.一个用于给pipeline中插入数据,一个从pipeline中提取数据。

appsrc和appsin功能非常强大,有自己的API,他们可以和gstream应用程序连接起来。这里仅仅使用简单的函数以及信号来控制。appsrc可以通过拉pull模式工作,也可以通过推push模式工作。pull就是appsrc主动去应用程序那里拉数据。push就是应用程序送数据过来,其中push当空间满是可以阻塞应用程序,或者通过enough-data 和 need-data 信号来控制数据流。本章主要介绍后一种方法,其他方法详见appsrc文档。

Applications can interact with(与...交互) the data flowing through a GStreamer pipeline in several ways. This tutorial describes the easiest one, since it uses elements that have been created for this sole purpose(纯粹的).

The element used to inject application data into a GStreamer pipeline is appsrc, and its counterpart, used to extract GStreamer data back to the application is appsink. To avoid confusing the names, think of it from GStreamer's point of view: appsrc is just a regular source, that provides data magically神奇的 fallen降落 from the sky (provided by the application, actually). appsink is a regular sink, where the data flowing through a GStreamer pipeline goes to die (it is recovered by the application, actually).

appsrc and appsink are so versatile(万能的) that they offer their own API (see their documentation), which can be accessed by linking against the gstreamer-app library. In this tutorial, however, we will use a simpler approach and control them through signals.

appsrc can work in a variety of modes: in pull mode, it requests data from the application every time it needs it. In push mode, the application pushes data at its own pace. Furthermore(另外), in push mode, the application can choose to be blocked in the push function when enough data has already been provided, or it can listen to the enough-data and need-data signals to control flow. This example implements the latter approach. Information regarding(关于) the other methods can be found in the appsrc documentation.

Buffers

GstBuffers 是很抽象的,由多个GstMemory 组成。每个GstBuffer都有一个time-stamps 和 duration,用来描述这段数据的属性。

Data travels through a GStreamer pipeline in chunks called buffers. Since this example produces and consumes data, we need to know about GstBuffers.

Source Pads produce buffers, that are consumed消费 by Sink Pads; GStreamer takes these buffers and passes them from element to element.

A buffer simply represents a unit of data, do not assume that all buffers will have the same size, or represent the same amount of time. Neither should you assume that if a single buffer enters an element, a single buffer will come out. Elements are free to do with the received buffers as they please. GstBuffers may also contain more than one actual memory buffer. Actual memory buffers are abstracted away(抽象的) using GstMemory objects, and a GstBuffer can contain multiple GstMemory objects.

Every buffer has attached time-stamps and duration, that describe in which moment the content of the buffer should be decoded, rendered or displayed. Time stamping is a very complex and delicate微妙 subject, but this simplified vision should suffice足够 for now.

As an example, a filesrc (a GStreamer element that reads files) produces buffers with the “ANY” caps and no time-stamping information. After demuxing (see Basic tutorial 3: Dynamic pipelines) buffers can have some specific caps, for example “video/x-h264”. After decoding, each buffer will contain a single video frame with raw caps (for example, “video/x-raw-yuv”) and very precise精确 time stamps indicating when should that frame be displayed.

This tutorial

本章对第7章进行修改得到。第一使用appsrc 代替audiotestsrc ,第二tee增加一个src pad输出到appsink中。appsink输出到应用程序,然后通知用户数据已经收到。当然appsink可以干其他很多事情。

This tutorial expands扩展 Basic tutorial 7: Multithreading and Pad Availability in two ways: firstly, the audiotestsrc is replaced by an appsrc that will generate the audio data. Secondly, a new branch is added to the tee so data going into the audio sink and the wave display is also replicated into an appsink. The appsink uploads the information back into the application, which then just notifies the user that data has been received, but it could obviously明显的 perform完成 more complex tasks.

A crude天然的 waveform波形图 generator

Copy this code into a text file named basic-tutorial-8.c (or find it in your GStreamer installation).

#include <gst/gst.h>
#include <gst/audio/audio.h>
#include <string.h>

#define CHUNK_SIZE 1024   /* Amount of bytes we are sending in each buffer */
#define SAMPLE_RATE 44100 /* Samples per second we are sending */

/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
  GstElement *pipeline, *app_source, *tee, *audio_queue, *audio_convert1, *audio_resample, *audio_sink;
  GstElement *video_queue, *audio_convert2, *visual, *video_convert, *video_sink;
  GstElement *app_queue, *app_sink;

  guint64 num_samples;   /* Number of samples generated so far (for timestamp generation) */
  gfloat a, b, c, d;     /* For waveform generation */

  guint sourceid;        /* To control the GSource */

  GMainLoop *main_loop;  /* GLib's Main Loop */
} CustomData;

/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
 * The idle handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
 * and is removed when appsrc has enough data (enough-data signal).
 */
static gboolean push_data (CustomData *data) {
  GstBuffer *buffer;
  GstFlowReturn ret;
  int i;
  GstMapInfo map;
  gint16 *raw;
  gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
  gfloat freq;

  /* Create a new empty buffer */
  buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);

  /* Set its timestamp and duration */
  GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
  GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (num_samples, GST_SECOND, SAMPLE_RATE);

  /* Generate some psychodelic waveforms */
  gst_buffer_map (buffer, &map, GST_MAP_WRITE);
  raw = (gint16 *)map.data;
  data->c += data->d;
  data->d -= data->c / 1000;
  freq = 1100 + 1000 * data->d;
  for (i = 0; i < num_samples; i++) {
    data->a += data->b;
    data->b -= data->a / freq;
    raw[i] = (gint16)(500 * data->a);
  }
  gst_buffer_unmap (buffer, &map);
  data->num_samples += num_samples;

  /* Push the buffer into the appsrc */
  g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);

  /* Free the buffer now that we are done with it */
  gst_buffer_unref (buffer);

  if (ret != GST_FLOW_OK) {
    /* We got some error, stop sending data */
    return FALSE;
  }

  return TRUE;
}

/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
 * to the mainloop to start pushing data into the appsrc */
static void start_feed (GstElement *source, guint size, CustomData *data) {
  if (data->sourceid == 0) {
    g_print ("Start feeding\n");
    data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
  }
}

/* This callback triggers when appsrc has enough data and we can stop sending.
 * We remove the idle handler from the mainloop */
static void stop_feed (GstElement *source, CustomData *data) {
  if (data->sourceid != 0) {
    g_print ("Stop feeding\n");
    g_source_remove (data->sourceid);
    data->sourceid = 0;
  }
}

/* The appsink has received a buffer */
static GstFlowReturn new_sample (GstElement *sink, CustomData *data) {
  GstSample *sample;

  /* Retrieve the buffer */
  g_signal_emit_by_name (sink, "pull-sample", &sample);
  if (sample) {
    /* The only thing we do in this example is print a * to indicate a received buffer */
    g_print ("*");
    gst_sample_unref (sample);
    return GST_FLOW_OK;
  }

  return GST_FLOW_ERROR;
}

/* This function is called when an error message is posted on the bus */
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
  GError *err;
  gchar *debug_info;

  /* Print error details on the screen */
  gst_message_parse_error (msg, &err, &debug_info);
  g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
  g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
  g_clear_error (&err);
  g_free (debug_info);

  g_main_loop_quit (data->main_loop);
}

int main(int argc, char *argv[]) {
  CustomData data;
  GstPad *tee_audio_pad, *tee_video_pad, *tee_app_pad;
  GstPad *queue_audio_pad, *queue_video_pad, *queue_app_pad;
  GstAudioInfo info;
  GstCaps *audio_caps;
  GstBus *bus;

  /* Initialize cumstom data structure */
  memset (&data, 0, sizeof (data));
  data.b = 1; /* For waveform generation */
  data.d = 1;

  /* Initialize GStreamer */
  gst_init (&argc, &argv);

  /* Create the elements */
  data.app_source = gst_element_factory_make ("appsrc", "audio_source");
  data.tee = gst_element_factory_make ("tee", "tee");
  data.audio_queue = gst_element_factory_make ("queue", "audio_queue");
  data.audio_convert1 = gst_element_factory_make ("audioconvert", "audio_convert1");
  data.audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
  data.audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
  data.video_queue = gst_element_factory_make ("queue", "video_queue");
  data.audio_convert2 = gst_element_factory_make ("audioconvert", "audio_convert2");
  data.visual = gst_element_factory_make ("wavescope", "visual");
  data.video_convert = gst_element_factory_make ("videoconvert", "video_convert");
  data.video_sink = gst_element_factory_make ("autovideosink", "video_sink");
  data.app_queue = gst_element_factory_make ("queue", "app_queue");
  data.app_sink = gst_element_factory_make ("appsink", "app_sink");

  /* Create the empty pipeline */
  data.pipeline = gst_pipeline_new ("test-pipeline");

  if (!data.pipeline || !data.app_source || !data.tee || !data.audio_queue || !data.audio_convert1 ||
      !data.audio_resample || !data.audio_sink || !data.video_queue || !data.audio_convert2 || !data.visual ||
      !data.video_convert || !data.video_sink || !data.app_queue || !data.app_sink) {
    g_printerr ("Not all elements could be created.\n");
    return -1;
  }

  /* Configure wavescope */
  g_object_set (data.visual, "shader", 0, "style", 0, NULL);

  /* Configure appsrc */
  gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
  audio_caps = gst_audio_info_to_caps (&info);
  g_object_set (data.app_source, "caps", audio_caps, "format", GST_FORMAT_TIME, NULL);
  g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
  g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);

  /* Configure appsink */
  g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
  g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
  gst_caps_unref (audio_caps);

  /* Link all elements that can be automatically linked because they have "Always" pads */
  gst_bin_add_many (GST_BIN (data.pipeline), data.app_source, data.tee, data.audio_queue, data.audio_convert1, data.audio_resample,
      data.audio_sink, data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, data.app_queue,
      data.app_sink, NULL);
  if (gst_element_link_many (data.app_source, data.tee, NULL) != TRUE ||
      gst_element_link_many (data.audio_queue, data.audio_convert1, data.audio_resample, data.audio_sink, NULL) != TRUE ||
      gst_element_link_many (data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, NULL) != TRUE ||
      gst_element_link_many (data.app_queue, data.app_sink, NULL) != TRUE) {
    g_printerr ("Elements could not be linked.\n");
    gst_object_unref (data.pipeline);
    return -1;
  }

  /* Manually link the Tee, which has "Request" pads */
  tee_audio_pad = gst_element_get_request_pad (data.tee, "src_%u");
  g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
  queue_audio_pad = gst_element_get_static_pad (data.audio_queue, "sink");
  tee_video_pad = gst_element_get_request_pad (data.tee, "src_%u");
  g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
  queue_video_pad = gst_element_get_static_pad (data.video_queue, "sink");
  tee_app_pad = gst_element_get_request_pad (data.tee, "src_%u");
  g_print ("Obtained request pad %s for app branch.\n", gst_pad_get_name (tee_app_pad));
  queue_app_pad = gst_element_get_static_pad (data.app_queue, "sink");
  if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
      gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK ||
      gst_pad_link (tee_app_pad, queue_app_pad) != GST_PAD_LINK_OK) {
    g_printerr ("Tee could not be linked\n");
    gst_object_unref (data.pipeline);
    return -1;
  }
  gst_object_unref (queue_audio_pad);
  gst_object_unref (queue_video_pad);
  gst_object_unref (queue_app_pad);

  /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
  bus = gst_element_get_bus (data.pipeline);
  gst_bus_add_signal_watch (bus);
  g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
  gst_object_unref (bus);

  /* Start playing the pipeline */
  gst_element_set_state (data.pipeline, GST_STATE_PLAYING);

  /* Create a GLib Main Loop and set it to run */
  data.main_loop = g_main_loop_new (NULL, FALSE);
  g_main_loop_run (data.main_loop);

  /* Release the request pads from the Tee, and unref them */
  gst_element_release_request_pad (data.tee, tee_audio_pad);
  gst_element_release_request_pad (data.tee, tee_video_pad);
  gst_element_release_request_pad (data.tee, tee_app_pad);
  gst_object_unref (tee_audio_pad);
  gst_object_unref (tee_video_pad);
  gst_object_unref (tee_app_pad);

  /* Free resources */
  gst_element_set_state (data.pipeline, GST_STATE_NULL);
  gst_object_unref (data.pipeline);
  return 0;
}

Walkthrough

大部分在第7节已经介绍,这里主要介绍关于appsrc和appsink的内容。

The code to create the pipeline (Lines 131 to 205) is an enlarged version of Basic tutorial 7: Multithreading and Pad Availability. It involves instantiating all the elements, link the elements with Always Pads, and manually link the Request Pads of the tee element.

Regarding关于 the configuration of the appsrc and appsink elements:

/* Configure appsrc */
gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
audio_caps = gst_audio_info_to_caps (&info);
g_object_set (data.app_source, "caps", audio_caps, NULL);
g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);

这里主要设置appsrc的pad caps,以及 need-data 和 enough-data 信号。第一行是形成一个指定结构的info,第二行是使用此info生成一个caps,第三行就是设置caps到appsrc,第四和第五行设置appsrc的信号回调。当appsrc数据不足或数据太多时,会触发此消息。

The first property that needs to be set on the appsrc is caps. It specifies the kind of data that the element is going to produce, so GStreamer can check if linking with downstream elements is possible (this is, if the downstream elements will understand this kind of data). This property must be a GstCaps object, which is easily built from a string with gst_caps_from_string().

We then connect to the need-data and enough-data signals. These are fired by appsrc when its internal queue of data is running low or almost full, respectively. We will use these signals to start and stop (respectively) our signal generation process.

/* Configure appsink */
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
gst_caps_unref (audio_caps);

关于appsink,我们只需要注册一个new_sample信号,当有数据到来的时候,就会触发此回调。此信号默认关闭,需要通过属性设置为开启状态。

Regarding the appsink configuration, we connect to the new-sample signal, which is emitted every time the sink receives a buffer. Also, the signal emission needs to be enabled through the emit-signals property, because, by default, it is disabled.

Starting the pipeline, waiting for messages and final cleanup is done as usual. Let's review the callbacks we have just registered:

/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
 * to the mainloop to start pushing data into the appsrc */
static void start_feed (GstElement *source, guint size, CustomData *data) {
  if (data->sourceid == 0) {
    g_print ("Start feeding\n");
    data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
  }
}

start_feed在appsr数据不足时会触发,这里start_feed仅发送一个空闲消息(类似于定时器,主线程空闲的时候就会被调用),让push_data去做真正发送数据给appsrc。这里g_idle_add返回值复制给data->sourceid(肯定不为0).

This function is called when the internal queue of appsrc is about to starve饥饿 (run out of data). The only thing we do here is register a GLib idle function with g_idle_add() that feeds data to appsrc until it is full again. A GLib idle function is a method that GLib will call from its main loop whenever it is “idle”, this is, when it has no higher-priority tasks to perform. It requires a GLib GMainLoop to be instantiated and running, obviously明显的.

This is only one of the multiple approaches that appsrc allows. In particular, buffers do not need to be fed into appsrc from the main thread using GLib, and you do not need to use the need-data and enough-data signals to synchronize with appsrc (although this is allegedly据说 the most convenient方便).

We take note of the sourceid that g_idle_add() returns, so we can disable it later.

/* This callback triggers when appsrc has enough data and we can stop sending.
 * We remove the idle handler from the mainloop */
static void stop_feed (GstElement *source, CustomData *data) {
  if (data->sourceid != 0) {
    g_print ("Stop feeding\n");
    g_source_remove (data->sourceid);
    data->sourceid = 0;
  }
}

当appsrc数据足够时,触发只函数,这里关闭之前的idle消息(通过g_idle_add的返回值,存储在data->sourceid中)。

This function is called when the internal queue of appsrc is full enough so we stop pushing data. Here we simply remove the idle function by using g_source_remove() (The idle function is implemented as a GSource).

/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
 * The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
 * and is removed when appsrc has enough data (enough-data signal).
 */
static gboolean push_data (CustomData *data) {
  GstBuffer *buffer;
  GstFlowReturn ret;
  int i;
  gint16 *raw;
  gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
  gfloat freq;

  /* Create a new empty buffer */
  buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);

  /* Set its timestamp and duration */
  GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
  GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (num_samples, GST_SECOND, SAMPLE_RATE);

  /* Generate some psychodelic waveforms */
  raw = (gint16 *)GST_BUFFER_DATA (buffer);

这个函数正在给appsrc提供数据。具体就是分配空间,初始化数据等等。

This is the function that feeds appsrc. It will be called by GLib at times and rates which are out of our control, but we know that we will disable it when its job is done (when the queue in appsrc is full).

Its first task is to create a new buffer with a given size (in this example, it is arbitrarily随意 set to 1024 bytes)  with  gst_buffer_new_and_alloc()

We count the number of samples that we have generated so far with the CustomData.num_samples variable, so we can time-stamp this buffer using the GST_BUFFER_TIMESTAMP macro in GstBuffer.

Since we are producing buffers of the same size, their duration is the same and is set using the GST_BUFFER_DURATION in GstBuffer.

gst_util_uint64_scale() is a utility function that scales平衡 (multiply and divide) numbers which can be large, without fear担心 of overflows.

The bytes that for the buffer can be accessed with GST_BUFFER_DATA in GstBuffer (Be careful not to write past the end of the buffer: you allocated it, so you know its size).

We will skip over the waveform generation, since it is outside the scope of this tutorial (it is simply a funny way of generating a pretty psychedelic wave).

/* Push the buffer into the appsrc */
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);

/* Free the buffer now that we are done with it */
gst_buffer_unref (buffer);

通过push-buffer信号,把数据发出去。

Once we have the buffer ready, we pass it to appsrc with the push-buffer action signal (see information box at the end of Playback tutorial 1: Playbin usage), and then gst_buffer_unref() it since we no longer need it.

/* The appsink has received a buffer */
static GstFlowReturn new_sample (GstElement *sink, CustomData *data) {
  GstSample *sample;
  /* Retrieve the buffer */
  g_signal_emit_by_name (sink, "pull-sample", &sample);
  if (sample) {
    /* The only thing we do in this example is print a * to indicate a received buffer */
    g_print ("*");
    gst_sample_unref (sample);
    return GST_FLOW_OK;
  }
  return GST_FLOW_ERROR;
}

这个函数就是appsink收到数据后出发的函数,这里我们可以获取接收到的数据以及长度。别反击释放申请的sample空间。

Finally, this is the function that gets called when the appsink receives a buffer. We use the pull-sample action signal to retrieve the buffer and then just print some indicator on the screen. We can retrieve the data pointer using the GST_BUFFER_DATA macro and the data size using the GST_BUFFER_SIZE macro in GstBuffer. Remember that this buffer does not have to match the buffer that we produced in the push_data function, any element in the path could have altered改变 the buffers in any way (Not in this example: there is only a tee in the path between appsrc and appsink, and it does not change the content of the buffers).

We then gst_buffer_unref() the buffer, and this tutorial is done.

Conclusion

This tutorial has shown how applications can:

  • Inject data into a pipeline using the appsrc element.
  • Retrieve data from a pipeline using the appsink element.
  • Manipulate操作 this data by accessing访问 the GstBuffer.

In a playbin-based pipeline, the same goals are achieved in a slightly different way. Playback tutorial 3: Short-cutting the pipeline shows how to do it.

It has been a pleasure having you here, and see you soon!

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值