pipeline的快捷访问

参考:
基础教程8:缩短管道link
GStreamer基础教程08——pipeline的快捷访问link
本文中(gstreamer-1.0, version 1.14.5;glib-2.0, version 2.56.4)分别修改了两篇中因为版本不能run的地方,并添加了详细的注释。

//
// Created by wxf on 2021/11/3.
//
#include <gst/gst.h>
#include <string.h>

#define CHUNK_SIZE 1024   /* Amount of bytes we are sending in each buffer */
#define SAMPLE_RATE 44100 /* Samples per second we are sending */
#define AUDIO_CAPS "audio/x-raw,channels=1,rate=%d,signed=(boolean)true,width=16,depth=16,endianness=BYTE_ORDER"

/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
    GstElement *pipeline, *app_source, *tee, *audio_queue, *audio_convert1, *audio_resample, *audio_sink;
    GstElement *video_queue, *audio_convert2, *visual, *video_convert, *video_sink;
    GstElement *app_queue, *app_sink;

    guint64 num_samples;   /* Number of samples generated so far (for timestamp generation) */
    gfloat a, b, c, d;     /* For waveform generation */

    guint sourceid;        /* To control the GSource */

    GMainLoop *main_loop;  /* GLib's Main Loop */
} CustomData;

/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
 * The idle handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
 * and is removed when appsrc has enough data (enough-data signal).
 */
/*
这是the function that feed appsrc。GLib 将在我们无法控制的时间和速率下调用它,但我们知道我们将在其工作完成时禁用它(当队列appsrc已满时)。
它的第一个任务是使用 gst_buffer_new_and_alloc().我们至今所产生的样本数 CustomData.num_samples变化,
 所以我们可以时间戳使用该缓冲区GST_BUFFER_TIMESTAMP宏GstBuffer。

由于我们正在生成相同大小的缓冲区,因此它们的持续时间相同并使用GST_BUFFER_DURATIONin设置GstBuffer。
gst_util_uint64_scale() 是一个效用函数,可以缩放(乘除)可能很大的数字,而不必担心溢出。
可以使用 GST_BUFFER_DATA 访问缓冲区的字节 GstBuffer(注意不要写到缓冲区的末尾:你分配了它,所以你知道它的大小)。

这个函数给appsrc发送数据。它被GLib调用的次数和频率我们不加以控制,但我们会在它任务完成时关闭它(appsrc内部队列满)。
这里第一步是用gst_buffer_new_and_alloc()方法和给定的大小创建一个新buffer(例子中是1024字节)。
我们计算我们生成的采样数据的数据量,把数据存在CustomData.num_samples里面,
 这样我们可以用GstBuffer提供的GST_BUFFER_TIMESTAMP宏来生成buffer的时间戳。
gst_util_uint64_scale是一个工具函数,用来缩放数据,确保不会溢出。
这些给buffer的数据可以用GstBuffer提供的GST_BUFFER_DTS宏来访问。
 */
static gboolean push_data (CustomData *data) {
    GstBuffer *buffer;
    GstFlowReturn ret;
    int i;
    gint16 *raw;
    gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
    gfloat freq;

    /* Create a new empty buffer */
    buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);

    /* Set its timestamp and duration */
    GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
    GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);

    /* Generate some psychodelic waveforms */

    //raw = (gint16 *)GST_BUFFER_DATA(buffer);
//    GstMapInfo map;
//    gst_buffer_map (buffer, &map, GST_MAP_WRITE);
//    raw = (gint16 *)map.data;
    raw = (gint16 *)GST_BUFFER_DTS(buffer);
    data->c += data->d;
    data->d -= data->c / 1000;
    freq = 1100 + 1000 * data->d;
    for (i = 0; i < num_samples; i++) {
        data->a += data->b;
        data->b -= data->a / freq;
        raw[i] = (gint16)(500 * data->a);
    }
    data->num_samples += num_samples;

    /* Push the buffer into the appsrc */
    /*准备好缓冲区后,我们将其appsrc与push-buffer动作信号一起 传递给(请参阅播放教程 1:Playbin 用法末尾的信息框),
     然后将 gst_buffer_unref()其传递给我们,因为我们不再需要它。
     */
    g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);

    /* Free the buffer now that we are done with it */
    gst_buffer_unref (buffer);

    if (ret != GST_FLOW_OK) {
        /* We got some error, stop sending data */
        return FALSE;
    }

    return TRUE;
}

/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
 * to the mainloop to start pushing data into the appsrc */
/*
 当内部队列appsrc即将耗尽(数据用完)时调用此函数。
 我们在这里做的唯一一件事是注册一个 GLib 空闲函数,g_idle_add()该函数将数据提供给appsrc它,直到它再次变满。
 GLib 空闲函数是当 GLib 处于“空闲”状态时,即没有更高优先级的任务要执行时,GLib 将从其主循环调用的方法。
 GMainLoop显然,它需要一个 GLib来实例化和运行。

 用g_idle_add()方法注册一个GLib的idle函数,
 这个函数会给appsrc输入数据直至内部队列满为止。
 一个GLib的idle函数是一个GLib在主循环在“idle”时会调用的方法,也就是说,当时没有更高优先级的任务运行。

这只是appsrc允许的多种方法之一。
特别是,缓冲区不需要appsrc从使用 GLib 的主线程馈入,并且您不需要使用need-data和 enough-data信号与之同步appsrc(尽管据称这是最方便的)。
我们记下g_idle_add()返回的 sourceid ,以便稍后调用stop_feed 函数禁用它。
 */
static void start_feed (GstElement *source, guint size, CustomData *data) {
    if (data->sourceid == 0) {
        g_print ("Start feeding\n");
        data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
    }
}

/* This callback triggers when appsrc has enough data and we can stop sending.
 * We remove the idle handler from the mainloop */
/*
当内部队列appsrc足够时调用此函数,因此我们停止推送数据。
这里我们简单地通过使用删除了空闲函数g_source_remove()(空闲函数被实现为 a GSource)。
 这个函数当appsrc内部的队列满的时候调用,所以我们需要停止发送数据。这里我们简单地用g_source_remove()来把idle函数移走。
*/
static void stop_feed (GstElement *source, CustomData *data) {
    if (data->sourceid != 0) {
        g_print ("Stop feeding\n");
        g_source_remove (data->sourceid);
        data->sourceid = 0;
    }
}

///* The appsink has received a buffer */
//static GstFlowReturn new_buffer (GstElement *sink, CustomData *data) {
//    GstBuffer *buffer;
//
//    /* Retrieve the buffer */
//    g_signal_emit_by_name (sink, "pull-buffer", &buffer);
//    if (buffer) {
//        /* The only thing we do in this example is print a * to indicate a received buffer */
//        g_print ("*");
//        gst_buffer_unref (buffer);
//    }
//    return GST_FLOW_CUSTOM_SUCCESS;
//}
/*
这个函数在appsink收到buffer时被调用。
 我们使用了pull-buffer的信号来重新获得buffer,因为是例子,所以仅仅在屏幕上打印一些内容。
 我们可以用GstBuffer的GST_BUFFER_DATA宏来获得数据指针和用GST_BUFFER_SIZE宏来获得数据大小。
 请记住,这里的buffer不是一定要和我们在push_data函数里面创建的buffer完全一致的,
 在传输路径上得任何一个element都可能对buffer进行一些改变。
 (这个例子中仅仅是在appsrc和appsink中间通过一个tee element,所以buffer没有变化)

 */
static GstFlowReturn new_sample (GstElement *sink, CustomData *data) {
    GstSample *sample;

    /* Retrieve the buffer */
    g_signal_emit_by_name (sink, "pull-sample", &sample);
    if (sample) {
        /* The only thing we do in this example is print a * to indicate a received buffer */
        g_print ("*");
        gst_sample_unref (sample);
        return GST_FLOW_OK;
    }

    return GST_FLOW_ERROR;
}
/* This function is called when an error message is posted on the bus */
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
    GError *err;
    gchar *debug_info;

    /* Print error details on the screen */
    gst_message_parse_error (msg, &err, &debug_info);
    g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
    g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
    g_clear_error (&err);
    g_free (debug_info);

    g_main_loop_quit (data->main_loop);
}



int main(int argc, char *argv[]) {
    CustomData data;
    GstPadTemplate *tee_src_pad_template;
    GstPad *tee_audio_pad, *tee_video_pad, *tee_app_pad;
    GstPad *queue_audio_pad, *queue_video_pad, *queue_app_pad;
    gchar *audio_caps_text;
    GstCaps *audio_caps;
    GstBus *bus;

    /* Initialize cumstom data structure */
    memset (&data, 0, sizeof (data));
    data.b = 1; /* For waveform generation */
    data.d = 1;

    /* Initialize GStreamer */
    gst_init (&argc, &argv);

    /* Create the elements */
    /*涉及实例化所有元素,将元素与 Always Pads 链接,并手动链接tee元素的 Request Pads*/
    data.app_source = gst_element_factory_make ("appsrc", "audio_source");
    data.tee = gst_element_factory_make ("tee", "tee");
    data.audio_queue = gst_element_factory_make ("queue", "audio_queue");
    data.audio_convert1 = gst_element_factory_make ("audioconvert", "audio_convert1");
    data.audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
    data.audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
    data.video_queue = gst_element_factory_make ("queue", "video_queue");
    data.audio_convert2 = gst_element_factory_make ("audioconvert", "audio_convert2");
    data.visual = gst_element_factory_make ("wavescope", "visual");
    data.video_convert = gst_element_factory_make ("videoconvert", "csp");
    data.video_sink = gst_element_factory_make ("autovideosink", "video_sink");
    data.app_queue = gst_element_factory_make ("queue", "app_queue");
    data.app_sink = gst_element_factory_make ("appsink", "app_sink");

    /* Create the empty pipeline */
    data.pipeline = gst_pipeline_new ("test-pipeline");

    if (!data.pipeline || !data.app_source || !data.tee || !data.audio_queue || !data.audio_convert1 ||
        !data.audio_resample || !data.audio_sink || !data.video_queue || !data.audio_convert2 || !data.visual ||
        !data.video_convert || !data.video_sink || !data.app_queue || !data.app_sink) {
        g_printerr ("Not all elements could be created.\n");
        return -1;
    }

    /* Configure wavescope */
    g_object_set (data.visual, "shader", 0, "style", 1, NULL);

    /* Configure appsrc */
    audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
    audio_caps = gst_caps_from_string (audio_caps_text);

    /*  设置的第一个属性appsrc是caps。它指定元素将要生成的数据类型,
     因此 GStreamer 可以检查是否可以与下游元素链接(即,下游元素是否会理解这种数据)。
     这个属性必须是一个GstCaps对象,它很容易从一个带有gst_caps_from_string().
     然后我们连接到need-data和enough-data信号。
     它们分别appsrc在其内部数据队列不足或几乎满时触发。我们将使用这些信号来启动和停止(分别)我们的信号生成过程。
     */
    g_object_set (data.app_source, "caps", audio_caps, NULL);
    g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
    g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);

    /* Configure appsink */
    /*
     Element Signals:
  "eos" :  void user_function (GstElement* object,
                               gpointer user_data);
  "new-preroll" :  GstFlowReturn user_function (GstElement* object,
                                                gpointer user_data);
  "new-sample" :  GstFlowReturn user_function (GstElement* object,
                                               gpointer user_data);

Element Actions:
  "pull-preroll" :  GstSample * user_function (GstElement* object);
  "pull-sample" :  GstSample * user_function (GstElement* object);
  "try-pull-preroll" :  GstSample * user_function (GstElement* object,
                                                   guint64 arg0);
  "try-pull-sample" :  GstSample * user_function (GstElement* object,
                                                  guint64 arg0);

     */
    /*
     appsink配置,我们连接到new-sample每次接收器接收缓冲区时发出的信号。
     此外,信号发射需要通过emit-signals属性启用 ,因为默认情况下它是禁用的。
     */
    g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
    g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
    gst_caps_unref (audio_caps);
    g_free (audio_caps_text);

    /* Link all elements that can be automatically linked because they have "Always" pads */
    gst_bin_add_many (GST_BIN (data.pipeline), data.app_source, data.tee, data.audio_queue, data.audio_convert1, data.audio_resample,
                      data.audio_sink, data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, data.app_queue,
                      data.app_sink, NULL);
    if (gst_element_link_many (data.app_source, data.tee, NULL) != TRUE ||
        gst_element_link_many (data.audio_queue, data.audio_convert1, data.audio_resample, data.audio_sink, NULL) != TRUE ||
        gst_element_link_many (data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, NULL) != TRUE ||
        gst_element_link_many (data.app_queue, data.app_sink, NULL) != TRUE) {
        g_printerr ("Elements could not be linked.\n");
        gst_object_unref (data.pipeline);
        return -1;
    }

    /* Manually link the Tee, which has "Request" pads */
    tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (data.tee), "src_%u");
    tee_audio_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
    g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
    queue_audio_pad = gst_element_get_static_pad (data.audio_queue, "sink");

    tee_video_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
    g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
    queue_video_pad = gst_element_get_static_pad (data.video_queue, "sink");

    tee_app_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
    g_print ("Obtained request pad %s for app branch.\n", gst_pad_get_name (tee_app_pad));
    queue_app_pad = gst_element_get_static_pad (data.app_queue, "sink");
    if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
        gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK ||
        gst_pad_link (tee_app_pad, queue_app_pad) != GST_PAD_LINK_OK) {
        g_printerr ("Tee could not be linked\n");
        gst_object_unref (data.pipeline);
        return -1;
    }
    gst_object_unref (queue_audio_pad);
    gst_object_unref (queue_video_pad);
    gst_object_unref (queue_app_pad);

    /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
    bus = gst_element_get_bus (data.pipeline);
    gst_bus_add_signal_watch (bus);
    g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
    gst_object_unref (bus);

    /* Start playing the pipeline */
    gst_element_set_state (data.pipeline, GST_STATE_PLAYING);

    /* Create a GLib Main Loop and set it to run */
    data.main_loop = g_main_loop_new (NULL, FALSE);
    g_main_loop_run (data.main_loop);

    /* Release the request pads from the Tee, and unref them */
    gst_element_release_request_pad (data.tee, tee_audio_pad);
    gst_element_release_request_pad (data.tee, tee_video_pad);
    gst_element_release_request_pad (data.tee, tee_app_pad);
    gst_object_unref (tee_audio_pad);
    gst_object_unref (tee_video_pad);
    gst_object_unref (tee_app_pad);

    /* Free resources */
    gst_element_set_state (data.pipeline, GST_STATE_NULL);
    gst_object_unref (data.pipeline);
    return 0;
}

//
// Created by wxf on 2021/11/3.
//
#include <gst/gst.h>
#include <gst/audio/audio.h>
#include <string.h>

#define CHUNK_SIZE 1024   /* Amount of bytes we are sending in each buffer */
#define SAMPLE_RATE 44100 /* Samples per second we are sending */

/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
    GstElement *pipeline, *app_source, *tee, *audio_queue, *audio_convert1, *audio_resample, *audio_sink;
    GstElement *video_queue, *audio_convert2, *visual, *video_convert, *video_sink;
    GstElement *app_queue, *app_sink;

    guint64 num_samples;   /* Number of samples generated so far (for timestamp generation) */
    gfloat a, b, c, d;     /* For waveform generation */

    guint sourceid;        /* To control the GSource */

    GMainLoop *main_loop;  /* GLib's Main Loop */
} CustomData;

/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
 * The idle handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
 * and is removed when appsrc has enough data (enough-data signal).
 */
static gboolean push_data (CustomData *data) {
    GstBuffer *buffer;
    GstFlowReturn ret;
    int i;
    GstMapInfo map;
    gint16 *raw;
    gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
    gfloat freq;

    /* Create a new empty buffer */
    buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);

    /* Set its timestamp and duration */
    GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
    GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (num_samples, GST_SECOND, SAMPLE_RATE);

    /* Generate some psychodelic waveforms */
    gst_buffer_map (buffer, &map, GST_MAP_WRITE);
    raw = (gint16 *)map.data;
    data->c += data->d;
    data->d -= data->c / 1000;
    freq = 1100 + 1000 * data->d;
    for (i = 0; i < num_samples; i++) {
        data->a += data->b;
        data->b -= data->a / freq;
        raw[i] = (gint16)(500 * data->a);
    }
    gst_buffer_unmap (buffer, &map);
    data->num_samples += num_samples;

    /* Push the buffer into the appsrc */
    g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);

    /* Free the buffer now that we are done with it */
    gst_buffer_unref (buffer);

    if (ret != GST_FLOW_OK) {
        /* We got some error, stop sending data */
        return FALSE;
    }

    return TRUE;
}

/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
 * to the mainloop to start pushing data into the appsrc */
static void start_feed (GstElement *source, guint size, CustomData *data) {
    if (data->sourceid == 0) {
        g_print ("Start feeding\n");
        data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
    }
}

/* This callback triggers when appsrc has enough data and we can stop sending.
 * We remove the idle handler from the mainloop */
static void stop_feed (GstElement *source, CustomData *data) {
    if (data->sourceid != 0) {
        g_print ("Stop feeding\n");
        g_source_remove (data->sourceid);
        data->sourceid = 0;
    }
}

/* The appsink has received a buffer */
static GstFlowReturn new_sample (GstElement *sink, CustomData *data) {
    GstSample *sample;

    /* Retrieve the buffer */
    g_signal_emit_by_name (sink, "pull-sample", &sample);
    if (sample) {
        /* The only thing we do in this example is print a * to indicate a received buffer */
        g_print ("*");
        gst_sample_unref (sample);
        return GST_FLOW_OK;
    }

    return GST_FLOW_ERROR;
}

/* This function is called when an error message is posted on the bus */
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
    GError *err;
    gchar *debug_info;

    /* Print error details on the screen */
    gst_message_parse_error (msg, &err, &debug_info);
    g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
    g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
    g_clear_error (&err);
    g_free (debug_info);

    g_main_loop_quit (data->main_loop);
}

int main(int argc, char *argv[]) {
    CustomData data;
    GstPad *tee_audio_pad, *tee_video_pad, *tee_app_pad;
    GstPad *queue_audio_pad, *queue_video_pad, *queue_app_pad;
    GstAudioInfo info;
    GstCaps *audio_caps;
    GstBus *bus;

    /* Initialize custom data structure */
    memset (&data, 0, sizeof (data));
    data.b = 1; /* For waveform generation */
    data.d = 1;

    /* Initialize GStreamer */
    gst_init (&argc, &argv);

    /* Create the elements */
    data.app_source = gst_element_factory_make ("appsrc", "audio_source");
    data.tee = gst_element_factory_make ("tee", "tee");
    data.audio_queue = gst_element_factory_make ("queue", "audio_queue");
    data.audio_convert1 = gst_element_factory_make ("audioconvert", "audio_convert1");
    data.audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
    data.audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
    data.video_queue = gst_element_factory_make ("queue", "video_queue");
    data.audio_convert2 = gst_element_factory_make ("audioconvert", "audio_convert2");
    data.visual = gst_element_factory_make ("wavescope", "visual");
    data.video_convert = gst_element_factory_make ("videoconvert", "video_convert");
    data.video_sink = gst_element_factory_make ("autovideosink", "video_sink");
    data.app_queue = gst_element_factory_make ("queue", "app_queue");
    data.app_sink = gst_element_factory_make ("appsink", "app_sink");

    /* Create the empty pipeline */
    data.pipeline = gst_pipeline_new ("test-pipeline");

    if (!data.pipeline || !data.app_source || !data.tee || !data.audio_queue || !data.audio_convert1 ||
        !data.audio_resample || !data.audio_sink || !data.video_queue || !data.audio_convert2 || !data.visual ||
        !data.video_convert || !data.video_sink || !data.app_queue || !data.app_sink) {
        g_printerr ("Not all elements could be created.\n");
        return -1;
    }

    /* Configure wavescope */
    g_object_set (data.visual, "shader", 0, "style", 0, NULL);

    /* Configure appsrc */
    gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
    audio_caps = gst_audio_info_to_caps (&info);
    g_object_set (data.app_source, "caps", audio_caps, "format", GST_FORMAT_TIME, NULL);
    g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
    g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);

    /* Configure appsink */
    g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
    g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
    gst_caps_unref (audio_caps);

    /* Link all elements that can be automatically linked because they have "Always" pads */
    gst_bin_add_many (GST_BIN (data.pipeline), data.app_source, data.tee, data.audio_queue, data.audio_convert1, data.audio_resample,
                      data.audio_sink, data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, data.app_queue,
                      data.app_sink, NULL);
    if (gst_element_link_many (data.app_source, data.tee, NULL) != TRUE ||
        gst_element_link_many (data.audio_queue, data.audio_convert1, data.audio_resample, data.audio_sink, NULL) != TRUE ||
        gst_element_link_many (data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, NULL) != TRUE ||
        gst_element_link_many (data.app_queue, data.app_sink, NULL) != TRUE) {
        g_printerr ("Elements could not be linked.\n");
        gst_object_unref (data.pipeline);
        return -1;
    }

    /* Manually link the Tee, which has "Request" pads */
    GstPadTemplate * tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (data.tee), "src_%u");
    tee_audio_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
    //tee_audio_pad = gst_element_request_pad_simple(data.tee, "src_%u");
    g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
    queue_audio_pad = gst_element_get_static_pad (data.audio_queue, "sink");
    //tee_video_pad = gst_element_request_pad_simple (data.tee, "src_%u");
    tee_video_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
    g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
    queue_video_pad = gst_element_get_static_pad (data.video_queue, "sink");
    //tee_app_pad = gst_element_request_pad_simple (data.tee, "src_%u");
    tee_app_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
    g_print ("Obtained request pad %s for app branch.\n", gst_pad_get_name (tee_app_pad));
    queue_app_pad = gst_element_get_static_pad (data.app_queue, "sink");
    if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
        gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK ||
        gst_pad_link (tee_app_pad, queue_app_pad) != GST_PAD_LINK_OK) {
        g_printerr ("Tee could not be linked\n");
        gst_object_unref (data.pipeline);
        return -1;
    }
    gst_object_unref (queue_audio_pad);
    gst_object_unref (queue_video_pad);
    gst_object_unref (queue_app_pad);

    /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
    bus = gst_element_get_bus (data.pipeline);
    gst_bus_add_signal_watch (bus);
    g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
    gst_object_unref (bus);

    /* Start playing the pipeline */
    gst_element_set_state (data.pipeline, GST_STATE_PLAYING);

    /* Create a GLib Main Loop and set it to run */
    data.main_loop = g_main_loop_new (NULL, FALSE);
    g_main_loop_run (data.main_loop);

    /* Release the request pads from the Tee, and unref them */
    gst_element_release_request_pad (data.tee, tee_audio_pad);
    gst_element_release_request_pad (data.tee, tee_video_pad);
    gst_element_release_request_pad (data.tee, tee_app_pad);
    gst_object_unref (tee_audio_pad);
    gst_object_unref (tee_video_pad);
    gst_object_unref (tee_app_pad);

    /* Free resources */
    gst_element_set_state (data.pipeline, GST_STATE_NULL);
    gst_object_unref (data.pipeline);
    return 0;
}

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值