GStreamer学习笔记(六)——基础教程5:GUI工具集成

原文:Basic tutorial 5: GUI toolkit integration (gstreamer.freedesktop.org)

目标

本教程展示了如何将GStreamer集成到图形用户界面(GUI)工具包中,如GTK+。基本上,GStreamer负责媒体播放,而GUI工具包处理用户交互。最有趣的是两个库必须交互的部分:指示GStreamer将视频输出到GTK+窗口并将用户操作转发给GStreamer。

特别是,你会学到:

  • 如何告诉GStreamer输出视频到一个特定的窗口(而不是创建自己的窗口)。
  • 如何使用来自GStreamer的信息不断刷新GUI。
  • 如何从GStreamer的多个线程更新GUI,这是大多数GUI工具包禁止的操作。
  • 一种只订阅您感兴趣的消息的机制,而不是收到所有消息的通知。

概述

我们将使用GTK+工具包构建一个媒体播放器,但是这些概念也适用于其他工具包,例如Qt。对GTK+有最起码的了解将有助于理解本教程。

重点是告诉GStreamer将视频输出到我们选择的窗口。

一个常见的问题是,GUI工具包通常只允许通过主线程(或应用程序)操作图形“widgets”,而GStreamer通常生成多个线程来处理不同的任务。从回调内部调用GTK+函数通常会失败,因为回调在调用线程中执行,而调用线程不必须是主线程。这个问题可以通过在回调中的GStreamer总线上发布消息来解决:消息将被主线程接收,然后主线程将做出相应的反应。

最后,到目前为止,我们已经注册了一个handle_message函数,该函数在每次消息出现在总线上时都会被调用,这迫使我们解析每条消息,看看它是否对我们感兴趣。在本教程中,我们使用了一种不同的方法,为每种消息注册一个回调,因此总体上较少的解析和较少的代码。

GTK+中的媒体播放器

让我们基于playbin编写一个非常简单的媒体播放器,这次使用GUI!

#include <string.h>

#include <gtk/gtk.h>
#include <gst/gst.h>
#include <gst/video/videooverlay.h>

#include <gdk/gdk.h>
#if defined (GDK_WINDOWING_X11)
#include <gdk/gdkx.h>
#elif defined (GDK_WINDOWING_WIN32)
#include <gdk/gdkwin32.h>
#elif defined (GDK_WINDOWING_QUARTZ)
#include <gdk/gdkquartz.h>
#endif

/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData
{
  GstElement *playbin;          /* Our one and only pipeline */

  GtkWidget *slider;            /* Slider widget to keep track of current position */
  GtkWidget *streams_list;      /* Text widget to display info about the streams */
  gulong slider_update_signal_id;       /* Signal ID for the slider update signal */

  GstState state;               /* Current state of the pipeline */
  gint64 duration;              /* Duration of the clip, in nanoseconds */
} CustomData;

/* This function is called when the GUI toolkit creates the physical window that will hold the video.
 * At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
 * and pass it to GStreamer through the XOverlay interface. */
static void
realize_cb (GtkWidget * widget, CustomData * data)
{
  GdkWindow *window = gtk_widget_get_window (widget);
  guintptr window_handle;

  if (!gdk_window_ensure_native (window))
    g_error ("Couldn't create native window needed for GstXOverlay!");

  /* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
  window_handle = (guintptr) GDK_WINDOW_HWND (window);
#elif defined (GDK_WINDOWING_QUARTZ)
  window_handle = gdk_quartz_window_get_nsview (window);
#elif defined (GDK_WINDOWING_X11)
  window_handle = GDK_WINDOW_XID (window);
#endif
  /* Pass it to playbin, which implements XOverlay and will forward it to the video sink */
  gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin),
      window_handle);
}

/* This function is called when the PLAY button is clicked */
static void
play_cb (GtkButton * button, CustomData * data)
{
  gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}

/* This function is called when the PAUSE button is clicked */
static void
pause_cb (GtkButton * button, CustomData * data)
{
  gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}

/* This function is called when the STOP button is clicked */
static void
stop_cb (GtkButton * button, CustomData * data)
{
  gst_element_set_state (data->playbin, GST_STATE_READY);
}

/* This function is called when the main window is closed */
static void
delete_event_cb (GtkWidget * widget, GdkEvent * event, CustomData * data)
{
  stop_cb (NULL, data);
  gtk_main_quit ();
}

/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
 * rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
 * we simply draw a black rectangle to avoid garbage showing up. */
static gboolean
draw_cb (GtkWidget * widget, cairo_t * cr, CustomData * data)
{
  if (data->state < GST_STATE_PAUSED) {
    GtkAllocation allocation;

    /* Cairo is a 2D graphics library which we use here to clean the video window.
     * It is used by GStreamer for other reasons, so it will always be available to us. */
    gtk_widget_get_allocation (widget, &allocation);
    cairo_set_source_rgb (cr, 0, 0, 0);
    cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
    cairo_fill (cr);
  }

  return FALSE;
}

/* This function is called when the slider changes its position. We perform a seek to the
 * new position here. */
static void
slider_cb (GtkRange * range, CustomData * data)
{
  gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
  gst_element_seek_simple (data->playbin, GST_FORMAT_TIME,
      GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
      (gint64) (value * GST_SECOND));
}

/* This creates all the GTK+ widgets that compose our application, and registers the callbacks */
static void
create_ui (CustomData * data)
{
  GtkWidget *main_window;       /* The uppermost window, containing all other windows */
  GtkWidget *video_window;      /* The drawing area where the video will be shown */
  GtkWidget *main_box;          /* VBox to hold main_hbox and the controls */
  GtkWidget *main_hbox;         /* HBox to hold the video_window and the stream info text widget */
  GtkWidget *controls;          /* HBox to hold the buttons and the slider */
  GtkWidget *play_button, *pause_button, *stop_button;  /* Buttons */

  main_window = gtk_window_new (GTK_WINDOW_TOPLEVEL);
  g_signal_connect (G_OBJECT (main_window), "delete-event",
      G_CALLBACK (delete_event_cb), data);

  video_window = gtk_drawing_area_new ();
  gtk_widget_set_double_buffered (video_window, FALSE);
  g_signal_connect (video_window, "realize", G_CALLBACK (realize_cb), data);
  g_signal_connect (video_window, "draw", G_CALLBACK (draw_cb), data);

  play_button =
      gtk_button_new_from_icon_name ("media-playback-start",
      GTK_ICON_SIZE_SMALL_TOOLBAR);
  g_signal_connect (G_OBJECT (play_button), "clicked", G_CALLBACK (play_cb),
      data);

  pause_button =
      gtk_button_new_from_icon_name ("media-playback-pause",
      GTK_ICON_SIZE_SMALL_TOOLBAR);
  g_signal_connect (G_OBJECT (pause_button), "clicked", G_CALLBACK (pause_cb),
      data);

  stop_button =
      gtk_button_new_from_icon_name ("media-playback-stop",
      GTK_ICON_SIZE_SMALL_TOOLBAR);
  g_signal_connect (G_OBJECT (stop_button), "clicked", G_CALLBACK (stop_cb),
      data);

  data->slider =
      gtk_scale_new_with_range (GTK_ORIENTATION_HORIZONTAL, 0, 100, 1);
  gtk_scale_set_draw_value (GTK_SCALE (data->slider), 0);
  data->slider_update_signal_id =
      g_signal_connect (G_OBJECT (data->slider), "value-changed",
      G_CALLBACK (slider_cb), data);

  data->streams_list = gtk_text_view_new ();
  gtk_text_view_set_editable (GTK_TEXT_VIEW (data->streams_list), FALSE);

  controls = gtk_box_new (GTK_ORIENTATION_HORIZONTAL, 0);
  gtk_box_pack_start (GTK_BOX (controls), play_button, FALSE, FALSE, 2);
  gtk_box_pack_start (GTK_BOX (controls), pause_button, FALSE, FALSE, 2);
  gtk_box_pack_start (GTK_BOX (controls), stop_button, FALSE, FALSE, 2);
  gtk_box_pack_start (GTK_BOX (controls), data->slider, TRUE, TRUE, 2);

  main_hbox = gtk_box_new (GTK_ORIENTATION_HORIZONTAL, 0);
  gtk_box_pack_start (GTK_BOX (main_hbox), video_window, TRUE, TRUE, 0);
  gtk_box_pack_start (GTK_BOX (main_hbox), data->streams_list, FALSE, FALSE, 2);

  main_box = gtk_box_new (GTK_ORIENTATION_VERTICAL, 0);
  gtk_box_pack_start (GTK_BOX (main_box), main_hbox, TRUE, TRUE, 0);
  gtk_box_pack_start (GTK_BOX (main_box), controls, FALSE, FALSE, 0);
  gtk_container_add (GTK_CONTAINER (main_window), main_box);
  gtk_window_set_default_size (GTK_WINDOW (main_window), 640, 480);

  gtk_widget_show_all (main_window);
}

/* This function is called periodically to refresh the GUI */
static gboolean
refresh_ui (CustomData * data)
{
  gint64 current = -1;

  /* We do not want to update anything unless we are in the PAUSED or PLAYING states */
  if (data->state < GST_STATE_PAUSED)
    return TRUE;

  /* If we didn't know it yet, query the stream duration */
  if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
    if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME,
            &data->duration)) {
      g_printerr ("Could not query current duration.\n");
    } else {
      /* Set the range of the slider to the clip duration, in SECONDS */
      gtk_range_set_range (GTK_RANGE (data->slider), 0,
          (gdouble) data->duration / GST_SECOND);
    }
  }

  if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, &current)) {
    /* Block the "value-changed" signal, so the slider_cb function is not called
     * (which would trigger a seek the user has not requested) */
    g_signal_handler_block (data->slider, data->slider_update_signal_id);
    /* Set the position of the slider to the current pipeline positoin, in SECONDS */
    gtk_range_set_value (GTK_RANGE (data->slider),
        (gdouble) current / GST_SECOND);
    /* Re-enable the signal */
    g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
  }
  return TRUE;
}

/* This function is called when new metadata is discovered in the stream */
static void
tags_cb (GstElement * playbin, gint stream, CustomData * data)
{
  /* We are possibly in a GStreamer working thread, so we notify the main
   * thread of this event through a message in the bus */
  gst_element_post_message (playbin,
      gst_message_new_application (GST_OBJECT (playbin),
          gst_structure_new_empty ("tags-changed")));
}

/* This function is called when an error message is posted on the bus */
static void
error_cb (GstBus * bus, GstMessage * msg, CustomData * data)
{
  GError *err;
  gchar *debug_info;

  /* Print error details on the screen */
  gst_message_parse_error (msg, &err, &debug_info);
  g_printerr ("Error received from element %s: %s\n",
      GST_OBJECT_NAME (msg->src), err->message);
  g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
  g_clear_error (&err);
  g_free (debug_info);

  /* Set the pipeline to READY (which stops playback) */
  gst_element_set_state (data->playbin, GST_STATE_READY);
}

/* This function is called when an End-Of-Stream message is posted on the bus.
 * We just set the pipeline to READY (which stops playback) */
static void
eos_cb (GstBus * bus, GstMessage * msg, CustomData * data)
{
  g_print ("End-Of-Stream reached.\n");
  gst_element_set_state (data->playbin, GST_STATE_READY);
}

/* This function is called when the pipeline changes states. We use it to
 * keep track of the current state. */
static void
state_changed_cb (GstBus * bus, GstMessage * msg, CustomData * data)
{
  GstState old_state, new_state, pending_state;
  gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
  if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
    data->state = new_state;
    g_print ("State set to %s\n", gst_element_state_get_name (new_state));
    if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
      /* For extra responsiveness, we refresh the GUI as soon as we reach the PAUSED state */
      refresh_ui (data);
    }
  }
}

/* Extract metadata from all the streams and write it to the text widget in the GUI */
static void
analyze_streams (CustomData * data)
{
  gint i;
  GstTagList *tags;
  gchar *str, *total_str;
  guint rate;
  gint n_video, n_audio, n_text;
  GtkTextBuffer *text;

  /* Clean current contents of the widget */
  text = gtk_text_view_get_buffer (GTK_TEXT_VIEW (data->streams_list));
  gtk_text_buffer_set_text (text, "", -1);

  /* Read some properties */
  g_object_get (data->playbin, "n-video", &n_video, NULL);
  g_object_get (data->playbin, "n-audio", &n_audio, NULL);
  g_object_get (data->playbin, "n-text", &n_text, NULL);

  for (i = 0; i < n_video; i++) {
    tags = NULL;
    /* Retrieve the stream's video tags */
    g_signal_emit_by_name (data->playbin, "get-video-tags", i, &tags);
    if (tags) {
      total_str = g_strdup_printf ("video stream %d:\n", i);
      gtk_text_buffer_insert_at_cursor (text, total_str, -1);
      g_free (total_str);
      gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
      total_str = g_strdup_printf ("  codec: %s\n", str ? str : "unknown");
      gtk_text_buffer_insert_at_cursor (text, total_str, -1);
      g_free (total_str);
      g_free (str);
      gst_tag_list_free (tags);
    }
  }

  for (i = 0; i < n_audio; i++) {
    tags = NULL;
    /* Retrieve the stream's audio tags */
    g_signal_emit_by_name (data->playbin, "get-audio-tags", i, &tags);
    if (tags) {
      total_str = g_strdup_printf ("\naudio stream %d:\n", i);
      gtk_text_buffer_insert_at_cursor (text, total_str, -1);
      g_free (total_str);
      if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
        total_str = g_strdup_printf ("  codec: %s\n", str);
        gtk_text_buffer_insert_at_cursor (text, total_str, -1);
        g_free (total_str);
        g_free (str);
      }
      if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
        total_str = g_strdup_printf ("  language: %s\n", str);
        gtk_text_buffer_insert_at_cursor (text, total_str, -1);
        g_free (total_str);
        g_free (str);
      }
      if (gst_tag_list_get_uint (tags, GST_TAG_BITRATE, &rate)) {
        total_str = g_strdup_printf ("  bitrate: %d\n", rate);
        gtk_text_buffer_insert_at_cursor (text, total_str, -1);
        g_free (total_str);
      }
      gst_tag_list_free (tags);
    }
  }

  for (i = 0; i < n_text; i++) {
    tags = NULL;
    /* Retrieve the stream's subtitle tags */
    g_signal_emit_by_name (data->playbin, "get-text-tags", i, &tags);
    if (tags) {
      total_str = g_strdup_printf ("\nsubtitle stream %d:\n", i);
      gtk_text_buffer_insert_at_cursor (text, total_str, -1);
      g_free (total_str);
      if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
        total_str = g_strdup_printf ("  language: %s\n", str);
        gtk_text_buffer_insert_at_cursor (text, total_str, -1);
        g_free (total_str);
        g_free (str);
      }
      gst_tag_list_free (tags);
    }
  }
}

/* This function is called when an "application" message is posted on the bus.
 * Here we retrieve the message posted by the tags_cb callback */
static void
application_cb (GstBus * bus, GstMessage * msg, CustomData * data)
{
  if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)),
          "tags-changed") == 0) {
    /* If the message is the "tags-changed" (only one we are currently issuing), update
     * the stream info GUI */
    analyze_streams (data);
  }
}

int
main (int argc, char *argv[])
{
  CustomData data;
  GstStateChangeReturn ret;
  GstBus *bus;

  /* Initialize GTK */
  gtk_init (&argc, &argv);

  /* Initialize GStreamer */
  gst_init (&argc, &argv);

  /* Initialize our data structure */
  memset (&data, 0, sizeof (data));
  data.duration = GST_CLOCK_TIME_NONE;

  /* Create the elements */
  data.playbin = gst_element_factory_make ("playbin", "playbin");

  if (!data.playbin) {
    g_printerr ("Not all elements could be created.\n");
    return -1;
  }

  /* Set the URI to play */
  g_object_set (data.playbin, "uri",
      "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm",
      NULL);

  /* Connect to interesting signals in playbin */
  g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed",
      (GCallback) tags_cb, &data);
  g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed",
      (GCallback) tags_cb, &data);
  g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed",
      (GCallback) tags_cb, &data);

  /* Create the GUI */
  create_ui (&data);

  /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
  bus = gst_element_get_bus (data.playbin);
  gst_bus_add_signal_watch (bus);
  g_signal_connect (G_OBJECT (bus), "message::error", (GCallback) error_cb,
      &data);
  g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback) eos_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::state-changed",
      (GCallback) state_changed_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::application",
      (GCallback) application_cb, &data);
  gst_object_unref (bus);

  /* Start playing */
  ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
  if (ret == GST_STATE_CHANGE_FAILURE) {
    g_printerr ("Unable to set the pipeline to the playing state.\n");
    gst_object_unref (data.playbin);
    return -1;
  }

  /* Register a function that GLib will call every second */
  g_timeout_add_seconds (1, (GSourceFunc) refresh_ui, &data);

  /* Start the GTK main loop. We will not regain control until gtk_main_quit is called. */
  gtk_main ();

  /* Free resources */
  gst_element_set_state (data.playbin, GST_STATE_NULL);
  gst_object_unref (data.playbin);
  return 0;
}

代码走读

关于本教程的结构,我们将不再使用前向函数定义:函数将在使用之前定义。此外,为了解释清楚,代码片段呈现的顺序并不总是与程序顺序相匹配。使用行号来定位完整代码中的片段。

本教程主要由回调函数组成,这些回调函数将从GStreamer或GTK+中调用,因此让我们审视一下注册所有这些回调函数的main函数。

int
main (int argc, char *argv[])
{
  CustomData data;
  GstStateChangeReturn ret;
  GstBus *bus;

  /* Initialize GTK */
  gtk_init (&argc, &argv);

  /* Initialize GStreamer */
  gst_init (&argc, &argv);

  /* Initialize our data structure */
  memset (&data, 0, sizeof (data));
  data.duration = GST_CLOCK_TIME_NONE;

  /* Create the elements */
  data.playbin = gst_element_factory_make ("playbin", "playbin");

  if (!data.playbin) {
    g_printerr ("Not all elements could be created.\n");
    return -1;
  }

  /* Set the URI to play */
  g_object_set (data.playbin, "uri",
      "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm",
      NULL);

标准的GStreamer初始化和playbin pipeline创建,以及GTK初始化。我们还创建了视频接收器元素,它将呈现为GTK小部件。我们将在以后的UI中使用这个小部件。

  /* Connect to interesting signals in playbin */
  g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed",
      (GCallback) tags_cb, &data);
  g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed",
      (GCallback) tags_cb, &data);
  g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed",
      (GCallback) tags_cb, &data);

当流上出现新的标签(元数据)时,我们感兴趣的是收到通知。为简单起见,我们将从tags_cb处理所有类型的标记(视频、音频和文本)。

  /* Create the GUI */
  create_ui (&data);

所有GTK+ widget的创建和信号注册都在这个函数中进行。它只包含与gtk相关的函数调用,因此我们将跳过它的定义。它注册到的信号传递用户命令,如下图所示。

  /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
  bus = gst_element_get_bus (data.playbin);
  gst_bus_add_signal_watch (bus);
  g_signal_connect (G_OBJECT (bus), "message::error", (GCallback) error_cb,
      &data);
  g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback) eos_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::state-changed",
      (GCallback) state_changed_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::application",
      (GCallback) application_cb, &data);
  gst_object_unref (bus);

在播放教程1中,gst_bus_add_watch()用于注册一个函数,该函数接收发布到GStreamer总线的每条消息。我们可以通过使用信号来实现更细的粒度,这允许我们只注册我们感兴趣的消息。通过调用gst_bus_add_signal_watch(),我们指示总线在每次接收消息时发出一个信号。该信号的名称为message::detail,其中detail是触发信号发射的消息。例如,当总线接收到EOS消息时,它会发出一个名为message:: eos的信号。

本教程将使用信号的详细信息来只注册我们关心的消息。如果我们已经注册到消息信号,我们将收到每条消息的通知,就像gst_bus_add_watch()所做的那样。

请记住,为了使总线监视器工作(无论是gst_bus_add_watch()还是gst_bus_add_signal_watch()),必须运行GLib主循环。在本例中,它隐藏在GTK+主循环中。

  /* Register a function that GLib will call every second */
  g_timeout_add_seconds (1, (GSourceFunc) refresh_ui, &data);

在将控制转移到GTK+之前,我们使用g_timeout_add_seconds()来注册另一个回调,这次使用超时,因此每秒调用一次。我们将使用它从refresh_ui函数刷新GUI。
在此之后,我们就完成了设置,可以启动GTK+主循环。当有趣的事情发生时,我们将从回调中重新获得控制权。让我们回顾一下回调。每个回调都有不同的签名,这取决于调用它的人。您可以在信号的文档中查找签名(参数的含义和返回值)。

/* This function is called when the PLAY button is clicked */
static void
play_cb (GtkButton * button, CustomData * data)
{
  gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}

/* This function is called when the PAUSE button is clicked */
static void
pause_cb (GtkButton * button, CustomData * data)
{
  gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}

/* This function is called when the STOP button is clicked */
static void
stop_cb (GtkButton * button, CustomData * data)
{
  gst_element_set_state (data->playbin, GST_STATE_READY);
}

这三个小回调与GUI中的PLAY, PAUSE和STOP按钮相关联。它们只是将pipeline设置为相应的状态。注意,在STOP状态下,我们将管道设置为READY。我们本可以将pipeline一直降低到NULL状态,但是,转换会稍微慢一些,因为需要释放一些资源(如音频设备)并重新获取。

/* This function is called when the main window is closed */
static void
delete_event_cb (GtkWidget * widget, GdkEvent * event, CustomData * data)
{
  stop_cb (NULL, data);
  gtk_main_quit ();
}

gtk_main_quit()最终将终止main中的gtk_main(),在本例中,这将结束程序。在这里,我们在主窗口关闭时调用它,在停止pipeline之后(只是为了整洁)。

/* This function is called when the slider changes its position. We perform a seek to the
 * new position here. */
static void
slider_cb (GtkRange * range, CustomData * data)
{
  gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
  gst_element_seek_simple (data->playbin, GST_FORMAT_TIME,
      GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
      (gint64) (value * GST_SECOND));
}

这是一个示例,说明如何通过GStreamer和GTK+的协作,非常容易地实现像seek bar(或允许搜索的滑块)这样的复杂GUI元素。如果滑块被拖动到一个新位置,告诉GStreamer使用gst_element_seek_simple() seek到那个位置(参见基础教程4:时间管理)。滑块已经设置好,所以它的值代表秒。

值得一提的是,通过进行一些节流可以获得一些性能(和响应性),也就是说,不响应每个用户的请求。由于seek操作必然要花费一些时间,因此在执行一次seek操作后等待半秒钟(例如)再执行另一次操作通常会更好。否则,如果用户疯狂地拖动滑块,则应用程序可能看起来没有响应,这将不允许在新请求排队之前完成任何请求。

/* This function is called periodically to refresh the GUI */
static gboolean
refresh_ui (CustomData * data)
{
  gint64 current = -1;

  /* We do not want to update anything unless we are in the PAUSED or PLAYING states */
  if (data->state < GST_STATE_PAUSED)
    return TRUE;

此函数将移动滑块以反映媒体的当前位置。首先,如果我们不在PLAYING状态,我们就没有什么可做的(另外,位置和持续时间查询通常会失败)。

  /* If we didn't know it yet, query the stream duration */
  if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
    if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME,
            &data->duration)) {
      g_printerr ("Could not query current duration.\n");
    } else {
      /* Set the range of the slider to the clip duration, in SECONDS */
      gtk_range_set_range (GTK_RANGE (data->slider), 0,
          (gdouble) data->duration / GST_SECOND);
    }
  }

如果我们不知道流的持续时间,我们就还原,然后我们可以设置滑动条的范围。

if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, &current)) {
  /* Block the "value-changed" signal, so the slider_cb function is not called
   * (which would trigger a seek the user has not requested) */
  g_signal_handler_block (data->slider, data->slider_update_signal_id);
  /* Set the position of the slider to the current pipeline positoin, in SECONDS */
  gtk_range_set_value (GTK_RANGE (data->slider),
      (gdouble) current / GST_SECOND);
  /* Re-enable the signal */
  g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
}

我们查询当前pipeline的位置,并相应地设置滑块的位置。这将触发value-changed信号的发射,我们用它来知道用户何时在拖动滑块。因为我们不希望查找发生,除非用户请求它们,所以我们在此操作期间使用g_signal_handler_block()g_signal_handler_unblock()禁用value-changed信号发射。

从这个函数返回TRUE将使它在将来继续被调用。如果我们返回FALSE,计时器将被移除。

/* This function is called when new metadata is discovered in the stream */
static void
tags_cb (GstElement * playbin, gint stream, CustomData * data)
{
  /* We are possibly in a GStreamer working thread, so we notify the main
   * thread of this event through a message in the bus */
  gst_element_post_message (playbin,
      gst_message_new_application (GST_OBJECT (playbin),
          gst_structure_new_empty ("tags-changed")));
}

这是本教程的关键点之一。当在媒体中发现新的标签时,这个函数将被调用,从流线程,也就是从应用程序(或主)线程以外的线程。我们在这里要做的是更新GTK+小部件以反映这些新信息,但是GTK+不允许从主线程以外的线程进行操作。

解决方案是让playbin在总线上发布消息并返回到调用线程。在适当的时候,主线程将接收此消息并更新GTK。

gst_element_post_message()使GStreamer元素将给定的消息发送到总线。gst_message_new_application()创建一个APPLICATION类型的新消息。GStreamer消息有不同的类型,这种特定的类型保留给应用程序:它将通过总线,不受GStreamer的影响。类型列表可以在GstMessageType文档中找到。

消息可以通过它们的嵌入的GstStructure传递额外的信息,GstStructure是一种非常灵活的数据容器。在这里,我们使用gst_structure_new_empty()创建一个新结构,并将其命名为tags-changed,以避免在我们想要发送其他应用程序消息时产生混淆。

稍后,一旦进入主线程,总线将接收此消息并发出message::application信号,我们已经将其关联到application_cb函数:

/* This function is called when an "application" message is posted on the bus.
 * Here we retrieve the message posted by the tags_cb callback */
static void
application_cb (GstBus * bus, GstMessage * msg, CustomData * data)
{
  if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)),
          "tags-changed") == 0) {
    /* If the message is the "tags-changed" (only one we are currently issuing), update
     * the stream info GUI */
    analyze_streams (data);
  }
}

一旦我确定它是tags-changed的消息,我们调用analyze_streams函数,该函数也在播放教程1:Playbin使用中使用,并且在那里更详细。它基本上从流中恢复标记,并将它们写入GUI中的文本小部件中。

error_cbeos_cbstate_changed_cb真的没有什么好解释的,因为它们的作用与前面所有教程中的相同,但现在是在它们自己的函数中。

就是这样!本教程中的代码量可能看起来令人望而生畏,但所需的概念很少而且很简单。如果您遵循了前面的教程,并且对GTK有一点了解,那么您可能已经理解了这一点,现在可以享受您自己的媒体播放器了!

  • 18
    点赞
  • 19
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 1
    评论
好的,我会尽力回答你的问题。关于通过UDP传输音视频,我了解一些相关的知识,下面是一些学习笔记: 1. gstreamer是一个流媒体框架,用于创建、处理和播放多媒体流。它支持多种音视频格式,可以通过插件扩展功能。 2. 通过gstreamer可以使用UDP协议传输音视频数据。UDP协议是一种无连接的协议,不保证数据传输的可靠性和顺序性,但是传输效率高。 3. 首先需要创建一个gstreamer的pipeline,包括音视频源、编码器、UDP发送端等组件。例如: ``` gst-launch-1.0 -v filesrc location=test.mp4 ! decodebin ! x264enc ! rtph264pay ! udpsink host=192.168.1.100 port=5000 ``` 这个pipeline的作用是从test.mp4文件读取音视频流,解码后使用x264编码器进行压缩,然后使用rtph264pay将数据打包成RTP数据包,最后通过udpsink发送到指定的IP地址和端口。 4. 接收端需要创建一个gstreamer的pipeline,包括UDP接收端、解包器、解码器等组件。例如: ``` gst-launch-1.0 -v udpsrc port=5000 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! autovideosink ``` 这个pipeline的作用是从UDP端口5000接收音视频数据,使用rtpjitterbuffer解决网络抖动问题,使用rtph264depay将RTP数据包解包成原始的H.264数据流,然后使用avdec_h264解码器进行解码,最后使用autovideosink播放视频。 5. 在实际使用过程中,还需要考虑数据的带宽限制、网络延迟等问题,以保证音视频传输的效果。 希望这些笔记能对你有帮助。如果你还有其他问题,可以继续问我。
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

最佳打工人

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值