IJKPLAYER源码分析-软解主结构

本文详细分析了IJKPLAYER的软解主流程,包括read_thread的协议解码与分离、ffplay_video_thread的视频解码、video_refresh_thread的音视频同步以及audio_thread的音频解码。同时,文章探讨了消息队列通信模型,展示了如何通过AVMessage在内核与APP业务层之间传递信息。
摘要由CSDN通过智能技术生成

1 前言

    本文主要分析IJKPLAYER源码软解主流程,硬解将另起一篇分析。所用IJKPLAYER版本号:

#define IJKPLAYER_VERSION "f0.7.17-28-gd7040f97"

2 主结构

    IJKPLAYER播放器的解协议、解复用、解码、音视频同步与显示播放,以及主要线程等主流程,大致如下:

read_thread职责:

  • 解协议:但凡ffmpeg所支持的协议,均支持,包括http-flv/rtsp/rtmp/rtp/srt/hls等流媒体协议,以及file协议-本地文件等;
  • demux:解协议之后,还需demux,调用av_read_frame分离出audio和video以及subtitle压缩数据,分别入对应的压缩队列;
  • seek:回放的seek请求操作,实际是放在read_thread线程做的:ijkmp_seek_to_l => ffp_seek_to_l => stream_seek => avformat_seek_file请求最近的IDR帧:

ffplay_video_thread职责:

  • video解码:消费videoq(PacketQueue) 队列的video压缩数据,用ffmpeg软解,decode出pixel数据之后,入pictq(FrameQueue)队列,待video_refresh_thread消费显示;
  • accurate seek: 若app层enable了精准seek开关,则会对decode出的video pts和seek timestamp作比对,若pts < seek_timestamp,则丢弃之,不render;
  • vf:video filter,缺省关闭;

video_refresh_thread职责:

  • 音视频同步:若主时钟不是AV_SYNC_VIDEO_MASTER,则video需与audio做pts时钟同步,详见另外一篇介绍音视频同步的文章;相反,则按video的duration播放;
  • SDL显示video:video_display2(ffp) => video_image_display2(ffp) => SDL_VoutDisplayYUVOverlay(ffp->vout, vp->bmp);
  • 更新video时钟:update_video_pts(is, vp->pts, vp->pos, vp->serial); 

audio_thread职责:

  • audio解码:消费audioq(PacketQueue)队列的audio压缩数据,用ffmpeg软解,decode出pcm数据之后,入sampq(FrameQueue)队列,待声卡驱动copy数据播放;
  • accurate seek: 同video的accurate seek逻辑;
  • af:audio filter,缺省关闭;

sdl_audio_callback职责:

  • 声卡驱动播放:与video主动送pixel数据给显卡驱动不一样,audio是由声卡驱动主动(iOS AudioUnit)获取pcm数据、Android端AudioTrack和OpenSL ES则是起1个线程主动喂PCM数据,具体PCM数据copy是由sdl_audio_callback执行;
  • 重采样:若audio参数(指采样率、采样通道、通道布局与采样位数)变更,或同步video需丢帧或添帧时,则需要重采样;
  • 更新audio的时钟:set_clock_at => sync_clock_to_slave;

subtitle_thread职责:

后续再讨论字幕

3 消息循环

3.1 通信模型

    那么,IJKPLAYER与APP业务层是如何沟通的呢?查看IJKPLAYER源码得知,除了subtitle_thread线程(当前和业务线程无交流)以外,其他几个主要线程均通过MessageQueue消息队列与上层通信:

    可以看到,MessageQueue是典型的多生产者单消费者模型队列,是一个带有首尾指针和回收链表的单链表结构,其详细定义参照如下:

typedef struct MessageQueue {
    AVMessage *first_msg, *last_msg;
    int nb_messages;
    int abort_request;
    SDL_mutex *mutex;
    SDL_cond *cond;

    AVMessage *recycle_msg;
    int recycle_count;
    int alloc_count;
} MessageQueue;

    单个AVMessage结构如下,是一个带有资源释放的单链表:

typedef struct AVMessage {
    int what;
    int arg1;
    int arg2;
    void *obj;
    size_t len;
    void (*free_l)(void *obj);
    struct AVMessage *next;
} AVMessage;

 3.2 AVMessage生产者接口

inline static void ffp_notify_msg1(FFPlayer *ffp, int what) {
    msg_queue_put_simple3(&ffp->msg_queue, what, 0, 0);
}

inline static void ffp_notify_msg2(FFPlayer *ffp, int what, int arg1) {
    msg_queue_put_simple3(&ffp->msg_queue, what, arg1, 0);
}

inline static void ffp_notify_msg3(FFPlayer *ffp, int what, int arg1, int arg2) {
    msg_queue_put_simple3(&ffp->msg_queue, what, arg1, arg2);
}

inline static void ffp_notify_msg4(FFPlayer *ffp, int what, int arg1, int arg2, void *obj, int obj_len) {
    msg_queue_put_simple4(&ffp->msg_queue, what, arg1, arg2, obj, obj_len);
}

inline static void ffp_notify_msg5(FFPlayer *ffp, int what, int arg1, int arg2, void *obj,
        size_t len, void (*free_l)(void *obj)) {
    msg_queue_put_simple5(&ffp->msg_queue, what, arg1, arg2, obj, len, free_l);
}

inline static void ffp_remove_msg(FFPlayer *ffp, int what) {
    msg_queue_remove(&ffp->msg_queue, what);
}

3.3 AVMessage消费者接口

int retval = ijkmp_get_msg(mp, &msg, 1);

3.4 向业务层投递AVMessage的接口

    IJKPLAYER内核投递msg的接口:

post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);

post_event2(env, weak_thiz, FFP_MSG_GET_IMG_STATE, msg.arg1, msg.arg2, file_name);
inline static void post_event(JNIEnv *env, jobject weak_this, int what, int arg1, int arg2)
{
    // MPTRACE("post_event(%p, %p, %d, %d, %d)", (void*)env, (void*) weak_this, what, arg1, arg2);
    J4AC_IjkMediaPlayer__postEventFromNative(env, weak_this, what, arg1, arg2, NULL);
    // MPTRACE("post_event()=void");
}
void J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer__postEventFromNative(JNIEnv *env, jobject weakThiz, jint what, jint arg1, jint arg2, jobject obj)
{
    (*env)->CallStaticVoidMethod(env, class_J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer.id, class_J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer.method_postEventFromNative, weakThiz, what, arg1, arg2, obj);
}

    android端最后是通过反射,由native层调用到Java层postEventFromNative方法,将msg传递到业务层的: 

    @CalledByNative
    private static void postEventFromNative(Object weakThiz, int what,
            int arg1, int arg2, Object obj) {
        if (weakThiz == null)
            return;

        @SuppressWarnings("rawtypes")
        IjkMediaPlayer mp = (IjkMediaPlayer) ((WeakReference) weakThiz).get();
        if (mp == null) {
            return;
        }

        if (what == IJK_MSG_VIDEO_SNAP_SHOT) {
            ByteBuffer byteBuffer = (ByteBuffer) obj;
            Matrix matrix = new Matrix();
            matrix.postScale(1, -1, arg1/2f, arg2/2f);
            Bitmap bitmap = Bitmap.createBitmap(arg1, arg2, Bitmap.Config.ARGB_8888);
            bitmap.copyPixelsFromBuffer(byteBuffer);
            bitmap =  Bitmap.createBitmap(bitmap, 0, 0, arg1, arg2, matrix, true);
            obj = bitmap;
        }
        // native ijkplayer never post this message
        // if (what == MEDIA_INFO && arg1 == MEDIA_INFO_STARTED_AS_NEXT) {
        //     // this acquires the wakelock if needed, and sets the client side
        //     // state
        //     mp.start();
        // }
        if (mp.mEventHandler != null) {
            Message m = mp.mEventHandler.obtainMessage(what, arg1, arg2, obj);
            mp.mEventHandler.sendMessage(m);
        }
    }

3.5 消息循环线程

    查看IJKPLAYER源码得知,消息循环线程并不在IJKPLAYER内核,而在移动端native层(ANDROID在jni层、iOS则在oc层)。

3.5.1 ANDROID端消息循环线程

    ANDROID端消息循环线程方法,是通过ijkmp_android_create注册的回调:

IjkMediaPlayer *mp = ijkmp_android_create(message_loop);

    消息循环线程方法:

static int message_loop(void *arg)
{
    MPTRACE("%s\n", __func__);

    JNIEnv *env = NULL;
    if (JNI_OK != SDL_JNI_SetupThreadEnv(&env)) {
        ALOGE("%s: SetupThreadEnv failed\n", __func__);
        return -1;
    }

    IjkMediaPlayer *mp = (IjkMediaPlayer*) arg;
    JNI_CHECK_GOTO(mp, env, NULL, "mpjni: native_message_loop: null mp", LABEL_RETURN);

    message_loop_n(env, mp);

LABEL_RETURN:
    ijkmp_dec_ref_p(&mp);

    MPTRACE("message_loop exit");
    return 0;
}
static void message_loop_n(JNIEnv *env, IjkMediaPlayer *mp)
{
    jobject weak_thiz = (jobject) ijkmp_get_weak_thiz(mp);
    JNI_CHECK_GOTO(weak_thiz, env, NULL, "mpjni: message_loop_n: null weak_thiz", LABEL_RETURN);

    while (1) {
        AVMessage msg;

        int retval = ijkmp_get_msg(mp, &msg, 1);
        if (retval < 0)
            break;

        // block-get should never return 0
        assert(retval > 0);

        switch (msg.what) {
        case FFP_MSG_FLUSH:
            MPTRACE("FFP_MSG_FLUSH:\n");
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_ERROR:
            MPTRACE("FFP_MSG_ERROR: %d\n", msg.arg1);
            if (msg.obj) {
                jstring text = (*env)->NewStringUTF(env, (const char *)msg.obj);
                post_event2(env, weak_thiz, msg.what, msg.arg1, msg.arg2, text);
                J4A_DeleteLocalRef__p(env, &text);
            } else {
                post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            }
            break;
        case FFP_MSG_PREPARED:
            MPTRACE("FFP_MSG_PREPARED:\n");
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_COMPLETED:
            MPTRACE("FFP_MSG_COMPLETED:\n");
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_VIDEO_SIZE_CHANGED:
            MPTRACE("FFP_MSG_VIDEO_SIZE_CHANGED: %d, %d\n", msg.arg1, msg.arg2);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_SAR_CHANGED:
            MPTRACE("FFP_MSG_SAR_CHANGED: %d, %d\n", msg.arg1, msg.arg2);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_VIDEO_RENDERING_START:
            MPTRACE("FFP_MSG_VIDEO_RENDERING_START:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_RENDERING_START, 0);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_AUDIO_RENDERING_START:
            MPTRACE("FFP_MSG_AUDIO_RENDERING_START:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_AUDIO_RENDERING_START, 0);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_VIDEO_ROTATION_CHANGED:
            MPTRACE("FFP_MSG_VIDEO_ROTATION_CHANGED: %d\n", msg.arg1);
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_ROTATION_CHANGED, msg.arg1);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_AUDIO_DECODED_START:
            MPTRACE("FFP_MSG_AUDIO_DECODED_START:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_AUDIO_DECODED_START, 0);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_VIDEO_DECODED_START:
            MPTRACE("FFP_MSG_VIDEO_DECODED_START:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_DECODED_START, 0);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_OPEN_INPUT:
            MPTRACE("FFP_MSG_OPEN_INPUT:\n");
            //post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_OPEN_INPUT, 0);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_FIND_STREAM_INFO:
            MPTRACE("FFP_MSG_FIND_STREAM_INFO:\n");
            //post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_FIND_STREAM_INFO, 0);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_COMPONENT_OPEN:
            MPTRACE("FFP_MSG_COMPONENT_OPEN:\n");
            //post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_COMPONENT_OPEN, 0);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_BUFFERING_START:
            MPTRACE("FFP_MSG_BUFFERING_START:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_BUFFERING_START, msg.arg1);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_BUFFERING_END:
            MPTRACE("FFP_MSG_BUFFERING_END:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_BUFFERING_END, msg.arg1);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_BUFFERING_UPDATE:
        case FFP_MSG_CURRENT_POSITION_UPDATE:
        case FFP_MSG_DISKBUFFER_UPDATE:
            // MPTRACE("FFP_MSG_BUFFERING_UPDATE: %d, %d", msg.arg1, msg.arg2);
            // post_event(env, weak_thiz, MEDIA_BUFFERING_UPDATE, msg.arg1, msg.arg2);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_BUFFERING_BYTES_UPDATE:
            break;
        case FFP_MSG_BUFFERING_TIME_UPDATE:
            break;
        case FFP_MSG_SEEK_COMPLETE:
            MPTRACE("FFP_MSG_SEEK_COMPLETE:\n");
            // post_event(env, weak_thiz, MEDIA_SEEK_COMPLETE, 0, 0);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_ACCURATE_SEEK_COMPLETE:
            MPTRACE("FFP_MSG_ACCURATE_SEEK_COMPLETE:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_MEDIA_ACCURATE_SEEK_COMPLETE, msg.arg1);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_PLAYBACK_STATE_CHANGED:
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_TIMED_TEXT:
            if (msg.obj) {
                jstring text = (*env)->NewStringUTF(env, (const char *)msg.obj);
                post_event2(env, weak_thiz, FFP_MSG_TIMED_TEXT, 0, 0, text);
                J4A_DeleteLocalRef__p(env, &text);
            }
            else {
                post_event2(env, weak_thiz, FFP_MSG_TIMED_TEXT, 0, 0, NULL);
            }
            break;
        case FFP_MSG_GET_IMG_STATE:
            if (msg.obj) {
                jstring file_name = (*env)->NewStringUTF(env, (char *)msg.obj);
                post_event2(env, weak_thiz, FFP_MSG_GET_IMG_STATE, msg.arg1, msg.arg2, file_name);
                J4A_DeleteLocalRef__p(env, &file_name);
            }
            else {
                post_event2(env, weak_thiz, FFP_MSG_GET_IMG_STATE, msg.arg1, msg.arg2, NULL);
            }
            break;
        case FFP_MSG_VIDEO_SNAP_SHOT:
            if (msg.obj) {
                jobject byte_buffer = (*env)->NewDirectByteBuffer(env, msg.obj, msg.len);
                post_event2(env, weak_thiz, FFP_MSG_VIDEO_SNAP_SHOT, msg.arg1, msg.arg2, byte_buffer);
            }
            break;
        case FFP_MSG_DOWNLOAD_HLS_PROGRESS:
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_VIDEO_SEEK_RENDERING_START:
            MPTRACE("FFP_MSG_VIDEO_SEEK_RENDERING_START:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_SEEK_RENDERING_START, msg.arg1);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_AUDIO_SEEK_RENDERING_START:
            MPTRACE("FFP_MSG_AUDIO_SEEK_RENDERING_START:\n");
            // post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_AUDIO_SEEK_RENDERING_START, msg.arg1);
            post_event(env, weak_thiz, msg.what, msg.arg1, msg.arg2);
            break;
        default:
            ALOGE("unknown FFP_MSG_xxx(%d)\n", msg.what);
            break;
        }
        msg_free_res(&msg);
    }

LABEL_RETURN:
    ;
}

 3.5.2 iOS端消息循环线程

    iOS端的msg loop方法,是则ijkmp_ios_create方法里注册的callback:

_nativeMediaPlayer = ijkmp_ios_create(ff_media_player_msg_loop);

    其消息循环线程实现: 

int ff_media_player_msg_loop(void* arg)
{
    @autoreleasepool {
        IjkMediaPlayer *mp = (IjkMediaPlayer*)arg;
        __weak IJKFFMediaPlayer *ffPlayer = ffplayerRetain(ijkmp_set_weak_thiz(mp, NULL));
        while (ffPlayer) {
            @autoreleasepool {
                IJKFFMoviePlayerMessage *msg = [ffPlayer obtainMessage];
                if (!msg)
                    break;
                
                int retval = ijkmp_get_msg(mp, &msg->_msg, 1);
                if (retval < 0)
                    break;
                
                // block-get should never return 0
                assert(retval > 0);
                [ffPlayer performSelectorOnMainThread:@selector(postEvent:) withObject:msg waitUntilDone:NO];
            }
        }
        
        // retained in prepare_async, before SDL_CreateThreadEx
        ijkmp_dec_ref_p(&mp);
        return 0;
    }
}
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

老中医的博客

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值