Android Media——使用MediaCodec解码MP4视频并播放

一、代码分析

1、创建mediacodec并初始化

        可通过createDecoderByType来创建mediacodec:

mediaCodec = MediaCodec.createDecoderByType("video/avc");

        上面表示创建了一个解码器,并指定了解码类型为avc的视频解码器。

        初始化主要是为了给这个解码器设置一些格式,配置等,如下完整的创建和初始代码:

    private void initMediaCodecSys() {
        try {
            //创建
            mediaCodec = MediaCodec.createDecoderByType("video/avc");
            //格式
            mediaFormat = MediaFormat.createVideoFormat("video/avc", 1280, 720);
            mediaExtractor = new MediaExtractor();
            //MP4 文件存放位置
            mediaExtractor.setDataSource(MainActivity.MP4_PLAY_PATH);
            Log.d(TAG, "getTrackCount: " + mediaExtractor.getTrackCount());
            for (int i = 0; i < mediaExtractor.getTrackCount(); i++) {
                MediaFormat format = mediaExtractor.getTrackFormat(i);
                String mime = format.getString(MediaFormat.KEY_MIME);
                Log.d(TAG, "mime: " + mime);
                if (mime.startsWith("video")) {
                    mediaFormat = format;
                    mediaExtractor.selectTrack(i);
                }
            }
        } catch (IOException e) {
            e.printStackTrace();
        }
        Surface surface = MainActivity.getSurface();
        //配置
        mediaCodec.configure(mediaFormat, surface, null, 0);
        mediaCodec.start();
    }

2、视频解码线程

        视频解码线程主要内容就是进行解码的流程控制。这个阶段的相应API有如下:

 // 获取可用的输入缓冲区的索引
 public int dequeueInputBuffer (long timeoutUs)
 // 获取输入缓冲区
 public ByteBuffer getInputBuffer(int index)
 // 将填满数据的inputBuffer提交到编码队列
 public final void queueInputBuffer(int index,int offset, int size, long presentationTimeUs, int flags)
 // 获取已成功编解码的输出缓冲区的索引
 public final int dequeueOutputBuffer(BufferInfo info, long timeoutUs)
 // 获取输出缓冲区
 public ByteBuffer getOutputBuffer(int index)
 // 释放输出缓冲区
 public final void releaseOutputBuffer(int index, boolean render) 

        获取可用的输入缓冲区的索引:

int inputIndex = mediaCodec.dequeueInputBuffer(-1);

        获取输入缓冲区:

ByteBuffer byteBuffer = mediaCodec.getInputBuffer(inputIndex);

       读取相关数据:

//读取一片或者一帧数据
int sampSize = mediaExtractor.readSampleData(byteBuffer, 0);
//读取时间戳
long time = mediaExtractor.getSampleTime();

        将填满数据的inputBuffer提交到编码队列:

if (sampSize > 0 && time > 0) {
    mediaCodec.queueInputBuffer(inputIndex, 0, sampSize, time, 0);
    //读取一帧后必须调用,提取下一帧
    //控制帧率在30帧左右
    mSpeedController.preRender(time);
    mediaExtractor.advance();
}

        获取已成功编解码的输出缓冲区的索引:

BufferInfo bufferInfo = new BufferInfo();
int outIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);

        输出缓冲区:

if (outIndex >= 0) {
    mediaCodec.releaseOutputBuffer(outIndex, true);
}

        完整的视频解码线程代码如下:

/**
 * Play the MP4 file Thread
 * 解码主流程
 */
private class DecoderMP4Thread extends Thread {
    long pts = 0;
 
    @Override
    public void run() {
        super.run();
        while (!isDecodeFinish) {
            int inputIndex = mediaCodec.dequeueInputBuffer(-1);
            Log.d(TAG, "inputIndex: " + inputIndex);
            if (inputIndex >= 0) {
                ByteBuffer byteBuffer = mediaCodec.getInputBuffer(inputIndex);
                //读取一片或者一帧数据
                int sampSize = mediaExtractor.readSampleData(byteBuffer, 0);
                //读取时间戳
                long time = mediaExtractor.getSampleTime();
                if (sampSize > 0 && time > 0) {
                    mediaCodec.queueInputBuffer(inputIndex, 0, sampSize, time, 0);
                    //读取一帧后必须调用,提取下一帧
                    //控制帧率在30帧左右
                    mSpeedController.preRender(time);
                    mediaExtractor.advance();
                }
            }
            BufferInfo bufferInfo = new BufferInfo();
            int outIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
            if (outIndex >= 0) {
                mediaCodec.releaseOutputBuffer(outIndex, true);
            }
        }
    }
}

3、解码结束关闭回收

        完成解码播放后,对相关内存回收和关闭处理:

public void close() {
    try {
        Log.d(TAG, "close start");
        if (mediaCodec != null) {
            isDecodeFinish = true;
            try {
                if (mDecodeMp4Thread != null) {
                    mDecodeMp4Thread.join(2000);
                }
            } catch (InterruptedException e) {
                Log.e(TAG, "InterruptedException " + e);
            }
            boolean isAlive = mDecodeMp4Thread.isAlive();
            Log.d(TAG, "close end isAlive :" + isAlive);
            mediaCodec.stop();
            mediaCodec.release();
            mediaCodec = null;
            mSpeedController.reset();
        }
    } catch (IllegalStateException e) {
        e.printStackTrace();
    }
    instance = null;
}

二、完整的Domo代码

DecoderManager.java

package com.example.mediacodec_decodemp4;


import android.media.MediaCodec;
import android.media.MediaExtractor;
import android.media.MediaFormat;
import android.util.Log;
import java.io.IOException;
import java.nio.ByteBuffer;
import android.media.MediaCodec.BufferInfo;
import android.view.Surface;


public class DecoderManager {


    private static final String TAG = "weekend";


    private static DecoderManager instance;
    private MediaCodec mediaCodec;
    private MediaFormat mediaFormat;
    private volatile boolean isDecodeFinish = false;
    private MediaExtractor mediaExtractor;
    private SpeedManager mSpeedController = new SpeedManager();
    private DecoderMP4Thread mDecodeMp4Thread;


    private DecoderManager() {
    }


    public static DecoderManager getInstance() {
        if (instance == null) {
            instance = new DecoderManager();
        }
        return instance;
    }


    /**
     * * Synchronized callback decoding
     */
    private void initMediaCodecSys() {
        try {
            mediaCodec = MediaCodec.createDecoderByType("video/avc");
            mediaFormat = MediaFormat.createVideoFormat("video/avc", 1280, 720);
            mediaExtractor = new MediaExtractor();
            //MP4 文件存放位置
            mediaExtractor.setDataSource(MainActivity.MP4_PLAY_PATH);
            Log.d(TAG, "getTrackCount: " + mediaExtractor.getTrackCount());
            for (int i = 0; i < mediaExtractor.getTrackCount(); i++) {
                MediaFormat format = mediaExtractor.getTrackFormat(i);
                String mime = format.getString(MediaFormat.KEY_MIME);
                Log.d(TAG, "mime: " + mime);
                if (mime.startsWith("video")) {
                    mediaFormat = format;
                    mediaExtractor.selectTrack(i);
                }
            }
        } catch (IOException e) {
            e.printStackTrace();
        }
        Surface surface = MainActivity.getSurface();
        mediaCodec.configure(mediaFormat, surface, null, 0);
        mediaCodec.start();
    }


    /**
     * Play the MP4 file Thread
     * 解码主流程
     */
    private class DecoderMP4Thread extends Thread {
        long pts = 0;


        @Override
        public void run() {
            super.run();
            while (!isDecodeFinish) {
                int inputIndex = mediaCodec.dequeueInputBuffer(-1);
                Log.d(TAG, "inputIndex: " + inputIndex);
                if (inputIndex >= 0) {
                    ByteBuffer byteBuffer = mediaCodec.getInputBuffer(inputIndex);
                    //读取一片或者一帧数据
                    int sampSize = mediaExtractor.readSampleData(byteBuffer, 0);
                    //读取时间戳
                    long time = mediaExtractor.getSampleTime();
                    if (sampSize > 0 && time > 0) {
                        mediaCodec.queueInputBuffer(inputIndex, 0, sampSize, time, 0);
                        //读取一帧后必须调用,提取下一帧
                        //控制帧率在30帧左右
                        mSpeedController.preRender(time);
                        mediaExtractor.advance();
                    }
                }
                BufferInfo bufferInfo = new BufferInfo();
                int outIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
                if (outIndex >= 0) {
                    mediaCodec.releaseOutputBuffer(outIndex, true);
                }
            }
        }
    }


    public void close() {
        try {
            Log.d(TAG, "close start");
            if (mediaCodec != null) {
                isDecodeFinish = true;
                try {
                    if (mDecodeMp4Thread != null) {
                        mDecodeMp4Thread.join(2000);
                    }
                } catch (InterruptedException e) {
                    Log.e(TAG, "InterruptedException " + e);
                }
                boolean isAlive = mDecodeMp4Thread.isAlive();
                Log.d(TAG, "close end isAlive :" + isAlive);
                mediaCodec.stop();
                mediaCodec.release();
                mediaCodec = null;
                mSpeedController.reset();
            }
        } catch (IllegalStateException e) {
            e.printStackTrace();
        }
        instance = null;
    }




    public void startMP4Decode() {
        initMediaCodecSys();
        mDecodeMp4Thread = new DecoderMP4Thread();
        mDecodeMp4Thread.setName("DecoderMP4Thread");
        mDecodeMp4Thread.start();


    }
}

SpeedManager.java

package com.example.mediacodec_decodemp4;


import android.util.Log;


/**
 * Movie player callback.
 * <p>
 * The goal here is to play back frames at the original rate.  This is done by introducing
 * a pause before the frame is submitted to the renderer.
 * <p>
 * This is not coordinated with VSYNC.  Since we can't control the display's refresh rate, and
 * the source material has time stamps that specify when each frame should be presented,
 * we will have to drop or repeat frames occasionally.
 * <p>
 * Thread restrictions are noted in the method descriptions.  The FrameCallback overrides should
 * only be called from the MoviePlayer.
 */
public class SpeedManager {


    private static final String TAG = "weekend";


    private static final boolean CHECK_SLEEP_TIME = false;


    private static final long ONE_MILLION = 1000000L;


    private long mPrevPresentUsec;
    private long mPrevMonoUsec;
    private long mFixedFrameDurationUsec;
    private boolean mLoopReset;


    /**
     * Sets a fixed playback rate.  If set, this will ignore the presentation time stamp
     * in the video file.  Must be called before playback thread starts.
     */
    public void setFixedPlaybackRate(int fps) {
        mFixedFrameDurationUsec = ONE_MILLION / fps;
    }


    // runs on decode thread
    public void preRender(long presentationTimeUsec) {
        // For the first frame, we grab the presentation time from the video
        // and the current monotonic clock time.  For subsequent frames, we
        // sleep for a bit to try to ensure that we're rendering frames at the
        // pace dictated by the video stream.
        //
        // If the frame rate is faster than vsync we should be dropping frames.  On
        // Android 4.4 this may not be happening.


        if (mPrevMonoUsec == 0) {
            // Latch current values, then return immediately.
            mPrevMonoUsec = System.nanoTime() / 1000;
            mPrevPresentUsec = presentationTimeUsec;
        } else {
            // Compute the desired time delta between the previous frame and this frame.
            long frameDelta;
            if (mLoopReset) {
                // We don't get an indication of how long the last frame should appear
                // on-screen, so we just throw a reasonable value in.  We could probably
                // do better by using a previous frame duration or some sort of average;
                // for now we just use 30fps.
                mPrevPresentUsec = presentationTimeUsec - ONE_MILLION / 30;
                mLoopReset = false;
            }
            if (mFixedFrameDurationUsec != 0) {
                // Caller requested a fixed frame rate.  Ignore PTS.
                frameDelta = mFixedFrameDurationUsec;
            } else {
                frameDelta = presentationTimeUsec - mPrevPresentUsec;
                Log.d(TAG," frameDelta: "+frameDelta);
            }
            if (frameDelta < 0) {
                //LogManager.w("Weird, video times went backward");
                frameDelta = 0;
            } else if (frameDelta == 0) {
                // This suggests a possible bug in movie generation.
                //LogManager.i("Warning: current frame and previous frame had same timestamp");
            } else if (frameDelta > 10 * ONE_MILLION) {
                // Inter-frame times could be arbitrarily long.  For this player, we want
                // to alert the developer that their movie might have issues (maybe they
                // accidentally output timestamps in nsec rather than usec).
                frameDelta = 5 * ONE_MILLION;
            }


            long desiredUsec = mPrevMonoUsec + frameDelta;  // when we want to wake up
            long nowUsec = System.nanoTime() / 1000;
            while (nowUsec < (desiredUsec - 100) /*&& mState == RUNNING*/) {
                // Sleep until it's time to wake up.  To be responsive to "stop" commands
                // we're going to wake up every half a second even if the sleep is supposed
                // to be longer (which should be rare).  The alternative would be
                // to interrupt the thread, but that requires more work.
                //
                // The precision of the sleep call varies widely from one device to another;
                // we may wake early or late.  Different devices will have a minimum possible
                // sleep time. If we're within 100us of the target time, we'll probably
                // overshoot if we try to sleep, so just go ahead and continue on.
                long sleepTimeUsec = desiredUsec - nowUsec;
                if (sleepTimeUsec > 500000) {
                    sleepTimeUsec = 500000;
                }
                try {
                    if (CHECK_SLEEP_TIME) {
                        long startNsec = System.nanoTime();
                        Thread.sleep(sleepTimeUsec / 1000, (int) (sleepTimeUsec % 1000) * 1000);
                        long actualSleepNsec = System.nanoTime() - startNsec;
                    } else {
                        long time = sleepTimeUsec / 1000;
                        Log.d(TAG," time: "+time);
                        Thread.sleep(time, (int) (sleepTimeUsec % 1000) * 1000);
                    }
                } catch (InterruptedException ie) {
                }
                nowUsec = System.nanoTime() / 1000;
            }


            // Advance times using calculated time values, not the post-sleep monotonic
            // clock time, to avoid drifting.
            mPrevMonoUsec += frameDelta;
            mPrevPresentUsec += frameDelta;
        }
    }


    // runs on decode thread
    public void postRender() {
    }


    public void loopReset() {
        mLoopReset = true;
    }


    public void reset() {
        mPrevPresentUsec = 0;
        mPrevMonoUsec = 0;
        mFixedFrameDurationUsec = 0;
        mLoopReset = false;
    }
}

MainActivity.java

package com.example.mediacodec_decodemp4;


import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import android.content.pm.PackageManager;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.os.Handler;
import android.os.Message;
import android.util.Log;
import android.view.Surface;
import android.view.SurfaceView;
import android.view.View;
import android.widget.Button;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;


public class MainActivity extends AppCompatActivity {


    //目前是使用了内置的视频,在raw目录下,然后先拷贝复制到了如下的MP4_PLAY_PATH本地存储中
    public static final String MP4_PLAY_PATH = Environment.getExternalStorageDirectory().getAbsolutePath() + "/TestInputV.mp4";
    private static final String TAG = "weekend";
    private boolean mWorking = false;
    public static SurfaceView surfaceView;
    private Button mStartBtn;
    private static final int INIT_MANAGER_MSG = 0x01;
    private static final int INIT_MANAGER_DELAY = 500;
    private final static int CAMERA_OK = 10001;
    private static String[] PERMISSIONS_STORAGE = {
            "android.permission.CAMERA",
            "android.permission.WRITE_EXTERNAL_STORAGE" };




    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        copyResourceToMemory(R.raw.video, MP4_PLAY_PATH);


        surfaceView = findViewById(R.id.surfaceview);
        if (Build.VERSION.SDK_INT>22) {
            if (!checkPermissionAllGranted(PERMISSIONS_STORAGE)){
                ActivityCompat.requestPermissions(MainActivity.this,
                        PERMISSIONS_STORAGE, CAMERA_OK);
            }
        }
        mStartBtn = findViewById(R.id.btnStartPlay);


        mStartBtn.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                if(mWorking){
                    stopWork();
                    mWorking = false;
                    mStartBtn.setText("start");
                }else{
                    mHandler.sendEmptyMessageDelayed(INIT_MANAGER_MSG, INIT_MANAGER_DELAY);
                    mWorking = true;
                    mStartBtn.setText("stop");
                }
            }
        });
    }


    private Handler mHandler = new Handler() {
        @Override
        public void handleMessage(Message msg) {
            super.handleMessage(msg);
            if (msg.what == INIT_MANAGER_MSG) {
                startWork();
            }
        }
    };


    private boolean checkPermissionAllGranted(String[] permissions) {
        for (String permission : permissions) {
            if (ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) {
                // 只要有一个权限没有被授予, 则直接返回 false
                return false;
            }
        }
        return true;
    }


    private void startWork() {
        DecoderManager.getInstance().startMP4Decode();
    }


    private void stopWork() {
        DecoderManager.getInstance().close();
    }


    private void copyResourceToMemory(int srcPath, String destPath) {
        InputStream fileInputStream = null;
        FileOutputStream fileOutputStream = null;
        try {
            fileInputStream = getResources().openRawResource(srcPath);
            File file = new File(destPath);
            if (file.exists()) {
                return;
            }
            file.createNewFile();
            fileOutputStream = new FileOutputStream(file);
            byte[] bytes = new byte[1024];
            while ((fileInputStream.read(bytes)) > 0) {
                fileOutputStream.write(bytes);
            }


        } catch (FileNotFoundException e) {
            Log.e(TAG, "copyVideoResourceToMemory FileNotFoundException : " + e);
        } catch (IOException e) {
            Log.e(TAG, "copyVideoResourceToMemory IOException : " + e);
        } finally {
            try {
                if(fileInputStream!=null){
                    fileInputStream.close();
                }
                if(fileOutputStream!=null){
                    fileOutputStream.close();
                }


            } catch (IOException e) {
                Log.e(TAG, "close stream IOException : " + e);
            }
        }
    }


    public static Surface getSurface() {
        return surfaceView.getHolder().getSurface();
    }
}

activity_main.xml

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent" >


    <SurfaceView
        android:id="@+id/surfaceview"
        android:layout_width="match_parent"
        android:layout_marginBottom="60dp"
        android:layout_height="match_parent"/>


    <Button
        android:id="@+id/btnStartPlay"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_centerHorizontal="true"
        android:layout_alignParentBottom="true"
        android:text="Start"/>


</RelativeLayout>

AndroidManifest.xml

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.example.mediacodec_decodemp4">


    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
    <uses-permission android:name="android.permission.CAMERA" />


    <application
        android:allowBackup="true"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/Theme.MediaCodec_DecodeMP4">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />


                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
    </application>


</manifest>
  • 2
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
要在 Android使用 OpenGL ES 2.0 和 C/C++ 实现使用 MediaCodec 解码 MP4 视频格式并将其转换为 RGB 图像,需要完成以下步骤: 1. 创建 Android 应用程序并引入必要的库文件 在 Android Studio 中创建一个新的 Android 应用程序,并在 build.gradle 文件中添加以下库文件的引用: ``` // For decoding MP4 video format implementation 'com.google.android.exoplayer:exoplayer-core:2.X.X' implementation 'com.google.android.exoplayer:exoplayer-ui:2.X.X' // For using OpenGL ES in Android implementation 'com.android.support:support-v4:28.0.0' implementation 'com.android.support:appcompat-v7:28.0.0' implementation 'com.android.support:design:28.0.0' // For using C/C++ in Android implementation 'com.android.support:support-compat:28.0.0' implementation 'com.android.support:support-core-utils:28.0.0' implementation 'com.android.support:support-annotations:28.0.0' implementation 'com.android.support:support-core-ui:28.0.0' // For using MediaCodec in Android implementation 'com.android.support:support-media-compat:28.0.0' ``` 2. 创建 OpenGL 上下文并加载着色器程序 在 C/C++ 中使用 OpenGL ES 2.0,需要创建一个 OpenGL 上下文并加载着色器程序。可以使用 Android 提供的 NativeActivity 类来创建一个带有 OpenGL 上下文的活动,并使用 GLES20.glCreateProgram() 和 GLES20.glAttachShader() 等方法来加载着色器程序。 3. 使用 MediaCodec 解码 MP4 视频格式并将其转换为 RGB 图像 可以使用 Android 提供的 MediaCodec 类来解码 MP4 视频格式,并使用 GLES20.glTexImage2D() 和 GLES20.glTexSubImage2D() 等方法将解码后的视频帧转换为 RGB 图像。需要注意的是,由于 MediaCodec 解码后的视频帧是 YUV 格式的,需要进行 YUV 到 RGB 的转换。 4. 在 OpenGL 中渲染 RGB 图像 将转换后的 RGB 图像渲染到 OpenGL 中,可以使用 GLES20.glDrawArrays() 和 GLES20.glEnableVertexAttribArray() 等方法。需要注意的是,由于 RGB 图像的数据格式是 GL_UNSIGNED_BYTE,需要使用 GLES20.glPixelStorei() 方法设置像素存储模式。 以上是大致的代码实现步骤,实际上还有很多细节需要注意,比如使用 OpenGL ES 2.0 的版本号、将 YUV 转换为 RGB 的算法等。如果需要深入了解,可以查看 Android 官方文档和相关的开源库代码。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值