android studio中mediaCodec解码摄像头数据避坑

今天是2023年大年初一,最近在做一个音视频小项目(毕设用),要将摄像头的帧数据编码并保存到手机上实现视频录制的功能,这里用到了mediaCodec来进行编码处理。这里的东西主要是关于写代码过程中出现的一些bug的处理,如果读者不知道mediaCodec的生产者消费者机制,建议先看其它文章了解一下。希望该文章对一些可能也有此类问题的同仁有一些启示。

首先要获取摄像头的数据,这个就没什么好说的了,打开摄像头的代码网上四处都是。

然后要配置MediaCodec和MediaMuxer(生成mp4容器)来将一帧一帧的数据保存在mp4文件中,这里我们首先要初始化mediaCodec,一开始笔者参考了grafika的代码,但是实际存在很多问题:

mBufferInfo = new MediaCodec.BufferInfo();
MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
        MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
format.setInteger(MediaFormat.KEY_BIT_RATE, width*height*5);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
if (VERBOSE) Log.d(TAG, "format: " + format);
mEncoder = MediaCodec.createEncoderByType(MIME_TYPE);
mEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mEncoder.start();
ouputFile1=new File(outputFile,"camera-test.mp4");
outputFile2=new File(outputFile,"mengying.264");
mMuxer = new MediaMuxer(ouputFile1.toString(),
        MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);

这里的代码各个参数的设置要尤其注意,首先我们要编码的是yuv420数据(后面会给出代码如何将手机摄像头拍摄的nv21数据格式改成yuv420)。这里的宽高一定要和摄像头那边设置的每帧图像的宽高一模一样,之前我就是两边数据不一样,手机摄像头设置了1920*1080,然后这边设置1280*720导致抛出bufferOverFlow的异常,因为后面正式开始解码时要将数据塞到 mEncoder.dequeueInputBuffer里面,而队列里面每一个ByteBuffer中初始化的大小是根据这个宽高设置好的,之前一些文章说可以通过设置MediaFormat的什么MAX_SIZE属性来设置,事实上这个是不能手动配置大小的,大小就是由最初输入的宽高决定的。并且这里的码率这个都是不能乱设置的,之前笔者这里的数据设置错误,结果在编码过程中报出如下错误并抛出IlegalState异常:

2023-01-22 01:56:50.065 25734-25895/com.example.testmedias E/ACodec: [OMX.qcom.video.encoder.avc] ERROR(0x80001009)
2023-01-22 01:56:50.065 25734-25895/com.example.testmedias E/ACodec: signalError(omxError 0x80001009, internalError -2147483648)
2023-01-22 01:56:50.065 25734-25894/com.example.testmedias E/MediaCodec: Codec reported err 0x80001009, actionCode 0, while in state 6

网上对此类错误的猜测都是复制的,说是输入数据有错,什么标志位没加,笔者认为出现此类问题最好也查一下相关的编码参数是否设置的有问题,还有机型的最大分辨率问题可能也会导致此类问题。然后还有一个就是mediaMuxer用完要记得release掉,否则会导致解码完的mp4视频不能播放

还有一大坑点,原项目中获取帧数据的方法好像是用Surface来获取输入的数据,于是有一个

mInputSurface = mEncoder.createInputSurface();

但笔者的项目后面是通过camera的一个onPreviewFrame的方法来获取每一帧的byte数据,然后转成yuv420塞入mediacodec的队列中,至始至终没有用到surface,但这句的出现会导致代码获取输入队列时抛出illegalState的异常,异常前面会有一句:can't use with inputSurface之类的错误,删了这句以后错误就解决了。

下面贴出的笔者的FrameEncoder代码,笔者也只是大四在校生代码风格比较丑陋,仅供参考

/*
 * Copyright 2014 Google Inc. All rights reserved.
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *      http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package com.example.testmedias;

import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Build;
import android.os.Handler;
import android.util.Log;
import android.view.Surface;

import androidx.annotation.RequiresApi;

import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;

/**
 * This class wraps up the core components used for surface-input video encoding.
 * <p>
 * Once created, frames are fed to the input surface.  Remember to provide the presentation
 * time stamp, and always call drainEncoder() before swapBuffers() to ensure that the
 * producer side doesn't get backed up.
 * <p>
 * This class is not thread-safe, with one exception: it is valid to use the input surface
 * on one thread, and drain the output on a different thread.
 */
public class FrameEncoder {
    private static final String TAG = MainActivity.TAG;
    private static final boolean VERBOSE = false;

    // TODO: these ought to be configurable as well
    private static final String MIME_TYPE = "video/avc";    // H.264 Advanced Video Coding
    private static final int FRAME_RATE = 30;               // 30fps
    private static final int IFRAME_INTERVAL = 5;           // 5 seconds between I-frames

    private Surface mInputSurface;
    private MediaMuxer mMuxer;
    private MediaCodec mEncoder;
    private MediaCodec.BufferInfo mBufferInfo;
    private int mTrackIndex;
    private boolean mMuxerStarted;
    private boolean endOfStream=false;
    public boolean isRecording=true;

    private BufferedOutputStream bufferedOutputStream;

    private File ouputFile1,outputFile2;


    /**
     * Configures encoder and muxer state, and prepares the input Surface.
     */
    @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
    public FrameEncoder(int width, int height, int bitRate, File outputFile)
            throws IOException {
        mBufferInfo = new MediaCodec.BufferInfo();

        MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);

        // Set some properties.  Failing to specify some of these can cause the MediaCodec
        // configure() call to throw an unhelpful exception.
        format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
                MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
        format.setInteger(MediaFormat.KEY_BIT_RATE, width*height*5);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
        if (VERBOSE) Log.d(TAG, "format: " + format);

        // Create a MediaCodec encoder, and configure it with our format.  Get a Surface
        // we can use for input and wrap it with a class that handles the EGL work.
        mEncoder = MediaCodec.createEncoderByType(MIME_TYPE);
        mEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mEncoder.start();

        ouputFile1=new File(outputFile,"camera-test.mp4");
        outputFile2=new File(outputFile,"mengying.264");
        // Create a MediaMuxer.  We can't add the video track and start() the muxer here,
        // because our MediaFormat doesn't have the Magic Goodies.  These can only be
        // obtained from the encoder after it has started processing data.
        //
        // We're not actually interested in multiplexing audio.  We just want to convert
        // the raw H.264 elementary stream we get from MediaCodec into a .mp4 file.
        mMuxer = new MediaMuxer(ouputFile1.toString(),
                MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);

        bufferedOutputStream=new BufferedOutputStream(new FileOutputStream(outputFile2));

        mTrackIndex = -1;
        mMuxerStarted = false;
    }

    /**
     * Returns the encoder's input surface.
     */
    public Surface getInputSurface() {
        return mInputSurface;
    }

    public void setStreamState(boolean endOfStream)
    {
        this.endOfStream=endOfStream;
    }
    /**
     * Releases encoder resources.
     */
    public void release() {
        if (VERBOSE) Log.d(TAG, "releasing encoder objects");
        if (mEncoder != null) {
            mEncoder.stop();
            mEncoder.release();
            mEncoder = null;
        }
        if (mMuxer != null) {
            // TODO: stop() throws an exception if you haven't fed it any data.  Keep track
            //       of frames submitted, and don't call stop() if we haven't written anything.
            mMuxer.stop();
            mMuxer.release();
            mMuxer = null;
        }
    }

    @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
    public void getBufferData(byte[] buffer)
    {
        if(!isRecording)
        {
            return;
        }

//        System.out.println("hasn't done 1111");
        Handler handler=new Handler();
        handler.post(new Runnable() {
            @Override
            public void run() {
                synchronized (this) {
                    try {
                        int index = mEncoder.dequeueInputBuffer(0);
                        if (index > 0) {
                            ByteBuffer inputBuffer = mEncoder.getInputBuffer(index);
                            inputBuffer.clear();
                            System.out.println(inputBuffer.remaining());
                            System.out.println(buffer.length);
                            inputBuffer.put(buffer, 0, buffer.length);
                            System.out.println("hasn't done1");
                            mEncoder.queueInputBuffer(index, 0, buffer.length, System.nanoTime() / 1000, 0);
                            System.out.println("hasn't done");
                            drainEncoder();
                            System.out.println("is working");
                        }
                    }
                    catch (NullPointerException ex)
                    {
                        ex.printStackTrace();
                    }
                }
            }
        });


    }

    /**
     * Extracts all pending data from the encoder and forwards it to the muxer.
     * <p>
     * If endOfStream is not set, this returns when there is no more data to drain.  If it
     * is set, we send EOS to the encoder, and then iterate until we see EOS on the output.
     * Calling this with endOfStream set should be done once, right before stopping the muxer.
     * <p>
     * We're just using the muxer to get a .mp4 file (instead of a raw H.264 stream).  We're
     * not recording audio.
     */
    public void drainEncoder() {
        final int TIMEOUT_USEC = 10000;
        if (VERBOSE) Log.d(TAG, "drainEncoder(" + endOfStream + ")");

        if (endOfStream) {
            if (VERBOSE) Log.d(TAG, "sending EOS to encoder");
//            mEncoder.signalEndOfInputStream();
        }

        ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();
        while (true) {
            int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // no output available yet
                if (!endOfStream) {
                    break;      // out of while
                } else {
                    if (VERBOSE) Log.d(TAG, "no output available, spinning to await EOS");
                }
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not expected for an encoder
                encoderOutputBuffers = mEncoder.getOutputBuffers();
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // should happen before receiving buffers, and should only happen once
                if (mMuxerStarted) {
                    throw new RuntimeException("format changed twice");
                }
                MediaFormat newFormat = mEncoder.getOutputFormat();
                Log.d(TAG, "encoder output format changed: " + newFormat);

                // now that we have the Magic Goodies, start the muxer
                mTrackIndex = mMuxer.addTrack(newFormat);
                mMuxer.start();
                mMuxerStarted = true;
            } else if (encoderStatus < 0) {
                Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " +
                        encoderStatus);

                // let's ignore it
            } else {
                ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                if (encodedData == null) {
                    System.out.println("mengying:ERROR:encodedData is null");
                    throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
                            " was null");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    // The codec config data was pulled out and fed to the muxer when we got
                    // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                    if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                    mBufferInfo.size = 0;
                }

                if (mBufferInfo.size != 0) {
                    if (!mMuxerStarted) {
                        System.out.println("mengying:muxer hasn't started");
                        throw new RuntimeException("muxer hasn't started");
                    }

                    System.out.println("mengying is encoding");
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    encodedData.position(mBufferInfo.offset);
                    encodedData.limit(mBufferInfo.offset + mBufferInfo.size);

                    mMuxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);
                    if (VERBOSE) {
                        Log.d(TAG, "sent " + mBufferInfo.size + " bytes to muxer, ts=" +
                                mBufferInfo.presentationTimeUs);
                    }
                }

                mEncoder.releaseOutputBuffer(encoderStatus, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    if (!endOfStream) {
                        Log.w(TAG, "reached end of stream unexpectedly");
                    } else {
                        if (VERBOSE) Log.d(TAG, "end of stream reached");
                    }
                    break;      // out of while
                }
            }
        }
    }
}
onPreviewFrame也有一个坑点,就是你要设置camera的纹理,如果你不setTexture的话这个函数就不会有返回,之前这个问题笔者也看了一个多小时才得解,不过这里创造SurfaceTexture的ID我就随便设了一个,这个和opengl的绘制有关系,而笔者项目暂时并不想把opengl的东西扯进来增添麻烦。

附上NV21转YUV420的代码:

 private byte[] NV21ToNV12(byte[] nv21, int width, int height) {
        byte[] nv12 = new byte[width * height * 3 / 2];
        int frameSize = width * height;
        int i, j;
        System.arraycopy(nv21, 0, nv12, 0, frameSize);
        for (i = 0; i < frameSize; i++) {
            nv12[i] = nv21[i];
        }
        for (j = 0; j < frameSize / 2; j += 2) {
            nv12[frameSize + j - 1] = nv21[j + frameSize];
        }
        for (j = 0; j < frameSize / 2; j += 2) {
            nv12[frameSize + j] = nv21[j + frameSize - 1];
        }
        return nv12;
    }
    //NV21:YYYYUVUV
    //YUV420:YYYYUUVV

一些该有的权限不要忘了加了,如果没加权限打开摄像头会崩溃的:

 <uses-feature android:name="android.hardware.camera" />
    <uses-feature android:name="android.hardware.camera.autofocus" />
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.FLASHLIGHT" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值