Android 音视频录制(2)——Buffer录制

概述

没有看上一篇Surface录制的小伙伴,先去看了Android音视频录制概述Android音视频录制(1)——Surface录制 这两篇文章在来看此篇文章。

看完此篇文章后,另外推荐一篇文章Android全关键帧视频录制——视频编辑必备

正如前面文章说的,surface录制是将摄像头数据通过egl和opengl绘制到编码器surface最后输出到文件的,buffer录制则是更直接,直接将摄像头数据灌输到编码器,让编码器直接编码数据后输出到文件,具体详见下文。

因为在surface录制 中已经详细说了音频数据的录制,在这篇文章中就不说音频轨道的录制,因为音频录制的代码和原理基本一样的,所以为了不浪费大家的精力,在这里只讲述视频轨道数据的录制,下面让我们开始旅程吧。

流程综述:Camera绑定SurfaceView, 通过onPreviewFrame()得到摄像头数据,再把数据输入到视频编码器MediaCodec中,编码完成后输出编码数据给音视频混合器MediaMuxer,最后由MediaMuxer写入数据到文件。

再次说明,请先看了Android音视频录制概述Android音视频录制(1)——Surface录制 这两篇文章在来看此篇文章,否则有些知识点可能会看不懂。当然,除此外,看这篇文章,小伙伴们除了对之前说的Android多媒体库(MediaCodec/MediaMuxer/Camera等)有必要的了解外,对YUV视频帧数据也需要做一定的了解,因为Buffer录制的数据是基于YUV的视频帧数据的。对于颜色模式了解,非常小伙伴看这两篇文章Android颜色模式详解YUV详解

预览

Android音视频录制(1)——Surface录制 中我们采用的是GLSurfaceView作为视频数据载体来录制,而在这篇文章中我们是采用SurfaceView作为数据载体来录制。原因是如果采用GLSurfaceView,要得到摄像头的YUV数据会非常的困难,因为通过GLSurfaceView得到的是ARGB数据,要手动的转一遍YUV数据,会有巨大的性能问题。
初始化摄像头的时候,必须要指定摄像头的数据预览格式为NV21(YUV数据的一种),摄像头初始化完成后绑定SurfaceView,在onPreviewFrame()回调中得到预览的NV21数据,将此数据提供给编码器。下面是预览相关的代码。

package lda.com.myrecorder;

import android.app.Activity;
import android.graphics.Bitmap;
import android.graphics.ImageFormat;
import android.hardware.Camera;
import android.media.MediaMetadataRetriever;
import android.os.Environment;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.widget.Button;

import java.io.File;
import java.io.IOException;
import java.util.List;

public class PreviewActivity extends Activity {

    private static final String TAG = PreviewActivity.class.getSimpleName();
    private Button mRecordCtrlView;
    private Button mCapturePictureView;
    private Button mSwitchCameraView;
    private SurfaceView mSurfaceView;
    private Camera mCamera;
    private SurfaceHolder mSurfaceHolder;
    private SurfaceHolder.Callback mSurfaceCallback;
    private Camera.Parameters mParameters;
    private Camera.PreviewCallback mPreviewCallback;
    private MMuxer mMuxer;
    private VideoEncoder mVideoEncoder;
    private boolean mIsRecording = false;
    private long mPreviewImgTime = 0;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_preview);

        initData();
        initView();
    }

    private void initData() {
        mSurfaceCallback = new SurfaceHolder.Callback() {
            @Override
            public void surfaceCreated(SurfaceHolder surfaceHolder) {
                boolean isInit = true;
                if(mCamera == null){
                    isInit = initCamera();
                    Log.d(TAG, "surfaceCreated format: " + mCamera.getParameters().getPreviewFormat());
                }
                if(isInit){
                  startPreview();
                }
            }

            @Override
            public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {

            }

            @Override
            public void surfaceDestroyed(SurfaceHolder surfaceHolder) {

            }
        };
        mPreviewImgTime = 0;
        mPreviewCallback = new Camera.PreviewCallback() {
            @Override
            public void onPreviewFrame(byte[] bytes, Camera camera) {
                if(mIsRecording) {
                    Frame frame = new Frame();
                    frame.mData = bytes;
                    frame.mTime = System.nanoTime() / 1000;
                    if(frame.mTime - mPreviewImgTime > 1000 * 1000) {
//                        VideoEncoder.saveBitmap(frame, mCamera.getParameters().getPreviewFormat());
                        mPreviewImgTime = frame.mTime;
                    }
                    //将预览的nv21数据传递给编码器
                    mVideoEncoder.addFrame(bytes);
                }
            }
        };
    }

    private void startPreview() {
        try {
            //绑定surfaceview
            mCamera.setPreviewDisplay(mSurfaceHolder);
            mCamera.startPreview();//开始预览
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    //初始化摄像头
    private boolean initCamera() {
        int num = Camera.getNumberOfCameras();
        if(num <= 0){
            return false;
        }
        boolean open = true;
        try {
            if (num == 1) {
                mCamera = Camera.open(0);
            } else {
                mCamera = Camera.open(1);
            }
            mCamera.setPreviewCallback(mPreviewCallback);
            mParameters = mCamera.getParameters();
            mParameters.setRotation(90);
            mParameters.setPreviewFormat(ImageFormat.NV21); // 设置NV21预览格式
            List<Camera.Size> list = mCamera.getParameters().getSupportedPreviewSizes();
            if(list != null && !list.isEmpty()){
                for(Camera.Size size : list){
                    Log.d(TAG, "camera support size=" + size.width + " " + size.height);
                }
                for(Camera.Size size : list){
                    if(size.height == Config.VIDEO_WIDTH && size.width == Config.VIDEO_HEIGHT){
                        mParameters.setPreviewSize(size.width, size.height);//预览带下
                        mCamera.setParameters(mParameters);
                        mCamera.setDisplayOrientation(90);//预览方向
                        return true;
                    }
                }
            }
        }catch (Exception e){

        }
        return false;
    }

    private void initView() {
        mRecordCtrlView = (Button)findViewById(R.id.record_ctrl);
        mCapturePictureView = (Button)findViewById(R.id.catch_pic);
        mSwitchCameraView = (Button)findViewById(R.id.switch_camera);
        mSurfaceView = (SurfaceView)findViewById(R.id.preview_view);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.addCallback(mSurfaceCallback);

        mRecordCtrlView.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                if(mIsRecording){
                    mRecordCtrlView.setText("开始录制");
                    mVideoEncoder.stop();
                    new Thread(new Runnable() {
                        @Override
                        public void run() {
                            try {
                                Thread.sleep(3000);
                                printVideoInfo();
                                saveFirstFrame();

                            } catch (InterruptedException e) {
                                e.printStackTrace();
                            }
                        }
                    }).start();

                }else{
                    mMuxer = new MMuxer(getSaveVideoPath());
                    mVideoEncoder = new VideoEncoder(mMuxer);
                    mVideoEncoder.setAllKeyFrame(true);
                    mVideoEncoder.prepare();
                    mVideoEncoder.start();
                    mRecordCtrlView.setText("停止录制");
                }
                mIsRecording = !mIsRecording;
            }
        });
    }

    private void saveFirstFrame() {
        MediaMetadataRetriever retriever = new MediaMetadataRetriever();
        retriever.setDataSource(getSaveVideoPath());
        Bitmap bitmap = retriever.getFrameAtTime(0);
        BitmapUtil.saveBitmap(bitmap, new File(Config.getSaveDir(), "first.jpg"));
        Log.e(TAG, "saveFirstFrame: save first frame");
    }

    private void printVideoInfo() {
        VideoInfo videoInfo = VideoInfo.getVideoInfo(getSaveVideoPath());
        if(videoInfo != null) {
            Log.d(TAG, "videoInfo width=" + videoInfo.mWidth + " height=" + videoInfo.mHeight + " duration=" + videoInfo.mDuration);
        }else{
            Log.d(TAG, "video info null");
        }
    }

    private String getSaveVideoPath() {
        File dir = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "00recorder" + File.separator);
        if(!dir.exists() || !dir.isDirectory()){
            dir.mkdirs();
        }
        File file = new File(dir, "buffer.mp4");
        return file.getAbsolutePath();
    }
}

视频帧数据

package lda.com.myrecorder;

/**
 * Created by lda on 2017/11/16.
 */

public class Frame{
    public byte[] mData;//视频帧数据,nv21
    public long mTime;//时间戳
    public boolean mIsEos = false;//是否停止编码
}

配置信息:

package lda.com.myrecorder;

import android.os.Environment;

import java.io.File;

/**
 * Created by lda on 2017/10/11.
 */

public class Config {
    public static final int VIDEO_WIDTH = 720;
    public static final int VIDEO_HEIGHT = 1280;
    public static String getSaveDir(){
        String path = Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "00recorder" + File.separator;
        File f = new File(path);
        if(!f.exists() || !f.isDirectory()){
            f.mkdirs();
        }
        return path;
    }

    public static String getSavePath(){
        return getSaveDir() + "aa.mp4";
    }
}

文件操作

package lda.com.myrecorder;

import android.text.TextUtils;

import java.io.File;

/**
 * Created by lda on 2017/11/10.
 */

public class FileUtil {

    public static boolean isFileExisted(String path) {
        if(TextUtils.isEmpty(path)){
            return false;
        }
        File file = new File(path);
        if(file.exists()){
            return true;
        }
        return false;
    }
}

图片操作:

package lda.com.myrecorder;

import android.graphics.Bitmap;

import java.io.File;
import java.io.FileOutputStream;

/**
 * Created by lda on 2017/11/16.
 */

public class BitmapUtil{

        public static boolean saveBitmap(Bitmap bitmap, File dstFile) {
            if(bitmap == null || bitmap.isRecycled()){
                return false;
            }
            if(dstFile == null){
                return false;
            }
            if(dstFile.exists() && dstFile.isFile()) {
                dstFile.delete();
            }
            try {
                FileOutputStream e = new FileOutputStream(dstFile);
                bitmap.compress(Bitmap.CompressFormat.JPEG, 100, e);
                e.flush();
                e.close();
                return true;
            } catch (Exception var3) {
            }
            return false;
        }
}

获取视频信息:

package lda.com.myrecorder;

import android.media.MediaMetadataRetriever;
import android.util.Log;

/**
 * Created by lda on 2017/11/10.
 * 
 * 获取视频信息
 */

public class VideoInfo {
    private static final String TAG = "VideoInfo";
    public int mHeight = 0;
    public int mWidth = 0;
    public int mAngle = 0;
    public String mPath;
    public long mDuration = 0;

    public static VideoInfo getVideoInfo(String path) {

        if (FileUtil.isFileExisted(path)) {
            MediaMetadataRetriever retr = null;
            try {
                retr = new MediaMetadataRetriever();
                retr.setDataSource(path);
                String height = retr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT); // 视频高度
                String width = retr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH); // 视频宽度
                String angle = retr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION);
                String duration = retr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
                Log.i("dao.log.rotate", "height=" + height + " width=" + width);
                VideoInfo videoInfo = new VideoInfo();
                videoInfo.mHeight = Integer.parseInt(height);
                videoInfo.mWidth = Integer.parseInt(width);
                videoInfo.mAngle = Integer.parseInt(angle);
                videoInfo.mPath = path;
                videoInfo.mDuration = Long.parseLong(duration);
                return videoInfo;
            } catch (Exception e) {
                Log.e(TAG, e + "");
            } finally {
                if (retr != null) {
                    try {
                        retr.release();
                    } catch (Exception e) {
                    }
                }
            }
        }
        return null;
    }
}

UI布局文件

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout
    xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    tools:context="lda.com.myrecorder.PreviewActivity">
    <LinearLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content">
        <Button
            android:text="开始录制"
            android:id="@+id/record_ctrl"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"/>
        <Button
            android:id="@+id/switch_camera"
            android:text="切换摄像头"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"/>
        <Button
            android:id="@+id/catch_pic"
            android:text="拍照"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"/>
    </LinearLayout>
    <SurfaceView
        android:id="@+id/preview_view"
        android:layout_width="match_parent"
        android:layout_height="match_parent"/>
</LinearLayout>

预览相对于Surface录制来说简单很多,上面代码已经写的很详细了,此处不多赘述了

编码

编码的原理很简单,摄像头在预览的过程中不断的传入数据过来(addFrame()),编码器中又一个List用于缓存这些视频数据(FrameList),编码器是一个轮询线程,在小于 1000/帧率 的时间间隔内轮询该列表,如果有数据,就取出数据,编码该数据,编码过程为:将NV21帧数据转换为NV12数据,再将NV12数据输入到编码器,此时编码器会自动编码该数据,然后输出编码后的数据(output())到混合器。

此处务必注意的是编码器的配置,编码器的颜色模式在我的代码中使用的是COLOR_FormatYUV420SemiPlanar,这个颜色模式就是NV12的格式,但有些手机可能不支持这个格式,所以小伙伴们需要先检查手机的编码器是否支持该格式,如果不支持该格式就得使用其他的颜色模式。通过MediaCodecInfo.getCapabilitiesForType()可以拿到当前编码器的MediaCodecInfo.CodecCapabilities,它有一个数组变量colorFormats包含的是该编码器支持的颜色模式,需要在这些模式中选择适用的颜色模式(在这篇文章中以COLOR_FormatYUV420SemiPlanar为例,建议小伙伴们如果编码器支持的颜色模式种包含这个模式,优先使用这个模式),因为摄像头拿到的是NV21的数据,所以就需要把NV21数据转为你当前编码器配置的颜色模式

在刚刚推荐的 颜色模式的两篇文章中可以知道NV12对应的数据格式是YYYYUVUVUV,而NV21对应的是YYYYVUVUVU,在代码中是通过NV21toI420SemiPlanar()这个函数来进行转化的。

由于在Surface录制中已经做了很详细的解释,在这里对于编码器我就不罗嗦过多了。直接看吧,关键部分我都有详细注释:

package lda.com.myrecorder;

/**
 * Created by lda on 2017/10/11.
 */

public interface IEncoder {
    void prepare();
    void input(Frame frame);
    void output(boolean isEos);
    void release();
}
package lda.com.myrecorder;

import android.annotation.TargetApi;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Rect;
import android.graphics.YuvImage;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaCodecList;
import android.media.MediaFormat;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;

import java.io.ByteArrayOutputStream;
import java.io.File;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.List;

/**
 * Created by lda on 2017/10/11.
 */

public class VideoEncoder implements IEncoder, Runnable {
    private static final String TAG = "video_encoder";
    private static final String MIME_TYPE = "video/avc";
    private static final int FRAME_RATE = 30;
    private static final int BIT_RATE = 4 * 1024 * 1024;
    private static final long TIMEOUT_USEC = 10000;
    private long mLoopInterval;
    private MediaCodec mMediaCodec;
    private int mColorFormat;
    private long mPresentTimeUs;
    private long mStartTime;
    private MediaCodec.BufferInfo mBufferInfo;
    private MMuxer mMuxer;
    private MediaFormat mMediaFormat;
    private boolean mIsRecording = false;
    //视频帧数据缓存列表
    private List<Frame> mFrameList;
    private boolean mIsRunning = true;
    private boolean mIsAllKeyFrame = false;



    public VideoEncoder(MMuxer muxer){
        mBufferInfo = new MediaCodec.BufferInfo();
        mMuxer = muxer;
        mFrameList = new ArrayList<>();
        mLoopInterval = 1000 / FRAME_RATE  / 2;
    }

    //NV21转为NV12
    private void NV21toI420SemiPlanar(byte[] nv21bytes, byte[] i420bytes, int width, int height) {
        System.arraycopy(nv21bytes, 0, i420bytes, 0, width * height);
        for (int i = width * height; i < nv21bytes.length; i += 2) {
            i420bytes[i] = nv21bytes[i + 1];
            i420bytes[i + 1] = nv21bytes[i];
        }
    }

    public void addFrame(byte[] data){
        Frame frame = new Frame();
        frame.mTime = System.nanoTime() / 1000;
        frame.mData = data;
        mFrameList.add(frame);
    }

    //结束编码
    public void eosFrame(){
        Frame frame = new Frame();
        frame.mTime = System.nanoTime() / 1000;
        frame.mIsEos = true;
        mFrameList.add(frame);
    }

    @Override
    public void prepare() {
        MediaCodecInfo codecInfo = selectCodec(MIME_TYPE);
        if (codecInfo == null) {
            return;
        }
        mColorFormat = MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar;//NV12数据格式
        checkColorFormat(codecInfo, MIME_TYPE);
        Log.i(TAG, "colorformat=" + mColorFormat);
        MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, Config.VIDEO_HEIGHT, Config.VIDEO_WIDTH);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, mColorFormat);
        try {
            mMediaCodec = MediaCodec.createByCodecName(codecInfo.getName());
        } catch (Exception e) {
            Log.i(TAG, e + "");
        }
        try {
            if(!mIsAllKeyFrame) {
                mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);  //单位是 秒
            }else{
                mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 0);//设置为0
            }
            mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            Log.i(TAG, "success configure-----------");
        } catch (Exception e) {
            Log.v(TAG, "config failed " + e);
            try {
                mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);  //单位是 秒
                mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
                Log.i(TAG, "config second success");
            } catch (Exception e1) {
                Log.i(TAG, "config second failed " + e1);
            }
        }

        try {
            mMediaCodec.start();
        } catch (Exception e) {
            Log.i(TAG, "start error--" + e);
        }
        mTrackIndex = -1;
    }

    private int mTrackIndex;

    private static int checkColorFormat(MediaCodecInfo codecInfo, String mimeType) {
        MediaCodecInfo.CodecCapabilities capabilities = codecInfo
                .getCapabilitiesForType(mimeType);
        for (int i = 0; i < capabilities.colorFormats.length; i++) {
            int colorFormat = capabilities.colorFormats[i];
            Log.d(TAG, "checkColorFormat support=" + colorFormat);
        }
        for (int i = 0; i < capabilities.colorFormats.length; i++) {
            int colorFormat = capabilities.colorFormats[i];
            if (isRecognizedFormat(colorFormat)) {
                return colorFormat;
            }
        }
        return 0;
    }
    private void NV21ToNV12(byte[] nv21,byte[] nv12,int width,int height){
        if(nv21 == null || nv12 == null)return;
        int framesize = width*height;
        int i = 0,j = 0;
        System.arraycopy(nv21, 0, nv12, 0, framesize);
        for(i = 0; i < framesize; i++){
            nv12[i] = nv21[i];
        }
        for (j = 0; j < framesize/2; j+=2)
        {
            nv12[framesize + j-1] = nv21[j+framesize];
        }
        for (j = 0; j < framesize/2; j+=2)
        {
            nv12[framesize + j] = nv21[j+framesize-1];
        }
    }


    private static boolean isRecognizedFormat(int colorFormat) {
        switch (colorFormat) {
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
            case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible:
                return true;
            default:
                return false;
        }
    }

    private static MediaCodecInfo selectCodec(String mimeType) {
        int numCodecs = MediaCodecList.getCodecCount();
        for (int i = 0; i < numCodecs; i++) {
            MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i);
            if (!codecInfo.isEncoder()) {
                continue;
            }
            String[] types = codecInfo.getSupportedTypes();
            for (int j = 0; j < types.length; j++) {
                if (types[j].equalsIgnoreCase(mimeType)) {
                    return codecInfo;
                }
            }
        }
        return null;
    }

    public void setAllKeyFrame(boolean allKeyFrame) {
        mIsAllKeyFrame = allKeyFrame;
    }

    @TargetApi(19)
    protected void requestKeyFrame() {
        if (mIsAllKeyFrame){
            try {
                Bundle reqKeyCmd = new Bundle();
                reqKeyCmd.putInt(MediaCodec.PARAMETER_KEY_REQUEST_SYNC_FRAME, 0);
                mMediaCodec.setParameters(reqKeyCmd);
            } catch (Exception e) {
            }
        }
    }

    @Override
    public void input(Frame frame) {
        if(mIsAllKeyFrame){
            requestKeyFrame();
        }
        ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
        int inputBufferIndex = mMediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
        byte[] dst = null;
        if(frame.mData != null){
            dst = new byte[frame.mData.length];
            //nv21转nv12
            NV21toI420SemiPlanar(frame.mData, dst, Config.VIDEO_WIDTH, Config.VIDEO_HEIGHT);
        }
        logD("input frame time =" + frame.mTime + " isEos=" + frame.mIsEos);
        if (inputBufferIndex >= 0) {
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            if(dst != null) {
                inputBuffer.put(dst);
                mMediaCodec.queueInputBuffer(inputBufferIndex, 0, dst.length, frame.mTime, 0);
                output(false);
            }else{
                mMediaCodec.queueInputBuffer(inputBufferIndex, 0, 0, frame.mTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                output(true);
            }
        }
    }

    private long getPresentTimeUs() {
        if(mPresentTimeUs == 0){
            mStartTime = System.currentTimeMillis();
            return 0;
        }else{
            mPresentTimeUs = System.currentTimeMillis() - mStartTime;
        }
        return mPresentTimeUs * 1000;
    }

    @Override
    public void output(boolean isEos) {
        String tag = TAG + "-output";
        if(mIsAllKeyFrame){
            requestKeyFrame();
        }
        ByteBuffer[] outputBuffers = null;
        int count = 0;
        int outputIndex = mMediaCodec.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
        try{
            outputBuffers = mMediaCodec.getOutputBuffers();
            do{
                if(outputIndex == MediaCodec.INFO_TRY_AGAIN_LATER){
                    Log.i(tag, "output from encoder not available");
                    if(!isEos){
                        count++;
                        if(count >= 10){
//                            break;
                        }
                    }
                }else if(outputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED){
                    outputBuffers = mMediaCodec.getOutputBuffers();
                    Log.i(tag, "encoder output buffers changed");
                }else if(outputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED){
                //加入视频轨道编码模式
                    addTrack();
                    Log.i(tag, "encoder output format change");
                }else if(outputIndex < 0){
                    Log.e(tag, "output buffer wrong " + outputIndex);
                }else{
                    ByteBuffer outputBuffer = outputBuffers[outputIndex];
                    if(outputBuffer == null){
                        Log.e(tag, "output buffer null");
                        return;
                    }
                    if((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0){
                        mBufferInfo.size = 0;
                    }
                    Log.d(tag, "buffer size=" + mBufferInfo.size + " pts=" + mBufferInfo.presentationTimeUs);
                    if(mBufferInfo.size != 0){
                        if(!mMuxer.isVideoTrackAdd()){
                            addTrack();
                        }
                        if(!mMuxer.isStarted()){
                            mMuxer.start();
                        }
                        outputBuffer.position(mBufferInfo.offset);
                        outputBuffer.limit(mBufferInfo.offset + mBufferInfo.size);
                        //编码后的数据输出到混合器
                        mMuxer.writeSampleData(mTrackIndex, outputBuffer, mBufferInfo);
                    }
                    mMediaCodec.releaseOutputBuffer(outputIndex, false);
                    if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                        // when EOS come.
                        mIsRecording = false;
                        stopMuxer();
                        release();
                        logD("eos coming");
                        break;      // out of while
                    }
                }
                outputIndex = mMediaCodec.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);

            }while (outputIndex >= 0);
        }catch (Exception e){
        }
    }

    private void stopMuxer() {
        mMuxer.stop();
    }

    //加入视频轨道
    private void addTrack() {
        mMediaFormat = mMediaCodec.getOutputFormat();
        mTrackIndex = mMuxer.addTrack(mMediaFormat, true);
    }

    @Override
    public void release() {
        if(!mIsRecording){
            mMuxer.release();
            mMediaCodec.release();
        }
    }

    @Override
    public void run() {
        int count = 0;
        //轮询线程
        while (true){
            count++;
            if(mFrameList.size() > 0){
                Frame frame = mFrameList.remove(0);
                input(frame);
                if(count % 30 == 0) {
//                    saveBitmap(frame, ImageFormat.NV21);
                }
            }
            try {
                Thread.sleep(mLoopInterval);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
            if(!mIsRecording){
                 break;
            }
        }
    }

    public static void saveBitmap(Frame frame, int format) {
        BitmapFactory.Options newOpts = new BitmapFactory.Options();
        newOpts.inJustDecodeBounds = true;
        YuvImage yuvimage = new YuvImage(frame.mData, format, Config.VIDEO_HEIGHT, Config.VIDEO_WIDTH, null);
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        yuvimage.compressToJpeg(new Rect(0, 0, Config.VIDEO_HEIGHT, Config.VIDEO_WIDTH), 100, baos);// 80--JPG图片的质量[0-100],100最高
        byte[] rawImage = baos.toByteArray();
        //将rawImage转换成bitmap
        BitmapFactory.Options options = new BitmapFactory.Options();
        options.inPreferredConfig = Bitmap.Config.ARGB_8888;
        Bitmap bitmap = BitmapFactory.decodeByteArray(rawImage, 0, rawImage.length, options);
        File dir = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "00recorder" + File.separator);
        File file = new File(dir, frame.mTime + ".jpg");
        BitmapUtil.saveBitmap(bitmap, file);
    }

    public void stop() {
        eosFrame();
        logD("stop encoder");
    }

    public void start() {
        mIsRecording = true;
        logD("start encoder");
        new Thread(this).start();
    }

    private void logD(String str){
        Log.d(TAG, str);
    }

    private void logE(String str){
        Log.e(TAG, str);
    }
}

视频混合

在surface录制一文中已经对视频混合器解释的很详细了,此处不多赘述,但关键的一点还是要重复一下,那就是数据轨道的加入时机,视频轨道编码格式通过addTrack这个函数加入,编码器的配置编码模式并不是编码器输出的编码模式,编码器会根据你配置的编码模式生成对应的输出编码模式,这个时候再把该编码模式添加到混合器的视频轨道。

package lda.com.myrecorder;

import android.media.MediaCodec;

import java.nio.ByteBuffer;

/**
 * Created by lda on 2017/10/11.
 */

public interface IMuxer {
    void stop();
    void release();
    void writeSampleData(int trackIndex, ByteBuffer byteBuf, MediaCodec.BufferInfo bufferInfo);
    void start();
}
package lda.com.myrecorder;

import android.media.MediaCodec;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.util.Log;

import java.io.IOException;
import java.nio.ByteBuffer;

/**
 * Created by lda on 2017/10/23.
 */

public class MMuxer implements IMuxer {

    String TAG = MMuxer.class.getSimpleName();
    private MediaMuxer mMuxer;
    private int mVideoIndex;
    private boolean mIsStarted;
    private boolean mIsVideoTrackAdd;
    private int mTrackCount = 0;

    public MMuxer(String path){
        try {
            mMuxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
        } catch (IOException e) {
            Log.i(TAG, "init mmuxer error " + e);
        }
    }

    //加入指定轨道
    public int addTrack(MediaFormat format, boolean isVideo){
        Log.i(TAG, "add track=" + format + " isVideo=" + isVideo);
        if(isVideo){
            mVideoIndex = mMuxer.addTrack(format);
            mIsVideoTrackAdd = true;
            mTrackCount++;
            return mVideoIndex;
        }else{
            return 0;
        }
    }

    public boolean isVideoTrackAdd(){
        return mIsVideoTrackAdd;
    }

    @Override
    public void stop() {
        mIsStarted = false;
        mMuxer.stop();
        Log.d(TAG, "muxer stop---");
    }

    //结束后必须要释放资源
    @Override
    public void release() {
        mMuxer.release();
        Log.d(TAG, "muxer release---");

    }

    //写入数据
    @Override
    public void writeSampleData(int trackIndex, ByteBuffer byteBuf, MediaCodec.BufferInfo bufferInfo) {
        try{
            if (mTrackCount > 0 && mIsStarted) {
                mMuxer.writeSampleData(trackIndex, byteBuf, bufferInfo);
                Log.d(TAG, "wrateSampleData-" + trackIndex + " pts=" + bufferInfo.presentationTimeUs);
            }else{

            }
        }catch (Exception e){
            Log.e(TAG, "writeSampleData Error=" + e);
        }
    }

    @Override
    public void start() {
        mIsStarted = true;
        mMuxer.start();
        Log.i(TAG, "start_muxer");
    }

    public boolean isStarted(){
        return mIsStarted;
    }
}

至此,关于视频录制的两种录制模式都说完了,小伙伴们有什么疑问的,请留言哦!

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 3
    评论
Html5网页纯JavaScript录制MP3音频 <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>Html5网页JavaScript录制MP3音频</title> <meta charset="utf-8" /> </head> <body> Html5网页JavaScript录制MP3音频 录制 停止 上传 调试信息: [removed][removed] [removed] var recorder = new MP3Recorder({ debug:true, funOk: function () { btnStart.disabled = false; log('初始化成功'); }, funCancel: function (msg) { log(msg); recorder = null; } }); var mp3Blob; function funStart(button) { btnStart.disabled = true; btnStop.disabled = false; btnUpload.disabled = true; log('录音开始...'); recorder.start(); } function funStop(button) { recorder.stop(); btnStart.disabled = false; btnStop.disabled = true; btnUpload.disabled = false; log('录音结束,MP3导出中...'); recorder.getMp3Blob(function (blob) { log('MP3导出成功'); mp3Blob = blob; var url = URL.createObjectURL(mp3Blob); var div = document.createElement('div'); var au = document.createElement('audio'); var hf = document.createElement('a'); au.controls = true; au.src = url; hf.href = url; hf.download = new Date().toISOString() + '.mp3'; hf[removed] = hf.download; div.appendChild(au); div.appendChild(hf); recordingslist.appendChild(div); }); } function log(str) { recordingslist[removed] += str + ''; } function funUpload() { var fd = new FormData(); var mp3Name = encodeURIComponent('audio_recording_' + new Date().getTime() + '.mp3'); fd.append('mp3Name', mp3Name); fd.append('file', mp3Blob); var xhr = new XMLHttpRequest(); xhr.onreadystatechange = function () { if (xhr.readyState == 4 && xhr.status == 200) { recordingslist[removed] += '上传成功:' + mp3Name + ''; } }; xhr.open('POST', 'upload.ashx'); xhr.send(fd); } [removed] </body> </html> [javascript] view plain copy 在CODE上查看代码片派生到我的代码片 (function (exports) { var MP3Recorder = function (config) { var recorder = this; config = config || {}; config.sampleRate = config.sampleRate || 44100; config.bitRate = config.bitRate || 128; navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia; if (navigator.getUserMedia) { navigator.getUserMedia({ audio: true }, function (stream) { var context = new AudioContext(), microphone = context.createMediaStreamSource(stream), processor = context.createScriptProcessor(16384, 1, 1),//bufferSize大小,输入channel数,输出channel数 mp3ReceiveSuccess, currentErrorCallback; config.sampleRate = context.sampleRate; processor.onaudioprocess = function (event) { //边录音边转换 var array = event.inputBuffer.getChannelData(0); realTimeWorker.postMessage({ cmd: 'encode', buf: array }); }; var realTimeWorker = new Worker('js/worker-realtime.js'); realTimeWorker.onmessage = function (e) { switch (e.data.cmd) { case 'init': log('初始化成功'); if (config.funOk) { config.funOk(); } break; case 'end': log('MP3大小:', e.data.buf.length); if (mp3ReceiveSuccess) { mp3ReceiveSuccess(new Blob(e.data.buf, { type: 'audio/mp3' })); } break; case 'error': log('错误信息:' + e.data.error); if (currentErrorCallback) { currentErrorCallback(e.data.error); } break; default: log('未知信息:', e.data); } }; recorder.getMp3Blob = function (onSuccess, onError) { currentErrorCallback = onError; mp3ReceiveSuccess = onSuccess; realTimeWorker.postMessage({ cmd: 'finish' }); }; recorder.start = function () { if (processor && microphone) { microphone.connect(processor); processor.connect(context.destination); log('开始录音'); } } recorder.stop = function () { if (processor && microphone) { microphone.disconnect(); processor.disconnect(); log('录音结束'); } } realTimeWorker.postMessage({ cmd: 'init', config: { sampleRate: config.sampleRate, bitRate: config.bitRate } }); }, function (error) { var msg; switch (error.code || error.name) { case 'PERMISSION_DENIED': case 'PermissionDeniedError': msg = '用户拒绝访问麦客风'; break; case 'NOT_SUPPORTED_ERROR': case 'NotSupportedError': msg = '浏览器不支持麦客风'; break; case 'MANDATORY_UNSATISFIED_ERROR': case 'MandatoryUnsatisfiedError': msg = '找不到麦客风设备'; break; default: msg = '无法打开麦克风,异常信息:' + (error.code || error.name); break; } if (config.funCancel) { config.funCancel(msg); } }); } else { if (config.funCancel) { config.funCancel('当前浏览器不支持录音功能'); } } function log(str) { if (config.debug) { console.log(str); } } } exports.MP3Recorder = MP3Recorder; })(window);
使用 `MediaFormat` 类录制音频,可以更加灵活地设置音频格式和编码器。以下是一个简单的示例代码,可以使用 `MediaFormat` 类录制音频并保存到指定的文件路径: ``` private MediaCodec codec; private MediaFormat format; private String outputFilePath = "/sdcard/recorded_audio.aac"; private FileOutputStream outputStream; public void startRecording() { try { format = MediaFormat.createAudioFormat("audio/mp4a-latm", 44100, 2); format.setInteger(MediaFormat.KEY_BIT_RATE, 128000); format.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC); codec = MediaCodec.createEncoderByType("audio/mp4a-latm"); codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); codec.start(); outputStream = new FileOutputStream(outputFilePath); } catch (IOException e) { e.printStackTrace(); } } public void stopRecording() { codec.stop(); codec.release(); codec = null; try { outputStream.flush(); outputStream.close(); } catch (IOException e) { e.printStackTrace(); } } public void encode(byte[] input) { int inputBufferIndex = codec.dequeueInputBuffer(-1); if (inputBufferIndex >= 0) { ByteBuffer inputBuffer = codec.getInputBuffer(inputBufferIndex); inputBuffer.clear(); inputBuffer.put(input); codec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0); } MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); int outputBufferIndex = codec.dequeueOutputBuffer(bufferInfo, 0); while (outputBufferIndex >= 0) { ByteBuffer outputBuffer = codec.getOutputBuffer(outputBufferIndex); outputBuffer.position(bufferInfo.offset); outputBuffer.limit(bufferInfo.offset + bufferInfo.size); byte[] chunk = new byte[bufferInfo.size]; outputBuffer.get(chunk); outputStream.write(chunk); codec.releaseOutputBuffer(outputBufferIndex, false); outputBufferIndex = codec.dequeueOutputBuffer(bufferInfo, 0); } } ``` 在这个例子中,我们首先创建一个 `MediaFormat` 对象,然后设置音频格式和编码器参数。接下来,我们创建一个 `MediaCodec` 对象,使用 `configure()` 方法配置 `MediaCodec` 对象。我们还创建一个 `FileOutputStream` 对象,用于将编码后的音频数据写入文件。 当我们想要开始录制时,只需要调用 `startRecording()` 方法。当有音频数据可用时,我们调用 `encode()` 方法将音频数据编码,并将编码后的数据写入文件。当我们想要停止录制时,只需要调用 `stopRecording()` 方法停止编码和释放资源。 请注意,此代码示例也需要一些权限,例如 RECORD_AUDIO 和 WRITE_EXTERNAL_STORAGE 权限。另外,如果你正在使用较旧的 Android 设备,可能需要使用不同的音频格式和编码器。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值