Android项目小结——硬解码(MediaCodec实现[MP4]转YUV420各种格式)

YUV420

  • yuv420p:yv12(YYYYYYYY VV UU)、I420(YYYYYYYY UU VV)
  • yuv420sp:nv12(YYYYYYYY UV UV)、nv21(YYYYYYYY VU VU)

同样是三种方式

(1)基于缓存(ByteBuffer)的同步编码
(2)基于缓存(ByteBuffer)的异步编码
(3)基于缓存数组的同步编码(废弃,可能效率没前面两种高吧)

基于缓存(ByteBuffer)的同步解码

(1)解码出来的是COLOR_FormatYUV420Flexible,利用Image的Plane,[0]是步长为1的Y,[1]是步长为2的U,[2]是步长为2的V,再利用对应的数组操作既可以得到对应的YUV420数据。
(2)解码送进去的时候,需要使用MediaExtractor解析Mp4文件,判断类型(编码的时候,头信息、关键帧、P帧信息)

主要方法

设置参数

public void setDecoderParams(String yuvPath, String mp4Path, int fileType) throws IOException {
        if (fileType != FILE_TypeI420 && fileType != FILE_TypeNV21 && fileType != FILE_TypeJPEG) {
            throw new IllegalArgumentException("only support FILE_TypeI420 " + "and FILE_TypeNV21 " + "and FILE_TypeJPEG");
        }
        File mp4File = new File(mp4Path);
        File yuvFile = new File(yuvPath);
        if (!mp4File.exists()) {
            throw new RuntimeException("mp4 file do not exist");
        }
        if (mp4File.isDirectory()) {
            throw new IllegalArgumentException("mp4Path is not a mp4 file , it is a directory");
        }
        if (yuvFile.isDirectory()) {
            throw new IllegalArgumentException("yuvPath is not a yuv file , it is a directory");
        }
        INPUT_FILE_PATH = mp4Path;
        outputImageFileType = fileType;
        OUTPUT_FILE_PATH = yuvPath;
        if (!yuvFile.getParentFile().exists()) {
            yuvFile.getParentFile().mkdirs();
        }

        RandomAccessFile acf = new RandomAccessFile(yuvFile, "rw");
        fc_out = acf.getChannel();
        File videoFile = new File(INPUT_FILE_PATH);
        extractor = new MediaExtractor();
        extractor.setDataSource(videoFile.getPath());
        int trackIndex = selectTrack(extractor);
        if (trackIndex < 0) {
            throw new RuntimeException("No video track found in " + INPUT_FILE_PATH);
        }
        extractor.selectTrack(trackIndex);
        mediaFormat = extractor.getTrackFormat(trackIndex);
        String mime = mediaFormat.getString(MediaFormat.KEY_MIME);
        decoder = MediaCodec.createDecoderByType(mime);
        showSupportedColorFormat(decoder.getCodecInfo().getCapabilitiesForType(mime));
        if (isColorFormatSupported(decodeColorFormat, decoder.getCodecInfo().getCapabilitiesForType(mime))) {
            mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, decodeColorFormat);
            Log.i(TAG, "set decode color format to type " + decodeColorFormat);
        } else {
            Log.i(TAG, "unable to set decode color format, color format type " + decodeColorFormat + " not supported");
        }

    }

设置了输入输出路径,初始化MediaExtractor去解析Mp4,根据编码类型创建对应的MediaCodec解码器。

解码

public void videoDecode() throws IOException {
        try {
            decodeFramesToYUV(decoder, extractor, mediaFormat);
            decoder.stop();
        } finally {
            if (decoder != null) {
                decoder.stop();
                decoder.release();
                decoder = null;
            }
            if (extractor != null) {
                extractor.release();
                extractor = null;
            }
        }
    }
    private void decodeFramesToYUV(MediaCodec decoder, MediaExtractor extractor, MediaFormat mediaFormat) throws IOException {
        MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
        boolean sawInputEOS = false;
        boolean sawOutputEOS = false;
        decoder.configure(mediaFormat, null, null, 0);
        decoder.start();
        final int width = mediaFormat.getInteger(MediaFormat.KEY_WIDTH);
        final int height = mediaFormat.getInteger(MediaFormat.KEY_HEIGHT);
        int outputFrameCount = 0;
        while (!sawOutputEOS) {
            if (!sawInputEOS) {
                int inputBufferId = decoder.dequeueInputBuffer(DEFAULT_TIMEOUT_US);
                if (inputBufferId >= 0) {
                    ByteBuffer inputBuffer = decoder.getInputBuffer(inputBufferId);
                    int sampleSize = extractor.readSampleData(inputBuffer, 0);
                    if (sampleSize < 0) {
                        decoder.queueInputBuffer(inputBufferId, 0, 0, 0L, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                        sawInputEOS = true;
                    } else {
                        long presentationTimeUs = extractor.getSampleTime();
                        decoder.queueInputBuffer(inputBufferId, 0, sampleSize, presentationTimeUs, 0);
                        extractor.advance();
                    }
                }
            }
            int outputBufferId = decoder.dequeueOutputBuffer(info, DEFAULT_TIMEOUT_US);
            if (outputBufferId >= 0) {
                if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    sawOutputEOS = true;
                    try {
                        fc_out.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
                boolean doRender = (info.size != 0);
                if (doRender) {
                    Image image = decoder.getOutputImage(outputBufferId);
//                    System.out.println("image format: " + image.getFormat());
                    if (outputImageFileType != -1) {
                        switch (outputImageFileType) {
                            case FILE_TypeI420:
                                byte[] data = getDataFromImage(image, COLOR_FormatI420);
                                data = rotationYuvByOpenCV(data, width, height, 1);
                                MappedByteBuffer outMappedBuffer = fc_out.map(FileChannel.MapMode.READ_WRITE, (long) outputFrameCount * data.length, (long) data.length);
                                outMappedBuffer.put(data);
                                final int finalOutputFrameCount = outputFrameCount;
                                executorPools.submit(new Runnable() {
                                    @Override
                                    public void run() {
                                        decodeProgressListener.publishProgress(finalOutputFrameCount);
                                    }
                                });
                                outputFrameCount++;
//                                dumpFile(OUTPUT_FILE_PATH, data);
                                break;
                            case FILE_TypeNV21:
//                                dumpFile(OUTPUT_FILE_PATH, getDataFromImage(image, COLOR_FormatNV21));
                                break;
                            case FILE_TypeJPEG:
                                compressToJpeg(OUTPUT_FILE_PATH, image);
                                break;
                        }
//                        Log.d(TAG, "完成第" + outputFrameCount + "帧");
                    }
                    image.close();
                    decoder.releaseOutputBuffer(outputBufferId, false);
                }
            }
        }
    }

没什么好说的,类似于编码,送进入,读出来。

从Image读取YUV数据

    private byte[] getDataFromImage(Image image, int colorFormat) {
        if (colorFormat != COLOR_FormatI420 && colorFormat != COLOR_FormatNV21) {
            throw new IllegalArgumentException("only support COLOR_FormatI420 " + "and COLOR_FormatNV21");
        }
        if (!isImageFormatSupported(image)) {
            throw new RuntimeException("can't convert Image to byte array, format " + image.getFormat());
        }
        Rect crop = image.getCropRect();
        int format = image.getFormat();
        int width = crop.width();
        int height = crop.height();
        Image.Plane[] planes = image.getPlanes();
        byte[] data = new byte[width * height * ImageFormat.getBitsPerPixel(format) / 8];
        byte[] rowData = new byte[planes[0].getRowStride()];
        if (VERBOSE) Log.v(TAG, "get data from " + planes.length + " planes");
        int channelOffset = 0;
        int outputStride = 1;
        for (int i = 0; i < planes.length; i++) {
            switch (i) {
                case 0:
                    channelOffset = 0;
                    outputStride = 1;
                    break;
                case 1:
                    if (colorFormat == COLOR_FormatI420) {
                        channelOffset = width * height;
                        outputStride = 1;
                    } else if (colorFormat == COLOR_FormatNV21) {
                        channelOffset = width * height + 1;
                        outputStride = 2;
                    }
                    break;
                case 2:
                    if (colorFormat == COLOR_FormatI420) {
                        channelOffset = (int) (width * height * 1.25);
                        outputStride = 1;
                    } else if (colorFormat == COLOR_FormatNV21) {
                        channelOffset = width * height;
                        outputStride = 2;
                    }
                    break;
            }
            ByteBuffer buffer = planes[i].getBuffer();
            int rowStride = planes[i].getRowStride();
            int pixelStride = planes[i].getPixelStride();
            if (VERBOSE) {
                Log.v(TAG, "pixelStride " + pixelStride);
                Log.v(TAG, "rowStride " + rowStride);
                Log.v(TAG, "width " + width);
                Log.v(TAG, "height " + height);
                Log.v(TAG, "buffer size " + buffer.remaining());
            }
            int shift = (i == 0) ? 0 : 1;
            int w = width >> shift;
            int h = height >> shift;
            buffer.position(rowStride * (crop.top >> shift) + pixelStride * (crop.left >> shift));
            for (int row = 0; row < h; row++) {
                int length;
                if (pixelStride == 1 && outputStride == 1) {
                    length = w;
                    buffer.get(data, channelOffset, length);
                    channelOffset += length;
                } else {
                    length = (w - 1) * pixelStride + 1;
                    buffer.get(rowData, 0, length);
                    for (int col = 0; col < w; col++) {
                        data[channelOffset] = rowData[col * pixelStride];
                        channelOffset += outputStride;
                    }
                }
                if (row < h - 1) {
                    buffer.position(buffer.position() + rowStride - length);
                }
            }
            if (VERBOSE) Log.v(TAG, "Finished reading data from plane " + i);
        }
        return data;
    }

得益于Image的实现:
plane[0]就是Y数据,步长为1;
plane[1]就是U数据,步长为2;
plane[2]就是V数据,步长为2;
具体的可以根据plane的内置属性进行判断。

YUV(I420)数据旋转

由于Mp4带有rotation属性,解码出来的数据是顺时针旋转90读的。利用Opencv的API,先转置再镜像即可完成旋转。

    private byte[] rotationYuvByOpenCV(byte[] oldData, int width, int height, int rotation) {
        byte[] newData = new byte[(int) (width * height * 1.5)];
        Mat mat = new Mat((int) (height * 1.5), width, CvType.CV_8UC1);
        mat.put(0, 0, oldData);
        Imgproc.cvtColor(mat, mat, Imgproc.COLOR_YUV2BGR_I420);
        Core.transpose(mat, mat);
        // 翻转模式,flipCode == 0垂直翻转(沿X轴翻转),flipCode>0水平翻转(沿Y轴翻转),flipCode<0水平垂直翻转(先沿X轴翻转,再沿Y轴翻转,等价于旋转180°)
        Core.flip(mat, mat, 1);
        if (isCoverFace) {
            Mat black = Mat.zeros(coverFaceHeight, mat.cols(), mat.type());
            Mat roi = new Mat(mat, new org.opencv.core.Rect(0, coverFaceY, black.width(), coverFaceHeight));
            black.copyTo(roi);
        }
        Imgproc.cvtColor(mat, mat, Imgproc.COLOR_BGR2YUV_I420);
        mat.get(0, 0, newData);
        return newData;
    }

逆时针旋转90度,其他的度数大同小异。

小结

之前苦于不知道怎么去判定MP4的帧,怎么送进去。查了一些Mp4的封装方式,参考了别人的blog,发现有MediaExtractor这个专用的解析器,可以按轨道按Sample去读取数据,将数据送进解析器即可。
上面是同步式的解码,异步式的只需要改动decodeFramesToYUV方法,将下面的同步方法增加到callback中即可。

源码地址:https://github.com/shen511460468/MediaCodecDemo

  • 1
    点赞
  • 11
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值