Android Jetpack中CameraX保存Bitmap

先看看官方文档

图片拍摄

图片拍摄用例旨在拍摄高分辨率的优质照片,不仅提供简单的相机手动控制功能,还提供自动白平衡、自动曝光和自动对焦 (3A) 功能。调用方负责决定如何使用拍摄的照片,具体包括以下选项:

运行 ImageCapture 的可自定义执行程序有两种类型:回调执行程序和 IO 执行程序。

  • 回调执行程序是 takePicture 方法的参数。它用于执行用户提供的 OnImageCapturedCallback()
  • 如果调用方选择将图片保存到文件位置,您可以指定执行程序以执行 IO。如需设置 IO 执行程序,请调用 ImageCapture.Builder.setIoExecutor(Executor)。如果执行程序不存在,则默认 CameraX 为任务的内部 IO 执行程序。

 在官方示例代码中 takePicture(OutputFileOptions, Executor, OnImageSavedCallback) 也没有具体说明怎么获取的。

于是我就想尝试第一个方法takePicture(Executor, OnImageCapturedCallback) 发现了一个问题。

用 image.getFormat()这个方法获取图片格式为 256(ImageFormat.JPEG) 呀  不是我想要的YVU格式(官方解释:图片拍摄方法完全支持 JPEG 格式。如需查看有关如何将 Media.Image 对象从 YUV_420_888 格式转换为 RGB Bitmap对象的示例代码,请参阅 YuvToRgbConverter.kt)

 所以这个方法直接得不到bitmap 置于jpeg格式转bitmap  我没有找到相关文档。怪我菜了~~~~

于是我想到  ImageAnalysis 

mImageAnalysis.setAnalyzer(CameraXExecutors.mainThreadExecutor(), image -> {
                if (image.getFormat() == YUV_420_888) {
                    //创建一个新空白位图
                    Bitmap bgBitmap = Bitmap.createBitmap(image.getWidth(), image.getHeight(), Bitmap.Config.ARGB_8888);
                    YuvToRgbConverter yuvToRgbConverter = new YuvToRgbConverter(mContext);
                    yuvToRgbConverter.yuvToRgb(image.getImage(), bgBitmap);
                    if (bgBitmap != null) {
                        String path = SDP + "photo/" + TimeExtUtils.getCurr7() + ".jpg";
                        boolean fileByDeleteOldFile = FileUtils.createFileByDeleteOldFile(path);
                        if (fileByDeleteOldFile) {
                            ImageUtils.save(bgBitmap, path, Bitmap.CompressFormat.JPEG);
                            LogUtils.i("保存路径:" + path);
                            ThreadUtils.runOnUiThread(() -> {
                                T.info("保存路径:" + path);
                            });
                        }
                    }
                }
                analyzeQRCode(image);
            });

这点我打印了图片格式,果然是YUV_420_888 !!!! ok 起锅烧油!!!!顺利的保存到了图片。

YuvToRgbConverter.kt 源码:

/*
 * Copyright 2020 The Android Open Source Project
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     https://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package com.example.android.camera.utils

import android.content.Context
import android.graphics.Bitmap
import android.graphics.ImageFormat
import android.media.Image
import android.renderscript.Allocation
import android.renderscript.Element
import android.renderscript.RenderScript
import android.renderscript.ScriptIntrinsicYuvToRGB
import android.renderscript.Type
import java.nio.ByteBuffer

/**
 * Helper class used to convert a [Image] object from
 * [ImageFormat.YUV_420_888] format to an RGB [Bitmap] object, it has equivalent
 * functionality to https://github
 * .com/androidx/androidx/blob/androidx-main/camera/camera-core/src/main/java/androidx/camera/core/ImageYuvToRgbConverter.java
 *
 * NOTE: This has been tested in a limited number of devices and is not
 * considered production-ready code. It was created for illustration purposes,
 * since this is not an efficient camera pipeline due to the multiple copies
 * required to convert each frame. For example, this
 * implementation
 * (https://stackoverflow.com/questions/52726002/camera2-captured-picture-conversion-from-yuv-420-888-to-nv21/52740776#52740776)
 * might have better performance.
 */
class YuvToRgbConverter(context: Context) {
    private val rs = RenderScript.create(context)
    private val scriptYuvToRgb =
        ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs))

    // Do not add getters/setters functions to these private variables
    // because yuvToRgb() assume they won't be modified elsewhere
    private var yuvBits: ByteBuffer? = null
    private var bytes: ByteArray = ByteArray(0)
    private var inputAllocation: Allocation? = null
    private var outputAllocation: Allocation? = null

    @Synchronized
    fun yuvToRgb(image: Image, output: Bitmap) {
        val yuvBuffer = YuvByteBuffer(image, yuvBits)
        yuvBits = yuvBuffer.buffer

        if (needCreateAllocations(image, yuvBuffer)) {
            val yuvType = Type.Builder(rs, Element.U8(rs))
                .setX(image.width)
                .setY(image.height)
                .setYuvFormat(yuvBuffer.type)
            inputAllocation = Allocation.createTyped(
                rs,
                yuvType.create(),
                Allocation.USAGE_SCRIPT
            )
            bytes = ByteArray(yuvBuffer.buffer.capacity())
            val rgbaType = Type.Builder(rs, Element.RGBA_8888(rs))
                .setX(image.width)
                .setY(image.height)
            outputAllocation = Allocation.createTyped(
                rs,
                rgbaType.create(),
                Allocation.USAGE_SCRIPT
            )
        }

        yuvBuffer.buffer.get(bytes)
        inputAllocation!!.copyFrom(bytes)

        // Convert NV21 or YUV_420_888 format to RGB
        inputAllocation!!.copyFrom(bytes)
        scriptYuvToRgb.setInput(inputAllocation)
        scriptYuvToRgb.forEach(outputAllocation)
        outputAllocation!!.copyTo(output)
    }

    private fun needCreateAllocations(image: Image, yuvBuffer: YuvByteBuffer): Boolean {
        return (inputAllocation == null ||               // the very 1st call
            inputAllocation!!.type.x != image.width ||   // image size changed
            inputAllocation!!.type.y != image.height ||
            inputAllocation!!.type.yuv != yuvBuffer.type || // image format changed
            bytes.size == yuvBuffer.buffer.capacity())
    }
}

Yuv.kt 源码

package com.example.android.camera.utils

import android.graphics.ImageFormat
import android.media.Image
import androidx.annotation.IntDef
import java.nio.ByteBuffer

/*
This file is converted from part of https://github.com/gordinmitya/yuv2buf.
Follow the link to find demo app, performance benchmarks and unit tests.

Intro to YUV image formats:
YUV_420_888 - is a generic format that can be represented as I420, YV12, NV21, and NV12.
420 means that for each 4 luminosity pixels we have 2 chroma pixels: U and V.

* I420 format represents an image as Y plane followed by U then followed by V plane
    without chroma channels interleaving.
    For example:
    Y Y Y Y
    Y Y Y Y
    U U V V

* NV21 format represents an image as Y plane followed by V and U interleaved. First V then U.
    For example:
    Y Y Y Y
    Y Y Y Y
    V U V U

* YV12 and NV12 are the same as previous formats but with swapped order of V and U. (U then V)

Visualization of these 4 formats:
https://user-images.githubusercontent.com/9286092/89119601-4f6f8100-d4b8-11ea-9a51-2765f7e513c2.jpg

It's guaranteed that image.getPlanes() always returns planes in order Y U V for YUV_420_888.
https://developer.android.com/reference/android/graphics/ImageFormat#YUV_420_888

Because I420 and NV21 are more widely supported (RenderScript, OpenCV, MNN)
the conversion is done into these formats.

More about each format: https://www.fourcc.org/yuv.php
*/

@kotlin.annotation.Retention(AnnotationRetention.SOURCE)
@IntDef(ImageFormat.NV21, ImageFormat.YUV_420_888)
annotation class YuvType

class YuvByteBuffer(image: Image, dstBuffer: ByteBuffer? = null) {
    @YuvType
    val type: Int
    val buffer: ByteBuffer

    init {
        val wrappedImage = ImageWrapper(image)

        type = if (wrappedImage.u.pixelStride == 1) {
            ImageFormat.YUV_420_888
        } else {
            ImageFormat.NV21
        }
        val size = image.width * image.height * 3 / 2
        buffer = if (
            dstBuffer == null || dstBuffer.capacity() < size ||
            dstBuffer.isReadOnly || !dstBuffer.isDirect
        ) {
            ByteBuffer.allocateDirect(size) }
        else {
            dstBuffer
        }
        buffer.rewind()

        removePadding(wrappedImage)
    }

    // Input buffers are always direct as described in
    // https://developer.android.com/reference/android/media/Image.Plane#getBuffer()
    private fun removePadding(image: ImageWrapper) {
        val sizeLuma = image.y.width * image.y.height
        val sizeChroma = image.u.width * image.u.height
        if (image.y.rowStride > image.y.width) {
            removePaddingCompact(image.y, buffer, 0)
        } else {
            buffer.position(0)
            buffer.put(image.y.buffer)
        }
        if (type == ImageFormat.YUV_420_888) {
            if (image.u.rowStride > image.u.width) {
                removePaddingCompact(image.u, buffer, sizeLuma)
                removePaddingCompact(image.v, buffer, sizeLuma + sizeChroma)
            } else {
                buffer.position(sizeLuma)
                buffer.put(image.u.buffer)
                buffer.position(sizeLuma + sizeChroma)
                buffer.put(image.v.buffer)
            }
        } else {
            if (image.u.rowStride > image.u.width * 2) {
                removePaddingNotCompact(image, buffer, sizeLuma)
            } else {
                buffer.position(sizeLuma)
                var uv = image.v.buffer
                val properUVSize = image.v.height * image.v.rowStride - 1
                if (uv.capacity() > properUVSize) {
                    uv = clipBuffer(image.v.buffer, 0, properUVSize)
                }
                buffer.put(uv)
                val lastOne = image.u.buffer[image.u.buffer.capacity() - 1]
                buffer.put(buffer.capacity() - 1, lastOne)
            }
        }
        buffer.rewind()
    }

    private fun removePaddingCompact(
        plane: PlaneWrapper,
        dst: ByteBuffer,
        offset: Int
    ) {
        require(plane.pixelStride == 1) {
            "use removePaddingCompact with pixelStride == 1"
        }

        val src = plane.buffer
        val rowStride = plane.rowStride
        var row: ByteBuffer
        dst.position(offset)
        for (i in 0 until plane.height) {
            row = clipBuffer(src, i * rowStride, plane.width)
            dst.put(row)
        }
    }

    private fun removePaddingNotCompact(
        image: ImageWrapper,
        dst: ByteBuffer,
        offset: Int
    ) {
        require(image.u.pixelStride == 2) {
            "use removePaddingNotCompact pixelStride == 2"
        }
        val width = image.u.width
        val height = image.u.height
        val rowStride = image.u.rowStride
        var row: ByteBuffer
        dst.position(offset)
        for (i in 0 until height - 1) {
            row = clipBuffer(image.v.buffer, i * rowStride, width * 2)
            dst.put(row)
        }
        row = clipBuffer(image.u.buffer, (height - 1) * rowStride - 1, width * 2)
        dst.put(row)
    }

    private fun clipBuffer(buffer: ByteBuffer, start: Int, size: Int): ByteBuffer {
        val duplicate = buffer.duplicate()
        duplicate.position(start)
        duplicate.limit(start + size)
        return duplicate.slice()
    }

    private class ImageWrapper(image:Image) {
        val width= image.width
        val height = image.height
        val y = PlaneWrapper(width, height, image.planes[0])
        val u = PlaneWrapper(width / 2, height / 2, image.planes[1])
        val v = PlaneWrapper(width / 2, height / 2, image.planes[2])

        // Check this is a supported image format
        // https://developer.android.com/reference/android/graphics/ImageFormat#YUV_420_888
        init {
            require(y.pixelStride == 1) {
                "Pixel stride for Y plane must be 1 but got ${y.pixelStride} instead."
            }
            require(u.pixelStride == v.pixelStride && u.rowStride == v.rowStride) {
                "U and V planes must have the same pixel and row strides " +
                "but got pixel=${u.pixelStride} row=${u.rowStride} for U " +
                "and pixel=${v.pixelStride} and row=${v.rowStride} for V"
            }
            require(u.pixelStride == 1 || u.pixelStride == 2) {
                "Supported" + " pixel strides for U and V planes are 1 and 2"
            }
        }
    }

    private class PlaneWrapper(width: Int, height: Int, plane: Image.Plane) {
        val width = width
        val height = height
        val buffer: ByteBuffer = plane.buffer
        val rowStride = plane.rowStride
        val pixelStride = plane.pixelStride
    }
}

后续优化操作再说吧!!!

遇到几个问题

1.官方说 setOutputImageFormat(int)  可以设置图片的格式,我发现居然里面没得这个API 被删了嘛?

2.为什么拍摄图片跟图片分析的图片格式不一样?

3.预览图片与生成图片不一致?

等着我们去探索...

全部源码:

package com.jszy.baselib.camera;

import static android.graphics.ImageFormat.YUV_420_888;

import android.Manifest;
import android.annotation.SuppressLint;
import android.content.ContentValues;
import android.graphics.Bitmap;
import android.graphics.ImageFormat;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.os.Bundle;
import android.provider.MediaStore;
import android.util.Log;
import android.util.Size;
import android.view.OrientationEventListener;
import android.view.SoundEffectConstants;
import android.view.Surface;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ImageView;
import android.widget.PopupWindow;

import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.camera.core.AspectRatio;
import androidx.camera.core.Camera;
import androidx.camera.core.CameraControl;
import androidx.camera.core.CameraSelector;
import androidx.camera.core.FocusMeteringAction;
import androidx.camera.core.FocusMeteringResult;
import androidx.camera.core.ImageAnalysis;
import androidx.camera.core.ImageCapture;
import androidx.camera.core.ImageCaptureException;
import androidx.camera.core.ImageProxy;
import androidx.camera.core.ImageReaderProxyProvider;
import androidx.camera.core.MeteringPoint;
import androidx.camera.core.MeteringPointFactory;
import androidx.camera.core.Preview;
import androidx.camera.core.SurfaceOrientedMeteringPointFactory;
import androidx.camera.core.VideoCapture;
import androidx.camera.core.impl.ImageReaderProxy;
import androidx.camera.core.impl.utils.executor.CameraXExecutors;
import androidx.camera.extensions.BeautyPreviewExtender;
import androidx.camera.extensions.NightImageCaptureExtender;
import androidx.camera.lifecycle.ProcessCameraProvider;
import androidx.camera.view.PreviewView;
import androidx.core.content.ContextCompat;

import com.blankj.utilcode.util.FileUtils;
import com.blankj.utilcode.util.ImageUtils;
import com.blankj.utilcode.util.LogUtils;
import com.blankj.utilcode.util.PermissionUtils;
import com.blankj.utilcode.util.ThreadUtils;
import com.blankj.utilcode.util.Utils;
import com.example.android.camera.utils.YuvToRgbConverter;
import com.google.common.util.concurrent.ListenableFuture;
import com.google.zxing.BinaryBitmap;
import com.google.zxing.MultiFormatReader;
import com.google.zxing.PlanarYUVLuminanceSource;
import com.google.zxing.Result;
import com.google.zxing.common.HybridBinarizer;
import com.jszy.baselib.R;
import com.jszy.baselib.base.AppBindingActivity;
import com.jszy.baselib.databinding.ActivityCameraxBinding;
import com.jszy.comm.utils.T;
import com.jszy.comm.utils.TimeExtUtils;

import java.io.File;
import java.nio.ByteBuffer;
import java.util.List;
import java.util.concurrent.TimeUnit;


/**
 * CameraX修改
 */
public class CameraX1Activity extends AppBindingActivity<ActivityCameraxBinding> {
    public String SDP = Utils.getApp().getExternalFilesDir("") + File.separator;
    private static final int REQUEST_CAMERA = 20,
            REQUEST_STORAGE = 30,
            REQUEST_STORAGE_BINDING = 35,
            REQUEST_STORAGE_VIDEO = 40,
            REQUEST_STORAGE_VIDEO_BINDING = 45;
    private static final String CAPTURED_FILE_NAME = "captured_picture",
            CAPTURED_FILE_NAME_END = "image/jpeg";
    private static final String RECORDED_FILE_NAME = "recorded_video",
            RECORDED_FILE_NAME_END = "video/mp4";
    private String[] pers =
            {
                    Manifest.permission.CAMERA,
                    Manifest.permission.RECORD_AUDIO
            };
    private Camera mCameraX;
    private ProcessCameraProvider mCameraProvider;
    private ImageAnalysis mImageAnalysis;
    private VideoCapture mVideoCapture;
    private Preview mPreview;
    private ImageCapture mImageCapture;

    private boolean isBack = true;//是否后置
    private boolean isVideoMode = false;//是否是视频模式
    private boolean isRecording = false;//是否在录制当中
    private boolean isCameraXHandling = false;
    private boolean isAnalyzing = false;//是否扫描二维码

    @Override
    protected void onResume() {
        super.onResume();
        if (!PermissionUtils.isGranted(pers)) {
            PermissionUtils.permission(pers)
                    .callback(new PermissionUtils.FullCallback() {
                        @Override
                        public void onGranted(@NonNull List<String> granted) {
                            setupCamera();
                        }

                        @Override
                        public void onDenied(@NonNull List<String> deniedForever, @NonNull List<String> denied) {
                            if (deniedForever.size() > 0 || denied.size() > 0) {
                                finish();
                                PermissionUtils.launchAppDetailsSettings();
                            }
                        }
                    })
                    .request();
            return;
        } else {
            setupCamera();
        }
    }

    @Override
    public void initView(Bundle savedInstanceState) {

    }

    @Override
    public void initData() {

    }

    @SuppressLint("RestrictedApi")
    @Override
    public void setListener() {
//        getBinding().previewView.setOnTouchListener((view, motionEvent) -> {
//            MeteringPoint point = getBinding().previewView
//                    .getMeteringPointFactory()
//                    .createPoint(motionEvent.getX(), motionEvent.getY());
//            FocusMeteringAction action = new FocusMeteringAction.Builder(point).build();
//            try {
//                //显示聚焦图标
//                showFocusView((int) motionEvent.getX(), (int) motionEvent.getY());
//                mCameraX.getCameraControl().startFocusAndMetering(action);
//            } catch (Exception e) {
//                e.printStackTrace();
//            }
//            return false;
//        });

        getBinding().previewView.setOnTouchListener((view, motionEvent) -> {
            //显示聚焦图标
            showFocusView((int) motionEvent.getX(), (int) motionEvent.getY());
            //
            MeteringPointFactory factory = new SurfaceOrientedMeteringPointFactory(getBinding().previewView.getWidth()
                    , getBinding().previewView.getHeight());
            MeteringPoint point = factory.createPoint(motionEvent.getX(), motionEvent.getY());
            FocusMeteringAction action = new FocusMeteringAction.Builder(point, FocusMeteringAction.FLAG_AF)
//                .addPoint(point2, FocusMeteringAction.FLAG_AE) // could have many
                    // auto calling cancelFocusAndMetering in 5 seconds
                    .setAutoCancelDuration(3, TimeUnit.SECONDS)
                    .build();
            CameraControl cameraControl = mCameraX.getCameraControl();
            ListenableFuture future = cameraControl.startFocusAndMetering(action);
            future.addListener(() -> {
                try {
                    FocusMeteringResult result = (FocusMeteringResult) future.get();
                    LogUtils.i("对焦:" + result.isFocusSuccessful());
                    // process the result
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }, CameraXExecutors.mainThreadExecutor());
            return false;
        });
    }

    /**
     * 前后摄像头切换
     *
     * @param view
     */
    public void onChangeGo(View view) {
        if (mCameraProvider != null) {
            isBack = !isBack;
            bindPreview(mCameraProvider, getBinding().previewView);
            if (mImageAnalysis != null) {
                mImageAnalysis.clearAnalyzer();
            }
        }
    }

    /**
     * 拍照或录制视频
     */
    public void onCaptureGo(View view) {
        if (isVideoMode) {
            if (!isRecording) {
                // Check permission first.
                ensureAudioStoragePermission(REQUEST_STORAGE_VIDEO);
            } else {
                // Update status right now.
                toggleRecordingStatus();
            }
        } else {
            ensureAudioStoragePermission(REQUEST_STORAGE);
        }
    }

    public void onVideoGo(View view) {
        if (!isVideoMode) {
            // Check audio and storage permission before go to video mode.
            ensureAudioStoragePermission(REQUEST_STORAGE_VIDEO_BINDING);
        } else {
            // Check storage permission before go to camera mode.
            ensureAudioStoragePermission(REQUEST_STORAGE_BINDING);
        }
    }

    /**
     * 扫描二维码
     */
    @SuppressLint({"RestrictedApi", "UnsafeOptInUsageError"})
    public void onAnalyzeGo(View view) {
        if (mImageAnalysis == null) {
            return;
        }
        if (!isAnalyzing) {
            mImageAnalysis.setAnalyzer(CameraXExecutors.mainThreadExecutor(), image -> {
                if (image.getFormat() == YUV_420_888) {
                    //创建一个新空白位图
                    Bitmap bgBitmap = Bitmap.createBitmap(image.getWidth(), image.getHeight(), Bitmap.Config.ARGB_8888);
                    YuvToRgbConverter yuvToRgbConverter = new YuvToRgbConverter(mContext);
                    yuvToRgbConverter.yuvToRgb(image.getImage(), bgBitmap);
                    if (bgBitmap != null) {
                        String path = SDP + "photo/" + TimeExtUtils.getCurr7() + ".jpg";
                        boolean fileByDeleteOldFile = FileUtils.createFileByDeleteOldFile(path);
                        if (fileByDeleteOldFile) {
                            ImageUtils.save(bgBitmap, path, Bitmap.CompressFormat.JPEG);
                            LogUtils.i("保存路径:" + path);
                            ThreadUtils.runOnUiThread(() -> {
                                T.info("保存路径:" + path);
                            });
                        }
                    }
                }
                analyzeQRCode(image);
            });
        } else {
            //删除先前设置的分析器
            mImageAnalysis.clearAnalyzer();
        }
        isAnalyzing = !isAnalyzing;
        getBinding().qrCodeZone.setVisibility(isAnalyzing ? View.VISIBLE : View.GONE);
    }


    private void setupCamera() {
        //请求 CameraProvider
        ListenableFuture<ProcessCameraProvider> cameraProviderFuture =
                ProcessCameraProvider.getInstance(this);
        //检查 CameraProvider 可用性
        cameraProviderFuture.addListener(() -> {
            try {
                //摄像头供应商现在保证可用
                mCameraProvider = cameraProviderFuture.get();
                bindPreview(mCameraProvider, getBinding().previewView);
            } catch (Exception e) {
                e.printStackTrace();
            }
        }, ContextCompat.getMainExecutor(this));
    }

    private void bindPreview(@NonNull ProcessCameraProvider cameraProvider, PreviewView previewView) {
        bindPreview(cameraProvider, previewView, false);
    }

    @SuppressLint("RestrictedApi")
    private void bindPreview(@NonNull ProcessCameraProvider cameraProvider,
                             PreviewView previewView, boolean isVideo) {
        //1.创建 Preview
        Preview.Builder previewBuilder = new Preview.Builder();
        //2.指定所需的相机 LensFacing 选项
        CameraSelector cameraSelector = isBack ? CameraSelector.DEFAULT_BACK_CAMERA
                : CameraSelector.DEFAULT_FRONT_CAMERA;
//        //  CameraSelector.DEFAULT_BACK_CAMERA 已实现Builder
//        CameraSelector cameraSelector = new CameraSelector.Builder()
//                .requireLensFacing(CameraSelector.LENS_FACING_BACK)
//                .build();
        /**
         * 图像分析
         *
         * 图像分析用例为您的应用提供可供 CPU 访问的图像,您可以对这些图像执行图像处理、计算机视觉或机器学习推断。
         * 应用会实现对每个帧运行的 analyze() 方法
         */
        mImageAnalysis = new ImageAnalysis.Builder()
                //如果需要RGBA输出,启用下面的行。
//                .setOutputImageFormat (ImageAnalysis.OUTPUT_IMAGE_FORMAT_RGBA_8888)
                .setTargetRotation(previewView.getDisplay().getRotation())
                .setTargetResolution(new Size(720, 1440))
                .setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
                .build();
        //设置捕获用例,允许用户拍照
        ImageCapture.Builder captureBuilder = new ImageCapture.Builder()
                // 设置照片比例,目前只有16:9和4:3
                .setTargetAspectRatio(AspectRatio.RATIO_16_9)
                //如需缩短照片拍摄的延迟时间,请将 ImageCapture.CaptureMode 设置为 CAPTURE_MODE_MINIMIZE_LATENCY。
                // 如需优化照片质量,请将其设置为 CAPTURE_MODE_MAXIMIZE_QUALITY。
                .setCaptureMode(ImageCapture.CAPTURE_MODE_MAXIMIZE_QUALITY)
//                // 设置照片压缩质量[1,100],值越大越清晰,默认值:95或100,根据模式而定
//                .setJpegQuality(100)
                //旋转
//                .setTargetRotation(previewView.getDisplay().getRotation())//旋转
//                .setFlashMode()
                ;
        //
        mVideoCapture = new VideoCapture.Builder()//录像用例配置
                .setTargetRotation(previewView.getDisplay().getRotation())
                // 设置视频帧率-高于30帧/秒,视频格式会过大。低于25帧/秒,视频会出现卡屏现象。
                .setVideoFrameRate(25)
                // 设置视频比特率,720P 大概是2000Kbps  1080P 大概是6000Kbps
                // 如果这里不设置的话,摄像头像素很高的手机,拍出来的视频文件超大
                .setBitRate(3 * 1024 * 1024)
//                // 设置输出视频比例
//                .setTargetResolution(Size(720, 1280))
                .build();
        //
        setPreviewExtender(previewBuilder, cameraSelector);
        mPreview = previewBuilder.build();
        //
//        setCaptureExtender(captureBuilder, cameraSelector);
        mImageCapture = captureBuilder.build();
        //为屏幕方向事件设置旋转角度
        OrientationEventListener orientationEventListener = new OrientationEventListener(this) {
            @Override
            public void onOrientationChanged(int orientation) {
                int rotation;
                // Monitors orientation values to determine the target rotation value
                if (orientation >= 45 && orientation < 135) {
                    rotation = Surface.ROTATION_270;
                } else if (orientation >= 135 && orientation < 225) {
                    rotation = Surface.ROTATION_180;
                } else if (orientation >= 225 && orientation < 315) {
                    rotation = Surface.ROTATION_90;
                } else {
                    rotation = Surface.ROTATION_0;
                }
                mImageCapture.setTargetRotation(rotation);
            }
        };
        orientationEventListener.enable();
        //
        cameraProvider.unbindAll();
        //3.将所选相机和任意用例绑定到生命周期
        if (isVideo) {
            mCameraX = cameraProvider.bindToLifecycle(this, cameraSelector,
                    mPreview, mVideoCapture);
        } else {
            mCameraX = cameraProvider.bindToLifecycle(this, cameraSelector,
                    mPreview, mImageCapture, mImageAnalysis);
        }
        //4.将 Preview 连接到 PreviewView
        mPreview.setSurfaceProvider(previewView.getSurfaceProvider());
    }

    private void setPreviewExtender(Preview.Builder builder, CameraSelector cameraSelector) {
        BeautyPreviewExtender beautyPreviewExtender = BeautyPreviewExtender.create(builder);
        if (beautyPreviewExtender.isExtensionAvailable(cameraSelector)) {
            // Enable the extension if available.
            LogUtils.i("beauty preview extension enable");
            beautyPreviewExtender.enableExtension(cameraSelector);
        } else {
            LogUtils.i("beauty preview extension not available");
        }
    }

    private void setCaptureExtender(ImageCapture.Builder builder, CameraSelector cameraSelector) {
        NightImageCaptureExtender nightImageCaptureExtender = NightImageCaptureExtender.create(builder);
        if (nightImageCaptureExtender.isExtensionAvailable(cameraSelector)) {
            LogUtils.i("night capture extension enable");
            // Enable the extension if available.
            nightImageCaptureExtender.enableExtension(cameraSelector);
        } else {
            LogUtils.i("night capture extension not available");
        }

//        BokehImageCaptureExtender bokehImageCapture = BokehImageCaptureExtender.create(builder);
//        if (bokehImageCapture.isExtensionAvailable(cameraSelector)) {
//            // Enable the extension if available.
//            Log.d("Camera", "hdr extension enable");
//            bokehImageCapture.enableExtension(cameraSelector);
//        } else {
//            Log.d("Camera", "hdr extension not available");
//        }
//
//        HdrImageCaptureExtender hdrImageCaptureExtender = HdrImageCaptureExtender.create(builder);
//        if (hdrImageCaptureExtender.isExtensionAvailable(cameraSelector)) {
//            // Enable the extension if available.
//            Log.d("Camera", "night extension enable");
//            hdrImageCaptureExtender.enableExtension(cameraSelector);
//        } else {
//            Log.d("Camera", "night extension not available");
//        }
//
//        BeautyImageCaptureExtender beautyImageCaptureExtender = BeautyImageCaptureExtender.create(builder);
//        if (beautyImageCaptureExtender.isExtensionAvailable(cameraSelector)) {
//            // Enable the extension if available.
//            Log.d("Camera", "beauty extension enable");
//            beautyImageCaptureExtender.enableExtension(cameraSelector);
//        } else {
//            Log.d("Camera", "beauty extension not available");
//        }
    }

    private void ensureAudioStoragePermission(int requestId) {
        if (requestId == REQUEST_STORAGE || requestId == REQUEST_STORAGE_BINDING) {
            if (requestId == REQUEST_STORAGE) {
                takenPictureInternal();
            } else {
                toggleVideoMode();
            }
        } else if (requestId == REQUEST_STORAGE_VIDEO || requestId == REQUEST_STORAGE_VIDEO_BINDING) {
            if (requestId == REQUEST_STORAGE_VIDEO) {
                recordVideo();
            } else {
                toggleVideoMode();
            }
        }
    }


    /**
     * 拍摄照片
     */
    @SuppressLint("RestrictedApi")
    private void takenPictureInternal() {
        if (mImageCapture != null) {
//            mImageCapture.takePicture(CameraXExecutors.ioExecutor(), new ImageCapture.OnImageCapturedCallback() {
//                @SuppressLint("UnsafeOptInUsageError")
//                @Override
//                public void onCaptureSuccess(@NonNull ImageProxy image) {
//                    super.onCaptureSuccess(image);
//                    YUV_420_888
//                    LogUtils.i("图片格式:" + image.getFormat());
//
//                    if (image.getFormat() != YUV_420_888) {
//                        return;
//                    }
//                    //创建一个新空白位图
//                    Bitmap bgBitmap = Bitmap.createBitmap(image.getWidth(), image.getHeight(), Bitmap.Config.ARGB_8888);
//                    YuvToRgbConverter yuvToRgbConverter = new YuvToRgbConverter(mContext);
//                    yuvToRgbConverter.yuvToRgb(image.getImage(), bgBitmap);
//                    if (bgBitmap != null) {
//                        String path = SDP + "photo/" + TimeExtUtils.getCurr7() + ".jpg";
//                        boolean fileByDeleteOldFile = FileUtils.createFileByDeleteOldFile(path);
//                        if (fileByDeleteOldFile) {
//                            ImageUtils.save(bgBitmap, path, Bitmap.CompressFormat.JPEG);
//                            LogUtils.i("保存路径:" + path);
//                            ThreadUtils.runOnUiThread(() -> {
//                                T.info("保存路径:" + path);
//                            });
//                        }
//                    }
//                }
//
//                @Override
//                public void onError(@NonNull ImageCaptureException exception) {
//                    super.onError(exception);
//                    exception.printStackTrace();
//                }
//            });
            final ContentValues contentValues = new ContentValues();
            contentValues.put(MediaStore.MediaColumns.DISPLAY_NAME, CAPTURED_FILE_NAME
                    + "_" + TimeExtUtils.getCurr7());
            contentValues.put(MediaStore.MediaColumns.MIME_TYPE, CAPTURED_FILE_NAME_END);
            //
            ImageCapture.OutputFileOptions outputFileOptions =
                    new ImageCapture.OutputFileOptions.Builder(
                            getContentResolver(),
                            MediaStore.Images.Media.EXTERNAL_CONTENT_URI, contentValues)
                            .build();
            mImageCapture.takePicture(outputFileOptions, CameraXExecutors.ioExecutor(),
                    new ImageCapture.OnImageSavedCallback() {
                        @Override
                        public void onImageSaved(@NonNull ImageCapture.OutputFileResults outputFileResults) {
                            StringBuilder builder = new StringBuilder();
                            builder.append("图片保存路径:").append(outputFileResults.getSavedUri().getPath()).append("\n");
                            LogUtils.i(builder);
                            ThreadUtils.runOnUiThread(() -> {
                                T.info(builder.toString());
                            });
                        }

                        @Override
                        public void onError(@NonNull ImageCaptureException exception) {
                            exception.printStackTrace();
                        }
                    });
        }
    }

    /**
     * 录制视频
     */
    @SuppressLint({"MissingPermission", "RestrictedApi"})
    private void recordVideo() {
        LogUtils.i("recordVideo() isCameraXHandling:" + isCameraXHandling);
        if (isCameraXHandling) return;
        final ContentValues contentValues = new ContentValues();
        contentValues.put(MediaStore.MediaColumns.DISPLAY_NAME, RECORDED_FILE_NAME
                + "_" + TimeExtUtils.getCurr7());
        contentValues.put(MediaStore.MediaColumns.MIME_TYPE, RECORDED_FILE_NAME_END);
        try {
            VideoCapture.OutputFileOptions outputFileOptions = new VideoCapture.OutputFileOptions.Builder(getContentResolver(),
                    MediaStore.Video.Media.EXTERNAL_CONTENT_URI, contentValues)
                    .build();

            mVideoCapture.startRecording(outputFileOptions, CameraXExecutors.ioExecutor()
                    , new VideoCapture.OnVideoSavedCallback() {
                        @Override
                        public void onVideoSaved(@NonNull VideoCapture.OutputFileResults outputFileResults) {
                            StringBuilder builder = new StringBuilder();
                            builder.append("图片保存路径:").append(outputFileResults.getSavedUri().getPath()).append("\n");
                            LogUtils.i(builder);
                            videoRecordingPrepared();
                        }

                        @Override
                        public void onError(int videoCaptureError, @NonNull String message, @Nullable Throwable cause) {
                            LogUtils.i("startRecording 异常:" + message);
                            videoRecordingPrepared();
                        }
                    }
            );
        } catch (Exception e) {
            e.printStackTrace();
        }
        toggleRecordingStatus();
        isCameraXHandling = true;
    }

    @SuppressLint("UnsafeOptInUsageError")
    private void analyzeQRCode(@NonNull ImageProxy imageProxy) {
//        imageProxy.getPlanes()[0].buffer[0];// alpha
//        imageProxy.getPlanes()[0].buffer[1];// red
//        imageProxy.getPlanes()[0].buffer[2];// green
//        imageProxy.getPlanes()[0].buffer[3];// blue
        ImageProxy.PlaneProxy[] planeProxies = imageProxy.getPlanes();
        LogUtils.i("图片格式:" + imageProxy.getFormat() + "planeProxies:" + planeProxies.length);
        //分析图像以产生结果
        ByteBuffer byteBuffer = imageProxy.getPlanes()[0].getBuffer();
        byte[] data = new byte[byteBuffer.remaining()];
        byteBuffer.get(data);

        int width = imageProxy.getWidth(), height = imageProxy.getHeight();
        PlanarYUVLuminanceSource source = new PlanarYUVLuminanceSource(
                data, width, height, 0, 0, width, height, false);
        BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
        Result result;
        try {
            MultiFormatReader multiFormatReader = new MultiFormatReader();
            result = multiFormatReader.decode(bitmap);
            LogUtils.i("result:" + result);
        } catch (Exception e) {
            e.printStackTrace();
            result = null;
        }
        showQRCodeResult(result);
        //将 ImageProxy 发布到 CameraX
        imageProxy.close();
    }

    private void showQRCodeResult(@Nullable Result result) {
        if (getBinding() != null && getBinding().qrCodeResult != null) {
            getBinding().qrCodeResult.post(() ->
                    getBinding().qrCodeResult.setText(result != null ? "Link:\n" + result.getText() : ""));
            getBinding().qrCodeResult.playSoundEffect(SoundEffectConstants.CLICK);
        }
    }


    private void videoRecordingPrepared() {
        isCameraXHandling = false;
        // Keep disabled status for a while to avoid fast click error with "Muxer stop failed!".
        getBinding().capture.postDelayed(() -> getBinding().capture.setEnabled(true), 500);
    }

    /**
     * 设置视频模式
     */
    private void toggleVideoMode() {
        LogUtils.i("是否是视频模式:" + isVideoMode);
        isVideoMode = !isVideoMode;
        getBinding().recordVideo.setImageResource(isVideoMode ? R.drawable.ic_camera_new
                : R.drawable.ic_video);
        getBinding().capture.setImageResource(isVideoMode ? R.drawable.ic_capture_record
                : R.drawable.ic_capture);
        bindPreview(mCameraProvider, getBinding().previewView, isVideoMode);
    }

    /**
     * 设置录制状态
     */
    @SuppressLint("RestrictedApi")
    private void toggleRecordingStatus() {
        LogUtils.i("toggleRecordingStatus() isVideoMode:" + isVideoMode + " isRecording:" + isRecording);
        if (!isVideoMode) return;

        isRecording = !isRecording;
        getBinding().capture.setImageResource(isRecording
                ? R.drawable.ic_capture_record_pressing : R.drawable.ic_capture_record);

        // Stop recording when toggle to false.
        if (!isRecording && mVideoCapture != null) {
            Log.d("Camera", "toggleRecordingStatus() stopRecording");
            mVideoCapture.stopRecording();
            // Keep record button disabled till video recording truly stopped.
            getBinding().capture.post(() -> getBinding().capture.setEnabled(false));
        }
    }

    /**
     * 显示聚焦图标
     */
    private void showFocusView(int x, int y) {
        LogUtils.i("Focus x=" + x + "\ty=" + y);
        PopupWindow popupWindow = new PopupWindow(ViewGroup.LayoutParams.WRAP_CONTENT,
                ViewGroup.LayoutParams.WRAP_CONTENT);
        // popupWindow.setBackgroundDrawable(getDrawable(android.R.color.holo_blue_bright));
        ImageView imageView = new ImageView(this);
        imageView.setImageResource(R.drawable.ic_focus_view);
        popupWindow.setContentView(imageView);
        // popupWindow.showAtLocation(binding.previewView, Gravity.CENTER, x, y);
        popupWindow.showAsDropDown(getBinding().previewView, x, y);
        getBinding().previewView.postDelayed(popupWindow::dismiss, 600);
        // binding.previewView.playSoundEffect(SoundEffectConstants.CLICK);
    }
}

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值