零时延功能 ZSL的实现

前言

最近在学习3A的时候,了解了一下ReprocessableCaptrueSession请求,也知道了通过这个请求可以实现相机零时延功能,这个功能能大大优化相机拍照的速度。这不巧了,我刚好就在写一个相机demo,要是能把这个功能写了,相机优化效果杠杠的。

但是这也是噩梦的开始,我需要首先声明的是这篇文章并不是一个完整体,我写到最后都还有最后一步没有成功,于是便决定先做个记录,将其实现放缓。

ZSL的概念

ZSL (zero shutter lag) 中文名称为零延时拍照,是为了减少拍照延时,让拍照&回显瞬间完成的一种技术。

当开始预览后,sensor 和 VFE 会产生 preview 和 snapshot帧, 而最新的snapshot 帧数据会被存储在buffer 中。当拍照被触发,系统计算实际的拍照时间,找出在buffer中的相应帧,然后返回帧到用户,这就是所谓的“ZERO”。

其实简单来说就是,当预览的时候同时将预览的数据帧数据存储到一个buffer中,但点击拍照时,会在这个特定的buffer中取出最接近拍照时间的以及拍照状态最好的一个缓存。

ZSL的流程

ZSL的流程比较多,我这里引用极客教程的流程图给大家看一下。

这里大家过一遍就好,其实如果没看过一些demo,这个流程也是看不懂的,但是如果对整体demo代码有过研究,返回来看这个流程又会觉得非常清晰。 接下来我们便开始具体的流程代码介绍。

代码实现

其实ZSL的基本功能实现,主要就是在于预览跟拍照时进行逻辑编写。今天也会按照顺序从预览讲解到拍照。但是值得注意的是,在我编写这个博客时,为了能够最基本的实现ZSL功能,已经将代码功能最简单化,代码健壮性不足,所以千万不要细究,主要看这个代码逻辑流程。可以结合上面那张图的流程一起去看。

预览实现

总所周知,每个相机硬件所支持的功能都有所区别,所以我们在开始逻辑代码实现之前,要判断当前设备是否支持reprocess功能,只有支持这个才能进行ZSL功能拍照优化。

判断是否支持reprocess

StreamConfigurationMap map = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                //获得允许的功能
                int[] cameraCapabilities = cameraCharacteristics.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
                for (int cameraCapability : cameraCapabilities) {
                    if (cameraCapability == CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_PRIVATE_REPROCESSING||cameraCapability==CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_YUV_REPROCESSING) {
                        canReprocess = true;
                    }
                }

在判断设备支持之后,便可以开始进行逻辑的编写,我们先搞懂普通方式的预览跟ZSL的预览有什么区别。

由上面我们可以知道,其实预览的实现跟之前普通的预览的难点主要主要集中在inputinconfiguration、以及多出来的那个surface要怎么设置,这个多出来的surface也是需要设置ImageReader的,解决这三点预览就可以轻松实现啦。

InputConfiguration的声明

这个配置的创建说简单也简单,说麻烦也麻烦。简单是因为可以直接通过new声明对象,但是麻烦也是因为他可以通过new声明对象。我们看一下new这个对象的参数。

InputConfiguration(int width, int height, int format)

这三个参数的值都有规定的值,像第一跟第二个长宽参数 ,一定要设备匹配的长宽,那这个长宽又是怎么得到呢。我们可以在设置摄像头参数的时候获取。

if (map != null) {
                    Size[] outputSizes = map.getOutputSizes(SurfaceTexture.class);
                    Size size = getOptimalSize(outputSizes, width, height);
                    previewSize = size;
                    maxJpegOutputSize = getMaxOutputSize(map, ImageFormat.JPEG);
                    minJpegOutputSize = getMinOutputSize(map, ImageFormat.JPEG);
                    //获取private的输出大小
                    if (canReprocess){
                        maxprivateOutputSize = getMaxOutputSize(map, ImageFormat.YUV_420_888);
                        minprivateOutputSize = getMinOutputSize(map, ImageFormat.YUV_420_888);

                        maxprivateIutputSize=getMaxInputSize(map,ImageFormat.YUV_420_888);
                        minprivateIutputSize = getMinInputSize(map, ImageFormat.YUV_420_888);
                    }
                }

//获取最大的输入尺寸
    private Size getMaxInputSize(StreamConfigurationMap map, int format) {
        Size maxInputSize = null;
        Size[] InputSizes = null;
        if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.M) {
            InputSizes = map.getInputSizes(format);
        }
        /*System.out.println("++++++++++++1");
        for (Size privateOutputSize : InputSizes) {
            System.out.println(privateOutputSize.getWidth()+":"+privateOutputSize.getHeight());
        }*/
        //找到最大的privateInputSizes
        if (InputSizes.length > 0) {
            maxInputSize = findMaxSize(InputSizes);
        }
        return maxInputSize;
    }

    //获得最小的输入尺寸
    public Size getMinInputSize(StreamConfigurationMap map,int format){
        Size minInputSize = null;
        Size[] inputSize = null;
        if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.M) {
            inputSize = map.getInputSizes(format);
        }
        //找到最大的privateOutputSizes
        if (inputSize.length > 0) {
            minInputSize = findMinSize(inputSize);
        }
        return minInputSize;
    }

    //获得最小的输出尺寸
    private Size getMinOutputSize(StreamConfigurationMap map, int format) {
        Size minOutputSize = null;
        Size[] OutputSizes = map.getOutputSizes(format);
        //找到最大的privateOutputSizes
        if (OutputSizes.length > 0) {
            minOutputSize = findMinSize(OutputSizes);
        }
        return minOutputSize;
    }

    //获得最大的输出尺寸
    private Size getMaxOutputSize(StreamConfigurationMap map, int format) {
        Size maxOutputSize = null;
        Size[] OutputSizes = map.getOutputSizes(format);
        /*System.out.println("++++++++++++2");
        for (Size privateOutputSize : OutputSizes) {
            System.out.println(privateOutputSize.getWidth()+":"+privateOutputSize.getHeight());
        }*/
        //找到最大的privateOutputSizes
        if (OutputSizes.length > 0) {
            maxOutputSize = findMaxSize(OutputSizes);
        }
        return maxOutputSize;
    }

    //找到最小的输出尺寸
    private Size findMinSize(Size[] Sizes) {
        Size minSize = Sizes[0];
        for (int i = 1; i < Sizes.length; i++) {
            if (Sizes[i].getHeight() * Sizes[i].getWidth() < minSize.getWidth() * minSize.getWidth()) {
                minSize = Sizes[i];
            }
        }
        return minSize;
    }

    //找到最大的输出尺寸
    private Size findMaxSize(Size[] Sizes) {
        Size maxSize = Sizes[0];
        for (int i = 1; i < Sizes.length; i++) {
            if (Sizes[i].getHeight() * Sizes[i].getWidth() > maxSize.getWidth() * maxSize.getWidth()) {
                maxSize = Sizes[i];
            }
        }
        return maxSize;
    }

代码很长,其实可以不看,我整个代码逻辑就是获得设备支持输入格式,和该输入格式支持的分辨率大小。只是获得的分辨率是个列表,有很多选择,我选择最大的一个分辨率进行选择。整体逻辑其实就两行,拿到多个Size之后,自己选一个就行。但是其实选哪个也是有讲究,选不对后续拍照操作也会有决定性的bug。这个后面会进行说明。

int[] inputFormats = map.getInputFormats();
Size[] inputSizes = map.getInputSizes();

一般来说,如果支持reprocess的话,一般输入的format会支持两种格式,一种是PRIVATE,一种是YUV_420_888。然后我们可以选择其中一个格式作为输入inputconfiguration的第三个参数,我这边其实比较推荐YUV_420_888格式,至于为什么我后续可能会说清楚,总体来说,可行性会比较好。

通过上面的操作,inputconfiguration的三个参数就都拿到了,我们就可以顺利进行下一步了。拿到inputconfiguration之后就可以开始考虑创建预览的reprocesscaptruesession了。

创建Reprocesscaptruesession

还是老规矩,我们看一下创建session的函数,以及它所需要的参数。

public abstract void createReprocessableCaptureSession(@NonNull InputConfiguration inputConfig,
            @NonNull List<Surface> outputs, @NonNull CameraCaptureSession.StateCallback callback,
            @Nullable Handler handler)

可以看到存在四个参数,第一个参数inputconfig就是我们之前创建的InputConfiguration,第二个参数就是我们预览所需要的三个surface,第三个参数就是创建好session之后的回调函数,第四个是线程参数,表示希望将创建session的操作放在哪个线程进行工作。

我们先来看看第二个参数,surface的列表。我们在这里需要预览的surface、保存图片的surface、输入帧数据的surface。第一个跟第二个surface都可以像普通预览那样得到,想得到第三个surface的话,我们就需要另外设置一个输入的imagereader,通过这个输入的imagereader去getsurface(),接下来我们看看设置imagereader的相关代码。

privateImageReader = ImageReader.newInstance(maxprivateIutputSize.getWidth(), maxprivateIutputSize.getHeight(), ImageFormat.YUV_420_888, CIRCULAR_BUFFER_SIZE+1);
privateImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
                @Override
                public void onImageAvailable(ImageReader reader) {
                    Image image = reader.acquireNextImage();
                    //System.out.println(image.getWidth()+":"+image.getHeight()+":"+image.getFormat());
                    //添加 imageBuffer进行缓存

                    //zslCoordinator.getImageBuffer().add(image);
                    /*TODO*/
                    /*if (null == debugResult || null == debugImage) {
                        if (null != debugResult) {
                            debugImage = image;
                            ZSLPair tempPair =new ZSLPair(image, debugResult);
                            recaptureRequest(tempPair);
                        } else {
                            debugImage = image;
                        }
                    }*/
                    if (lastImage!=null){
                        lastImage.close();
                    }
                    lastImage=image;
                }
            }, null);

有一点需要注意的是,首先我们要搞清楚inputconfiguration跟这个Imagereader的联系,这两者应该是一个一一对应的关系,因为这个imagereader是拿来存储inputconfiguration的surface产生的帧数据。所以分辨率以及格式都是要一一对应的。一定要对应!特别是分辨率的长宽、格式!

然后我们再来看一下这个onImageAvailable回调函数的一些重要代码。再看之前我们先引入一张图

 如果上面这幅图可以理解,那么就好说了,很明显在预览的时候,我们需要缓存两个东西,一个是result,一个是image。result的缓存我一会再说,我先来说一下image的缓存,可以看到上面的代码就是对image进行缓存。我们还要知道的一个点就是,privateImageReader会在预览的时候一直被回调,也就是说onImageAvailable会一直调用,所以我那段代码的作用就是永远把最新得到的image进行保存。

自此,我们就说完了创建Reprocesscaptruesession的第二个参数,我们来到了第三个参数,回调函数

new CameraCaptureSession.StateCallback() {
                                @Override
                                public void onConfigured(@NonNull CameraCaptureSession session) {
                                    mCaptureSession = session;
                                    recaptureImageWriter = ImageWriter.newInstance(session.getInputSurface(), 11);
                                    //recaptureImageWriter = ImageWriter.newInstance(new Surface(surfaceTexture), 11);
                                    setRepeatCapture(mPreviewRequestBuilder);
                                }

                                @Override
                                public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                                    System.out.println("失败啦");
                                    //openCamera();
                                }
                            }

回调函数里面做两个事,第一就是创建Imagewriter,这个writer也必须跟inputconfiguration一一对应,这边可以直接使用getInputSurface()。第二件事就是准备请求repeatcaptrue。

中间的设置captruerequest的代码就不展示了,直接看一下请求repeatcaptrue的代码。

mCaptureSession.setRepeatingRequest(mPreviewRequest, new CameraCaptureSession.CaptureCallback() {
                    @Override
                    public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                        super.onCaptureCompleted(session, request, result);
                        //添加buffer
                        //zslCoordinator.getResultBuffer().add(result);
                        //System.out.println("来到了compl阶段");
                        //解码的captrueresult和image的判断
                        /*if (null == debugResult || null == debugImage) {
                            if (null != debugImage) {
                                debugResult = result;
                                ZSLPair tempPair = new ZSLPair(debugImage, result);
                                recaptureRequest(tempPair);
                            } else {
                                debugResult = result;
                            }
                        }*/
                        lastTotalResult=result;
                    }
                }, null);

可以看到在complete之后便开始缓存result,至此ZSL的预览就完成了。

拍照实现

ZSL的拍照不需要像普通拍照一样重新获取帧数据,只需要拿到之前的缓存result和image就可以进行拍照。可能有人会有疑问,既然预览的时候就已经拿到了image,那为什么不直接将这个image转换成jpeg存储就好,还要后续这么多步骤干嘛。首先这个想法,我之前也有,而且也是直接这么操作的,但是结果就是保存出来的照片都是超级奇怪的,完全不是预想中的样子。而且我们需要进行完所有ZSL需要的流程才能算ZSL,才能在速度上得到最大程度的优化。

前面的预览没有问题,其实拍照问题就不大了。一个方法解决

private void recaptureRequest(ZSLPair pair) {
        //System.out.println("走到这了!");
        captureStart = System.currentTimeMillis();

        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
            try {
                CaptureRequest.Builder reprocessCaptureRequest = mCameraDevice.createReprocessCaptureRequest(lastTotalResult);
                reprocessCaptureRequest.addTarget(jpegImageReaderSurface);
                //planes = pair.getImage().getPlanes();
                //dequeueInputImage跟queueInputImage的image一定要一样,我这里输出两个不一样,captrue失败的很大原因就在这
                //Image image = recaptureImageWriter.dequeueInputImage();
                //System.out.println(image.getWidth()+":"+image.getHeight()+":"+image.getFormat());
                System.out.println(lastImage.getWidth()+":"+lastImage.getHeight()+":"+lastImage.getFormat());
                /*试一下单独保存*/
                //saveImageToTPEG(lastImage);
                //lastImage的格式时YUV-420-888
                recaptureImageWriter.queueInputImage(lastImage);
                //pair.getImage().close();
                //reprocessCaptureRequest.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_STILL_CAPTURE);
                mCaptureSession.capture(reprocessCaptureRequest.build(), new CameraCaptureSession.CaptureCallback() {
                    @Override
                    public void onCaptureStarted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, long timestamp, long frameNumber) {
                        super.onCaptureStarted(session, request, timestamp, frameNumber);
                        System.out.println("onCaptureStarted");
                    }

                    @Override
                    public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
                        super.onCaptureProgressed(session, request, partialResult);
                        System.out.println("onCaptureProgressed");
                    }

                    @Override
                    public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                        System.out.println("onCaptureCompleted");
                        super.onCaptureCompleted(session, request, result);
                    }

                    @Override
                    public void onCaptureFailed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureFailure failure) {
                        super.onCaptureFailed(session, request, failure);
                        System.out.println("onCaptureFailed");
                        boolean b = failure.wasImageCaptured();
                        System.out.println(b);

                    }

                    @Override
                    public void onCaptureSequenceCompleted(@NonNull CameraCaptureSession session, int sequenceId, long frameNumber) {
                        super.onCaptureSequenceCompleted(session, sequenceId, frameNumber);
                        System.out.println("onCaptureSequenceCompleted");
                        //进行保存图片,开一个线程进行保存
                        /*ImageSaver imageSaver = new ImageSaver(getContext(), pair.getImage());
                        new Thread(imageSaver).start();
                        Toast.makeText(getContext(), "保存图片成功", Toast.LENGTH_SHORT).show();*/
                    }

                    @Override
                    public void onCaptureSequenceAborted(@NonNull CameraCaptureSession session, int sequenceId) {
                        super.onCaptureSequenceAborted(session, sequenceId);
                        System.out.println("onCaptureSequenceAborted");
                    }

                    @Override
                    public void onCaptureBufferLost(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull Surface target, long frameNumber) {
                        super.onCaptureBufferLost(session, request, target, frameNumber);
                        System.out.println("onCaptureBufferLost");
                        System.out.println(target);
                    }
                }, null);
                //System.out.println("ok");
                isFinishSavePhoto=true;
            } catch (CameraAccessException e) {
                e.printStackTrace();
            }
        }
    }

看着挺长,其实核心代码就四行

CaptureRequest.Builder reprocessCaptureRequest = mCameraDevice.createReprocessCaptureRequest(lastTotalResult);
reprocessCaptureRequest.addTarget(jpegImageReaderSurface);
recaptureImageWriter.queueInputImage(lastImage);
mCaptureSession.capture

首先第一行是根据之前缓存的result创建 ReprocessCaptureRequest

第二行是根据captruerequest添加保存图片的imagereader的surface

第三行是根据是将之前缓存的image通过imaewriter提交给Camera HAL

第四行是执行captrue请求,完成之后,保存图片的imagereader会进行回调函数,在回调函数中保存图片即可。

值得注意的是,我代码中保存YUV图片的代码还没有完成,大家知道这个意思就行。

总结

自此,整个ZSL优化就完成了。可能大家看下来还是云里雾里,但是我前几天也是这么过来的,我这边给大家推荐一个demo进行学习,看完之后再看我这篇文章,应该能理解的更加深刻一点。

这个demo有点小bug,但是真的很适合学习,语言不是Java,不过认真看看还是能看懂的。

最后我将我的所有代码附上。不过应该看的不是很懂,毕竟太乱了。

package com.yjs.cameraapplication.FragmentPackage;

import static android.hardware.camera2.CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL;
import static android.view.MotionEvent.ACTION_DOWN;

import android.Manifest;
import android.annotation.SuppressLint;
import android.app.AlertDialog;
import android.content.ContentResolver;
import android.content.Context;
import android.content.DialogInterface;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.res.Configuration;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.ColorFilter;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.Paint;
import android.graphics.Path;
import android.graphics.PixelFormat;
import android.graphics.Rect;
import android.graphics.RectF;
import android.graphics.SurfaceTexture;
import android.graphics.YuvImage;
import android.graphics.drawable.Drawable;
import android.graphics.drawable.GradientDrawable;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureFailure;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.InputConfiguration;
import android.hardware.camera2.params.MeteringRectangle;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.media.ImageWriter;
import android.media.MediaMetadataRetriever;
import android.net.Uri;
import android.os.Build;
import android.os.Bundle;

import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import androidx.core.graphics.drawable.RoundedBitmapDrawable;
import androidx.core.graphics.drawable.RoundedBitmapDrawableFactory;
import androidx.core.view.MotionEventCompat;
import androidx.fragment.app.DialogFragment;
import androidx.fragment.app.Fragment;

import android.os.Environment;
import android.os.Handler;
import android.os.Looper;
import android.os.Message;
import android.provider.MediaStore;
import android.util.Log;
import android.util.Size;
import android.util.SparseIntArray;
import android.view.Display;
import android.view.LayoutInflater;
import android.view.MotionEvent;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ImageButton;
import android.widget.ImageView;
import android.widget.Toast;

import com.yjs.cameraapplication.MainActivity;
import com.yjs.cameraapplication.MyOrientoinListener;
import com.yjs.cameraapplication.PhotoAlbumActivity;
import com.yjs.cameraapplication.R;
import com.yjs.cameraapplication.SelfView.MyImageView;
import com.yjs.cameraapplication.ZslHelper.CircularImageBuffer;
import com.yjs.cameraapplication.ZslHelper.ZSLCoordinator;
import com.yjs.cameraapplication.ZslHelper.ZSLPair;
import com.yjs.cameraapplication.utils.CoordinateTransformer;

import org.w3c.dom.ls.LSOutput;

import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.FilenameFilter;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.sql.SQLOutput;
import java.sql.Time;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.concurrent.TimeUnit;

/**
 * A simple {@link Fragment} subclass.
 * Use the {@link TakePhotoFragment#newInstance} factory method to
 * create an instance of this fragment.
 */
public class TakePhotoFragment extends Fragment implements View.OnClickListener {
    private static final String TAG = "TakePhotoFragment";
    private static final String ARG_PARAM1 = "param1";
    private static final String ARG_PARAM2 = "param2";
    private String mParam1;
    private String mParam2;
    /*fragment空间声明*/
    private TextureView textureView;
    private ImageButton takePhotoBtn;
    private ImageView mImageView;
    /*private MyImageView mImageView;*/
    private ImageButton changeCamBtn;
    private ImageButton flashState;
    /*除此之外,还需要一些参数*/
    private CameraManager cameraManager;
    private String mCameraId; //摄像头ID
    private Size previewSize; //预览分辨率
    private ImageReader mImageReader; //图片阅读器
    private static CameraDevice mCameraDevice;   //摄像头设备
    private static CameraCaptureSession mCaptureSession;   //获取会话
    private CaptureRequest mPreviewRequest;      //获取预览请求
    private CaptureRequest.Builder mPreviewRequestBuilder;   //获取到预览请求的Builder通过它创建预览请求
    private Surface mPreviewSurface;  //预览显示图
    //新建一个权限链
    private String[] permissions = {Manifest.permission.CAMERA, Manifest.permission.READ_EXTERNAL_STORAGE,
            Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.RECORD_AUDIO};
    private List<String> permissionList = new ArrayList();
    //添加一个图片集合
    List<String> imageList = new ArrayList<>();
    private Boolean isCreated = false;
    private Boolean isLeave = false;
    private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
    //获得缩略图的路径
    private String lastImagePath;
    private boolean isFinishSavePhoto = true;
    private MyOrientoinListener myOrientoinListener;
    //下面进行拉大缩小的变量定义
    private float stepWidth; //宽步长
    private float stepHeigh; //高步长
    private Rect rect;
    //最大变焦数
    private float max_digital_zoom;
    //最小的长宽
    private float minWidth;
    private float minHeigh;
    //最大放大
    private final int maxZoom = 100;
    private int currentZoom = 0;
    double oldlength = 0;
    private Rect zoomRect;
    //闪关灯按钮
    private LinkedHashMap<Integer, Integer> flashMap;
    private final int FLASH_OFF = 0;
    private final int FLASH_ON = 1;
    private final int FLASH_AUTO = 2;
    private final int FLASH_TORCH = 3;
    private int currentFlashState = 0;
    //判断是否允许reprocess
    private boolean canReprocess = false;
    private Size maxprivateOutputSize;
    private Size maxJpegOutputSize;
    private Size minprivateOutputSize;
    private Size minJpegOutputSize;
    private Size maxprivateIutputSize;
    private Size minprivateIutputSize;

    private final int CIRCULAR_BUFFER_SIZE = 10;
    //不同surface的imagereader
    private ImageReader jpegImageReader;
    private ImageReader privateImageReader;
    //zsl协调者
    private ZSLCoordinator zslCoordinator;
    //input的surface
    private Surface privateImageReaderSurface;
    //后续优化替代mImageReader的生成surface
    private Surface jpegImageReaderSurface;
    //解码result
    private TotalCaptureResult debugResult=null;
    //解码Image
    private Image debugImage=null;
    //拍照开始时间
    private long captureStart;
    private ImageWriter recaptureImageWriter;
    private Image.Plane[] planes;
    private TotalCaptureResult lastTotalResult;
    private Image lastImage;

    public TakePhotoFragment() {
        // Required empty public constructor
    }

    /**
     * Use this factory method to create a new instance of
     * this fragment using the provided parameters.
     *
     * @param param1 Parameter 1.
     * @param param2 Parameter 2.
     * @return A new instance of fragment TakePhotoFragment.
     */
    // TODO: Rename and change types and number of parameters
    public static TakePhotoFragment newInstance(String param1, String param2) {
        TakePhotoFragment fragment = new TakePhotoFragment();
        Bundle args = new Bundle();
        args.putString(ARG_PARAM1, param1);
        args.putString(ARG_PARAM2, param2);
        fragment.setArguments(args);
        return fragment;
    }

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        if (getArguments() != null) {
            mParam1 = getArguments().getString(ARG_PARAM1);
            mParam2 = getArguments().getString(ARG_PARAM2);
        }
        myOrientoinListener = new MyOrientoinListener(getContext());
        myOrientoinListener.enable();
        //加载闪关灯状态
        flashMap = new LinkedHashMap<Integer, Integer>();
        flashMap.put(FLASH_OFF, R.mipmap.flashoff);
        flashMap.put(FLASH_ON, R.mipmap.flashon);
        flashMap.put(FLASH_AUTO, R.mipmap.autoflash);
        flashMap.put(FLASH_TORCH, R.mipmap.flashtorch);

    }

    @Override
    public View onCreateView(LayoutInflater inflater, ViewGroup container,
                             Bundle savedInstanceState) {
        Log.d(TAG, "onCreateView: success");
        // Inflate the layout for this fragment
        View view = inflater.inflate(R.layout.fragment_take_photo, container, false);
        //动态授权
        isCreated = true;
        return view;
    }

    private void previewZoom(boolean isZoomBig) {
        //如果isZoomBig是true,就代表放大,为false就是缩小
        if (mCameraDevice == null || mCaptureSession == null || mPreviewRequest == null) {
            return;
        }
        if (currentZoom < maxZoom && isZoomBig) {
            //放大操作
            currentZoom++;
        }
        if (currentZoom > 0 && !isZoomBig) {
            //缩小操作
            currentZoom--;
        }
        int cropw = (int) (stepWidth * currentZoom);
        int croph = (int) (stepHeigh * currentZoom);
        zoomRect = new Rect(this.rect.left + cropw, this.rect.top + croph, this.rect.right - cropw, this.rect.bottom - croph);
        mPreviewRequestBuilder.set(CaptureRequest.SCALER_CROP_REGION, zoomRect);
        setRepeatCapture(mPreviewRequestBuilder);
        //setZoomCaptrue();
    }

    //看看viewpage对fragment的生命周期的影响
    @Override
    public void onPause() {
        super.onPause();
        Log.d(TAG, "onPause: success");
        closeCamera();
        if (textureView.isAvailable()) {
            //如果可用,就是除了进行切换pageview之外的所有操作
            isLeave = true;
        }
        /*else {
            //不可用说明就是直接切换了pageview
            closeCamera();
        }*/

    }

    @Override
    public void onStop() {
        super.onStop();
        Log.d(TAG, "onStop: success");
    }

    @Override
    public void onDestroyView() {
        super.onDestroyView();
        Log.d(TAG, "onDestroyView: success");
    }

    @Override
    public void onDestroy() {
        super.onDestroy();
        Log.d(TAG, "onDestroy: success");
        myOrientoinListener.disable();
    }

    @Override
    public void onResume() {
        super.onResume();
        Log.d(TAG, "onResume: success");
        System.out.println("++++++++++");
        System.out.println(textureView.isAvailable());
        currentZoom = 0;
        cameraManager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
        if (textureView.isAvailable()) {
            try {
                CameraCharacteristics cameraCharacteristics = cameraManager.getCameraCharacteristics(mCameraId);
                rect = cameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
                //设置会初始状态
                zoomRect = rect;
                currentZoom = 0;
            } catch (CameraAccessException e) {
                e.printStackTrace();
            }
            if (isLeave) {

                String getimagepath = getimagepath();
                Bitmap bitmap = null;
                if (getimagepath == null) {
                    Canvas canvas = new Canvas();
                    bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.cam);
                    Paint paint = new Paint();
                    paint.setColor(Color.BLACK);
                    canvas.drawBitmap(bitmap, bitmap.getWidth(), bitmap.getHeight(), paint);
                } else {
                    if (getimagepath.contains("mp4")) {
                        //如果是视频,对其第一帧进行获取
                        MediaMetadataRetriever object = new MediaMetadataRetriever();
                        object.setDataSource(getimagepath);
                        //frameTime的单位为us微秒
                        bitmap = object.getFrameAtTime(1, MediaMetadataRetriever.OPTION_CLOSEST);
                    } else {
                        bitmap = BitmapFactory.decodeFile(getimagepath);
                    }
                }

                mImageView.setImageBitmap(bitmap);
                openCamera();
                isLeave = false;
            } else {
                try {
                    CameraCharacteristics cameraCharacteristics = cameraManager.getCameraCharacteristics(mCameraId);
                    StreamConfigurationMap map = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                    Size[] outputSizes = map.getOutputSizes(SurfaceTexture.class);
                    Size optimalSize = getOptimalSize(outputSizes, textureView.getWidth(), textureView.getHeight());
                    previewSize = optimalSize;
                } catch (CameraAccessException e) {
                    e.printStackTrace();
                }

                openCamera();
            }
        } else {
            textureView.setSurfaceTextureListener(textureListener);
        }
    }

    @Override
    public void onConfigurationChanged(@NonNull Configuration newConfig) {
        super.onConfigurationChanged(newConfig);

        switch (newConfig.orientation) {

            case Configuration.ORIENTATION_PORTRAIT://竖屏

                Log.i(TAG, "竖屏");

                break;

            case Configuration.ORIENTATION_LANDSCAPE://横屏

                Log.i(TAG, "横屏");

            default:

                break;

        }
    }

    @Override
    public void onStart() {
        super.onStart();
        Log.d(TAG, "onStart: success");
        /*if (textureView.isAvailable()){

            openCamera();
        }*/
    }

    @Override
    public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) {
        super.onViewCreated(view, savedInstanceState);
        Log.d(TAG, "onViewCreated: success");
        initView(view);
        view.setOnTouchListener(new View.OnTouchListener() {
            @Override
            public boolean onTouch(View v, MotionEvent event) {

                if (event.getPointerCount() == 2) {
                    //表示检测到两个触摸点的时候,执行操作
                    switch (event.getAction() & MotionEvent.ACTION_MASK) {
                        case MotionEvent.ACTION_POINTER_DOWN:
                            //按下的时候,记录原本的位置
                            float x = event.getX(0) - event.getX(1);
                            float y = event.getY(0) - event.getY(1);
                            oldlength = Math.sqrt(Math.pow(x, 2) + Math.pow(y, 2));
                            break;
                        case MotionEvent.ACTION_MOVE:
                            //移动的时候
                            float newx = event.getX(0) - event.getX(1);
                            float newy = event.getY(0) - event.getY(1);
                            double newlength = Math.sqrt(Math.pow(newx, 2) + Math.pow(newy, 2));
                            System.out.println("对比一下");
                            System.out.println(oldlength + ":" + newlength);

                            if (newlength >= oldlength) {
                                //放大操作
                                previewZoom(true);
                            } else {
                                //缩小操作
                                previewZoom(false);
                            }
                            break;
                    }
                }
                return true;
            }
        });

    }


    private void closeCamera() {
        Log.d(TAG, "closeCamera: success");
        //首先要关闭session
        if (mCaptureSession != null) {
            mCaptureSession.close();
        }
        if (mCameraDevice != null) {
            mCameraDevice.close();
        }
    }

    //设置照片方向
    private void front() {
        //前置时,照片竖直显示
        Log.d(TAG, "front: success");
        ORIENTATIONS.append(0, 270);
        ORIENTATIONS.append(90, 180);
        ORIENTATIONS.append(180, 90);
        ORIENTATIONS.append(270, 0);
    }

    private void rear() {
        //后置时,照片竖直显示
        Log.d(TAG, "rear: success");
        ORIENTATIONS.append(0, 90);
        ORIENTATIONS.append(90, 180);
        ORIENTATIONS.append(180, 270);
        ORIENTATIONS.append(270, 0);
    }

    //绑定控件
    private void initView(View view) {
        textureView = view.findViewById(R.id.textureView);
        textureView.setOnTouchListener(onTouchLister);
        takePhotoBtn = view.findViewById(R.id.takePicture);
        takePhotoBtn.setOnClickListener(this);
        mImageView = view.findViewById(R.id.image_show);
        mImageView.setOnClickListener(this);
        changeCamBtn = view.findViewById(R.id.change);
        changeCamBtn.setOnClickListener(this);
        flashState = view.findViewById(R.id.flashstate);
        flashState.setOnClickListener(this);
    }

    private View.OnTouchListener onTouchLister = new View.OnTouchListener() {
        @Override
        public boolean onTouch(View v, MotionEvent event) {
            int actionMasked = MotionEventCompat.getActionMasked(event);
            switch (actionMasked) {
                case ACTION_DOWN:
                    //点击屏幕,聚焦区域
                    float fingerX = event.getX();
                    float fingerY = event.getY();
                    //triggerFocusArea(v,fingerX,fingerY);
            }
            return false;
        }
    };

    //设置对焦方法
    private void triggerFocusArea(View v, float fingerX, float fingerY) {
        //需要设置rect
        //获得长度八十个像素点
        //下面这种方法并不标准,因为预览的点击和底层的camera坐标系不同,所以需要进行转换
        int length = (int) (getResources().getDisplayMetrics().density * 80);
        CameraManager cameraManager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
        CameraCharacteristics cameraCharacteristics = null;
        try {
            cameraCharacteristics = cameraManager.getCameraCharacteristics(mCameraId);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }

        RectF rectF = new RectF(0, 0, v.getWidth(), v.getHeight());
        CoordinateTransformer coordinateTransformer = new CoordinateTransformer(cameraCharacteristics, rectF);
        RectF focusRect = new RectF((int) fingerX - length / 2, (int) fingerY - length / 2, (int) fingerX + length / 2, (int) fingerY + length / 2);
        RectF finalRect = coordinateTransformer.toCameraSpace(focusRect);
        Rect rect = new Rect();
        finalRect.round(rect);
        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_REGIONS, new MeteringRectangle[]{new MeteringRectangle(rect, 1000)});
        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);

        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_REGIONS, new MeteringRectangle[]{new MeteringRectangle(rect, 1000)});

        try {
            mCaptureSession.setRepeatingRequest(mPreviewRequestBuilder.build(), null, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
        //当预览设置为自动的时候,在暗环境下,对焦开始也会让闪光曝光执行
        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CameraMetadata.CONTROL_AE_PRECAPTURE_TRIGGER_START);
        try {
            mCaptureSession.capture(mPreviewRequestBuilder.build(), new CameraCaptureSession.CaptureCallback() {
                @Override
                public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                    Log.d(TAG, "onCaptureCompleted: success");
                    super.onCaptureCompleted(session, request, result);
                    //聚焦结束之后,记得将trggle调成idle,否则会一直聚焦
                    mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_IDLE);
                    mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_IDLE);
                    Integer integer = result.get(CaptureResult.CONTROL_AE_STATE);
                    Log.d(TAG, "自动曝光状态:" + integer);
                }
            }, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }

    }

    private void getPermission() {
        Log.d(TAG, "getPermission: success");
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
            for (String permission : permissions) {
                if (ContextCompat.checkSelfPermission(getContext(), permission) != PackageManager.PERMISSION_GRANTED) {
                    permissionList.add(permission);
                }
            }
            if (!permissionList.isEmpty()) {
                //进行授权
                ActivityCompat.requestPermissions(getActivity(), permissionList.toArray(new String[permissionList.size()]), 1);
            } else {
                //表示全部已经授权
                //这时候回调一个预览view的回调函数
                textureView.setSurfaceTextureListener(textureListener);
            }
        }
    }
    //只能写在Activity中,下次把授权写到activity中,减少麻烦
    /*@Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
        Log.d(TAG, "onRequestPermissionsResult: success");
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        if (requestCode==1){
            if (grantResults.length!=0){
                //表示有权限没有授权
                getPermission();
            }
            else {
                //表示都授权
                openCamera();
            }
        }
    }*/


    /*SurfaceView状态回调*/
    TextureView.SurfaceTextureListener textureListener = new TextureView.SurfaceTextureListener() {
        @Override
        public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
            Log.d(TAG, "onSurfaceTextureAvailable: success");
            //首先就需要设置相机,然后再打开相机
            /*setLastImagePath();*/
            String getimagepath = getimagepath();
            Bitmap bitmap = null;
            if (getimagepath == null) {
                Canvas canvas = new Canvas();
                bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.cam);
                Paint paint = new Paint();
                paint.setColor(Color.BLACK);
                canvas.drawBitmap(bitmap, bitmap.getWidth(), bitmap.getHeight(), paint);
            } else {
                if (getimagepath.contains("mp4")) {
                    //如果是视频,对其第一帧进行获取
                    MediaMetadataRetriever object = new MediaMetadataRetriever();
                    object.setDataSource(getimagepath);
                    //frameTime的单位为us微秒
                    bitmap = object.getFrameAtTime(1, MediaMetadataRetriever.OPTION_CLOSEST);
                } else {
                    bitmap = BitmapFactory.decodeFile(getimagepath);
                }
            }

            mImageView.setImageBitmap(bitmap);
            setupCamera(width, height);
            zslCoordinator = new ZSLCoordinator();
            if (isCreated) {
                openCamera();
                isCreated = false;
            }
        }

        @Override
        public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
            Log.d(TAG, "onSurfaceTextureSizeChanged: success");
            configureTransform(width, height);
        }

        @Override
        public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
            return false;
        }

        @Override
        public void onSurfaceTextureUpdated(SurfaceTexture surface) {

        }
    };

    //旋转屏幕
    private void configureTransform(int viewWidth, int viewHeight) {
        Log.d(TAG, "configureTransform: success");
        if (textureView == null || previewSize == null) {
            return;
        }
        int rotation = getActivity().getWindowManager().getDefaultDisplay().getRotation();
        Log.i("TAGggg", "rotation: " + rotation);
        Matrix matrix = new Matrix();
        //改变之后的长宽
        RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);
        //改变之前的长宽
        RectF bufferRect = new RectF(0, 0, previewSize.getHeight(), previewSize.getWidth());
        float centerX = viewRect.centerX();
        float centerY = viewRect.centerY();
        if (rotation == Surface.ROTATION_90 || rotation == Surface.ROTATION_270) {
            bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
            matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
            float scale = Math.max((float) viewHeight / previewSize.getHeight(),
                    (float) viewWidth / previewSize.getWidth());
            matrix.postScale(scale, scale, centerX, centerY);
            matrix.postRotate(90 * (rotation - 2), centerX, centerY);
        } else if (rotation == Surface.ROTATION_180) {
            Log.i("TAGggg", "rotation  --- : " + rotation);
            matrix.postRotate(180, centerX, centerY);
        }
        textureView.setTransform(matrix);
    }


    @SuppressLint("Range")
    public String getimagepath() {
        String imagePath = new String();
        String videoPath = new String();
        String path = null;
        boolean photoIsEmpty = false;
        boolean videoIsEmpty = false;
        int imageDateAdd = 0;
        int videoDateAdd = 0;
        ContentResolver mResolver = getActivity().getContentResolver();
        //得修改一下,改成获取图片或视频,也就是说得获取两个cursor游标,一个相片的,一个录像的

        String[] imageProjection = new String[]{MediaStore.Images.Media.DATA, MediaStore.Images.Media.DATE_ADDED, "MAX(" + MediaStore.Images.Media.DATE_ADDED + ")"};
        Cursor imageCursor = mResolver.query(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, imageProjection, MediaStore.Images.Media.DATA + " like ?", new String[]{"%.jpg"}, MediaStore.Images.Media.DEFAULT_SORT_ORDER);
        imageCursor.moveToFirst();

        while (!imageCursor.isAfterLast()) {
            if (imageCursor.getString(imageCursor.getColumnIndex(MediaStore.Images.Media.DATA)) == null) {
                photoIsEmpty = true;
                break;
            }
            Log.i("image", imageCursor.getString(imageCursor.getColumnIndex(MediaStore.Images.Media.DATA)));

            imagePath = imageCursor.getString(imageCursor.getColumnIndex(MediaStore.Images.Media.DATA));
            imageDateAdd = imageCursor.getInt(imageCursor.getColumnIndex(MediaStore.Images.Media.DATE_ADDED));
            System.out.println(imageDateAdd);
            imageCursor.moveToNext();

        }
        imageCursor.close();
        /*录像的内容获取*/
        String[] videoProjection = new String[]{MediaStore.Video.Media.DATA, MediaStore.Video.Media.DATE_ADDED, "MAX(" + MediaStore.Video.Media.DATE_ADDED + ")"};
        Cursor videoCursor = mResolver.query(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, videoProjection, null, null, MediaStore.Video.Media.DEFAULT_SORT_ORDER);
        videoCursor.moveToFirst();
        int count = videoCursor.getCount();
        while (!videoCursor.isAfterLast()) {
            if (videoCursor.getString(videoCursor.getColumnIndex(MediaStore.Video.Media.DATA)) == null) {
                videoIsEmpty = true;
                break;
            }
            Log.i("video", videoCursor.getString(videoCursor.getColumnIndex(MediaStore.Video.Media.DATA)));
            videoPath = videoCursor.getString(videoCursor.getColumnIndex(MediaStore.Video.Media.DATA));
            videoDateAdd = videoCursor.getInt(videoCursor.getColumnIndex(MediaStore.Video.Media.DATE_ADDED));
            System.out.println(videoDateAdd);
            videoCursor.moveToNext();
        }
        videoCursor.close();
        //判断图片还是录像最新
        //首先需要判断有没有一个是空,是空的直接把另一个返回即可
        if (photoIsEmpty) {
            if (videoIsEmpty) {
                //图片和录像都是空的话
                path = null;
            } else {
                //图片是空,录像不是空
                path = videoPath;
            }
        } else {
            if (videoIsEmpty) {
                //图像不是空,录像是空
                path = imagePath;
            } else {
                //都不是空,那就要比较谁的添加时间大了
                if (imageDateAdd >= videoDateAdd) {
                    path = imagePath;
                } else {
                    path = videoPath;
                }
            }
        }
        lastImagePath = path;
        return path;

    }

    //获取步长
    private void getZoomStep(CameraCharacteristics cameraCharacteristics) {
        //获取一下传光器获取光源的大小
        rect = cameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
        //拿到最大的变焦数
        max_digital_zoom = cameraCharacteristics.get(CameraCharacteristics.SCALER_AVAILABLE_MAX_DIGITAL_ZOOM);
        //可以开始算最小长宽了
        minWidth = rect.width() / max_digital_zoom;
        minHeigh = rect.height() / max_digital_zoom;
        //计算步长
        stepWidth = (rect.width() - minWidth) / maxZoom / 2;
        stepHeigh = (rect.height() - minHeigh) / maxZoom / 2;
    }

    private void setupCamera(int width, int height) {
        Log.d(TAG, "setupCamera: success");
        cameraManager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
        try {
            String[] cameraIdList = cameraManager.getCameraIdList();
            for (String cameraId : cameraIdList) {
                CameraCharacteristics cameraCharacteristics = cameraManager.getCameraCharacteristics(cameraId);
                if (cameraCharacteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT) {
                    continue;
                }
                StreamConfigurationMap map = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                //获得允许的功能
                int[] cameraCapabilities = cameraCharacteristics.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
                for (int cameraCapability : cameraCapabilities) {
                    if (cameraCapability == CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_PRIVATE_REPROCESSING||cameraCapability==CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_YUV_REPROCESSING) {
                        canReprocess = true;
                    }
                }
                Log.d("CameraCharacteristics", "Camera can reprocess: " + canReprocess);
                Log.d("CameraCharacteristics", cameraCharacteristics.get(INFO_SUPPORTED_HARDWARE_LEVEL) + "");
                //相机支持的所有分辨率,下一步就是获取最合适的分辨率
                if (map != null) {
                    Size[] outputSizes = map.getOutputSizes(SurfaceTexture.class);
                    Size size = getOptimalSize(outputSizes, width, height);
                    previewSize = size;
                    maxJpegOutputSize = getMaxOutputSize(map, ImageFormat.JPEG);
                    minJpegOutputSize = getMinOutputSize(map, ImageFormat.JPEG);
                    //获取private的输出大小
                    if (canReprocess){
                        maxprivateOutputSize = getMaxOutputSize(map, ImageFormat.YUV_420_888);
                        minprivateOutputSize = getMinOutputSize(map, ImageFormat.YUV_420_888);

                        maxprivateIutputSize=getMaxInputSize(map,ImageFormat.YUV_420_888);
                        minprivateIutputSize = getMinInputSize(map, ImageFormat.YUV_420_888);
                    }
                }

                getZoomStep(cameraCharacteristics);

                if (mCameraId == null) {
                    mCameraId = cameraId;
                }
                /*看一下相机的旋转角度*/
                if (mCameraId.equals("0")) {
                    rear();
                    //configureTransform(width, height);
                } else {
                    front();
                    //configureTransform(width, height);
                }
                break;
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }
    //获取最大的输入尺寸
    private Size getMaxInputSize(StreamConfigurationMap map, int format) {
        Size maxInputSize = null;
        Size[] InputSizes = null;
        if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.M) {
            InputSizes = map.getInputSizes(format);
        }
        System.out.println("++++++++++++1");
        for (Size privateOutputSize : InputSizes) {
            System.out.println(privateOutputSize.getWidth()+":"+privateOutputSize.getHeight());
        }
        //找到最大的privateInputSizes
        if (InputSizes.length > 0) {
            maxInputSize = findMaxSize(InputSizes);
        }
        return maxInputSize;
    }

    //获得最小的输入尺寸
    public Size getMinInputSize(StreamConfigurationMap map,int format){
        Size minInputSize = null;
        Size[] inputSize = null;
        if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.M) {
            inputSize = map.getInputSizes(format);
        }
        //找到最大的privateOutputSizes
        if (inputSize.length > 0) {
            minInputSize = findMinSize(inputSize);
        }
        return minInputSize;
    }

    //获得最小的输出尺寸
    private Size getMinOutputSize(StreamConfigurationMap map, int format) {
        Size minOutputSize = null;
        Size[] OutputSizes = map.getOutputSizes(format);
        //找到最大的privateOutputSizes
        if (OutputSizes.length > 0) {
            minOutputSize = findMinSize(OutputSizes);
        }
        return minOutputSize;
    }

    //获得最大的输出尺寸
    private Size getMaxOutputSize(StreamConfigurationMap map, int format) {
        Size maxOutputSize = null;
        Size[] OutputSizes = map.getOutputSizes(format);
        /*System.out.println("++++++++++++2");
        for (Size privateOutputSize : OutputSizes) {
            System.out.println(privateOutputSize.getWidth()+":"+privateOutputSize.getHeight());
        }*/
        //找到最大的privateOutputSizes
        if (OutputSizes.length > 0) {
            maxOutputSize = findMaxSize(OutputSizes);
        }
        return maxOutputSize;
    }

    //找到最小的输出尺寸
    private Size findMinSize(Size[] Sizes) {
        Size minSize = Sizes[0];
        for (int i = 1; i < Sizes.length; i++) {
            if (Sizes[i].getHeight() * Sizes[i].getWidth() < minSize.getWidth() * minSize.getWidth()) {
                minSize = Sizes[i];
            }
        }
        return minSize;
    }

    //找到最大的输出尺寸
    private Size findMaxSize(Size[] Sizes) {
        Size maxSize = Sizes[0];
        for (int i = 1; i < Sizes.length; i++) {
            if (Sizes[i].getHeight() * Sizes[i].getWidth() > maxSize.getWidth() * maxSize.getWidth()) {
                maxSize = Sizes[i];
            }
        }
        return maxSize;
    }

    //打开摄像头
    private void openCamera() {
        Log.d(TAG, "openCamera: success");
        CameraManager cameraManager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
        try {
            if (ActivityCompat.checkSelfPermission(getContext(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                // TODO: Consider calling
                //    ActivityCompat#requestPermissions
                // here to request the missing permissions, and then overriding
                //   public void onRequestPermissionsResult(int requestCode, String[] permissions,
                //                                          int[] grantResults)
                // to handle the case where the user grants the permission. See the documentation
                // for ActivityCompat#requestPermissions for more details.
                AlertDialog.Builder builder = new AlertDialog.Builder(getContext());
                builder.setMessage("该应用需要相机授权,点击授权按钮跳转到设置进行设置");
                builder.setNegativeButton("取消", new DialogInterface.OnClickListener() {
                    @Override
                    public void onClick(DialogInterface dialog, int which) {
                        getActivity().finish();
                    }
                });
                builder.setPositiveButton("设置", new DialogInterface.OnClickListener() {
                    @Override
                    public void onClick(DialogInterface dialog, int which) {
                        Intent intent = new Intent("android.settings.APPLICATION_DETAILS_SETTINGS");
                        intent.setData(Uri.fromParts("package", getActivity().getPackageName(), null));
                        startActivity(intent);
                    }
                }).create().show();
                return;
            }
            getPermission();
            cameraManager.openCamera(mCameraId, stateCallback, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    //摄像头状态回调
    private CameraDevice.StateCallback stateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(@NonNull CameraDevice camera) {
            Log.d(TAG, "onOpened: success");
            mCameraDevice = camera;
            //开启预览
            startPreview();
        }

        @Override
        public void onDisconnected(@NonNull CameraDevice camera) {
            Toast.makeText(getContext(), "摄像头设备连接失败", Toast.LENGTH_SHORT).show();
        }

        @Override
        public void onError(@NonNull CameraDevice camera, int error) {
            Toast.makeText(getContext(), "摄像头设备连接出错", Toast.LENGTH_SHORT).show();
        }
    };


    //预览功能
    private void startPreview() {
        Log.d(TAG, "startPreview: success");
        //设置图片阅读器
        setImageReader();
        //注意这里:sufacetexture跟surfaceview是两个东西,需要注意!
        //sufacetexture是textureview的重要属性
        SurfaceTexture surfaceTexture = textureView.getSurfaceTexture();
        //设置textureview的缓存区大小
        //设置预览的分辨率
        surfaceTexture.setDefaultBufferSize(1920, 1080);
        //设置surface进行预览图像数据
        mPreviewSurface = new Surface(surfaceTexture);
        //保存图片的surface
        jpegImageReaderSurface = jpegImageReader.getSurface();
        if (canReprocess){
            //缓存的surface
            privateImageReaderSurface = privateImageReader.getSurface();
        }
        //创建CaptureRequest
        setPreviewRequest();
        //创建capturesession
        /*Surface表示有多个输出流,我们有几个显示载体,就需要几个输出流。
        对于拍照而言,有两个输出流:一个用于预览、一个用于拍照。
        对于录制视频而言,有两个输出流:一个用于预览、一个用于录制视频。*/
        // previewSurface 用于预览, mImageReader.getSurface() 用于拍照
        try {
            if (canReprocess) {
                if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
                    InputConfiguration inputConfiguration = new InputConfiguration(maxprivateIutputSize.getWidth(), maxprivateIutputSize.getHeight(), ImageFormat.YUV_420_888);
                    /*privateImageReaderSurface的分辨率一定要小于1080p,这个是inputreader*/
                    mCameraDevice.createReprocessableCaptureSession(inputConfiguration,
                            Arrays.asList(mPreviewSurface, jpegImageReaderSurface, privateImageReaderSurface),
                            new CameraCaptureSession.StateCallback() {
                                @Override
                                public void onConfigured(@NonNull CameraCaptureSession session) {
                                    mCaptureSession = session;
                                    recaptureImageWriter = ImageWriter.newInstance(session.getInputSurface(), 11);
                                    //recaptureImageWriter = ImageWriter.newInstance(new Surface(surfaceTexture), 11);
                                    setRepeatCapture(mPreviewRequestBuilder);
                                }

                                @Override
                                public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                                    System.out.println("失败啦");
                                    //openCamera();
                                }
                            }
                            , null);
                }
            } else {
                mCameraDevice.createCaptureSession(Arrays.asList(mPreviewSurface, jpegImageReaderSurface), new CameraCaptureSession.StateCallback() {
                    @Override
                    public void onConfigured(@NonNull CameraCaptureSession session) {
                        //当回调创建成功就会调用这个回调
                        mCaptureSession = session;
                        setRepeatCapture(mPreviewRequestBuilder);
                    }

                    @Override
                    public void onConfigureFailed(@NonNull CameraCaptureSession session) {

                    }
                }, null);
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }

    }

    //设置图片阅读器
    private void setImageReader() {
        Log.d(TAG, "setImageReader: success");
        //创建ImageReader实例,接下来应该是设置一些属性参数
        if (canReprocess){
            jpegImageReader = ImageReader.newInstance(maxprivateOutputSize.getWidth(), maxprivateOutputSize.getHeight(), ImageFormat.YUV_420_888, CIRCULAR_BUFFER_SIZE+1);
        }
        else {
            jpegImageReader = ImageReader.newInstance(previewSize.getWidth(),previewSize.getHeight(), ImageFormat.JPEG, CIRCULAR_BUFFER_SIZE+1);
        }
        jpegImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
            @Override
            public void onImageAvailable(ImageReader reader) {
                System.out.println("真的走到这了");
                Image image = reader.acquireLatestImage();
                //进行保存图片,开一个线程进行保存
                ImageSaver imageSaver = new ImageSaver(getContext(), image);
                new Thread(imageSaver).start();
                Toast.makeText(getContext(), "保存图片成功", Toast.LENGTH_SHORT).show();
            }
        }, null);
        //创建不同的ImageReader
        if (canReprocess){
            //这里很关键,因为zsl,谷歌设定需要设置10以上,对内存的要求很大,所以分辨率不能超过1080p
            //privateImageReader = ImageReader.newInstance(1920, 1080, ImageFormat.PRIVATE, CIRCULAR_BUFFER_SIZE);
            //为什么又可以开了,卧槽
            privateImageReader = ImageReader.newInstance(maxprivateIutputSize.getWidth(), maxprivateIutputSize.getHeight(), ImageFormat.YUV_420_888, CIRCULAR_BUFFER_SIZE+1);

            privateImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
                @Override
                public void onImageAvailable(ImageReader reader) {
                    Image image = reader.acquireNextImage();
                    //System.out.println(image.getWidth()+":"+image.getHeight()+":"+image.getFormat());
                    //添加 imageBuffer进行缓存

                    //zslCoordinator.getImageBuffer().add(image);
                    /*TODO*/
                    /*if (null == debugResult || null == debugImage) {
                        if (null != debugResult) {
                            debugImage = image;
                            ZSLPair tempPair =new ZSLPair(image, debugResult);
                            recaptureRequest(tempPair);
                        } else {
                            debugImage = image;
                        }
                    }*/
                    if (lastImage!=null){
                        lastImage.close();
                    }
                    lastImage=image;
                }
            }, null);
        }


    }

    //选择sizeMap中大于并且接近width和height的size
    private Size getOptimalSize(Size[] outputSizes, int width, int height) {
        Size tempSize = new Size(width, height);
        List<Size> sizes = new ArrayList<>();
        for (Size outputSize : outputSizes) {
            if (width > height) {
                //横屏的时候看,或是平板形式
                if (outputSize.getHeight() > height && outputSize.getWidth() > width) {
                    sizes.add(outputSize);
                }
            } else {
                //竖屏的时候
                if (outputSize.getWidth() > height && outputSize.getHeight() > width) {
                    sizes.add(outputSize);
                }
            }
        }
        if (sizes.size() > 0) {
            //如果有多个符合条件找到一个差距最小的,最接近预览分辨率的
            tempSize = sizes.get(0);
            int minnum = 999999;
            for (Size size : sizes) {
                int num = size.getHeight() * size.getHeight() - width * height;
                if (num < minnum) {
                    minnum = num;
                    tempSize = size;
                }
            }
        }
        return tempSize;
        /*if (sizes.size() > 0) {
            return Collections.min(sizes, new Comparator<Size>() {
                @Override
                public int compare(Size size, Size t1) {
                    return Long.signum(size.getWidth() * size.getHeight() - t1.getWidth() * t1.getHeight());
                }
            });
        }
        return outputSizes[0];*/

    }

    /*Fragment控件点击事件*/
    @Override
    public void onClick(View v) {
        switch (v.getId()) {
            case R.id.change:
                changeLens();
                break;
            case R.id.takePicture:
                captruePictrue();
                break;
            case R.id.image_show:
                openAlbum();
                break;
            case R.id.flashstate:
                //闪关灯状态切换
                changeFlashState();
                break;
        }
    }

    //切换闪关灯状态
    private void changeFlashState() {
        currentFlashState++;
        if (currentFlashState >= 4) {
            currentFlashState = 0;
        }
        Bitmap bitmap = BitmapFactory.decodeResource(getResources(), flashMap.get(currentFlashState));
        flashState.setImageBitmap(bitmap);
        changeAEMode();
        setRepeatCapture(mPreviewRequestBuilder);
    }

    //打开相册
    private void openAlbum() {
        if (lastImagePath == null) {
            Toast.makeText(getContext(), "相册无照片", Toast.LENGTH_SHORT).show();
        } else {
            Intent intent = new Intent(getContext(), PhotoAlbumActivity.class);
            intent.putExtra("lastImage", lastImagePath);
            startActivity(intent);
        }

    }

    public void captruePictrue() {
        //拍照需要captureRequest
        if (isFinishSavePhoto) {
            isFinishSavePhoto = false;
            if (canReprocess){
                //可以reprocess
                /*ZSLPair bestFrame = zslCoordinator.getBestFrame();
                if (bestFrame!=null){
                    recaptureRequest(bestFrame);
                    zslCoordinator.getImageBuffer().remove(bestFrame.getImage());
                    zslCoordinator.getResultBuffer().remove(bestFrame.getResult());
                }
                else {
                    try {
                        throw new Exception("No Best frame found");
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                }*/
                recaptureRequest(null);
            }
            else {
                try {
                    CaptureRequest.Builder previewRequest = getPreviewRequest();
                    CaptureRequest.Builder captureRequest = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
                    captureRequest.addTarget(jpegImageReader.getSurface());
                    previewRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
                    // 判断是否需要闪光灯
                    //这里写错了,到时候再改改
                    /*switch (currentFlashState) {
                        case FLASH_ON:
                            captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
                            captureRequest.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_TORCH);
                            break;
                        case FLASH_AUTO:
                            captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
                            break;
                        case FLASH_OFF:
                            captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
                            captureRequest.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF);
                    }*/

                    // 根据摄像头方向对保存的照片进行旋转,使其为"自然方向"
                    int orientation = getActivity().getWindowManager().getDefaultDisplay().getRotation();
                    int requestedOrientation = getActivity().getRequestedOrientation();
                    //测试得到摄像头方向都是0
                /*System.out.println("摄像头方向是");
                android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo();
                int cameraOrientation = info.orientation;
                System.out.println(cameraOrientation);*/

                    System.out.println("目前方向是");
                    System.out.println(orientation);
                    System.out.println("传感器角度");
                    System.out.println(myOrientoinListener.angle);
                    System.out.println(ORIENTATIONS.get(myOrientoinListener.angle));

                    captureRequest.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(myOrientoinListener.angle));
                    captureRequest.set(CaptureRequest.SCALER_CROP_REGION, zoomRect);
                    captureRequest.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO);
                    /*将预览给停了*/
                    /*没必要停,在重复模式开启其他模式的captrue,重复模式会自动停止,待拍照结束,会自己重新启动*/
                    //mCaptureSession.stopRepeating();
                    CaptureRequest takePictrueBuild = captureRequest.build();
                    mCaptureSession.capture(takePictrueBuild, new CameraCaptureSession.CaptureCallback() {
                        @Override
                        public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                            super.onCaptureCompleted(session, request, result);
                            //setRepeatCapture();
                        }
                    }, null);


                } catch (CameraAccessException e) {
                    e.printStackTrace();
                }
            }

        }


    }

    private void changeLens() {
        Log.d(TAG, "changeLens: success");
        if (mCameraId.equals(String.valueOf(CameraCharacteristics.LENS_FACING_BACK))) {
            mCameraId = String.valueOf(CameraCharacteristics.LENS_FACING_FRONT);
            rear();
        } else {
            if (mCameraId.equals(String.valueOf(CameraCharacteristics.LENS_FACING_FRONT))) {
                mCameraId = String.valueOf(CameraCharacteristics.LENS_FACING_BACK);
                front();
            }
        }
        CameraManager cameraManager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
        try {
            CameraCharacteristics cameraCharacteristics = cameraManager.getCameraCharacteristics(mCameraId);
            StreamConfigurationMap map = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            Size[] outputSizes = map.getOutputSizes(SurfaceTexture.class);
            Size optimalSize = getOptimalSize(outputSizes, textureView.getWidth(), textureView.getHeight());
            previewSize = optimalSize;
            rect = cameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
            getZoomStep(cameraCharacteristics);
            //设置会初始状态
            zoomRect = rect;
            currentZoom = 0;
            mPreviewRequestBuilder.set(CaptureRequest.SCALER_CROP_REGION, zoomRect);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        /*这里不能直接使用关闭camera这个方法*/
        /*closeCamera();*/
        closeCamera();
        openCamera();
    }


    //创建一个图片保存类
    public class ImageSaver implements Runnable {
        private Context context;
        private Image image;

        public ImageSaver(Context context, Image image) {
            Log.d(TAG, "ImageSaver: success");
            this.context = context;
            this.image = image;
        }

        @Override
        public void run() {
            Image.Plane[] planes = image.getPlanes();
            ByteBuffer buffer = planes[0].getBuffer();
            byte[] bytes = new byte[buffer.remaining()];
            buffer.get(bytes);
            //解决前摄镜像问题
            if (mCameraId.equals("1")) {
                Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
                Matrix m = new Matrix();
                m.postScale(-1, 1); // 镜像水平翻转
                bitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), m, true);
                ByteArrayOutputStream baos = new ByteArrayOutputStream();
                bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
                bytes = baos.toByteArray();
            }

            String filname = Environment.getExternalStorageDirectory() + "/DCIM/Camera/" + System.currentTimeMillis() + ".jpg";
            File file = new File(filname);
            FileOutputStream fileOutputStream = null;
            try {
                //保存图片
                fileOutputStream = new FileOutputStream(file);
                fileOutputStream.write(bytes, 0, bytes.length);

            } catch (FileNotFoundException e) {
                e.printStackTrace();
            } catch (IOException e) {
                e.printStackTrace();
            } finally {
                if (fileOutputStream != null) {
                    try {
                        fileOutputStream.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
                //最后还要广播通知相册更新数据库
                notiBroadcast();
                //保存操作结束后,需要用handle进行主线程数据的更新
                Message message = new Message();
                message.what = 0;
                Bundle bundle = new Bundle();
                bundle.putString("path", filname);
                message.setData(bundle);
                handler.sendMessage(message);
                //image也算是个流也需要进行关闭,否则可能下一次拍照的时候会报错
                //这个image如果不关掉,他就只能拍imagereader设定的最大imagesize,我总感觉这个是拿来做连拍的
                image.close();
            }
        }
    }

    private Handler handler = new Handler(Looper.myLooper()) {
        @Override
        public void handleMessage(Message msg) {
            super.handleMessage(msg);
            if (msg.what == 0) {
                Bundle pathdata = msg.getData();
                String path = (String) pathdata.get("path");
                imageList.add(path);
                //设置拍照界面显示的一个图片(左下角的图片预览)
                setLastImagePath(path);
                isFinishSavePhoto = true;
            }
        }
    };

    private void notiBroadcast() {
        String path = Environment.getExternalStorageDirectory() + "/DCIM/";
        Intent intent = new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE);
        Uri uri = Uri.fromFile(new File(path));
        intent.setData(uri);
        getContext().sendBroadcast(intent);
    }

    private void setLastImagePath(String path) {
        Log.d(TAG, "setLastImagePath: success");
        //先判断一下手机有没有权限
        if (ActivityCompat.checkSelfPermission(getContext(), Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
            return;
        }
        Bitmap bitmap = BitmapFactory.decodeFile(path);
        lastImagePath = path;
        mImageView.setImageBitmap(bitmap);

    }

    //单独设置一个previewRequest,我们需要更加完整的功能
    private void setPreviewRequest() {
        try {
            if (canReprocess) {
                mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_ZERO_SHUTTER_LAG);
            } else {
                mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            }
            mPreviewRequestBuilder.addTarget(mPreviewSurface);
            //添加privateImageReaderSurface
            if (privateImageReaderSurface != null) {
                mPreviewRequestBuilder.addTarget(privateImageReaderSurface);
            }
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_IDLE);
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_ANTIBANDING_MODE, CaptureRequest.CONTROL_AE_ANTIBANDING_MODE_AUTO);
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO);

            changeAEMode();

        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void changeAEMode() {
        switch (currentFlashState) {
            case FLASH_AUTO:
                //高通平台
                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
                mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_SINGLE);
                //mtk平台
                /*mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
                mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF);*/
                break;
            case FLASH_ON:
                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
                mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_SINGLE);
                break;
            case FLASH_OFF:
                //关闭闪光灯
                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
                mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_OFF);
                break;
            case FLASH_TORCH:
                //闪光灯一直打开,一般用作补光的时候使用
                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
                mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_TORCH);
                break;
        }
    }

    private CaptureRequest.Builder getPreviewRequest() {
        if (mPreviewRequestBuilder != null) {
            return mPreviewRequestBuilder;
        } else {
            try {
                mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            } catch (CameraAccessException e) {
                e.printStackTrace();
            }
            mPreviewRequestBuilder.addTarget(mPreviewSurface);
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_ANTIBANDING_MODE, CaptureRequest.CONTROL_AE_ANTIBANDING_MODE_AUTO);
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO);
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_IDLE);
            return mPreviewRequestBuilder;
        }
    }


    private void setRepeatCapture(CaptureRequest.Builder capture) {
        Log.d(TAG, "setRepeatCapture: success");
        mPreviewRequestBuilder.setTag(TAG);
        //首先要知道整个调用顺序 devices创建出capturebuilder,当builder设置好各种参数之后,就可以build出capturerequire
        mPreviewRequest = capture.build();
        //session中需要用到capturerequire
        try {
            if (canReprocess) {
                mCaptureSession.setRepeatingRequest(mPreviewRequest, new CameraCaptureSession.CaptureCallback() {
                    @Override
                    public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                        super.onCaptureCompleted(session, request, result);
                        //添加buffer
                        //zslCoordinator.getResultBuffer().add(result);
                        //System.out.println("来到了compl阶段");
                        //解码的captrueresult和image的判断
                        /*if (null == debugResult || null == debugImage) {
                            if (null != debugImage) {
                                debugResult = result;
                                ZSLPair tempPair = new ZSLPair(debugImage, result);
                                recaptureRequest(tempPair);
                            } else {
                                debugResult = result;
                            }
                        }*/
                        lastTotalResult=result;
                    }
                }, null);
            } else {
                mCaptureSession.setRepeatingRequest(mPreviewRequest, new CameraCaptureSession.CaptureCallback() {
                    @Override
                    public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                        super.onCaptureCompleted(session, request, result);
                        Integer integer = result.get(CaptureResult.CONTROL_AE_STATE);
                        //曝光成功
                        //Log.d(TAG, "AE_SATE:"+integer);
                    }
                }, null);
            }

        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void recaptureRequest(ZSLPair pair) {
        //System.out.println("走到这了!");
        captureStart = System.currentTimeMillis();

        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
            try {
                CaptureRequest.Builder reprocessCaptureRequest = mCameraDevice.createReprocessCaptureRequest(lastTotalResult);
                reprocessCaptureRequest.addTarget(jpegImageReaderSurface);
                //planes = pair.getImage().getPlanes();
                //dequeueInputImage跟queueInputImage的image一定要一样,我这里输出两个不一样,captrue失败的很大原因就在这
                //Image image = recaptureImageWriter.dequeueInputImage();
                //System.out.println(image.getWidth()+":"+image.getHeight()+":"+image.getFormat());
                System.out.println(lastImage.getWidth()+":"+lastImage.getHeight()+":"+lastImage.getFormat());
                /*试一下单独保存*/
                //saveImageToTPEG(lastImage);
                //lastImage的格式时YUV-420-888
                recaptureImageWriter.queueInputImage(lastImage);
                //pair.getImage().close();
                //reprocessCaptureRequest.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_STILL_CAPTURE);
                mCaptureSession.capture(reprocessCaptureRequest.build(), new CameraCaptureSession.CaptureCallback() {
                    @Override
                    public void onCaptureStarted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, long timestamp, long frameNumber) {
                        super.onCaptureStarted(session, request, timestamp, frameNumber);
                        System.out.println("onCaptureStarted");
                    }

                    @Override
                    public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
                        super.onCaptureProgressed(session, request, partialResult);
                        System.out.println("onCaptureProgressed");
                    }

                    @Override
                    public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                        System.out.println("onCaptureCompleted");
                        super.onCaptureCompleted(session, request, result);
                    }

                    @Override
                    public void onCaptureFailed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureFailure failure) {
                        super.onCaptureFailed(session, request, failure);
                        System.out.println("onCaptureFailed");
                        boolean b = failure.wasImageCaptured();
                        System.out.println(b);

                    }

                    @Override
                    public void onCaptureSequenceCompleted(@NonNull CameraCaptureSession session, int sequenceId, long frameNumber) {
                        super.onCaptureSequenceCompleted(session, sequenceId, frameNumber);
                        System.out.println("onCaptureSequenceCompleted");
                        //进行保存图片,开一个线程进行保存
                        /*ImageSaver imageSaver = new ImageSaver(getContext(), pair.getImage());
                        new Thread(imageSaver).start();
                        Toast.makeText(getContext(), "保存图片成功", Toast.LENGTH_SHORT).show();*/
                    }

                    @Override
                    public void onCaptureSequenceAborted(@NonNull CameraCaptureSession session, int sequenceId) {
                        super.onCaptureSequenceAborted(session, sequenceId);
                        System.out.println("onCaptureSequenceAborted");
                    }

                    @Override
                    public void onCaptureBufferLost(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull Surface target, long frameNumber) {
                        super.onCaptureBufferLost(session, request, target, frameNumber);
                        System.out.println("onCaptureBufferLost");
                        System.out.println(target);
                    }
                }, null);
                //System.out.println("ok");
                isFinishSavePhoto=true;
            } catch (CameraAccessException e) {
                e.printStackTrace();
            }
        }
    }

    private void saveImageToTPEG(Image image) {
        Thread thread = new Thread(new Runnable() {
            @Override
            public void run() {
                int planes = image.getPlanes().length;
                Log.i(TAG,"length of getPlanes:"+planes);

                ByteBuffer[] buffers = new ByteBuffer[3];
                int[] number_buffers = new int[3];
                byte[][] bytes_plans = new byte[3][];
                for (int i = 0; i < planes; i++) {
                    buffers[i] = image.getPlanes()[i].getBuffer();
                    number_buffers[i] = buffers[i].remaining();
                    Log.i(TAG,"测试相机 imagereader 回调数据大小 planes["+i+"]:"+number_buffers[i]);
                    bytes_plans[i] = new byte[number_buffers[i]];
                    buffers[i].get(bytes_plans[i]);
                    //System.out.println(Arrays.toString(bytes_plans[i]));
                }
                if (planes>1){
                    byte[] bytes = new byte[buffers[0].remaining()];
                    byte[] yuv_buffer =byteMergerAll(bytes_plans[0],bytes_plans[2]);
                    Log.i(TAG,"yuv_buffer size is :"+yuv_buffer.length);
                    if (yuv_buffer!=null){
                        Bitmap Jpeg_bitmap = YuvTansformJpeg(yuv_buffer,1920,1080,80);
                        ByteArrayOutputStream baos = new ByteArrayOutputStream();
                        Jpeg_bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
                        bytes = baos.toByteArray();
                    }
                    String filname = Environment.getExternalStorageDirectory() + "/DCIM/Camera/" + System.currentTimeMillis() + ".jpg";
                    File file = new File(filname);
                    FileOutputStream fileOutputStream = null;
                    //保存图片
                    try {
                        fileOutputStream = new FileOutputStream(file);
                        fileOutputStream.write(bytes, 0, bytes.length);
                    } catch (FileNotFoundException e) {
                        e.printStackTrace();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }finally {
                        if (fileOutputStream!=null){
                            try {
                                fileOutputStream.close();
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }
                    }
                    //最后还要广播通知相册更新数据库
                    notiBroadcast();
                    //保存操作结束后,需要用handle进行主线程数据的更新
                    Message message = new Message();
                    message.what = 0;
                    Bundle bundle = new Bundle();
                    bundle.putString("path", filname);
                    message.setData(bundle);
                    handler.sendMessage(message);
                    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
                        recaptureImageWriter.queueInputImage(lastImage);
                    }
                }
            }
        });
        thread.start();


    }

    private static Bitmap YuvTansformJpeg(byte[] data, int width, int hight, int quality) {
        Log.i(TAG,"Yuv开始转换Jpeg");
        try {
            YuvImage image_jpeg =
                    new YuvImage(data,ImageFormat.NV21,width,hight,null);
            ByteArrayOutputStream stream = new ByteArrayOutputStream();
            image_jpeg.compressToJpeg(new Rect(0, 0,width, hight), quality, stream);
            Bitmap jpeg_bitmap =
                    BitmapFactory.decodeByteArray(stream.toByteArray(), 0, stream.toByteArray().length);
            Log.i(TAG,"Yuv转换Jpeg完成");

            return jpeg_bitmap;
        } catch (Exception e) {
            Log.e(TAG,"failed :"+e);
        }
        return null;
    }

    private static byte[] byteMergerAll(byte[]... values) {
        int length_byte = 0;
        for (int i = 0; i < values.length; i++) {
            length_byte += values[i].length;
        }
        byte[] all_byte = new byte[length_byte];
        int countLength = 0;
        for (int i = 0; i < values.length; i++) {
            byte[] b = values[i];
            System.arraycopy(b, 0, all_byte, countLength, b.length);
            countLength += b.length;
        }
        return all_byte;
    }


    /*private void setZoomCaptrue(){
        mPreviewRequest = mPreviewRequestBuilder.build();
        try {
            mCaptureSession.capture(mPreviewRequest, new CameraCaptureSession.CaptureCallback() {
                @Override
                public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                    super.onCaptureCompleted(session, request, result);
                }
            }, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }*/
}

 

  • 4
    点赞
  • 11
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值