使用camera2构建相机应用

在Andrioid 5.0之前如果要自己创建相机应用,那么需要使用android.hardware.Camera类,但是在Android 5.0中引入了android.hardware.camera2包用于替代原有的类构建相机应用。本文主要讲述利用camera2包构建应用,如果想了解在Android 5.0之前构建应用的话,可以到官网文档查看

一、android.hardware.camera2概述

在官方文档中这样写道:android.hardware.camera2包在Android设备和个人相机设备之间提供接口,替代过时的Camera类。该包将相机设备抽象成管道,输入为捕获单个一帧的请求或单个图片的请求,输出为一个捕获结果的元数据包,附带一些请求的输出图片缓存信息。请求按照顺序处理,并且可以同时发送多个请求。

二、使用步骤

1. 获取指定摄像头并设置参数

为了获得可用的相机设备,需要首先获取CameraManager对象。获取代码如下:

 CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);

每一个CameraDevices提供了一系列描述硬件设备、设置和输出参数的静态属性。这些属性通过CameraCharacteristics对象,CameraCharacteristic对象则通过CameraManager的getCameraCharacteristicis(String)方法获得。现在很多Android设备都有两个摄像头,一个前置,一个后置。下面的代码就是获取后置摄像头的代码:

CameraManager cameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);

        try {
            for (String cameraId : cameraManager.getCameraIdList()) {

                CameraCharacteristics cameraCharacteristics = cameraManager.getCameraCharacteristics(cameraId);

                Integer facing = cameraCharacteristics.get(CameraCharacteristics.LENS_FACING);

                if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
                    continue;
                }

                //TODO 后置设置头

            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }

CameraCharacteristics的LENS_FACING参数描述的是摄像头相对于屏幕的方向,可能值为FRONT、BACK、EXTERNAL。在获取到指定摄像头后,保存对应的cameraId,该cameraId就代表了某一摄像头。
当获得了cameraId后,就调用CameraManager的openCamera方法打开指定摄像头,方法声明如下:

void openCamera (String cameraId, 
                CameraDevice.StateCallback callback, 
                Handler handler)

其中callback是相机一旦开启后使用的回调,handler是回调所使用的。如果handler为空,那么就使用当前线程的Looper。也就是说Handler取决于Callback在哪个线程中运行。其中StataCallback的声明如下:

public abstact CameraDevice.StateCallback {

        @Override
        public abstract void onOpened(@NonNull CameraDevice cameraDevice);

        @Override
        public abstract void onDisconnected(@NonNull CameraDevice cameraDevice);

        @Override
        public abstract void onError(@NonNull CameraDevice cameraDevice, int error);
}

一旦相机打开,就可以获得CameraDevice对象,一般在onOpened方法中创建相机会话,在onDisconnected和onError中释放相机资源。

2. 设置预览界面

为了拍照或录像,应用需要首先用一系列输出Surface装进相机捕获绘画。每一个Surface需要首先配置好恰当的尺寸和格式。Surface可以从多种方式获得,包括SurfaceView、SurfaceTexture。一般地,相机预览图片被发送到SurfaceView或TexttureView。在这里我们简单实用TextureView。SurfaceView创建预览的例子可以参考这里。实用TextureView很简单,只需要获得SurfaceTexture对象即可。SurfaceTexture用于渲染内容。在开发中我们需要设置TextureView的SurfaceTextureListener监听器,该监听器用于监听SurfaceTexture的获得、尺寸改变、更新和销毁事件。

3. 创建相机会话

在第一步中获取到CameraDevice和第二步设置了预览界面后,下面需要利用CameraDevice和预览界面创建相机会话。应用需要创建CaptureRequest对象,该对象定义了摄像头捕获一帧需要的所有参数。CaptureRequest需要列出配置好的Surface作为输出结果的Surface。一旦CaptureRequest对象创建,可以被提交给活动着的会话用于拍一张图片或多张图片。创建会话需要使用CameraDevice的createCaptureSession方法,下面是方法的声明

void createCaptureSession (List<Surface> outputs, 
                CameraCaptureSession.StateCallback callback, 
                Handler handler)

createCaptureSession方法使用提供的一系列Surface创建新的会话。callback是会话状态的回调,Handler是回调发消息所使用的,如果为null,那么使用当前线程的Looper创建Handler。CameraCaptureSession.StateCallback是一个抽象类,有两个抽象方法,onConfigured和onConfigureFailed。在onConfigured中可以处理CaptureRequest。下面是一段示例,

try {

            SurfaceTexture surfaceTexture = textureView.getSurfaceTexture();
            surfaceTexture.setDefaultBufferSize(textureView.getWidth(), textureView.getHeight());

            Surface surface = new Surface(surfaceTexture);
            priviewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            priviewRequestBuilder.addTarget(surface);

            cameraDevice.createCaptureSession(Arrays.asList(surface, imageReader.getSurface()), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(CameraCaptureSession session) {

                    try {
                        if (null == cameraDevice)
                            return;

                        cameraCaptureSession = session;

                        //预览时,需要设置自动聚焦模式
                        priviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
                        previewRequest = priviewRequestBuilder.build();

                        cameraCaptureSession.setRepeatingRequest(previewRequest, mCaptureCallback, mBackgroundHandler);
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }

                }

                @Override
                public void onConfigureFailed(CameraCaptureSession session) {

                    Toast.makeText(MainActivity.this, "Failed", Toast.LENGTH_SHORT).show();

                }
            }, null);


        } catch (Exception e) {
            e.printStackTrace();
        }

从代码中可以看到,在会话的onConfigured方法中保存会话对象,然后创建预览的CaptureRequest对象,因为是预览,所以使用的是CameraSession的setRepeatingRequest方法。

4. 拍照

上面建立了预览状态,并且在会话创建成功后保存了会话变量,接下来的所有拍照请求都需要提交给CameraSeesion。拍摄单张图片调用capture方法,如下

   mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
                    mBackgroundHandler);

其中mCaptureCallback用于跟踪提交的CaptureRequest的处理进程。CaptureSession.CaptureCallback的定义如下:

 public static abstract class CaptureCallback {

        /**
         * This constant is used to indicate that no images were captured for
         * the request.
         *
         * @hide
         */
        public static final int NO_FRAMES_CAPTURED = -1;


        public void onCaptureStarted(@NonNull CameraCaptureSession session,
                @NonNull CaptureRequest request, long timestamp, long frameNumber) {
            // Temporary trampoline for API change transition
            onCaptureStarted(session, request, timestamp);
        }

        /**
         * Temporary for API change transition
         * @hide
         */
        public void onCaptureStarted(CameraCaptureSession session,
                CaptureRequest request, long timestamp) {
            // default empty implementation
        }

        /**
         * @hide
         */
        public void onCapturePartial(CameraCaptureSession session,
                CaptureRequest request, CaptureResult result) {
            // default empty implementation
        }

        public void onCaptureProgressed(@NonNull CameraCaptureSession session,
                @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
            // default empty implementation
        }


        public void onCaptureCompleted(@NonNull CameraCaptureSession session,
                @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
            // default empty implementation
        }


        public void onCaptureFailed(@NonNull CameraCaptureSession session,
                @NonNull CaptureRequest request, @NonNull CaptureFailure failure) {
            // default empty implementation
        }

        public void onCaptureSequenceCompleted(@NonNull CameraCaptureSession session,
                int sequenceId, long frameNumber) {
            // default empty implementation
        }

        public void onCaptureSequenceAborted(@NonNull CameraCaptureSession session,
                int sequenceId) {
            // default empty implementation
        }

        public void onCaptureBufferLost(@NonNull CameraCaptureSession session,
                @NonNull CaptureRequest request, @NonNull Surface target, long frameNumber) {
            // default empty implementation
        }
    }

可以看到很多方法都是空实现。

5. 保存图片

在用camera2包中,如何保存拍摄的图片呢?这需要使用到ImageReader类,需要创建ImageReader类并设置OnImageAvailableListener接口,OnImageAvailableListener用于处理可获得图片时,我们可以在这里保存图片。还需要在创建CameraCaptureSession时,将ImageReader也作为输出Surface接受拍摄的图片。

6. 完整例子

@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
public class MainActivity extends AppCompatActivity implements ActivityCompat.OnRequestPermissionsResultCallback, View.OnClickListener {

    public static final String TAG = "MainActivity";


    private static final int STATE_PREVIEW = 0;
    private static final int STATE_WAITING_LOCK = 1;
    private static final int STATE_WAITING_PRECAPTURE = 2;
    private static final int STATE_WAITING_NON_PRECAPTURE = 3;
    private static final int STATE_PICTURE_TAKEN = 4;

    private int mState = STATE_PREVIEW;

    private TextureView textureView;
    private Button captureBtn;

    private CameraCaptureSession cameraCaptureSession;
    private String cameraId;
    private CameraDevice cameraDevice;

    private HandlerThread mBackgroundThread;
    private Handler mBackgroundHandler;

    private CaptureRequest.Builder priviewRequestBuilder;
    private CaptureRequest previewRequest;

    private ImageReader imageReader;
    private File file;

    private TextureView.SurfaceTextureListener mSurfaceTextureListener = new TextureView.SurfaceTextureListener() {
        @Override
        public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
            openCamera();
        }

        @Override
        public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

        }

        @Override
        public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
            return false;
        }

        @Override
        public void onSurfaceTextureUpdated(SurfaceTexture surface) {

        }
    };

    private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(CameraDevice camera) {

            cameraDevice = camera;
            createCameraPreviewSession();

        }

        @Override
        public void onDisconnected(CameraDevice camera) {

            camera.close();
            cameraDevice = null;

        }

        @Override
        public void onError(CameraDevice camera, int error) {

            camera.close();
            cameraDevice = null;

        }
    };

    private CameraCaptureSession.CaptureCallback mCaptureCallback = new CameraCaptureSession.CaptureCallback() {
        @Override
        public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {

            process(result);

        }

        @Override
        public void onCaptureProgressed(CameraCaptureSession session, CaptureRequest request, CaptureResult partialResult) {

            process(partialResult);

        }

        private void process(CaptureResult result) {

            switch (mState) {
                case STATE_PREVIEW: {
                    // We have nothing to do when the camera preview is working normally.
                    break;
                }
                case STATE_WAITING_LOCK: {
                    Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
                    if (afState == null) {
                        captureStillPicture();
                    } else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
                            CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {
                        // CONTROL_AE_STATE can be null on some devices
                        Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                        if (aeState == null ||
                                aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {
                            mState = STATE_PICTURE_TAKEN;
                            captureStillPicture();
                        } else {
                            runPrecaptureSequence();
                        }
                    }
                    break;
                }
                case STATE_WAITING_PRECAPTURE: {
                    // CONTROL_AE_STATE can be null on some devices
                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                    if (aeState == null ||
                            aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
                            aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
                        mState = STATE_WAITING_NON_PRECAPTURE;
                    }
                    break;
                }
                case STATE_WAITING_NON_PRECAPTURE: {
                    // CONTROL_AE_STATE can be null on some devices
                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                    if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
                        mState = STATE_PICTURE_TAKEN;
                        captureStillPicture();
                    }
                    break;
                }
            }

        }

    };

    private void runPrecaptureSequence() {
        try {
            // This is how to tell the camera to trigger.
            priviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER,
                    CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
            // Tell #mCaptureCallback to wait for the precapture sequence to be set.
            mState = STATE_WAITING_PRECAPTURE;
            cameraCaptureSession.capture(priviewRequestBuilder.build(), mCaptureCallback,
                    mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void captureStillPicture() {

        try {
            CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
            builder.addTarget(imageReader.getSurface());
            builder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);

            CameraCaptureSession.CaptureCallback callback = new CameraCaptureSession.CaptureCallback() {
                @Override
                public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {

                    Toast.makeText(MainActivity.this, "Saved:" + file, Toast.LENGTH_SHORT).show();
                    unlockFocus();
                }
            };

            cameraCaptureSession.stopRepeating();
            cameraCaptureSession.capture(builder.build(), callback, null);

        } catch (CameraAccessException e) {
            e.printStackTrace();
        }

    }

    @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        textureView = (TextureView) findViewById(R.id.surface);
        captureBtn = (Button) findViewById(R.id.capture);
        captureBtn.setOnClickListener(this);

        file = new File(getExternalFilesDir(null), "test.jpg");

    }

    @Override
    protected void onResume() {
        super.onResume();
        mBackgroundThread = new HandlerThread("camera");
        mBackgroundThread.start();
        mBackgroundHandler = new Handler(mBackgroundThread.getLooper());

        if (textureView.isAvailable()) {
            openCamera();
        } else {
            textureView.setSurfaceTextureListener(mSurfaceTextureListener);
        }
    }

    private void openCamera() {

        if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {

            ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA},
                    1);
            return;
        }

        //获取相机对象
        CameraManager cameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
        try {
            for (String id : cameraManager.getCameraIdList()) {

                CameraCharacteristics cameraCharacteristics = cameraManager.getCameraCharacteristics(id);
                Integer facing = cameraCharacteristics.get(CameraCharacteristics.LENS_FACING);
                if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
                    continue;
                }

                cameraId = id;

                imageReader = ImageReader.newInstance(textureView.getWidth(), textureView.getHeight(), ImageFormat.JPEG, 2);
                imageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler);

                break;

            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }

        try {
            cameraManager.openCamera(cameraId, mStateCallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    /**
     * 创建预览会话
     */
    private void createCameraPreviewSession() {

        try {

            SurfaceTexture surfaceTexture = textureView.getSurfaceTexture();
            surfaceTexture.setDefaultBufferSize(textureView.getWidth(), textureView.getHeight());

            Surface surface = new Surface(surfaceTexture);
            priviewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            priviewRequestBuilder.addTarget(surface);

            cameraDevice.createCaptureSession(Arrays.asList(surface, imageReader.getSurface()), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(CameraCaptureSession session) {

                    try {
                        if (null == cameraDevice)
                            return;

                        cameraCaptureSession = session;

                        //预览时,需要设置自动聚焦模式
                        priviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
                        previewRequest = priviewRequestBuilder.build();

                        cameraCaptureSession.setRepeatingRequest(previewRequest, mCaptureCallback, mBackgroundHandler);
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }

                }

                @Override
                public void onConfigureFailed(CameraCaptureSession session) {

                    Toast.makeText(MainActivity.this, "Failed", Toast.LENGTH_SHORT).show();

                }
            }, null);


        } catch (Exception e) {
            e.printStackTrace();
        }

    }


    @Override
    public void onClick(View v) {

        //拍照
        takePic();

    }

    private void takePic() {

        lockFocus();
    }

    //锁住焦点
    private void lockFocus() {

        try {
            priviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START);
            mState = STATE_WAITING_LOCK;
            cameraCaptureSession.capture(priviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    //解决焦点
    private void unlockFocus() {
        try {
            priviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
                    CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
            cameraCaptureSession.capture(priviewRequestBuilder.build(), mCaptureCallback,
                    mBackgroundHandler);
            mState = STATE_PREVIEW;
            cameraCaptureSession.setRepeatingRequest(previewRequest, mCaptureCallback,
                    mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
        @Override
        public void onImageAvailable(ImageReader reader) {

            //保存图片
            mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), file));

        }
    };

    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
                                           @NonNull int[] grantResults) {
        if (requestCode == 1) {
            if (grantResults.length != 1 || grantResults[0] != PackageManager.PERMISSION_GRANTED) {
                //TODO 不同意
            }
        } else {
            super.onRequestPermissionsResult(requestCode, permissions, grantResults);
            openCamera();
        }
    }

    /**
    *保存图片
    */
    private static class ImageSaver implements Runnable {

        private Image mImage;
        private File mFile;

        public ImageSaver(Image mImage, File mFile) {
            this.mImage = mImage;
            this.mFile = mFile;
        }

        @Override
        public void run() {

            ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
            byte[] bytes = new byte[buffer.remaining()];
            buffer.get(bytes);
            FileOutputStream output = null;
            try {
                output = new FileOutputStream(mFile);
                output.write(bytes);
            } catch (IOException e) {
                e.printStackTrace();
            } finally {
                mImage.close();
                if (null != output) {
                    try {
                        output.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
            }

        }
    }

}

说明

上述代码示例,参考google官方示例
本文代码可以到我的Github地址查看。

评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值