Arcsoft人脸识别算法_Camera1、Camera2、CameraX_API的使用

这篇文章主要介绍分别采用Camera1、Camera2、CameraX API接口获取Camera数据流,并集成ArcSoft人脸识别算法。

 

ArcSoft官方的demo是采用的Camera1接口,我前面也写过一篇单独Camera2 接口集成Arcsoft接口的文章(全网首发:Android Camera2 集成人脸识别算法

01

应用设计流程图

如下图所示,应用流程比较简单,分别从不同的API接口获取到Camera数据流数据,然后送到ArcSoft人脸识别算法库中进行识别,最终将识别结果绘制到界面上。

02

应用界面

CameraX需要和界面生命周期进行绑定,所以主界面设计成了2个Button入口,一个入口是Camera1和Camera2共用,一个是CameraX独立的入口。

如下图所示:Camera1和Camera2之间可以互相切换。

CameraX是单独的界面。

03

代码实现

1) Camera1 API的使用:

 private void startCameraByApi1() {
        DisplayMetrics metrics = new DisplayMetrics();
        getWindowManager().getDefaultDisplay().getMetrics(metrics);
        CameraListener cameraListener = new CameraListener() {
            @Override
            public void onCameraOpened(Camera camera, int cameraId, int displayOrientation, boolean isMirror) {
                Camera.Size previewSize = camera.getParameters().getPreviewSize();
                mPreviewSize = new Size(previewSize.width,previewSize.height);
                drawHelper = new DrawHelper(previewSize.width, previewSize.height, previewView.getWidth(), previewView.getHeight(), displayOrientation
                        , cameraId, isMirror, false, false);
            }


            @Override
            public void onPreview(byte[] nv21, Camera camera) {
                drawFaceInfo(nv21);
            }


            @Override
            public void onCameraClosed() {
            }


            @Override
            public void onCameraError(Exception e) {
            }


            @Override
            public void onCameraConfigurationChanged(int cameraID, int displayOrientation) {
                if (drawHelper != null) {
                    drawHelper.setCameraDisplayOrientation(displayOrientation);
                }
            }
        };


        cameraAPI1Helper = new Camera1ApiHelper.Builder()
                .previewViewSize(new Point(previewView.getMeasuredWidth(), previewView.getMeasuredHeight()))
                .rotation(getWindowManager().getDefaultDisplay().getRotation())
                .specificCameraId(Camera.CameraInfo.CAMERA_FACING_BACK)
                .isMirror(false)
                .previewOn(previewView)
                .cameraListener(cameraListener)
                .build();
        cameraAPI1Helper.init();
        cameraAPI1Helper.start();
    }

2) Camera2 API的使用:

 private void openCameraApi2(int width, int height) {
        Log.v(TAG, "---- openCameraAPi2();width: " + width + ";height: " + height);


        setUpCameraOutputs(width, height);
        CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
        try {
            if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
                throw new RuntimeException("Time out waiting to lock back camera opening.");
            }
            manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        } catch (InterruptedException e) {
            throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
        }
    }


 private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(@NonNull CameraDevice cameraDevice) {
            initArcsoftDrawHelper();


            mCameraOpenCloseLock.release();
            mCameraDevice = cameraDevice;


            try {
                Thread.sleep(500);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }


            createCameraPreviewSession();
        }


        @Override
        public void onDisconnected(@NonNull CameraDevice cameraDevice) {
            mCameraOpenCloseLock.release();
            cameraDevice.close();
            mCameraDevice = null;
        }


        @Override
        public void onError(@NonNull CameraDevice cameraDevice, int error) {
            mCameraOpenCloseLock.release();
            cameraDevice.close();
            mCameraDevice = null;
            finish();
        }
    };


 private void createCameraPreviewSession() {
        try {
            SurfaceTexture texture = previewView.getSurfaceTexture();
            assert texture != null;


            // We configure the size of default buffer to be the size of camera preview we want.
            texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());


            // This is the output Surface we need to start preview.
            Surface surface = new Surface(texture);


            mPreviewRequestBuilder
                    = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);


            mPreviewRequestBuilder.addTarget(surface);
            mPreviewRequestBuilder.addTarget(mImageReader.getSurface());


            mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
                    new CameraCaptureSession.StateCallback() {


                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                            Log.v(TAG, "--- Camera2API:onConfigured();");


                            if (null == mCameraDevice) {
                                return;
                            }


                            mPreviewCaptureSession = cameraCaptureSession;
                            try {
                                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                                        CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);


                                mPreviewRequest = mPreviewRequestBuilder.build();
                                mPreviewCaptureSession.setRepeatingRequest(mPreviewRequest,
                                        null, mBackgroundHandler);
                            } catch (CameraAccessException e) {
                                e.printStackTrace();
                            }
                        }


                        @Override
                        public void onConfigureFailed(
                                @NonNull CameraCaptureSession cameraCaptureSession) {
                            showToast("Failed");
                        }
                    }, null
            );
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }


private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
            = new ImageReader.OnImageAvailableListener() {


        @Override
        public void onImageAvailable(ImageReader reader) {
            Image image = reader.acquireLatestImage();


            if(image == null){
                return;
            }


            synchronized (mImageReaderLock) {
                if(!mImageReaderLock.equals(1)){
                    Log.v(TAG, "--- image not available,just return!!!");
                    image.close();
                    return;
                }
                if (ImageFormat.YUV_420_888 == image.getFormat()) {
                    Image.Plane[] planes = image.getPlanes();


                    lock.lock();
                    if (y == null) {
                        y = new byte[planes[0].getBuffer().limit() - planes[0].getBuffer().position()];
                        u = new byte[planes[1].getBuffer().limit() - planes[1].getBuffer().position()];
                        v = new byte[planes[2].getBuffer().limit() - planes[2].getBuffer().position()];
                    }


                    if (image.getPlanes()[0].getBuffer().remaining() == y.length) {
                        planes[0].getBuffer().get(y);
                        planes[1].getBuffer().get(u);
                        planes[2].getBuffer().get(v);


                        if (nv21 == null) {
                            nv21 = new byte[planes[0].getRowStride() * mPreviewSize.getHeight() * 3 / 2];
                        }


                        if(nv21 != null && (nv21.length != planes[0].getRowStride() * mPreviewSize.getHeight() *3/2)){
                            return;
                        }


                        // 回传数据是YUV422
                        if (y.length / u.length == 2) {
                            ImageUtil.yuv422ToYuv420sp(y, u, v, nv21, planes[0].getRowStride(), mPreviewSize.getHeight());
                        }
                        // 回传数据是YUV420
                        else if (y.length / u.length == 4) {
                            ImageUtil.yuv420ToYuv420sp(y, u, v, nv21, planes[0].getRowStride(), mPreviewSize.getHeight());
                        }


                        //调用Arcsoft算法,绘制人脸信息
                        drawFaceInfo(nv21);
                    }
                    lock.unlock();
                }
            }
            image.close();
        }
    };

3) CameraX API的使用:

 private void startCameraX() {
        Log.v(TAG,"--- startCameraX();");
        mPreviewSize = new Size(640,480);
        setPreviewViewAspectRatio();
        initArcsoftDrawHelper();


        Rational rational = new Rational(mPreviewSize.getHeight(), mPreviewSize.getWidth());
        // 1. preview
        PreviewConfig previewConfig = new PreviewConfig.Builder()
                .setTargetAspectRatio(rational)
                .setTargetResolution(mPreviewSize)
                .build();


        Preview preview = new Preview(previewConfig);
        preview.setOnPreviewOutputUpdateListener(new Preview.OnPreviewOutputUpdateListener() {
            @Override
            public void onUpdated(Preview.PreviewOutput output) {
                previewView.setSurfaceTexture(output.getSurfaceTexture());
                configureTransform(previewView.getWidth(),previewView.getHeight());
            }
        });


        // 2. capture
        ImageCaptureConfig imageCaptureConfig = new ImageCaptureConfig.Builder()
                .setTargetAspectRatio(rational)
                .setCaptureMode(ImageCapture.CaptureMode.MIN_LATENCY)
                .build();
        final ImageCapture imageCapture = new ImageCapture(imageCaptureConfig);


        // 3. analyze
        HandlerThread handlerThread = new HandlerThread("Analyze-thread");
        handlerThread.start();


        ImageAnalysisConfig imageAnalysisConfig = new ImageAnalysisConfig.Builder()
                .setCallbackHandler(new Handler(handlerThread.getLooper()))
                .setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE)
                .setTargetAspectRatio(rational)
                .setTargetResolution(mPreviewSize)
                .build();


        ImageAnalysis imageAnalysis = new ImageAnalysis(imageAnalysisConfig);
        imageAnalysis.setAnalyzer(new MyAnalyzer());


        CameraX.bindToLifecycle(this, preview, imageCapture, imageAnalysis);
    }
private class MyAnalyzer implements ImageAnalysis.Analyzer {
        private byte[] y;
        private byte[] u;
        private byte[] v;
        private byte[] nv21;
        private ReentrantLock lock = new ReentrantLock();
        private Object mImageReaderLock = 1;//1 available,0 unAvailable


        @Override
        public void analyze(ImageProxy imageProxy, int rotationDegrees) {
            Image image =  imageProxy.getImage();


            if(image == null){
                return;
            }


            synchronized (mImageReaderLock) {
                if(!mImageReaderLock.equals(1)){
                    image.close();
                    return;
                }
                if (ImageFormat.YUV_420_888 == image.getFormat()) {
                    Image.Plane[] planes = image.getPlanes();
                    if(mImageReaderSize == null){
                        mImageReaderSize = new Size(planes[0].getRowStride(),image.getHeight());
                    }


                    lock.lock();
                    if (y == null) {
                        y = new byte[planes[0].getBuffer().limit() - planes[0].getBuffer().position()];
                        u = new byte[planes[1].getBuffer().limit() - planes[1].getBuffer().position()];
                        v = new byte[planes[2].getBuffer().limit() - planes[2].getBuffer().position()];
                    }


                    if (image.getPlanes()[0].getBuffer().remaining() == y.length) {
                        planes[0].getBuffer().get(y);
                        planes[1].getBuffer().get(u);
                        planes[2].getBuffer().get(v);


                        if (nv21 == null) {
                            nv21 = new byte[planes[0].getRowStride() * image.getHeight() * 3 / 2];
                        }


                        if(nv21 != null && (nv21.length != planes[0].getRowStride() * image.getHeight() *3/2)){
                            return;
                        }


                        // 回传数据是YUV422
                        if (y.length / u.length == 2) {
                            ImageUtil.yuv422ToYuv420sp(y, u, v, nv21, planes[0].getRowStride(), image.getHeight());
                        }
                        // 回传数据是YUV420
                        else if (y.length / u.length == 4) {
                            nv21 = ImageUtil.yuv420ToNv21(image);
                        }


                        //调用Arcsoft算法,绘制人脸信息
                        drawFaceInfo(nv21,mPreviewSize);
                    }
                    lock.unlock();
                }
            }
        }
    }


04

遇到的问题

1)预览变形

这个是由于设置Camera预览的size和TextureView的size比例不一致导致。

我们一般会根据当前设备屏幕的size,遍历camera支持的preview size,找到适合当前设备的预览size,再根据当前预览size,动态调整textureView的显示。

2)Arcsoft Sdk Error 异常

中间遇到的关于Arcsoft sdk error异常的,可以在Arcsoft开发中心,帮助界面,输入对应的error code,根据提示信息,可以帮助我们快速定位排查问题。

05

附录

1)Demo地址:

      公众号:小驰笔记,回复:人脸识别,即可获取代码下载地址。

2)ArcSoft官网sdk下载地址:

    https://ai.arcsoft.com.cn/

推荐阅读:

全网首发:Android Camera2 集成人脸识别算法

哪些坑爹的Android CTS测试

高通Camera数字成像系统简介

一篇文章带你了解Android 最新Camera框架

这可能是介绍Android UvcCamera最详细的文章了

深圳上班,

生活简简单单,

14年开始从事Android Camera相关软件开发工作,

做过车载、手机、执法记录仪......

公众号记录生活和工作的点滴,

点击关注“小驰笔记”,期待和你相遇~

欢迎关注我的个人博客:http://www.xiaochibiji.com

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 1
    评论
Android Camera开发入门:目录 第一篇: 前景  一、Android Camera开发前景;      1)camera相关应用的领域      2)相关岗位介绍;      3)市场招聘介绍;      4)发展前景介绍;  二、学习这门课的重要性;      1)适合的人群;      2)熟悉和了解Android Camera 应用开发流程的重要性 第二篇: 开发环境安装  一、jdk、sdk的配置;  二、android studio的安装介绍;  三、adb命令的使用; 第三篇: Camera 常用api和最新框架介绍  一、android camera api介绍      1)camera1、camera2 区别;      2)camera 1、camera2 常用api介绍;      3)android camerax;  二、android camera最新框架介绍 第四篇:Camera api1实现预览、拍照、录像功能  一、预览  二、拍照  三、录像  四、获取实时预览流 第五篇: Camera2相机 打开功能实现第六篇: Camera2相机 预览功能实现  1)surfaceview、textureview 第七篇: Camera2相机 拍照功能实现 1)单拍; 第八篇:Camera2相机 录像功能实现1)正常录像 第九篇:Camera2预览方向、拍照方向设置     1) 预览变形问题如何处理? 第十篇:YUV流处理  1)如何获取实时预览流?  2)  思考:双码流方案如何实现?一边本地录像,一边后台推流 第十一篇:dumpsys media.camera 第十二篇:Camera2 Zoom变焦第十三篇:人脸识别android 原生 & 三方人脸识别算法)第十四篇:Uvc UsbCamera第十五篇:Android Camera2拍摄RAW图第十六篇: Android Camera2同时打开前后摄 并 录像第十七篇: Android Camera2 视频慢动作  附:1)提供android开发相关资源      软件工具、Android相关学习书籍、学习相关网站博客等链接2)提供课程讲解中设计到的App 源码    * Camera API1使用源码    * Camera API2使用源码    * 调用三方算法人脸识别源码    *  录像慢动作源码    * Uvc UsbCamera相关源码3)课件

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

小驰成长圈

谢谢老板,今晚吃鸡~

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值