android 通过Camera和Surfaceview实现照相 图片预览 保存图片到本地 手动聚焦 FaceDetectionListener实现人脸识别

最近几天没啥事研究了下,android自定义相机的实现,主要实现了通过surfaceview跟camera实现拍照,预览,图片的保存。

能够进行前后摄像头的切换,自动和手动聚焦,设置闪光灯模式 ,人脸识别等。可能有一些不是特别好的地方,欢迎指出

先上效果图


1.图像预览的surfaceview

package com.example.camera.preview;

import android.content.Context;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Handler;
import android.util.AttributeSet;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import com.example.camera.util.CameraUtil;

/**
 * Created by renlei
 * DATE: 15-11-5
 * Time: 下午4:52
 */
public class MySurfacePreview extends SurfaceView implements SurfaceHolder.Callback {
    private SurfaceHolder surfaceHolder;
    private Handler mHandler;

    public MySurfacePreview(Context context, AttributeSet attrs) {
        super(context, attrs);
        surfaceHolder = getHolder();
        surfaceHolder.setFormat(PixelFormat.TRANSPARENT);//translucent半透明 transparent透明
        surfaceHolder.addCallback(this);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        CameraUtil.getInstance().doOpenCamera(Camera.CameraInfo.CAMERA_FACING_BACK);
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
        CameraUtil.getInstance().doStartPreview(surfaceHolder);
        if (mHandler != null) {
            mHandler.postDelayed(new Runnable() {
                @Override
                public void run() {
                    mHandler.sendEmptyMessage(CameraUtil.PREVIEW_HAS_STARTED);
                }
            }, 1000);
        }
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        CameraUtil.getInstance().doStopPreview();
    }

    public void setmHandler(Handler handler) {
        this.mHandler = handler;
    }
}
2CameraActivity

package com.example.camera;

import android.app.Activity;
import android.hardware.Camera;
import android.os.Bundle;
import android.os.Handler;
import android.os.Message;
import android.util.Log;
import android.view.GestureDetector;
import android.view.MotionEvent;
import android.view.View;
import android.widget.FrameLayout;
import android.widget.ImageButton;
import android.widget.ImageView;
import android.widget.RelativeLayout;
import com.example.camera.preview.MySurfacePreview;
import com.example.camera.util.CameraUtil;
import com.example.camera.util.FaceView;
import com.example.camera.util.GoogleDetectListenerImpl;

public class CameraActivity extends Activity {
    private MySurfacePreview mySurfacePreview;
    private ImageButton takeBtn;
    private FrameLayout focusLayout;
    private ImageView changeFlashModeIV;
    private ImageView swichCameraIV;
    private RelativeLayout settingRl;
    private FaceView faceView;
    int width;
    int height;

    /**
     * Called when the activity is first created.
     */
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.main);
        initView();
        bindListenre();
    }

    private void initView() {
        mySurfacePreview = (MySurfacePreview) findViewById(R.id.my_surfaceview);
        mySurfacePreview.setmHandler(mainHandler);
        takeBtn = (ImageButton) findViewById(R.id.take_btn);
        focusLayout = (FrameLayout) findViewById(R.id.camera_focus_layout);
        int w = View.MeasureSpec.makeMeasureSpec(0, View.MeasureSpec.UNSPECIFIED);
        int h = View.MeasureSpec.makeMeasureSpec(0, View.MeasureSpec.UNSPECIFIED);
        Log.d("showFocusIcon initview", "w " + w + " h " + h);
        focusLayout.measure(w, h);
        width = focusLayout.getMeasuredWidth() / 2;
        height = focusLayout.getMeasuredHeight() / 2;
        Log.d("showFocusIcon initview", "focusLayout.getMeasuredWidth()/2" + focusLayout.getMeasuredWidth() / 2 + "focusLayout.getMeasuredHeight()/2" + focusLayout.getMeasuredHeight() / 2);
        changeFlashModeIV = (ImageView) findViewById(R.id.flash_iv);
        swichCameraIV = (ImageView) findViewById(R.id.swich_camera_iv);
        settingRl = (RelativeLayout)findViewById(R.id.setting_rl);
        faceView = (FaceView)findViewById(R.id.face_view);

    }
    private void bindListenre() {
        takeBtn.setOnClickListener(new TakeBtnClickListener());
        mySurfacePreview.setOnTouchListener(new View.OnTouchListener() {
            @Override
            public boolean onTouch(View v, MotionEvent event) {
                if (CameraUtil.getInstance().getmCameraInfo().facing == Camera.CameraInfo.CAMERA_FACING_BACK) {
                    return gestureDetector.onTouchEvent(event);
                }else {
                    return false;
                }
            }
        });

        changeFlashModeIV.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                CameraUtil.getInstance().setFlashMode(changeFlashModeIV);
            }
        });

        swichCameraIV.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                changeCamera();
            }
        });
    }

    private class TakeBtnClickListener implements View.OnClickListener {
        @Override
        public void onClick(View v) {
            CameraUtil.getInstance().doTakePic();
        }
    }

    GestureDetector gestureDetector = new GestureDetector(new GestureDetector.OnGestureListener() {
        @Override
        public boolean onDown(MotionEvent e) {
            Log.d("MyGestureDetector", "onDown");
            return true;
        }

        @Override
        public void onShowPress(MotionEvent e) {
            Log.d("MyGestureDetector", "onShowPress");

        }

        @Override
        public boolean onSingleTapUp(final MotionEvent e) {
            Log.d("MyGestureDetector", "onSingleTapUp");
            CameraUtil.getInstance().autoFocus(new Camera.AutoFocusCallback() {
                @Override
                public void onAutoFocus(boolean success, Camera camera) {
                    if (success) {
                        Log.d("renlei", "聚焦成功");
                    } else {
                        Log.d("renlei", "聚焦失败");

                    }
                    focusLayout.setVisibility(View.GONE);
                }
            });
            CameraUtil.getInstance().setFocusArea(CameraActivity.this, e);
            showFocusIcon(e);
            return true;
        }

        @Override
        public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
            Log.d("MyGestureDetector", "onScroll");

            return false;
        }

        @Override
        public void onLongPress(MotionEvent e) {
            Log.d("MyGestureDetector", "onLongPress");

        }

        @Override
        public boolean onFling(MotionEvent e1, MotionEvent e2, float velocityX, float velocityY) {
            Log.d("MyGestureDetector", "onFling");
            return false;
        }
    });

    private void showFocusIcon(MotionEvent e) {
        int x = (int) e.getX();
        int y = (int) e.getY();
        RelativeLayout.LayoutParams params = (RelativeLayout.LayoutParams) focusLayout.getLayoutParams();
        params.leftMargin = (int) (x - width + 0.5);
        params.topMargin = (int) (y - height + 0.5+settingRl.getHeight());
//        Log.d("showFocusIcon","focusLayout.getMeasuredWidth()/2"+focusLayout.getMeasuredWidth()/2+"focusLayout.getMeasuredHeight()/2"+focusLayout.getMeasuredHeight()/2);
//        Log.d("showFocusIcon","focusLayout.getWidth()/2"+focusLayout.getWidth()/2+"focusLayout.getHeight()/2"+focusLayout.getHeight()/2);
        Log.d("showFocusIcon", "x" + x + "y" + y + "params.width" + params.width + "params.height" + params.height);
//        focusLayout.setLayoutParams(params);
        focusLayout.requestLayout();
//        focusLayout.setLayoutParams(params);
        focusLayout.setVisibility(View.VISIBLE);
        RelativeLayout.LayoutParams params2 = (RelativeLayout.LayoutParams) focusLayout.getLayoutParams();
        Log.d("showFocusIcon", "x" + x + "y" + y + "params2.width" + params2.width + "params2.height" + params2.height);
    }

    public void changeCamera() {
        CameraUtil.getInstance().doStopPreview();
        int newCameraId = (CameraUtil.getInstance().getCameraId() + 1) % 2;
        CameraUtil.getInstance().doOpenCamera(newCameraId);
        CameraUtil.getInstance().doStartPreview(mySurfacePreview.getHolder());
        if (newCameraId == Camera.CameraInfo.CAMERA_FACING_BACK){
            swichCameraIV.setImageResource(R.drawable.camera_setting_switch_back);
            changeFlashModeIV .setVisibility(View.VISIBLE);
        }else {
            swichCameraIV.setImageResource(R.drawable.camera_setting_switch_front);
            changeFlashModeIV.setVisibility(View.GONE);
        }
    }
    private MainHandler mainHandler = new MainHandler();
    private void startGoogleDetect(){
        Camera.Parameters parameters = CameraUtil.getInstance().getCameraParaters();
        Camera camera = CameraUtil.getInstance().getCamera();
        if (parameters.getMaxNumDetectedFaces()>0){
            if(faceView != null){
                faceView.clearFaces();
                faceView.setVisibility(View.VISIBLE);
            }
            camera.setFaceDetectionListener(new GoogleDetectListenerImpl(CameraActivity.this,mainHandler));
            camera.startFaceDetection();
        }
    }
    private class MainHandler extends Handler{

        @Override
        public void handleMessage(final Message msg) {
            int what = msg.what;
            switch (what){
                case CameraUtil.PREVIEW_HAS_STARTED:
                    startGoogleDetect();
                    Log.e("renlei110","开启人脸识别");
                    break;
                case CameraUtil.RECEIVE_FACE_MSG:
                    runOnUiThread(new Runnable() {
                        @Override
                        public void run() {
                            Camera.Face[]faces = (Camera.Face[]) msg.obj;
                            faceView.setFaces(faces);
                            Log.e("renlei111","收到人脸识别的信息");
                        }
                    });

                    break;
            }
            super.handleMessage(msg);
        }
    }
}
3保存图片的工具类ImageUtil

package com.example.camera.util;

import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.media.ExifInterface;
import android.os.Environment;
import android.util.Log;

import java.io.File;
import java.io.IOException;

/**
 * Created by renlei
 * DATE: 15-11-5
 * Time: 下午7:21
 * Email: lei.ren@renren-inc.com
 */
public class ImageUtil {
    public static void saveImage(File file,byte []data,String filePath){
        BitmapFactory.Options options = new BitmapFactory.Options();
        options.inJustDecodeBounds = true;
        Bitmap tempBitmap = BitmapFactory.decodeFile(filePath,options);
        int degrees = getExifRotateDegree(filePath);
    }

    public static String getSaveImgePath(){
        if (Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED)){
            String path = Environment.getExternalStorageDirectory().getPath()+"/renlei/"+System.currentTimeMillis()+".jpg";
            File file = new File(path);
            if (!file.getParentFile().exists()){
                file.getParentFile().mkdirs();
            }
            return path;
        }
        return System.currentTimeMillis()+".jpg";
    }

    public static int getExifRotateDegree(String path){
        try {
            ExifInterface exifInterface = new ExifInterface(path);
            int orientation = exifInterface.getAttributeInt(ExifInterface.TAG_ORIENTATION,ExifInterface.ORIENTATION_NORMAL);
            int degrees = getExifRotateDegrees(orientation);
            Log.d("imageutil degrees",degrees+"");
            return degrees;
        } catch (IOException e) {
            e.printStackTrace();
        }
        return 0;
    }

    public static int getExifRotateDegrees(int exifOrientation) {
        int degrees = 0;
        switch (exifOrientation) {
            case ExifInterface.ORIENTATION_NORMAL:
                degrees = 0;
                break;
            case ExifInterface.ORIENTATION_ROTATE_90:
                degrees = 90;
                break;
            case ExifInterface.ORIENTATION_ROTATE_180:
                degrees = 180;
                break;
            case ExifInterface.ORIENTATION_ROTATE_270:
                degrees = 270;
                break;
        }
        return degrees;
    }
}<span style="color:#CC0000;">
</span>
4 CameraUtil 这是最主要的一个类,其中包括了打开相机,开始预览,结束预览,设置闪光灯模式,聚焦等一系列的操作,代码中都有注释
package com.example.camera.util;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.PixelFormat;
import android.graphics.Rect;
import android.hardware.Camera;
import android.os.Build;
import android.util.Log;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.widget.ImageView;
import com.example.camera.R;

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;

/**
 * Created by renlei
 * DATE: 15-11-5
 * Time: 下午4:57
 * Email: lei.ren@renren-inc.com
 */
public class CameraUtil {
    private Camera mCamera;
    private static CameraUtil mCameraUtil;
    private boolean isPreview;
    private int cameraId = -1; //0表示后置,1表示前置
    private Camera.CameraInfo mCameraInfo = new Camera.CameraInfo();
    public static final int PREVIEW_HAS_STARTED = 110;
    public static final int RECEIVE_FACE_MSG = 111;
    public static synchronized CameraUtil getInstance() {
        if (mCameraUtil == null) {
            mCameraUtil = new CameraUtil();
        }
        return mCameraUtil;
    }

    /**
     * 打开相机
     * @param cameraId
     */
    public void doOpenCamera(int cameraId) {
        Log.d("renlei", "open camera"+cameraId);
        try {
            this.cameraId = cameraId;
            mCamera = Camera.open(cameraId);
            Camera.getCameraInfo(cameraId, mCameraInfo);///这里的mCamerainfo必须是new出来的,不能是个null
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    /**
     * 开启预览
     * @param holder
     */
    public void doStartPreview(SurfaceHolder holder) {
        Log.d("CAmerautil","doStartPreview");
        if (isPreview) {
            mCamera.stopPreview();
        }
        if (mCamera != null) {
            Camera.Parameters parameters = mCamera.getParameters();
            parameters.setPictureFormat(PixelFormat.JPEG);//设置照片拍摄后的保存格式
            mCamera.setDisplayOrientation(90);//否则方向会有问题
            if (mCameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_BACK) {//前置与后置的不一样,这里暂时只设置前置的,后置的可以相应的去设置
                parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
                parameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
            }
            printSupportPreviewSize(parameters);
            printSupportPictureSize(parameters);
            printSupportFocusMode(parameters);
//            parameters.setPictureSize(parameters.getPreviewSize().width,parameters.getPictureSize().height);
            //设置的这两个size必须时支持的size大小,否则时不可以的,会出现setparameters错误
            parameters.setPreviewSize(parameters.getSupportedPreviewSizes().get(0).width, parameters.getSupportedPreviewSizes().get(0).height);
            parameters.setPictureSize(parameters.getSupportedPictureSizes().get(0).width, parameters.getSupportedPictureSizes().get(0).height);
            mCamera.setParameters(parameters);
            Camera.Parameters mParams = mCamera.getParameters();
            Log.i("renlei", "最终设置:PreviewSize--With = " + mParams.getPreviewSize().width
                    + "Height = " + mParams.getPreviewSize().height);
            Log.i("renlei", "最终设置:PictureSize--With = " + mParams.getPictureSize().width
                    + "Height = " + mParams.getPictureSize().height);
            try {
                mCamera.setPreviewDisplay(holder);
                mCamera.startPreview();
            } catch (IOException e) {
                e.printStackTrace();
            }
            isPreview = true;
        }
    }

    /**
     * 结束预览
     */
    public void doStopPreview() {
        if (isPreview) {
            isPreview = false;
            mCamera.stopPreview();

            mCamera.release();
            mCamera = null;
        }
    }

    /**
     * 拍照
     */
    public void doTakePic() {
        if (isPreview && mCamera != null) {
            mCamera.takePicture(new ShutCallBackImpl(), null, new PicCallBacKImpl());
        }
    }

    /**
     * 拍照时的动作
     * 默认会有咔嚓一声
     */
    private class ShutCallBackImpl implements Camera.ShutterCallback {
        @Override
        public void onShutter() {

        }
    }

    /**
     * 拍照后的最主要的返回
     */
    private class PicCallBacKImpl implements Camera.PictureCallback {
        @Override
        public void onPictureTaken(final byte[] data, Camera camera) {
            isPreview = false;
            new Thread(new Runnable() {
                @Override
                public void run() {
                    String filePath = ImageUtil.getSaveImgePath();
                    File file = new File(filePath);
                    FileOutputStream fos = null;
                    try {
                        fos = new FileOutputStream(file, true);
                        fos.write(data);
                        ImageUtil.saveImage(file, data, filePath);
                        fos.close();

                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                }
            }).start();

            mCamera.startPreview();//重新开启预览 ,不然不能继续拍照
            isPreview = true;
        }
    }


    /**
     * 打印支持的previewSizes
     *
     * @param params
     */
    public void printSupportPreviewSize(Camera.Parameters params) {
        List<Camera.Size> previewSizes = params.getSupportedPreviewSizes();
        for (int i = 0; i < previewSizes.size(); i++) {
            Camera.Size size = previewSizes.get(i);
            Log.i("camerautil", "previewSizes:width = " + size.width + " height = " + size.height);
        }

    }

    /**
     * 打印支持的pictureSizes
     *
     * @param params
     */
    public void printSupportPictureSize(Camera.Parameters params) {
        List<Camera.Size> pictureSizes = params.getSupportedPictureSizes();
        for (int i = 0; i < pictureSizes.size(); i++) {
            Camera.Size size = pictureSizes.get(i);
            Log.i("camerautil", "pictureSizes:width = " + size.width
                    + " height = " + size.height);
        }
    }

    /**
     * 点击聚焦
     *
     * @param autoFocusCallback
     * @return
     */
    public boolean autoFocus(Camera.AutoFocusCallback autoFocusCallback) {
        Log.d("Camerrautil", "autoFouce");
        Camera.Parameters parameters = mCamera.getParameters();
        List<String> supportMode = parameters.getSupportedFocusModes();
        if (supportMode.contains(Camera.Parameters.FOCUS_MODE_AUTO)) {
            String focusMode = parameters.getFocusMode();
            if (!Camera.Parameters.FOCUS_MODE_AUTO.equals(focusMode)) {
                parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
                mCamera.setParameters(parameters);
            }
            if (autoFocusCallback != null) {
                mCamera.autoFocus(autoFocusCallback);
            }
            return true;
        }
        return false;
    }

    /**
     * 设置聚焦的区域
     * @param mContext
     * @param event
     */
    public void setFocusArea(Context mContext, MotionEvent event) {
        if (!CameraUtil.isSupportFocusArea() || mCamera == null) {
            return;
        }
        Camera.Parameters parameters = mCamera.getParameters();
        int ax = (int) (2000f * event.getRawX() / mContext.getResources().getDisplayMetrics().widthPixels - 1000);
        int ay = (int) (2000f * event.getRawY() / mContext.getResources().getDisplayMetrics().heightPixels - 1000);
//        Log.d("renlei",parameters.getMaxNumFocusAreas()+"");
        int rawx = (int) event.getRawX();
        int rawy = (int) event.getRawY();
        Log.d("renlei", "widthpix" + mContext.getResources().getDisplayMetrics().widthPixels + "heightpix" + mContext.getResources().getDisplayMetrics().heightPixels);
        Log.d("renlei", "rawx" + rawx + "rawy" + rawy);
        //防止超出1000 ,-1000的范围
        if (ay > 900) {
            ay = 900;
        } else if (ay < -900) {
            ay = -900;
        }

        if (ax < -900) {
            ax = -900;
        } else if (ax > 900) {
            ax = 900;
        }
        Log.d("renlei09", "ax" + ax + "ay" + ay);
        Camera.Area area = new Camera.Area(new Rect(ax - 100, ay - 100, ax + 100, ay + 100), 1000);
        List<Camera.Area> areas = new ArrayList<Camera.Area>();
        areas.add(area);
        parameters.setFocusAreas(areas);
        parameters.setMeteringAreas(areas);
        mCamera.setParameters(parameters);
    }

    /**
     * 是否符合设置对焦区域的SDK版本
     *
     * @return
     */
    public static boolean isSupportFocusArea() {
        return Build.VERSION.SDK_INT >= 14;
    }

    /**
     * 设置闪光灯的模式
     * @param imageView
     */
    public void setFlashMode(ImageView imageView) {
        Camera.Parameters parameters = mCamera.getParameters();
        String flashMode = parameters.getFlashMode();
        Log.d("setFlashMode  ", flashMode);
        if (flashMode != null) {
            if (flashMode.equals(Camera.Parameters.FLASH_MODE_OFF)) {
                imageView.setImageResource(R.drawable.camera_setting_flash_on_normal);
                parameters.setFlashMode(Camera.Parameters.FLASH_MODE_ON);
            } else if (flashMode.equals(Camera.Parameters.FLASH_MODE_ON)) {
                imageView.setImageResource(R.drawable.camera_setting_flash_auto_normal);
                parameters.setFlashMode(Camera.Parameters.FLASH_MODE_AUTO);
            } else if (flashMode.equals(Camera.Parameters.FLASH_MODE_AUTO)) {
                parameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
                imageView.setImageResource(R.drawable.camera_setting_flash_off_normal);
            } else {
                imageView.setImageResource(R.drawable.camera_setting_flash_off_normal);
                parameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
            }
            mCamera.setParameters(parameters);
        }
    }

    public int getCameraId() {
        return cameraId;
    }


    /**
     * 打印支持的聚焦模式
     *
     * @param params
     */
    public void printSupportFocusMode(Camera.Parameters params) {
        List<String> focusModes = params.getSupportedFocusModes();
        for (String mode : focusModes) {
            Log.i("CameraUtil", "focusModes--" + mode);
        }
    }

    public Camera.CameraInfo getmCameraInfo(){
        return mCameraInfo;
    }

    public Camera getCamera(){
        return mCamera;
    }
    public Camera.Parameters getCameraParaters(){
        if (mCamera!=null){
            return mCamera.getParameters();
        }
        return null;
    }
}

通过的google的人脸识别接口来实现的人脸识别

package com.example.camera.util;

import android.content.Context;
import android.hardware.Camera;
import android.os.Handler;
import android.os.Message;
import android.util.Log;


/**
 * Created by renlei
 * DATE: 15-11-10
 * Time: 下午4:49
 * Email: renlei0109@yeah.net
 */
public class GoogleDetectListenerImpl implements Camera.FaceDetectionListener{
    private Handler mHandler;///用于向主线程发送信息
    private Context mContext;

    public GoogleDetectListenerImpl(Context mContext,Handler mHandler) {
        this.mHandler = mHandler;
        this.mContext = mContext;
    }

    @Override
    public void onFaceDetection(Camera.Face[] faces, Camera camera) {
        if (faces!=null){
            Message msg = mHandler.obtainMessage();
            msg.what = CameraUtil.RECEIVE_FACE_MSG;
            msg.obj = faces;
            msg.sendToTarget();
        }

    }
}
识别出来的人脸区域,通过FaceView来显示

package com.example.camera.util;

import android.content.Context;
import android.graphics.*;
import android.graphics.drawable.Drawable;
import android.hardware.Camera;
import android.util.AttributeSet;
import android.util.Log;
import android.view.View;
import android.widget.ImageView;
import com.example.camera.R;

/**
 * Created by renlei
 * DATE: 15-11-11
 * Time: 上午11:34
 * Email: renlei0109@yeah.net
 */
public class FaceView extends ImageView {
    private Context mContext;
    private Camera.Face[] mFaces;
    private Matrix mMatrix = new Matrix();
    private boolean mirror;
    private Paint mLinePaint;

    private RectF rectF = new RectF();
    private Drawable mFaceIndicator = null;

    public FaceView(Context context, AttributeSet attrs) {
        super(context, attrs);
        initPaint();
        this.mContext = context;
        mFaceIndicator = mContext.getResources().getDrawable(R.drawable.ic_face_find_2);
    }

    public void setFaces(Camera.Face[] faces) {
        this.mFaces = faces;
        Log.d("Faceview", "invalidate");
//        ((View)getParent()).invalidate();
        invalidate();
        /*postInvalidate();
        invalidate();
        forceLayout();
        requestLayout();*/
    }
    public void clearFaces(){
        mFaces = null;
        invalidate();
    }
    @Override
    protected void onDraw(Canvas canvas) {
//        Log.d("Faceview", "onDraw");
        if(mFaces == null || mFaces.length < 1){
            return;
        }
        if (mFaces != null) {
            Log.d("renlei","onDraw"+mFaces.length);
            int id = CameraUtil.getInstance().getCameraId();
            mirror = (id == Camera.CameraInfo.CAMERA_FACING_FRONT);
            canvas.save();
            prepareMatrix();
            mMatrix.postRotate(0); //Matrix.postRotate默认是顺时针
            canvas.rotate(-0);   //Canvas.rotate()默认是逆时针
            for (int i = 0; i < mFaces.length; i++) {
                rectF.set(mFaces[i].rect);
                mMatrix.mapRect(rectF);
                mFaceIndicator.setBounds(Math.round(rectF.left), Math.round(rectF.top), Math.round(rectF.right), Math.round(rectF.bottom));
                mFaceIndicator.draw(canvas);
            }
            canvas.restore();
        }
        super.onDraw(canvas);
    }



    /**
     * <p>Here is the matrix to convert driver coordinates to View coordinates
     * in pixels.</p>
     * <pre>
     * Matrix matrix = new Matrix();
     * CameraInfo info = CameraHolder.instance().getCameraInfo()[cameraId];
     * // Need mirror for front camera.
     * boolean mirror = (info.facing == CameraInfo.CAMERA_FACING_FRONT);
     * matrix.setScale(mirror ? -1 : 1, 1);
     * // This is the value for android.hardware.Camera.setDisplayOrientation.
     * matrix.postRotate(displayOrientation);
     * // Camera driver coordinates range from (-1000, -1000) to (1000, 1000).
     * // UI coordinates range from (0, 0) to (width, height).
     * matrix.postScale(view.getWidth() / 2000f, view.getHeight() / 2000f);
     * matrix.postTranslate(view.getWidth() / 2f, view.getHeight() / 2f);
     * </pre>
     */
    private void prepareMatrix() {
        mMatrix.setScale(mirror ? -1 : 1, 1);
        mMatrix.postRotate(9);
        mMatrix.postScale(getWidth() / 2000f, getHeight() / 2000f);
        mMatrix.postTranslate(getWidth() / 2f, getHeight() / 2f);
    }

    private void initPaint() {
        mLinePaint = new Paint(Paint.ANTI_ALIAS_FLAG);
//		int color = Color.rgb(0, 150, 255);
        int color = Color.rgb(98, 212, 68);
//		mLinePaint.setColor(Color.RED);
        mLinePaint.setColor(color);
        mLinePaint.setStyle(Paint.Style.STROKE);
        mLinePaint.setStrokeWidth(5f);
        mLinePaint.setAlpha(180);
    }
}

工程下载地址 下载地址

http://download.csdn.net/detail/renlei0109/9280637

  • 2
    点赞
  • 12
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
好的,我会尽力回答你的问题。首先,让我们来看一下如何实现自定义Camera和前后置摄像头切换。 1. 自定义Camera 首先,在AndroidManifest.xml文件中添加以下权限: ``` <uses-feature android:name="android.hardware.camera" /> <uses-permission android:name="android.permission.CAMERA" /> ``` 然后,在你的Activity中,创建一个Preview类,继承SurfaceView,并实现SurfaceHolder.Callback接口。 ``` public class Preview extends SurfaceView implements SurfaceHolder.Callback { private SurfaceHolder mHolder; private Camera mCamera; public Preview(Context context, Camera camera) { super(context); mCamera = camera; mHolder = getHolder(); mHolder.addCallback(this); } @Override public void surfaceCreated(SurfaceHolder holder) { try { mCamera.setPreviewDisplay(holder); mCamera.startPreview(); } catch (IOException e) { Log.d("Preview", "Error setting camera preview: " + e.getMessage()); } } @Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { if (mHolder.getSurface() == null) { return; } try { mCamera.stopPreview(); } catch (Exception e) { Log.d("Preview", "Error stopping camera preview: " + e.getMessage()); } try { mCamera.setPreviewDisplay(mHolder); mCamera.startPreview(); } catch (Exception e) { Log.d("Preview", "Error starting camera preview: " + e.getMessage()); } } @Override public void surfaceDestroyed(SurfaceHolder holder) { mCamera.stopPreview(); mHolder.removeCallback(this); } } ``` 然后,在你的Activity中,创建一个Camera对象,并将其设置给Preview类。 ``` public class CameraActivity extends Activity { private Camera mCamera; private Preview mPreview; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); mCamera = getCameraInstance(); mPreview = new Preview(this, mCamera); FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview); preview.addView(mPreview); } public static Camera getCameraInstance() { Camera c = null; try { c = Camera.open(); } catch (Exception e) { Log.d("CameraActivity", "Error opening camera: " + e.getMessage()); } return c; } } ``` 2. 前后置摄像头切换 为了实现前后置摄像头的切换,你需要先判断设备是否支持前置摄像头。如果支持,你可以通过Camera.getNumberOfCameras()方法来获取设备上可用的摄像头数量。然后,你可以通过Camera.open()方法打开指定的摄像头。 ``` public void switchCamera() { if (Camera.getNumberOfCameras() > 1) { mCamera.stopPreview(); mCamera.release(); mCamera = null; if (currentCameraId == Camera.CameraInfo.CAMERA_FACING_BACK) { currentCameraId = Camera.CameraInfo.CAMERA_FACING_FRONT; } else { currentCameraId = Camera.CameraInfo.CAMERA_FACING_BACK; } mCamera = Camera.open(currentCameraId); mPreview.setCamera(mCamera); try { mCamera.setPreviewDisplay(mHolder); } catch (IOException e) { Log.d("CameraActivity", "Error setting camera preview: " + e.getMessage()); } mCamera.startPreview(); } } ``` 3. 图片缩小放大预览 为了实现图片的缩小和放大预览,你可以使用Matrix类来对Bitmap进行缩放操作。 ``` public void zoom(float scale) { Matrix matrix = new Matrix(); matrix.postScale(scale, scale); Bitmap scaledBitmap = Bitmap.createBitmap(mBitmap, 0, 0, mBitmap.getWidth(), mBitmap.getHeight(), matrix, true); mImageView.setImageBitmap(scaledBitmap); } ``` 这样,你就可以实现自定义Camera和前后置摄像头切换以及图片缩小放大预览了。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值