解决OpenCV JavaCameraView相机preview方向问题

    网上找了很多解决都是有问题的,研究了半天源码解决了这个问题。我是从整个相机启动和数据传输的过程着手的,这里捡重点介绍一下,最后会贴上修改后的两个源文件。

    首先要知道一个概念。

    图里的小圆圈是Home按键,不是摄像头:)

    现在问题就是在什么地方进行旋转,如何旋转。这就需要了解JavaCameraView类的工作流程了。JavaCameraView实现了父类CameraBridgeViewBase的抽象方法connectCamera,这个方法主要做两件事

1.初始化相机,包括选择启动哪个相机,选择preview的Frame大小等等。这个过程是在UI线程实现的,因此onPreviewFrame也是在UI线程调用,当相机启动后数据就会传至onPreviewFrame。onPreviewFrame不处理数据只是将数据存好然后通知另一个线程去处理。

2.就是启动那个处理onPreviewFrame存储数据的线程。

JavaCameraView类的问题就是在这第1,第2步骤里都没有区分SurfaceView和相机Frame的坐标是不同的。比如,在第1步骤里,选择相机preview大小时,preview的height最大不能超过SurfaceView的width,也就是说相机Frame的高对应着SurfaceView的宽而相机Frame的宽对应着SurfaceView的高。因此所有涉及到SurfaceView宽和高的地方都需要修改,因为源码都用的相机Frame的宽和高。最重要的就是第1步骤里会调用AllocateCache函数初始化一个Bitmap,之后在第2步骤里将相机Frame转为这个Bitmap,然后直接画到SurfaceView对应的画布上的。因此这个Bitmap的宽和高应该跟相机Frame的宽和高正好相反,因为相机Frame的数据是顺时针旋转90度后使用的。在第2步骤里,相机Frame转为Bitmap之前需要顺时针旋转相机Frame数据,旋转后相机Frame和Bitmap就一致了。这个旋转相机Frame的时机我选择在JavaCameraFrame类实现,这个类很简单它有一个Mat成员,onPreviewFrame每当有数据时就是存到这个成员里,这个Mat是Yuv420sp格式的。但是它实现了CvCameraViewFrame接口的gray,rgba方法,可直接返回Yuv420sp对应的灰度图和rgba图,此时返回的是相机Frame那个方向的。我在gray,rgba方法里对返回结果进行旋转。我认为这个时机很好,因为通过onCameraFrame传给JavaCameraView类的用户的数据就是JavaCameraFrame类型,在onCameraFrame函数里用户像往常一样直接调用gray,rgba方法就可得到方向正确的Mat,且这个时机在相机Frame转为Bitmap之前,当转换时相机Frame的数据已经被旋转到正确方向了。

水平一般,能力有限。说的可能不清楚,贴上代码,所有我修改的地方都注释了#Modified,修改的地方不是很多。

 

1.CameraBridgeViewBase

  1 package org.opencv.android;
  2 
  3 import java.util.List;
  4 
  5 import org.opencv.BuildConfig;
  6 import org.opencv.R;
  7 import org.opencv.core.Mat;
  8 import org.opencv.core.Size;
  9 
 10 import android.app.Activity;
 11 import android.app.AlertDialog;
 12 import android.content.Context;
 13 import android.content.DialogInterface;
 14 import android.content.res.TypedArray;
 15 import android.graphics.Bitmap;
 16 import android.graphics.Canvas;
 17 import android.graphics.Rect;
 18 import android.util.AttributeSet;
 19 import android.util.Log;
 20 import android.view.SurfaceHolder;
 21 import android.view.SurfaceView;
 22 
 23 /**
 24  * This is a basic class, implementing the interaction with Camera and OpenCV library.
 25  * The main responsibility of it - is to control when camera can be enabled, process the frame,
 26  * call external listener to make any adjustments to the frame and then draw the resulting
 27  * frame to the screen.
 28  * The clients shall implement CvCameraViewListener.
 29  */
 30 public abstract class CameraBridgeViewBase extends SurfaceView implements SurfaceHolder.Callback {
 31 
 32     private static final String TAG = "CameraBridge";
 33     private static final int MAX_UNSPECIFIED = -1;
 34     private static final int STOPPED = 0;
 35     private static final int STARTED = 1;
 36 
 37     private int mState = STOPPED;
 38     private Bitmap mCacheBitmap;
 39     private CvCameraViewListener2 mListener;
 40     private boolean mSurfaceExist;
 41     private final Object mSyncObject = new Object();
 42 
 43     protected int mFrameWidth;
 44     protected int mFrameHeight;
 45     protected int mMaxHeight;
 46     protected int mMaxWidth;
 47     protected float mScale = 0;
 48     protected int mPreviewFormat = RGBA;
 49     protected int mCameraIndex = CAMERA_ID_ANY;
 50     protected boolean mEnabled;
 51     protected FpsMeter mFpsMeter = null;
 52 
 53     public static final int CAMERA_ID_ANY   = -1;
 54     public static final int CAMERA_ID_BACK  = 99;
 55     public static final int CAMERA_ID_FRONT = 98;
 56     public static final int RGBA = 1;
 57     public static final int GRAY = 2;
 58 
 59     public CameraBridgeViewBase(Context context, int cameraId) {
 60         super(context);
 61         mCameraIndex = cameraId;
 62         getHolder().addCallback(this);
 63         mMaxWidth = MAX_UNSPECIFIED;
 64         mMaxHeight = MAX_UNSPECIFIED;
 65     }
 66 
 67     public CameraBridgeViewBase(Context context, AttributeSet attrs) {
 68         super(context, attrs);
 69 
 70         int count = attrs.getAttributeCount();
 71         Log.d(TAG, "Attr count: " + Integer.valueOf(count));
 72 
 73         TypedArray styledAttrs = getContext().obtainStyledAttributes(attrs, R.styleable.CameraBridgeViewBase);
 74         if (styledAttrs.getBoolean(R.styleable.CameraBridgeViewBase_show_fps, false))
 75             enableFpsMeter();
 76 
 77         mCameraIndex = styledAttrs.getInt(R.styleable.CameraBridgeViewBase_camera_id, -1);
 78 
 79         getHolder().addCallback(this);
 80         mMaxWidth = MAX_UNSPECIFIED;
 81         mMaxHeight = MAX_UNSPECIFIED;
 82         styledAttrs.recycle();
 83     }
 84 
 85     /**
 86      * Sets the camera index
 87      * @param cameraIndex new camera index
 88      */
 89     public void setCameraIndex(int cameraIndex) {
 90         this.mCameraIndex = cameraIndex;
 91     }
 92 
 93     public interface CvCameraViewListener {
 94         /**
 95          * This method is invoked when camera preview has started. After this method is invoked
 96          * the frames will start to be delivered to client via the onCameraFrame() callback.
 97          * @param width -  the width of the frames that will be delivered
 98          * @param height - the height of the frames that will be delivered
 99          */
100         public void onCameraViewStarted(int width, int height);
101 
102         /**
103          * This method is invoked when camera preview has been stopped for some reason.
104          * No frames will be delivered via onCameraFrame() callback after this method is called.
105          */
106         public void onCameraViewStopped();
107 
108         /**
109          * This method is invoked when delivery of the frame needs to be done.
110          * The returned values - is a modified frame which needs to be displayed on the screen.
111          * TODO: pass the parameters specifying the format of the frame (BPP, YUV or RGB and etc)
112          */
113         public Mat onCameraFrame(Mat inputFrame);
114     }
115 
116     public interface CvCameraViewListener2 {
117         /**
118          * This method is invoked when camera preview has started. After this method is invoked
119          * the frames will start to be delivered to client via the onCameraFrame() callback.
120          * @param width -  the width of the frames that will be delivered
121          * @param height - the height of the frames that will be delivered
122          */
123         public void onCameraViewStarted(int width, int height);
124 
125         /**
126          * This method is invoked when camera preview has been stopped for some reason.
127          * No frames will be delivered via onCameraFrame() callback after this method is called.
128          */
129         public void onCameraViewStopped();
130 
131         /**
132          * This method is invoked when delivery of the frame needs to be done.
133          * The returned values - is a modified frame which needs to be displayed on the screen.
134          * TODO: pass the parameters specifying the format of the frame (BPP, YUV or RGB and etc)
135          */
136         public Mat onCameraFrame(CvCameraViewFrame inputFrame);
137     };
138 
139     protected class CvCameraViewListenerAdapter implements CvCameraViewListener2  {
140         public CvCameraViewListenerAdapter(CvCameraViewListener oldStypeListener) {
141             mOldStyleListener = oldStypeListener;
142         }
143 
144         public void onCameraViewStarted(int width, int height) {
145             mOldStyleListener.onCameraViewStarted(width, height);
146         }
147 
148         public void onCameraViewStopped() {
149             mOldStyleListener.onCameraViewStopped();
150         }
151 
152         public Mat onCameraFrame(CvCameraViewFrame inputFrame) {
153              Mat result = null;
154              switch (mPreviewFormat) {
155                 case RGBA:
156                     result = mOldStyleListener.onCameraFrame(inputFrame.rgba());
157                     break;
158                 case GRAY:
159                     result = mOldStyleListener.onCameraFrame(inputFrame.gray());
160                     break;
161                 default:
162                     Log.e(TAG, "Invalid frame format! Only RGBA and Gray Scale are supported!");
163             };
164 
165             return result;
166         }
167 
168         public void setFrameFormat(int format) {
169             mPreviewFormat = format;
170         }
171 
172         private int mPreviewFormat = RGBA;
173         private CvCameraViewListener mOldStyleListener;
174     };
175 
176     /**
177      * This class interface is abstract representation of single frame from camera for onCameraFrame callback
178      * Attention: Do not use objects, that represents this interface out of onCameraFrame callback!
179      */
180     public interface CvCameraViewFrame {
181 
182         /**
183          * This method returns RGBA Mat with frame
184          */
185         public Mat rgba();
186 
187         /**
188          * This method returns single channel gray scale Mat with frame
189          */
190         public Mat gray();
191     };
192 
193     /*
194     重载SurfaceHolder.Callback的方法
195      */
196     /*
197     Access to the underlying surface is provided via the SurfaceHolder interface,
198     which can be retrieved by calling getHolder().
199     The Surface will be created for you while the SurfaceView's window is visible;
200     you should implement SurfaceHolder.Callback.surfaceCreated(SurfaceHolder)
201     and SurfaceHolder.Callback.surfaceDestroyed(SurfaceHolder) to discover when the
202     Surface is created and destroyed as the window is shown and hidden.
203     One of the purposes of this class is to provide a surface in which a secondary
204     thread can render into the screen. If you are going to use it this way,
205     you need to be aware of some threading semantics:
206     All SurfaceView and SurfaceHolder.Callback methods will be called from
207     the thread running the SurfaceView's window (typically the main thread of the application). They thus need to correctly synchronize with any state that is also touched by the drawing thread.
208     You must ensure that the drawing thread only touches the underlying Surface
209     while it is valid -- between SurfaceHolder.Callback.surfaceCreated()
210     and SurfaceHolder.Callback.surfaceDestroyed().
211      */
212     /*
213     This is called immediately after any structural changes (format or size)
214     have been made to the surface. You should at this point update the imagery
215     in the surface. This method is always called at least once,
216     after surfaceCreated(SurfaceHolder).
217      */
218     public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
219         Log.d(TAG, "call surfaceChanged event");
220         synchronized(mSyncObject) {
221             if (!mSurfaceExist) {
222                 mSurfaceExist = true;
223                 checkCurrentState();
224             } else {
225                 /** Surface changed. We need to stop camera and restart with new parameters */
226                 /* Pretend that old surface has been destroyed */
227                 mSurfaceExist = false;
228                 checkCurrentState();
229                 /* Now use new surface. Say we have it now */
230                 mSurfaceExist = true;
231                 checkCurrentState();
232             }
233         }
234     }
235 
236     /*
237     This is called immediately after the surface is first created.
238     Implementations of this should start up whatever rendering code they desire.
239     Note that only one thread can ever draw into a Surface,
240     so you should not draw into the Surface here if your normal rendering
241     will be in another thread.
242      */
243     public void surfaceCreated(SurfaceHolder holder) {
244         /* Do nothing. Wait until surfaceChanged delivered */
245     }
246 
247     /*
248     This is called immediately before a surface is being destroyed.
249     After returning from this call, you should no longer try to access this surface.
250     If you have a rendering thread that directly accesses the surface,
251     you must ensure that thread is no longer touching the Surface before returning
252     from this function.
253      */
254     public void surfaceDestroyed(SurfaceHolder holder) {
255         synchronized(mSyncObject) {
256             mSurfaceExist = false;
257             checkCurrentState();
258         }
259     }
260 
261     /**
262      * This method is provided for clients, so they can enable the camera connection.
263      * The actual onCameraViewStarted callback will be delivered only after both this method is called and surface is available
264      */
265     public void enableView() {
266         synchronized(mSyncObject) {
267             mEnabled = true;
268             checkCurrentState();
269         }
270     }
271 
272     /**
273      * This method is provided for clients, so they can disable camera connection and stop
274      * the delivery of frames even though the surface view itself is not destroyed and still stays on the scren
275      */
276     public void disableView() {
277         synchronized(mSyncObject) {
278             mEnabled = false;
279             checkCurrentState();
280         }
281     }
282 
283     /**
284      * This method enables label with fps value on the screen
285      */
286     public void enableFpsMeter() {
287         if (mFpsMeter == null) {
288             mFpsMeter = new FpsMeter();
289             mFpsMeter.setResolution(mFrameWidth, mFrameHeight);
290         }
291     }
292 
293     public void disableFpsMeter() {
294             mFpsMeter = null;
295     }
296 
297     /**
298      *
299      * @param listener
300      */
301 
302     public void setCvCameraViewListener(CvCameraViewListener2 listener) {
303         mListener = listener;
304     }
305 
306     public void setCvCameraViewListener(CvCameraViewListener listener) {
307         CvCameraViewListenerAdapter adapter = new CvCameraViewListenerAdapter(listener);
308         adapter.setFrameFormat(mPreviewFormat);
309         mListener = adapter;
310     }
311 
312     /**
313      * This method sets the maximum size that camera frame is allowed to be. When selecting
314      * size - the biggest size which less or equal the size set will be selected.
315      * As an example - we set setMaxFrameSize(200,200) and we have 176x152 and 320x240 sizes. The
316      * preview frame will be selected with 176x152 size.
317      * This method is useful when need to restrict the size of preview frame for some reason (for example for video recording)
318      * @param maxWidth - the maximum width allowed for camera frame.
319      * @param maxHeight - the maximum height allowed for camera frame
320      */
321     public void setMaxFrameSize(int maxWidth, int maxHeight) {
322         mMaxWidth = maxWidth;
323         mMaxHeight = maxHeight;
324     }
325 
326     public void SetCaptureFormat(int format)
327     {
328         mPreviewFormat = format;
329         if (mListener instanceof CvCameraViewListenerAdapter) {
330             CvCameraViewListenerAdapter adapter = (CvCameraViewListenerAdapter) mListener;
331             adapter.setFrameFormat(mPreviewFormat);
332         }
333     }
334 
335     /**
336      * Called when mSyncObject lock is held
337      */
338     private void checkCurrentState() {
339         Log.d(TAG, "call checkCurrentState");
340         int targetState;
341         //enableView()将设置mEnabled为true,surfaceChanged()将设置mSurfaceExist
342         //getVisibility() == VISIBLE似乎总是成立的
343         //当surface准备好了且client设置enableView()时设置targetState为STARTED
344         if (mEnabled && mSurfaceExist && getVisibility() == VISIBLE) {
345             targetState = STARTED;
346         } else {
347             targetState = STOPPED;
348         }
349 
350         //mState初始值是STOPPED
351         //若目标状态与当前状态不同则退出当前状态进入目标状态
352         if (targetState != mState) {
353             /* The state change detected. Need to exit the current state and enter target state */
354             processExitState(mState);
355             mState = targetState;
356             processEnterState(mState);
357         }
358     }
359 
360     private void processEnterState(int state) {
361         Log.d(TAG, "call processEnterState: " + state);
362         switch(state) {
363         case STARTED:
364             //真正启动相机的地方
365             onEnterStartedState();
366             if (mListener != null) {
367                 //进入STARTED状态后若CameraBridgeViewBase类的成员CvCameraViewListener2 mListener
368                 //不为null则调用其onCameraViewStarted方法,通知mListener相机启动了
369                 mListener.onCameraViewStarted(mFrameWidth, mFrameHeight);
370             }
371             break;
372         case STOPPED:
373             onEnterStoppedState();
374             if (mListener != null) {
375                 //进入STOPPED状态后若CameraBridgeViewBase类的成员CvCameraViewListener2 mListener
376                 //不为null则调用其onCameraViewStopped方法,通知mListener相机停止了
377                 mListener.onCameraViewStopped();
378             }
379             break;
380         };
381     }
382 
383     private void processExitState(int state) {
384         Log.d(TAG, "call processExitState: " + state);
385         switch(state) {
386         case STARTED:
387             onExitStartedState();
388             break;
389         case STOPPED:
390             onExitStoppedState();
391             break;
392         };
393     }
394 
395     private void onEnterStoppedState() {
396         /* nothing to do */
397     }
398 
399     private void onExitStoppedState() {
400         /* nothing to do */
401     }
402 
403     // NOTE: The order of bitmap constructor and camera connection is important for android 4.1.x
404     // Bitmap must be constructed before surface
405     private void onEnterStartedState() {
406         Log.d(TAG, "call onEnterStartedState");
407         /* Connect camera */
408         //connectCamera的参数是CameraBridgeViewBase的width,height
409         if (!connectCamera(getWidth(), getHeight())) {
410             AlertDialog ad = new AlertDialog.Builder(getContext()).create();
411             ad.setCancelable(false); // This blocks the 'BACK' button
412             ad.setMessage("It seems that you device does not support camera (or it is locked). Application will be closed.");
413             ad.setButton(DialogInterface.BUTTON_NEUTRAL,  "OK", new DialogInterface.OnClickListener() {
414                 public void onClick(DialogInterface dialog, int which) {
415                     dialog.dismiss();
416                     ((Activity) getContext()).finish();
417                 }
418             });
419             ad.show();
420 
421         }
422     }
423 
424     private void onExitStartedState() {
425         disconnectCamera();
426         if (mCacheBitmap != null) {
427             mCacheBitmap.recycle();
428         }
429     }
430 
431     /*
432         onPreviewFrame在UI线程被调用,它存好数据后通知另一个线程处理
433         另一个线程就调用这个方法处理数据
434         onPreviewFrame相当于生产者,另一个线程相当于消费者
435         当使用JavaCameraView类时frame是JavaCameraFrame类型
436         其通过OpenCV实现了接口
437      */
438     /**
439      * This method shall be called by the subclasses when they have valid
440      * object and want it to be delivered to external client (via callback) and
441      * then displayed on the screen.
442      * @param frame - the current frame to be delivered
443      */
444     protected void deliverAndDrawFrame(CvCameraViewFrame frame) {
445         Mat modified;
446 
447         if (mListener != null) {
448             //CvCameraViewListener2 mListener是client指定的
449             //这里调用客户重载的接口方法且接收返回值
450             //这里都是在数据处理线程里执行的
451             modified = mListener.onCameraFrame(frame);
452         } else {
453             //若client没指定CvCameraViewListener2 mListener即client不准备处理preview数据
454             //则modified设置为
455             //onPreviewFrame传回的数据转换成的rgba Mat
456             modified = frame.rgba();
457         }
458 
459         //Log Mat的大小和Bitmap的大小
460         Log.d("FunnyAR","mScale: "+mScale+" modified.rows: "+modified.rows()
461                 +" modified.cols: "+modified.cols()+" mCacheBitmap.getWidth(): "+
462                 mCacheBitmap.getWidth()+" mCacheBitmap.getHeight() "+
463                 mCacheBitmap.getHeight());
464 
465         //标志modified转Bitmap是否成功
466         boolean bmpValid = true;
467         //若确实有modified则将其转为Bitmap
468         if (modified != null) {
469             try {
470                 Utils.matToBitmap(modified, mCacheBitmap);
471             } catch(Exception e) {
472                 Log.e(TAG, "Mat type: " + modified);
473                 Log.e(TAG, "Bitmap type: " + mCacheBitmap.getWidth() + "*" + mCacheBitmap.getHeight());
474                 Log.e(TAG, "Utils.matToBitmap() throws an exception: " + e.getMessage());
475                 bmpValid = false;
476             }
477         }
478         //转换成功通过画布画到surface里
479         if (bmpValid && mCacheBitmap != null) {
480             Canvas canvas = getHolder().lockCanvas();
481             if (canvas != null) {
482                 canvas.drawColor(0, android.graphics.PorterDuff.Mode.CLEAR);
483                 if (BuildConfig.DEBUG)
484                     Log.d(TAG, "mStretch value: " + mScale);
485 
486                 if (mScale != 0) {
487                     canvas.drawBitmap(mCacheBitmap, new Rect(0,0,mCacheBitmap.getWidth(), mCacheBitmap.getHeight()),
488                          new Rect((int)((canvas.getWidth() - mScale*mCacheBitmap.getWidth()) / 2),
489                          (int)((canvas.getHeight() - mScale*mCacheBitmap.getHeight()) / 2),
490                          (int)((canvas.getWidth() - mScale*mCacheBitmap.getWidth()) / 2 + mScale*mCacheBitmap.getWidth()),
491                          (int)((canvas.getHeight() - mScale*mCacheBitmap.getHeight()) / 2 + mScale*mCacheBitmap.getHeight())), null);
492                 } else {
493                      canvas.drawBitmap(mCacheBitmap, new Rect(0,0,mCacheBitmap.getWidth(), mCacheBitmap.getHeight()),
494                          new Rect((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,
495                          (canvas.getHeight() - mCacheBitmap.getHeight()) / 2,
496                          (canvas.getWidth() - mCacheBitmap.getWidth()) / 2 + mCacheBitmap.getWidth(),
497                          (canvas.getHeight() - mCacheBitmap.getHeight()) / 2 + mCacheBitmap.getHeight()), null);
498                 }
499 
500                 if (mFpsMeter != null) {
501                     mFpsMeter.measure();
502                     mFpsMeter.draw(canvas, 20, 30);
503                 }
504                 getHolder().unlockCanvasAndPost(canvas);
505             }
506         }
507     }
508 
509     /**
510      * This method is invoked shall perform concrete operation to initialize the camera.
511      * CONTRACT: as a result of this method variables mFrameWidth and mFrameHeight MUST be
512      * initialized with the size of the Camera frames that will be delivered to external processor.
513      * @param width - the width of this SurfaceView
514      * @param height - the height of this SurfaceView
515      */
516     //具体启动相机的过程由子类实现
517     protected abstract boolean connectCamera(int width, int height);
518 
519     /**
520      * Disconnects and release the particular camera object being connected to this surface view.
521      * Called when syncObject lock is held
522      */
523     protected abstract void disconnectCamera();
524 
525     // NOTE: On Android 4.1.x the function must be called before SurfaceTexture constructor!
526     protected void AllocateCache()
527     {
528         //mCacheBitmap = Bitmap.createBitmap(mFrameWidth, mFrameHeight, Bitmap.Config.ARGB_8888);
529         //#Modified portrait step2
530         //为了方向正确mCacheBitmap存储的时相机frame旋转90度之后的数据
531         //旋转90度后mFrameWidth,mFrameHeight互换
532         int portraitWidth=mFrameHeight;
533         int portraitHeight=mFrameWidth;
534         mCacheBitmap = Bitmap.createBitmap(portraitWidth, portraitHeight, Bitmap.Config.ARGB_8888);
535     }
536 
537     public interface ListItemAccessor {
538         public int getWidth(Object obj);
539         public int getHeight(Object obj);
540     };
541 
542     /**
543      * This helper method can be called by subclasses to select camera preview size.
544      * It goes over the list of the supported preview sizes and selects the maximum one which
545      * fits both values set via setMaxFrameSize() and surface frame allocated for this view
546      * @param supportedSizes
547      * @param surfaceWidth
548      * @param surfaceHeight
549      * @return optimal frame size
550      */
551     protected Size calculateCameraFrameSize(List<?> supportedSizes, ListItemAccessor accessor, int surfaceWidth, int surfaceHeight) {
552         //选择一个相机frame大小
553         int calcWidth = 0;
554         int calcHeight = 0;
555 
556         //允许的最大width和height
557         //#Modified step4
558         //相机Frame的mMaxWidth应该与surface的surfaceHeight比
559         //相机Frame的mMaxHeight应该与surface的surfaceWidth比
560         //int maxAllowedWidth = (mMaxWidth != MAX_UNSPECIFIED && mMaxWidth < surfaceWidth)? mMaxWidth : surfaceWidth;
561         //int maxAllowedHeight = (mMaxHeight != MAX_UNSPECIFIED && mMaxHeight < surfaceHeight)? mMaxHeight : surfaceHeight;
562         int maxAllowedWidth = (mMaxWidth != MAX_UNSPECIFIED && mMaxWidth < surfaceHeight)? mMaxWidth : surfaceHeight;
563         int maxAllowedHeight = (mMaxHeight != MAX_UNSPECIFIED && mMaxHeight < surfaceWidth)? mMaxHeight : surfaceWidth;
564 
565         for (Object size : supportedSizes) {
566             int width = accessor.getWidth(size);
567             int height = accessor.getHeight(size);
568 
569             //在允许的范围内选择最大的size
570             //client是可通过设置小的mMaxWidth,mMaxHeight来选择低分辨率frame的
571             if (width <= maxAllowedWidth && height <= maxAllowedHeight) {
572                 if (width >= calcWidth && height >= calcHeight) {
573                     calcWidth = (int) width;
574                     calcHeight = (int) height;
575                 }
576             }
577         }
578 
579         return new Size(calcWidth, calcHeight);
580     }
581 }

 2.JavaCameraView

package org.opencv.android;

import java.util.List;

import android.content.Context;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.os.Build;
import android.util.AttributeSet;
import android.util.Log;
import android.view.ViewGroup.LayoutParams;

import org.opencv.BuildConfig;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.Size;
import org.opencv.imgproc.Imgproc;

/**
 * This class is an implementation of the Bridge View between OpenCV and Java Camera.
 * This class relays on the functionality available in base class and only implements
 * required functions:
 * connectCamera - opens Java camera and sets the PreviewCallback to be delivered.
 * disconnectCamera - closes the camera and stops preview.
 * When frame is delivered via callback from Camera - it processed via OpenCV to be
 * converted to RGBA32 and then passed to the external callback for modifications if required.
 */
public class JavaCameraView extends CameraBridgeViewBase implements PreviewCallback {

    private static final int MAGIC_TEXTURE_ID = 10;
    private static final String TAG = "JavaCameraView";

    private byte mBuffer[];
    private Mat[] mFrameChain;
    private int mChainIdx = 0;
    private Thread mThread;
    private boolean mStopThread;

    protected Camera mCamera;
    protected JavaCameraFrame[] mCameraFrame;
    private SurfaceTexture mSurfaceTexture;
    private int mPreviewFormat = ImageFormat.NV21;

    public static class JavaCameraSizeAccessor implements ListItemAccessor {

        @Override
        public int getWidth(Object obj) {
            Camera.Size size = (Camera.Size) obj;
            return size.width;
        }

        @Override
        public int getHeight(Object obj) {
            Camera.Size size = (Camera.Size) obj;
            return size.height;
        }
    }

    public JavaCameraView(Context context, int cameraId) {
        super(context, cameraId);
    }

    public JavaCameraView(Context context, AttributeSet attrs) {
        super(context, attrs);
    }

    //传入的是JavaCameraView的width,height
    protected boolean initializeCamera(int width, int height) {
        Log.d(TAG, "Initialize java camera");
        boolean result = true;
        synchronized (this){
            mCamera = null;
            //mCameraIndex是指定相机的类型,是应用里的设置不是指相机ID
            //相机ID需根据类型找
            //继承父类CameraBridgeViewBase
            //初值为CAMERA_ID_ANY
            if (mCameraIndex == CAMERA_ID_ANY) {
                Log.d(TAG, "Trying to open camera with old open()");
                try {
                    //先尝试不指定相机类型启动相机
                    mCamera = Camera.open();
                }
                catch (Exception e){
                    Log.e(TAG, "Camera is not available (in use or does not exist): " + e.getLocalizedMessage());
                }

                if(mCamera == null && Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
                    boolean connected = false;
                    for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                        Log.d(TAG, "Trying to open camera with new open(" + Integer.valueOf(camIdx) + ")");
                        try {
                            //若不指定相机类型启动相机失败则遍历所有相机ID一个个尝试启动,一旦成功
                            //就选择当前成功启动的相机
                            mCamera = Camera.open(camIdx);
                            connected = true;
                        } catch (RuntimeException e) {
                            Log.e(TAG, "Camera #" + camIdx + "failed to open: " + e.getLocalizedMessage());
                        }
                        if (connected) break;
                    }
                }
            } else {
                //这里是指定相机类型的情况
                if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
                    int localCameraIndex = mCameraIndex;
                    if (mCameraIndex == CAMERA_ID_BACK) {
                        Log.i(TAG, "Trying to open back camera");
                        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
                        //根据相机类型找此类型对应的相机ID
                        for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                            Camera.getCameraInfo( camIdx, cameraInfo );
                            if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_BACK) {
                                localCameraIndex = camIdx;
                                break;
                            }
                        }
                    } else if (mCameraIndex == CAMERA_ID_FRONT) {
                        Log.i(TAG, "Trying to open front camera");
                        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
                        for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                            Camera.getCameraInfo( camIdx, cameraInfo );
                            if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
                                localCameraIndex = camIdx;
                                break;
                            }
                        }
                    }
                    if (localCameraIndex == CAMERA_ID_BACK) {
                        //localCameraIndex初赋值为CAMERA_ID_BACK类型,指定要启动背面相机
                        //若有背面相机此处localCameraIndex值已经被赋值为背面相机的相机ID了
                        Log.e(TAG, "Back camera not found!");
                    } else if (localCameraIndex == CAMERA_ID_FRONT) {
                        Log.e(TAG, "Front camera not found!");
                    } else {
                        Log.d(TAG, "Trying to open camera with new open(" + Integer.valueOf(localCameraIndex) + ")");
                        try {
                            //根据找到的相机ID启动相机
                            mCamera = Camera.open(localCameraIndex);
                        } catch (RuntimeException e) {
                            Log.e(TAG, "Camera #" + localCameraIndex + "failed to open: " + e.getLocalizedMessage());
                        }
                    }
                }
            }

            //若启动相机失败则返回false
            if (mCamera == null)
                return false;

            /* Now set camera parameters */
            try {
                Camera.Parameters params = mCamera.getParameters();
                Log.d(TAG, "getSupportedPreviewSizes()");
                List<android.hardware.Camera.Size> sizes = params.getSupportedPreviewSizes();

                if (sizes != null) {
                    //选择预览size
                    /* Select the size that fits surface considering maximum size allowed */
                    Size frameSize = calculateCameraFrameSize(sizes, new JavaCameraSizeAccessor(), width, height);
                    //这里width,height是connectCamera(getWidth(), getHeight())传进来的
                    //是surfaceView的大小也是surface的大小
                    //Log相机frame大小和surface大小
                    Log.d("FunnyAR","surface width: "+width+" surface height: "+height+
                            "frameSize: "+frameSize.toString());

                    //选择预览格式
                    /* Image format NV21 causes issues in the Android emulators */
                    if (Build.FINGERPRINT.startsWith("generic")
                            || Build.FINGERPRINT.startsWith("unknown")
                            || Build.MODEL.contains("google_sdk")
                            || Build.MODEL.contains("Emulator")
                            || Build.MODEL.contains("Android SDK built for x86")
                            || Build.MANUFACTURER.contains("Genymotion")
                            || (Build.BRAND.startsWith("generic") && Build.DEVICE.startsWith("generic"))
                            || "google_sdk".equals(Build.PRODUCT))
                        params.setPreviewFormat(ImageFormat.YV12);  // "generic" or "android" = android emulator
                    else
                        params.setPreviewFormat(ImageFormat.NV21);

                    //预览格式记录到成员变量里
                    mPreviewFormat = params.getPreviewFormat();

                    Log.d(TAG, "Set preview size to " + Integer.valueOf((int)frameSize.width) + "x" + Integer.valueOf((int)frameSize.height));
                    params.setPreviewSize((int)frameSize.width, (int)frameSize.height);

                    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH && !android.os.Build.MODEL.equals("GT-I9100"))
                        params.setRecordingHint(true);

                    //JavaCameraView的聚焦模式也是写定下的
                    List<String> FocusModes = params.getSupportedFocusModes();
                    if (FocusModes != null && FocusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO))
                    {
                        params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
                    }

                    mCamera.setParameters(params);
                    params = mCamera.getParameters();

                    //设置frame大小
                    mFrameWidth = params.getPreviewSize().width;
                    mFrameHeight = params.getPreviewSize().height;

                    //这里涉及到缩放
                    /*
                        #Modified portrait step1
                        为了在deliverAndDrawFrame里往画布上画时应用缩放
                        <JavaCameraView>里
                        android:layout_width="match_parent"
                        android:layout_height="match_parent"
                        若又想指定缩放后的大小可将<JavaCameraView>放在一个有大小的
                        LinearLayout里
                        且当方向是portrait时比率是
                        surface的width/相机frame的mFrameHeight
                        surface的height/相机frame的mFrameWidth
                        若不想设置<JavaCameraView>则这里直接去掉if语句应该也可
                     */
                    if ((getLayoutParams().width == LayoutParams.MATCH_PARENT) && (getLayoutParams().height == LayoutParams.MATCH_PARENT))
                        //mScale = Math.min(((float)height)/mFrameHeight, ((float)width)/mFrameWidth);
                        mScale = Math.min(((float)width)/mFrameHeight, ((float)height)/mFrameWidth);
                    else
                        mScale = 0;

                    //Log缩放和相机Frame大小
                    Log.d("FunnyAR","mScale: "+mScale+" mFrameWidth: "+mFrameWidth+
                            " mFrameHeight: "+mFrameHeight);

                    if (mFpsMeter != null) {
                        mFpsMeter.setResolution(mFrameWidth, mFrameHeight);
                    }

                    //算frame的字节大小,设置相应大小的缓冲区接收数据
                    //像素个数
                    int size = mFrameWidth * mFrameHeight;
                    //像素个数x当前格式每个像素所需bit个数/一个字节8bit==frame所需byte数
                    size  = size * ImageFormat.getBitsPerPixel(params.getPreviewFormat()) / 8;
                    mBuffer = new byte[size];

                    /*
                    Adds a pre-allocated buffer to the preview callback buffer queue.
                    Applications can add one or more buffers to the queue.
                    When a preview frame arrives and there is still at least
                    one available buffer, the buffer will be used and removed from the queue.
                    Then preview callback is invoked with the buffer.
                    If a frame arrives and there is no buffer left, the frame is discarded.
                    Applications should add buffers back when they finish processing the data
                     in them.
                     */
                    /*
                    This method is only necessary when setPreviewCallbackWithBuffer(PreviewCallback)
                     is used. When setPreviewCallback(PreviewCallback) or
                     setOneShotPreviewCallback(PreviewCallback) are used,
                     buffers are automatically allocated.
                     When a supplied buffer is too small to hold the preview frame data,
                     preview callback will return null and the buffer will be removed from the
                     buffer queue.
                     */
                    mCamera.addCallbackBuffer(mBuffer);
                    /*
                    Installs a callback to be invoked for every preview frame,
                    using buffers supplied with addCallbackBuffer(byte[]),
                    in addition to displaying them on the screen.
                    he callback will be repeatedly called for as long as preview is active
                    and buffers are available. Any other preview callbacks are overridden.
                     */
                    mCamera.setPreviewCallbackWithBuffer(this);

                    //一个Mat数组
                    //注意Yuv420sp格式
                    mFrameChain = new Mat[2];
                    mFrameChain[0] = new Mat(mFrameHeight + (mFrameHeight/2), mFrameWidth, CvType.CV_8UC1);
                    mFrameChain[1] = new Mat(mFrameHeight + (mFrameHeight/2), mFrameWidth, CvType.CV_8UC1);

                    //继承的方法为继承的Bitmap mCacheBitmap初始化内存
                    AllocateCache();

                    //JavaCameraFrame内部有对Mat的引用
                    //mCameraFrame[0].mYuvFrameData就是Mat mFrameChain[0]
                    mCameraFrame = new JavaCameraFrame[2];
                    mCameraFrame[0] = new JavaCameraFrame(mFrameChain[0], mFrameWidth, mFrameHeight);
                    mCameraFrame[1] = new JavaCameraFrame(mFrameChain[1], mFrameWidth, mFrameHeight);

                    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB) {
                        mSurfaceTexture = new SurfaceTexture(MAGIC_TEXTURE_ID);
                        mCamera.setPreviewTexture(mSurfaceTexture);
                    } else
                       mCamera.setPreviewDisplay(null);

                    /* Finally we are ready to start the preview */
                    Log.d(TAG, "startPreview");
                    mCamera.startPreview();
                }
                else
                    result = false;
            } catch (Exception e) {
                result = false;
                e.printStackTrace();
            }
        }

        return result;
    }

    protected void releaseCamera() {
        synchronized (this) {
            if (mCamera != null) {
                mCamera.stopPreview();
                mCamera.setPreviewCallback(null);

                mCamera.release();
            }
            mCamera = null;
            if (mFrameChain != null) {
                mFrameChain[0].release();
                mFrameChain[1].release();
            }
            if (mCameraFrame != null) {
                mCameraFrame[0].release();
                mCameraFrame[1].release();
            }
        }
    }

    private boolean mCameraFrameReady = false;

    //重载父类的抽象方法,负责启动相机
    @Override
    protected boolean connectCamera(int width, int height) {

        /* 1. We need to instantiate camera
         * 2. We need to start thread which will be getting frames
         */
        /* First step - initialize camera connection */
        Log.d(TAG, "Connecting to camera");
        //用initializeCamera函数实现初始化相机连接
        if (!initializeCamera(width, height))
            return false;

        mCameraFrameReady = false;

        /* now we can start update thread */
        Log.d(TAG, "Starting processing thread");
        mStopThread = false;
        mThread = new Thread(new CameraWorker());
        mThread.start();

        return true;
    }

    @Override
    protected void disconnectCamera() {
        /* 1. We need to stop thread which updating the frames
         * 2. Stop camera and release it
         */
        Log.d(TAG, "Disconnecting from camera");
        try {
            mStopThread = true;
            Log.d(TAG, "Notify thread");
            synchronized (this) {
                this.notify();
            }
            Log.d(TAG, "Waiting for thread");
            if (mThread != null)
                mThread.join();
        } catch (InterruptedException e) {
            e.printStackTrace();
        } finally {
            mThread =  null;
        }

        /* Now release camera */
        releaseCamera();

        mCameraFrameReady = false;
    }

    /*
    重载Camera.PreviewCallback onPreviewFrame方法
    这里onPreviewFrame时在UI线程里被处理的,因为相机时在主线程里被启动的
    但是数据将被另一个线程取走处理
     */
    /*
    Callback interface used to deliver copies of preview frames as they are displayed.
    Called as preview frames are displayed. This callback is invoked
     on the event thread Camera.open(int) was called from.
     */
    @Override
    public void onPreviewFrame(byte[] frame, Camera arg1) {
        if (BuildConfig.DEBUG)
            Log.d(TAG, "Preview Frame received. Frame size: " + frame.length);
        synchronized (this) {
            //mChainIdx在0,1间切换,由另一个线程负责管理
            //OpenCV Java层特有的方法
            //mFrameChain[mChainIdx]的大小是1.5height x 1.0width,将数据存进去
            mFrameChain[mChainIdx].put(0, 0, frame);
            //设置标志表示数据存好了
            mCameraFrameReady = true;
            //唤醒一个等待当前JavaCameraView.this的线程

            this.notify();
        }
        /*
        onPreviewFrame处理数据时addCallbackBuffer()的buffer将出队列被处理
        处理完后为了下次onPreviewFrame需再次将buffer给回调
         */
        if (mCamera != null)
            mCamera.addCallbackBuffer(mBuffer);
    }

    /*
    JavaCameraFrame实现CvCameraViewFrame的rgba(),gray()方法
    这个类型将通过deliverAndDrawFrame()里的mListener.onCameraFrame(frame)传给用户处理
    在JavaCameraFrame的接口里实现Mat的旋转是最好的时机了
    如此client通过gray(),rgba()获得的Mat就是方向portrait的了
    #Modified portrait step3
     */
    private class JavaCameraFrame implements CvCameraViewFrame {
        @Override
        public Mat gray() {
            //返回Mat里的选定区域,这跟Yuv420sp格式紧密相关
            //return mYuvFrameData.submat(0, mHeight, 0, mWidth);
            //#Modified step3.1
            Core.rotate(mYuvFrameData.submat(0, mHeight, 0, mWidth),
                    portrait_gray,Core.ROTATE_90_CLOCKWISE);
            return portrait_gray;
        }

        @Override
        public Mat rgba() {
            if (mPreviewFormat == ImageFormat.NV21)
                Imgproc.cvtColor(mYuvFrameData, mRgba, Imgproc.COLOR_YUV2RGBA_NV21, 4);
            else if (mPreviewFormat == ImageFormat.YV12)
                Imgproc.cvtColor(mYuvFrameData, mRgba, Imgproc.COLOR_YUV2RGB_I420, 4);  // COLOR_YUV2RGBA_YV12 produces inverted colors
            else
                throw new IllegalArgumentException("Preview Format can be NV21 or YV12");

            //#Modified step3.2
            Core.rotate(mYuvFrameData.submat(0, mHeight, 0, mWidth),
                    portrait_rgba,Core.ROTATE_90_CLOCKWISE);

            return portrait_rgba;
        }

        public JavaCameraFrame(Mat Yuv420sp, int width, int height) {
            super();
            mWidth = width;
            mHeight = height;
            //#Modified
            portrait_mHeight=mWidth;
            portrait_mWidth=mHeight;
            portrait_gray=new Mat(portrait_mHeight,portrait_mWidth,CvType.CV_8UC1);
            portrait_rgba=new Mat(portrait_mHeight,portrait_mWidth,CvType.CV_8UC4);
            mYuvFrameData = Yuv420sp;
            mRgba = new Mat();
        }

        public void release() {
            mRgba.release();
        }

        private Mat mYuvFrameData;
        private Mat mRgba;
        private int mWidth;
        private int mHeight;
        //#Modified
        private int portrait_mHeight;
        private int portrait_mWidth;
        private Mat portrait_gray;
        private Mat portrait_rgba;
    };

    private class CameraWorker implements Runnable {

        @Override
        public void run() {
            do {
                boolean hasFrame = false;
                synchronized (JavaCameraView.this) {
                    try {
                        //onPreviewFrame里frame准备好了会设置mCameraFrameReady为true然后唤醒此线程
                        //只要相机启动着mStopThread就为false
                        //当相机启动着且onPreviewFrame里frame没准备好时线程就等待
                        //等待语句放在while里防止条件没满足时线程被唤醒
                        while (!mCameraFrameReady && !mStopThread) {
                            JavaCameraView.this.wait();
                        }
                    } catch (InterruptedException e) {
                        e.printStackTrace();
                    }
                    //线程被唤醒是因为onPreviewFrame里frame准备好了
                    if (mCameraFrameReady)
                    {
                        //mChainIdx在0,1之间切换表示mCameraFrame当前的缓冲区
                        mChainIdx = 1 - mChainIdx;
                        //设置mCameraFrameReady为false用来等下次onPreviewFrame里frame准备好
                        mCameraFrameReady = false;
                        //表示当前有frame可用
                        hasFrame = true;
                    }
                }

                //线程没停止且有frame可用
                if (!mStopThread && hasFrame) {
                    //当前的缓冲区不为空则处理它
                    //mChainIdx初值为0,mChainIdx = 1 - mChainIdx设置其为1
                    //这里1 - mChainIdx为0
                    //之后mChainIdx值为1,mChainIdx = 1 - mChainIdx设置其为0
                    //这里1 - mChainIdx为1
                    //如此循环
                    //mCameraFrame[1 - mChainIdx].mYuvFrameData就是对mFrameChain[1 - mChainIdx]
                    //的引用,即JavaCameraFrame类里有对Mat的引用
                    if (!mFrameChain[1 - mChainIdx].empty())
                        deliverAndDrawFrame(mCameraFrame[1 - mChainIdx]);
                }
            } while (!mStopThread);
            Log.d(TAG, "Finish processing thread");
        }
    }
}

3.这是layout文件

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    tools:context=".MainActivity">

    <!-- 这里1080px,1440px是硬编码,适合redmi note4x屏幕 -->
    <!--<org.opencv.android.JavaCameraView
        android:id="@+id/javaCameraView"
        android:layout_width="1080px"
        android:layout_height="1440px" />-->

    <!--为了应用缩放-->
    <LinearLayout
        android:layout_width="1080px"
        android:layout_height="1440px"
        android:orientation="vertical">

        <org.opencv.android.JavaCameraView
            android:id="@+id/javaCameraView"
            android:layout_width="match_parent"
            android:layout_height="match_parent" />

    </LinearLayout>

    <TextView
        android:layout_width="1080px"
        android:layout_height="480px"
        android:text="FunnyAR!" />

</LinearLayout>

 

转载于:https://www.cnblogs.com/qq2523984508/p/10512396.html

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值