Android多种方式实现相机圆形预览 看这一篇就够了

setOutlineProvider(new ViewOutlineProvider() {

@Override

public void getOutline(View view, Outline outline) {

Rect rect = new Rect(0, 0, view.getMeasuredWidth(), view.getMeasuredHeight());

outline.setRoundRect(rect, radius);

}

});

setClipToOutline(true);

}

在需要时修改圆角值并更新

public void setRadius(int radius) {

this.radius = radius;

}

public void turnRound() {

invalidateOutline();

}

即可根据设置的圆角值更新控件显示的圆角大小。当控件为正方形,且圆角值为边长的一半,显示的就是圆形。

二、实现正方形预览

1. 设备支持1:1预览尺寸

首先介绍一种简单但是局限性较大的实现方式:将相机预览尺寸和预览控件的大小都调整为1:1

一般Android设备都支持多种预览尺寸,以Samsung Tab S3为例

  • 在使用Camera API时,其支持的预览尺寸如下:

2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1920x1080

2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1280x720

2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1440x1080

2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1088x1088

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1056x864

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 960x720

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 720x480

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 640x480

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 352x288

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 320x240

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 176x144

其中1:1的预览尺寸为:1088x1088。

  • 在使用Camera2 API时,其支持的预览尺寸(其实也包含了PictureSize)如下:

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 4128x3096

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 4128x2322

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3264x2448

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3264x1836

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3024x3024

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2976x2976

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2880x2160

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2592x1944

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1920

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1440

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1080

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2160x2160

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2048x1536

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2048x1152

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1936x1936

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1920x1080

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1440x1080

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1280x960

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1280x720

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 960x720

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 720x480

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 640x480

2019-08-02 13:19:24.982 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 320x240

2019-08-02 13:19:24.982 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 176x144

其中1:1的预览尺寸为:3024x3024、2976x2976、2160x2160、1936x1936。

只要我们选择1:1的预览尺寸,再将预览控件设置为正方形,即可实现正方形预览

再通过设置预览控件的圆角为边长的一半,即可实现圆形预览

2. 设备不支持1:1预览尺寸的情况

  • 选择1:1预览尺寸的缺陷分析

  • 分辨率局限性

上述说到,我们可以选择1:1的预览尺寸进行预览,但是局限性较高

可选择范围都很小。如果相机不支持1:1的预览尺寸,这个方案就不可行了。

  • 资源消耗

以Samsung tab S3为例,该设备使用Camera2 API时,支持的正方形预览尺寸都很大,在进行图像处理等操作时将占用较多系统资源。

  • 处理不支持1:1预览尺寸的情况

  • 添加一个1:1尺寸的ViewGroup

  • 将TextureView放入ViewGroup

  • 设置TextureView的margin值以达到显示中心正方形区域的效果

示例代码

//将预览控件和预览尺寸比例保持一致,避免拉伸

{

FrameLayout.LayoutParams textureViewLayoutParams = (FrameLayout.LayoutParams) textureView.getLayoutParams();

int newHeight = 0;

int newWidth = textureViewLayoutParams.width;

//横屏

if (displayOrientation % 180 == 0) {

newHeight = textureViewLayoutParams.width * previewSize.height / previewSize.width;

}

//竖屏

else {

newHeight = textureViewLayoutParams.width * previewSize.width / previewSize.height;

}

当不是正方形预览的情况下,添加一层ViewGroup限制View的显示区域

if (newHeight != textureViewLayoutParams.height) {

insertFrameLayout = new RoundFrameLayout(CoverByParentCameraActivity.this);

int sideLength = Math.min(newWidth, newHeight);

FrameLayout.LayoutParams layoutParams = new FrameLayout.LayoutParams(sideLength, sideLength);

insertFrameLayout.setLayoutParams(layoutParams);

FrameLayout parentView = (FrameLayout) textureView.getParent();

parentView.removeView(textureView);

parentView.addView(insertFrameLayout);

insertFrameLayout.addView(textureView);

FrameLayout.LayoutParams newTextureViewLayoutParams = new FrameLayout.LayoutParams(newWidth, newHeight);

//横屏

if (displayOrientation % 180 == 0) {

newTextureViewLayoutParams.leftMargin = ((newHeight - newWidth) / 2);

}

//竖屏

else {

newTextureViewLayoutParams.topMargin = -(newHeight - newWidth) / 2;

}

textureView.setLayoutParams(newTextureViewLayoutParams);

}

}

三、使用GLSurfaceView进行自定义程度更高的预览

使用上面的方法操作已经可完成正方形和圆形预览,但是仅适用于原生相机,当我们的数据源并非是原生相机的情况时如何进行圆形预览?接下来介绍使用GLSurfaceView显示NV21的方案,完全是自己实现预览数据的绘制

1. GLSurfaceView使用流程

其中的重点是渲染器(Renderer)的编写,Renderer的介绍如下:

/**

  • A generic renderer interface.

  • The renderer is responsible for making OpenGL calls to render a frame.

  • GLSurfaceView clients typically create their own classes that implement

  • this interface, and then call {@link GLSurfaceView#setRenderer} to

  • register the renderer with the GLSurfaceView.

  • Developer Guides

  • For more information about how to use OpenGL, read the

  • OpenGL developer guide.

  • Threading

  • The renderer will be called on a separate thread, so that rendering

  • performance is decoupled from the UI thread. Clients typically need to

  • communicate with the renderer from the UI thread, because that’s where

  • input events are received. Clients can communicate using any of the

  • standard Java techniques for cross-thread communication, or they can

  • use the {@link GLSurfaceView#queueEvent(Runnable)} convenience method.

  • EGL Context Lost

  • There are situations where the EGL rendering context will be lost. This

  • typically happens when device wakes up after going to sleep. When

  • the EGL context is lost, all OpenGL resources (such as textures) that are

  • associated with that context will be automatically deleted. In order to

  • keep rendering correctly, a renderer must recreate any lost resources

  • that it still needs. The {@link #onSurfaceCreated(GL10, EGLConfig)} method

  • is a convenient place to do this.

  • @see #setRenderer(Renderer)

*/

public interface Renderer {

/**

  • Called when the surface is created or recreated.

  • Called when the rendering thread

  • starts and whenever the EGL context is lost. The EGL context will typically

  • be lost when the Android device awakes after going to sleep.

  • Since this method is called at the beginning of rendering, as well as

  • every time the EGL context is lost, this method is a convenient place to put

  • code to create resources that need to be created when the rendering

  • starts, and that need to be recreated when the EGL context is lost.

  • Textures are an example of a resource that you might want to create

  • here.

  • Note that when the EGL context is lost, all OpenGL resources associated

  • with that context will be automatically deleted. You do not need to call

  • the corresponding “glDelete” methods such as glDeleteTextures to

  • manually delete these lost resources.

  • @param gl the GL interface. Use instanceof to

  • test if the interface supports GL11 or higher interfaces.

  • @param config the EGLConfig of the created surface. Can be used

  • to create matching pbuffers.

*/

void onSurfaceCreated(GL10 gl, EGLConfig config);

/**

  • Called when the surface changed size.

  • Called after the surface is created and whenever

  • the OpenGL ES surface size changes.

  • Typically you will set your viewport here. If your camera

  • is fixed then you could also set your projection matrix here:

  • void onSurfaceChanged(GL10 gl, int width, int height) {

  • gl.glViewport(0, 0, width, height);
    
  • // for a fixed camera, set the projection too
    
  • float ratio = (float) width / height;
    
  • gl.glMatrixMode(GL10.GL_PROJECTION);
    
  • gl.glLoadIdentity();
    
  • gl.glFrustumf(-ratio, ratio, -1, 1, 1, 10);
    
  • }

  • @param gl the GL interface. Use instanceof to

  • test if the interface supports GL11 or higher interfaces.

  • @param width

  • @param height

*/

void onSurfaceChanged(GL10 gl, int width, int height);

/**

  • Called to draw the current frame.

  • This method is responsible for drawing the current frame.

  • The implementation of this method typically looks like this:

  • void onDrawFrame(GL10 gl) {

  • gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
    
  • //... other gl calls to render the scene ...
    
  • }

  • @param gl the GL interface. Use instanceof to

  • test if the interface supports GL11 or higher interfaces.

*/

void onDrawFrame(GL10 gl);

}

  • void onSurfaceCreated(GL10 gl, EGLConfig config)

在Surface创建或重建的情况下回调

  • void onSurfaceChanged(GL10 gl, int width, int height)

在Surface的大小发生变化的情况下回调

  • void onDrawFrame(GL10 gl)

在这里实现绘制操作。当我们设置的renderModeRENDERMODE_CONTINUOUSLY时,该函数将不断地执行;

当我们设置的renderModeRENDERMODE_WHEN_DIRTY时,将只在创建完成和调用requestRender后才执行。一般我们选择RENDERMODE_WHEN_DIRTY渲染模式,避免过度绘制。

一般情况下,我们会自己实现一个Renderer,然后为GLSurfaceView设置Renderer,可以说,Renderer的编写是整个流程的核心步骤。以下是在void onSurfaceCreated(GL10 gl, EGLConfig config)进行的初始化操作和在void onDrawFrame(GL10 gl)进行的绘制操作的流程图:

2. 具体实现

  • 坐标系介绍

如图所示,和Android的View坐标系不同,OpenGL的坐标系是笛卡尔坐标系。

Android View的坐标系以左上角为原点,向右x递增,向下y递增

而OpenGL坐标系以中心为原点,向右x递增,向上y递增

  • 着色器编写

/**

  • 顶点着色器

*/

private static String VERTEX_SHADER =

" attribute vec4 attr_position;\n" +

" attribute vec2 attr_tc;\n" +

" varying vec2 tc;\n" +

" void main() {\n" +

" gl_Position = attr_position;\n" +

" tc = attr_tc;\n" +

" }";

/**

  • 片段着色器

*/

private static String FRAG_SHADER =

" varying vec2 tc;\n" +

" uniform sampler2D ySampler;\n" +

" uniform sampler2D uSampler;\n" +

" uniform sampler2D vSampler;\n" +

" const mat3 convertMat = mat3( 1.0, 1.0, 1.0, -0.001, -0.3441, 1.772, 1.402, -0.7141, -0.58060);\n" +

" void main()\n" +

" {\n" +

" vec3 yuv;\n" +

" yuv.x = texture2D(ySampler, tc).r;\n" +

" yuv.y = texture2D(uSampler, tc).r - 0.5;\n" +

" yuv.z = texture2D(vSampler, tc).r - 0.5;\n" +

" gl_FragColor = vec4(convertMat * yuv, 1.0);\n" +

" }";

  • 内建变量解释

  • gl_Position

VERTEX_SHADER代码里的gl_Position代表绘制的空间坐标。由于我们是二维绘制,所以直接传入OpenGL二维坐标系的左下(-1,-1)、右下(1,-1)、左上(-1,1)、右上(1,1),也就是{-1,-1,1,-1,-1,1,1,1}

  • gl_FragColor

FRAG_SHADER代码里的gl_FragColor代表单个片元的颜色

  • 其他变量解释

  • ySampleruSamplervSampler

分别代表Y、U、V纹理采样器

  • convertMat

根据以下公式:

R = Y + 1.402 (V - 128)

G = Y - 0.34414 (U - 128) - 0.71414 (V - 128)

B = Y + 1.772 (U - 128)

我们可得到一个YUV转RGB的矩阵

1.0, 1.0, 1.0,

0, -0.344, 1.77,

1.403, -0.714, 0

  • 部分类型、函数的解释

  • vec3、vec4

分别代表三维向量、四维向量。

  • vec4 texture2D(sampler2D sampler, vec2 coord)

以指定的矩阵将采样器的图像纹理转换为颜色值;如:

texture2D(ySampler, tc).r获取到的是Y数据,

texture2D(uSampler, tc).r获取到的是U数据,

texture2D(vSampler, tc).r获取到的是V数据。

  • 在Java代码中进行初始化

根据图像宽高创建Y、U、V对应的ByteBuffer纹理数据;

根据是否镜像显示、旋转角度选择对应的转换矩阵;

public void init(boolean isMirror, int rotateDegree, int frameWidth, int frameHeight) {

if (this.frameWidth == frameWidth

&& this.frameHeight == frameHeight

&& this.rotateDegree == rotateDegree

&& this.isMirror == isMirror) {

return;

}

dataInput = false;

this.frameWidth = frameWidth;

this.frameHeight = frameHeight;

this.rotateDegree = rotateDegree;

this.isMirror = isMirror;

yArray = new byte[this.frameWidth * this.frameHeight];

uArray = new byte[this.frameWidth * this.frameHeight / 4];

vArray = new byte[this.frameWidth * this.frameHeight / 4];

int yFrameSize = this.frameHeight * this.frameWidth;

int uvFrameSize = yFrameSize >> 2;

yBuf = ByteBuffer.allocateDirect(yFrameSize);

yBuf.order(ByteOrder.nativeOrder()).position(0);

uBuf = ByteBuffer.allocateDirect(uvFrameSize);

uBuf.order(ByteOrder.nativeOrder()).position(0);

vBuf = ByteBuffer.allocateDirect(uvFrameSize);

vBuf.order(ByteOrder.nativeOrder()).position(0);

// 顶点坐标

squareVertices = ByteBuffer

.allocateDirect(GLUtil.SQUARE_VERTICES.length * FLOAT_SIZE_BYTES)

.order(ByteOrder.nativeOrder())

.asFloatBuffer();

squareVertices.put(GLUtil.SQUARE_VERTICES).position(0);

//纹理坐标

if (isMirror) {

switch (rotateDegree) {

case 0:

coordVertice = GLUtil.MIRROR_COORD_VERTICES;

break;

case 90:

coordVertice = GLUtil.ROTATE_90_MIRROR_COORD_VERTICES;

break;

case 180:

coordVertice = GLUtil.ROTATE_180_MIRROR_COORD_VERTICES;

break;

case 270:

coordVertice = GLUtil.ROTATE_270_MIRROR_COORD_VERTICES;

break;

default:

break;

}

} else {

switch (rotateDegree) {

case 0:

coordVertice = GLUtil.COORD_VERTICES;

break;

case 90:

coordVertice = GLUtil.ROTATE_90_COORD_VERTICES;

break;

case 180:

coordVertice = GLUtil.ROTATE_180_COORD_VERTICES;

break;

case 270:

coordVertice = GLUtil.ROTATE_270_COORD_VERTICES;

break;

default:

break;

}

}

coordVertices = ByteBuffer.allocateDirect(coordVertice.length * FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();

coordVertices.put(coordVertice).position(0);

}

在Surface创建完成时进行Renderer初始化

private void initRenderer() {

rendererReady = false;

createGLProgram();

//启用纹理

GLES20.glEnable(GLES20.GL_TEXTURE_2D);

//创建纹理

createTexture(frameWidth, frameHeight, GLES20.GL_LUMINANCE, yTexture);

createTexture(frameWidth / 2, frameHeight / 2, GLES20.GL_LUMINANCE, uTexture);

createTexture(frameWidth / 2, frameHeight / 2, GLES20.GL_LUMINANCE, vTexture);

rendererReady = true;

}

其中createGLProgram用于创建OpenGL Program并关联着色器代码中的变量

private void createGLProgram() {

int programHandleMain = GLUtil.createShaderProgram();

if (programHandleMain != -1) {

// 使用着色器程序

GLES20.glUseProgram(programHandleMain);

// 获取顶点着色器变量

int glPosition = GLES20.glGetAttribLocation(programHandleMain, “attr_position”);

int textureCoord = GLES20.glGetAttribLocation(programHandleMain, “attr_tc”);

// 获取片段着色器变量

int ySampler = GLES20.glGetUniformLocation(programHandleMain, “ySampler”);

int uSampler = GLES20.glGetUniformLocation(programHandleMain, “uSampler”);

int vSampler = GLES20.glGetUniformLocation(programHandleMain, “vSampler”);

尾声

面试成功其实都是必然发生的事情,因为在此之前我做足了充分的准备工作,不单单是纯粹的刷题,更多的还会去刷一些Android核心架构进阶知识点,比如:JVM、高并发、多线程、缓存、热修复设计、插件化框架解读、组件化框架设计、图片加载框架、网络、设计模式、设计思想与代码质量优化、程序性能优化、开发效率优化、设计模式、负载均衡、算法、数据结构、高级UI晋升、Framework内核解析、Android组件内核等。

不仅有学习文档,视频+笔记提高学习效率,还能稳固你的知识,形成良好的系统的知识体系。这里,笔者分享一份从架构哲学的层面来剖析的视频及资料分享给大家梳理了多年的架构经验,筹备近6个月最新录制的,相信这份视频能给你带来不一样的启发、收获。

Android进阶学习资料库

一共十个专题,包括了Android进阶所有学习资料,Android进阶视频,Flutter,java基础,kotlin,NDK模块,计算机网络,数据结构与算法,微信小程序,面试题解析,framework源码!

image

大厂面试真题

PS:之前因为秋招收集的二十套一二线互联网公司Android面试真题 (含BAT、小米、华为、美团、滴滴)和我自己整理Android复习笔记(包含Android基础知识点、Android扩展知识点、Android源码解析、设计模式汇总、Gradle知识点、常见算法题汇总。)

《2017-2021字节跳动Android面试历年真题解析》

网上学习资料一大堆,但如果学到的知识不成体系,遇到问题时只是浅尝辄止,不再深入研究,那么很难做到真正的技术提升。

需要这份系统化学习资料的朋友,可以戳这里获取

一个人可以走的很快,但一群人才能走的更远!不论你是正从事IT行业的老鸟或是对IT行业感兴趣的新人,都欢迎加入我们的的圈子(技术交流、学习资源、职场吐槽、大厂内推、面试辅导),让我们一起学习成长!

化、设计模式、负载均衡、算法、数据结构、高级UI晋升、Framework内核解析、Android组件内核等。**
[外链图片转存中…(img-IGn0h1Kp-1714397524970)]

不仅有学习文档,视频+笔记提高学习效率,还能稳固你的知识,形成良好的系统的知识体系。这里,笔者分享一份从架构哲学的层面来剖析的视频及资料分享给大家梳理了多年的架构经验,筹备近6个月最新录制的,相信这份视频能给你带来不一样的启发、收获。

[外链图片转存中…(img-bAKmNA8G-1714397524970)]

Android进阶学习资料库

一共十个专题,包括了Android进阶所有学习资料,Android进阶视频,Flutter,java基础,kotlin,NDK模块,计算机网络,数据结构与算法,微信小程序,面试题解析,framework源码!

[外链图片转存中…(img-FvGtp5Uc-1714397524970)]

大厂面试真题

PS:之前因为秋招收集的二十套一二线互联网公司Android面试真题 (含BAT、小米、华为、美团、滴滴)和我自己整理Android复习笔记(包含Android基础知识点、Android扩展知识点、Android源码解析、设计模式汇总、Gradle知识点、常见算法题汇总。)

[外链图片转存中…(img-Y0UiM6RG-1714397524971)]

《2017-2021字节跳动Android面试历年真题解析》

[外链图片转存中…(img-VSnyoqvx-1714397524971)]

网上学习资料一大堆,但如果学到的知识不成体系,遇到问题时只是浅尝辄止,不再深入研究,那么很难做到真正的技术提升。

需要这份系统化学习资料的朋友,可以戳这里获取

一个人可以走的很快,但一群人才能走的更远!不论你是正从事IT行业的老鸟或是对IT行业感兴趣的新人,都欢迎加入我们的的圈子(技术交流、学习资源、职场吐槽、大厂内推、面试辅导),让我们一起学习成长!

  • 25
    点赞
  • 24
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值