Camera2 教程 4:通过TEXTURE OES方式实现相机预览

青橙相机

GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用处是什么?
上一张实现相机预览的每一帧的输出格式是YUV的(YUV),那么这个扩展纹理的作用就是实现YUV格式到RGB的自动转化,我们就不需要再为此写YUV转RGB的代码了

1. 创建xml和Render

//xml
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent">

    <com.jdf.camera.ui.CameraV2GLSurfaceView
        android:id="@+id/glsurfaceView"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

</RelativeLayout>
//Activity
   @Override
    protected void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_camera2_oes);
        //获取SurfaceView
        glSurfaceView = findViewById(R.id.glsurfaceView);
        //封装相机操作
        mCameraLoader = new Camera2OESLoader(this);
        //主要是实例化Render,并绑定
        glSurfaceView.init(mCameraLoader, false, Camera2OESActivity.this);
        findViewById(R.id.saveImage).setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                ImageUtils.saveBitmap = true;
            }
        });
    }
//GlSurfaceView
public class CameraV2GLSurfaceView extends GLSurfaceView {
    public static final String TAG = "Filter_CameraV2GLSurfaceView";
    private JImageRender mCameraV2Renderer;

    public void init(Camera2OESLoader camera, boolean isPreviewStarted, Context context) {
        setEGLContextClientVersion(2);

        mCameraV2Renderer = new JImageRender(new JImageFilter(context));
        mCameraV2Renderer.init(this, camera, isPreviewStarted, context);

        setRenderer(mCameraV2Renderer);
    }

    public CameraV2GLSurfaceView(Context context, AttributeSet attributes) {
        super(context, attributes);
    }
}

2 开启相机

在Activity的onResume中,判断glSurfaceView加载完毕后,获取相机相机传感器方向,并且根据GlSurfaceView大小获取合适的预览大小

//Activity
    @Override
    protected void onResume() {
        super.onResume();
        boolean laidOut = ViewCompat.isLaidOut(glSurfaceView);
        boolean b = !glSurfaceView.isLayoutRequested();
        JLog.d(TAG, "onResume.....laidOut[%b],[%b]",laidOut,b);
        if (laidOut && b) {
            mCameraLoader.onResume(glSurfaceView.getWidth(), glSurfaceView.getHeight());
        } else {
            glSurfaceView.addOnLayoutChangeListener(new View.OnLayoutChangeListener() {
                @Override
                public void onLayoutChange(View v, int left, int top, int right, int bottom, int oldLeft, int oldTop,
                                           int oldRight, int oldBottom) {
                    JLog.d(TAG, "onResume.....onLayoutChange");
                    glSurfaceView.removeOnLayoutChangeListener(this);
                    mCameraLoader.onResume(glSurfaceView.getWidth(), glSurfaceView.getHeight());
                }
            });
        }
    }

Camera2OESLoader.java

public void onResume(int width, int height) {
        JLog.d(TAG, "onResume[%d,%d]...", width, height);
        mViewWidth = width;
        mViewHeight = height;
        setUpCamera();
    }


    protected void setUpCamera() {
        try {
            //获取屏幕方向,相机传感器方向,和预览大小
            setUpCameraOutputs();
            if (ActivityCompat.checkSelfPermission(mActivity, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                JLog.e(TAG, "Dont have CAMERA permission");
                return;
            }
            mCameraManager.openCamera(mCameraId, mCameraDeviceCallback, mCameraHandler);
        } catch (CameraAccessException e) {
            JLog.e(TAG, "Opening camera (ID: " + mCameraId + ") failed.");
            e.printStackTrace();
        }
    }

mCameraHandler是根据Camera2OESLoader中HandlerThread创建的,因为后续我们需要再Render线程中启动预览,为了保证相机打开和启动预览在同一个线程,需要指定同一个Handler

相机启动成功后,我们去Render中创建OES纹理id,SufaceTexture和启动预览

3 启动预览

  • JImageRender.onSurfaceCreated

创建OES 纹理id

@Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        JLog.i(TAG, "onSurfaceCreated......");
        mOESTextureId = Utils.createOESTextureObject();
        mFilterEngine.ifNeedInit();

    }
  • JImageRender.onDrawFrame
    然后在onDrawFrame第一次调用时,完成SurfaceTexture初始化和预览启动
@Override
    public void onDrawFrame(GL10 gl) {

        Long t1 = System.currentTimeMillis();
        if (mSurfaceTexture != null) {
            mSurfaceTexture.updateTexImage();
            mSurfaceTexture.getTransformMatrix(transformMatrix);
        }

        if (!bIsPreviewStarted) {
            bIsPreviewStarted = initSurfaceTexture();
            bIsPreviewStarted = true;
            return;
        }

        glClearColor(1.0f, 0.0f, 0.0f, 0.0f);

        mFilterEngine.onDraw(transformMatrix,mOESTextureId,glCubeBuffer,glTextureBuffer);

        long t2 = System.currentTimeMillis();
        long t = t2 - t1;
        Log.i(TAG, "onDrawFrame: time: " + t);
    }
  • JImageRender. initSurfaceTexture
public boolean initSurfaceTexture() {
     
        mSurfaceTexture = new SurfaceTexture(mOESTextureId);
        mCameraLoaer.setPreviewTexture(mSurfaceTexture);
        mCameraLoaer.startPreview();
        return true;
    }
  • Camera2OESLoader.startPreview
public void startPreview() {
       
        mSurfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
        final Surface surface = new Surface(mSurfaceTexture);

        try {

            mCameraDevice.createCaptureSession(Arrays.asList( surface), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(@NonNull CameraCaptureSession session) {
                    try {
                        CaptureRequest.Builder builder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
                        builder.addTarget(surface);
                        mCaptureRequest = builder.build();
                        mCameraCaptureSession = session;
                        mCameraCaptureSession.setRepeatingRequest(mCaptureRequest, null, mCameraHandler);
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }
                }

                @Override
                public void onConfigureFailed(@NonNull CameraCaptureSession session) {

                }
            }, mCameraHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

4 绘制渲染

预览启动后成功后,会更新SurfaceTexture,并更新纹理数据对应的纹理举证,然后将该SurfaceTexture对应的OES纹理ID传递到OPENGL进行绘制

@Override
    public void onDrawFrame(GL10 gl) {

        Long t1 = System.currentTimeMillis();
        if (mSurfaceTexture != null) {
           //更新SurfaceTexture纹理内容
            mSurfaceTexture.updateTexImage();
            mSurfaceTexture.getTransformMatrix(transformMatrix);
        }

        if (!bIsPreviewStarted) {
            bIsPreviewStarted = initSurfaceTexture();
            bIsPreviewStarted = true;
            return;
        }

        glClearColor(1.0f, 0.0f, 0.0f, 0.0f);
        //使用opengl绘制mOESTextureId对应的纹理内容
        mFilterEngine.onDraw(transformMatrix, mOESTextureId,glCubeBuffer,glTextureBuffer)
  
    }

注意:updateTexImage()方法只能在包OpenGLES环境的线程里调用,即Renderer接口所独立创建的线程当中。一般在onDrawFrame中调用updateTexImage()将数据绑定给OpenGLES对应的纹理对象

JimageFilter.onDraw

public void onDraw(float[] transformMatrix, int textureId, FloatBuffer cubeBuffer, FloatBuffer textureBuffer) {
      
        GLES30.glUseProgram(glProgId);
           ....
        if (textureId != OpenGlUtils.NO_TEXTURE) {
            ...
           glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
           ...
           glUniformMatrix4fv(uTextureMatrixLocation, 1, false, transformMatrix, 0);
        }
         ...
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);

    }

onDraw过程与上一节的相机预览流程一致,主要差别有两点

  1. 绑定和解绑纹理id的类型为GL_TEXTURE_EXTERNAL_OES
  2. 需要额外将从SurfaceTexture中产生的变换矩阵uTextureMatrixLocation传递到着色器中

5 定义OES着色器

因为纹理id使用的是OES格式,着色器中纹理类型也要对应修改,相对于上以上相机预览实现,作色器主要有两个变化

顶点着色器中,需要传递SurfaceTexture产生的变换矩阵对象uTextureMatrix,并根据矩阵值调整纹理坐标位置

attribute vec4 position;
uniform mat4 uTextureMatrix;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main()
{
  textureCoordinate = (uTextureMatrix * inputTextureCoordinate).xy;
  gl_Position = position;
}`

片元着色器中,需要声明纹理的类型为samplerExternalOES ,而不是sampler2D; 需要在文件头声明使用了samplerExternalOES,否则加载着色器会报错

#extension GL_OES_EGL_image_external : require
precision mediump float;
uniform samplerExternalOES inputImageTexture;
varying vec2 textureCoordinate;
void main()
{
  gl_FragColor = texture2D(inputImageTexture, textureCoordinate);
}

6 显示效果

预览效果

7 流程总结

相机OES预览.png

主要流程为:

  1. 定义GlSurfaceView和定义OEG纹理类型的着色器

  2. 将GlSurfaceView和Render,设置渲染模式

  3. 在GlSurfaceView加载完毕后,启动相机

  4. 在Render的onSurfaceCreate中,完成OES纹理id创建和初始化Opengl对象

  5. 在相机开启成功的时候,在onDrawFrame中完成预览启动;之所以在onDrawFrame启动,是因为其他阶段,相机不一定启动成功

  6. 预览启动成功后,会在Render的每一帧调用时,更新SurfaceTexture的内容,并更新对应的OES纹理id;将纹理id传递到opengl对象中进行渲染

  7. 渲染成功后,预览数据会显示到屏幕上

8 代码位置

代码具体实现参考 QCCamera中的JCamera2OESFboActivity

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值