使用OpenGLES 在 android 上显示摄像头滤镜效果

  突然发现 下午无事可干了,心里慌得狠,总想找点事儿事情做。初学android,也是初学OpenGLES,想起自己的许多经验都是这个平台给予的,今天在这里也分享下一段自己整合的代码,如何使用OpenglES在android显示摄像头的滤镜效果,先贴出一篇前辈的帖子,主要用到了在Shader上yuv转RGB的显示。http://blog.csdn.net/eastlhu/article/details/9382431


前提

 使用工具: android studio,android版本4.0,opengles2.0


思路:

使用opengles 绘制摄像头数据,并使用滤镜处理了摄像头的效果,GPU的绘制,减少cpu的占用


效果预览



前半部为摄像头采集数据,后半段为opengles 加香槟特效滤镜 绘制的数据


一、获取摄像头数据

 通过调用android提供的Camera类,可以打开摄像头,并通过SurfaceView显示数据。在Camera类中,有一个

PreviewCallback 接口,用于接收Camera采集的yuv数据(手机采集的格式一般为yuv420sp),我自己写了一个摄像头采集的类,大致框架如下:


<pre name="code" class="java">// 摄像头数据采集及图层绘制
public class CameraView extends SurfaceView implements SurfaceHolder.Callback, Camera.PreviewCallback
{
    // 源视频帧宽/高
    private int srcFrameWidth  = 640;
    private int srcFrameHeight = 480;

    // 数据采集
    private int curCameraIndex = 1;
    private Camera camera = null;
    // 视频帧共享存储回调接口
    private SurfaceHolder surfaceHolder;

    public CameraView(Context context)
    {
        super(context);

        if (Camera.getNumberOfCameras() > 1)
        {
            curCameraIndex = 1;
        }
        else
        {
            curCameraIndex = 0;
        }

        surfaceHolder = getHolder();
        surfaceHolder.addCallback(this);
        surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    @Override
    public void surfaceCreated(SurfaceHolder sh)
    {
        stopCamera();
        startCamera(curCameraIndex);
    }

    @Override
    public void surfaceChanged(SurfaceHolder sh, int format, int width, int height)
    {


    }

    @Override
    public void surfaceDestroyed(SurfaceHolder sh)
    {
        stopCamera();
    }

    // 根据索引初始化摄像头
    public void initCamera(int cameraIndex)
    {
        // 初始化并打开摄像头
        if (camera == null)
        {
            try
            {
                camera = Camera.open(cameraIndex);
            }
            catch (Exception e)
            {
                return;
            }

            Camera.Parameters params = camera.getParameters();
            if (params.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE))
            {
                // 自动对焦
                params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
            }

            try
            {
                params.setPreviewFormat(ImageFormat.NV21);
                params.setPreviewSize(srcFrameWidth, srcFrameHeight);
                camera.setParameters(params);

                params = camera.getParameters();
                params.setPreviewFpsRange(15*1000, 30*1000);
                camera.setParameters(params);
            }
            catch (Exception e)
            {
            }

            try
            {
                camera.setPreviewDisplay(surfaceHolder);
            }
            catch(Exception ex)
            {
                ex.printStackTrace();
            }

            camera.startPreview();
            camera.setPreviewCallback(this);
            camera.setDisplayOrientation(180);
        }
    }

    // 打开摄像头
    private void startCamera(int cameraIndex)
    {
        initCamera(cameraIndex);
    }

    // 停止并释放摄像头
    public void stopCamera()
    {
        if (camera != null)
        {
            camera.setPreviewCallback(null);
            camera.stopPreview();
            camera.release();
            camera = null;
        }

    }
    // 获取摄像头视频数据
    @Override
    public void onPreviewFrame(byte[] data, Camera camera)
    {
        synchronized (this)
        {

        }
    }
  
}




二、Opengl绘制

 在android开发中,摄像头数据的显示使用SurfaceView来完成任务,主要的绘制过程由底层API完成,opengles的绘制在SurfaceView做不了什么文章,那么,opengles绘制的数据,需要在哪里显示呢? 如果不嫌麻烦,可以使用 view里的 Canvas(画布)将数据画出来。不过,android为我们专门提供了一个GLSurFaceView的类来更好的显示opengl绘制的数据,在GLSurFaceView中有一个Rederer渲染接口,主要绘制就在这个接口里操作。这个接口有三个方法需要实现,如下

<span style="font-size:18px;">  public class MyGL20Renderer implements GLSurfaceView.Renderer
    {

        @Override
        public void onSurfaceCreated(GL10 unused, EGLConfig config)
        {
           
        }

        @Override
        public void onDrawFrame(GL10 unused)
        {
        
        }

        @Override
        public void onSurfaceChanged(GL10 unused, int width, int height)
        {
          
        }</span>
<span style="font-size:18px;">  }</span>

<span style="font-size:18px;"></span>
三个方法一目了然,OnSurfaceCreate 用于初始化操作,例如为Opengl设置背景颜色,开启纹理绘制等;OnDrawFrame 用于具体的绘制操作;OnsurfaceChanged 在尺寸发生变化时调用的方法


在开始调用opengles绘制前,需要做的一些准备工作

1、判断手机是否支持opengles绘制,在androidManifest.xml中加入如下

<uses-feature android:glEsVersion="0x00020000" android:required="true" />

2、调用如下启动Opengles2.0的绘制;设置Renderer并设置绘制模式

setEGLContextClientVersion(2);
//设置Renderer到GLSurfaceView
setRenderer(new MyGL20Renderer());
// 只有在绘制数据改变时才绘制view
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);


三、具体绘制细节(进入代码环节)

 关于MainActivity如下

public class MainActivity extends Activity
{
    private CameraGLSurfaceView mGLView = null;
    private DrawYUVView mDrawView = null;
    private CameraView cameraView = null;
    private RelativeLayout previewLayout = null,GLpreviewLayout = null,DrawpreviewLayout;
    @Override
    protected void onCreate(Bundle savedInstanceState)
    {
        super.onCreate(savedInstanceState);
    
        getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
        setContentView(R.layout.activity_main);
        RelativeLayout.LayoutParams layoutParams = null;

        //GL绘制窗口
        GLpreviewLayout = (RelativeLayout)findViewById(R.id.GLpreviewLayout);
        layoutParams = new RelativeLayout.LayoutParams(640,480);
        mGLView = new CameraGLSurfaceView(this);
        GLpreviewLayout.addView(mGLView, layoutParams);

        //视频窗口
        previewLayout = (RelativeLayout)findViewById(R.id.previewLayout);
        layoutParams = new RelativeLayout.LayoutParams(640,480 );
        cameraView = new CameraView(this);
        cameraView.setSaveFrameCallback(mGLView);
        previewLayout.addView(cameraView, layoutParams);

    }
}

CameraGLSurfaceView 是我编写的一个类,继承自GLSurfaceView,所有opengles绘制操作,将在这个类里面完成。该类还继承了一个SaveFrameCallback接口,以回调函数的方式接收camera发送过来的数据,调用代码中的cameraView.SetSaveFrameCallback(mGLView)函数设置此回调函数


CameraGLSurfaceView 

public class CameraGLSurfaceView extends GLSurfaceView implements CameraView.SaveFrameCallback
{
    // 源视频帧宽/高
    private int srcFrameWidth  = 640;
    private int srcFrameHeight = 480;
    private int viewWidth = 0, viewHeight = 0;
    private int frameWidth = 640, frameHeight = 480;


    private ByteBuffer yBuf = null, uBuf = null, vBuf = null;
    private  int yuvFrameSize = 640*480;
    // 纹理id
    private int[] Ytexture = new int[1];
    private int[] Utexture = new int[1];
    private int[] Vtexture = new int[1];
    private int[] Mtexture = new int[1];
    private int aPositionMain = 0, aTexCoordMain = 0,  uYTextureMain = 0, uUTextureMain = 0, uVTextureMain = 0,uMTextureMain = 0;
    private int programHandleMain = 0;
    private static final int FLOAT_SIZE_BYTES = 4;

    private FloatBuffer squareVertices = null;
    private FloatBuffer coordVertices = null;
    private boolean mbpaly = false;


    public CameraGLSurfaceView(Context context)
    {
        super(context);
        setEGLContextClientVersion(2);
        //设置Renderer到GLSurfaceView
        setRenderer(new MyGL20Renderer());
        // 只有在绘制数据改变时才绘制view
        setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);

        int qtrFrameSize = yuvFrameSize >> 2;
        yBuf = ByteBuffer.allocateDirect(yuvFrameSize);
        yBuf.order(ByteOrder.nativeOrder()).position(0);

        uBuf = ByteBuffer.allocateDirect(qtrFrameSize);
        uBuf.order(ByteOrder.nativeOrder()).position(0);

        vBuf = ByteBuffer.allocateDirect(qtrFrameSize);
        vBuf.order(ByteOrder.nativeOrder()).position(0);

        // 顶点坐标
        squareVertices = ByteBuffer.allocateDirect(util.squareVertices.length * FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();
        squareVertices.put(util.squareVertices).position(0);
        //纹理坐标
        coordVertices = ByteBuffer.allocateDirect(util.coordVertices.length * FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();
        coordVertices.put(util.coordVertices).position(0);

    }

    public class MyGL20Renderer implements GLSurfaceView.Renderer
    {

        public void onSurfaceCreated(GL10 unused, EGLConfig config)
        {
            mbpaly = false;
            //设置背景的颜色
            GLES20.glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
            //启动纹理
            GLES20.glEnable(GLES20.GL_TEXTURE_2D);
            GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
            GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
            GLES20.glActiveTexture(GLES20.GL_TEXTURE2);
            GLES20.glActiveTexture(GLES20.GL_TEXTURE3);

            changeFilterShader(0);
            //创建yuv纹理
            createTexture(frameWidth, frameHeight, GLES20.GL_LUMINANCE, Ytexture);
            createTexture(frameWidth>>1, frameHeight>>1, GLES20.GL_LUMINANCE, Utexture);
            createTexture(frameWidth >> 1, frameHeight >> 1, GLES20.GL_LUMINANCE, Vtexture);
            createTexture(frameWidth, frameHeight, GLES20.GL_RGBA, Mtexture);


            mbpaly = true;
        }

        private void changeFilterShader( int filterId )
        {
            programHandleMain = util.createShaderProgram();
            if ( programHandleMain != -1 )
            {
                // 获取VertexShader变量
                aPositionMain = getShaderHandle(programHandleMain, "vPosition");
                aTexCoordMain = getShaderHandle(programHandleMain, "a_texCoord");
                // 获取FrameShader变量
                uYTextureMain = getShaderHandle(programHandleMain, "SamplerY");
                uUTextureMain = getShaderHandle(programHandleMain, "SamplerU");
                uVTextureMain = getShaderHandle(programHandleMain, "SamplerV");
                uMTextureMain = getShaderHandle(programHandleMain, "SamplerM");

                // 使用滤镜着色器程序
                GLES20.glUseProgram(programHandleMain);

                //给变量赋值
                GLES20.glUniform1i(uYTextureMain, 0);
                GLES20.glUniform1i(uUTextureMain, 1);
                GLES20.glUniform1i(uVTextureMain, 2);
                GLES20.glUniform1i(uMTextureMain, 3);
                GLES20.glEnableVertexAttribArray(aPositionMain);
                GLES20.glEnableVertexAttribArray(aTexCoordMain);

                // 设置Vertex Shader数据
                squareVertices.position(0);
                GLES20.glVertexAttribPointer(aPositionMain, 2, GLES20.GL_FLOAT, false, 0, squareVertices);
                coordVertices.position(0);
                GLES20.glVertexAttribPointer(aTexCoordMain, 2, GLES20.GL_FLOAT, false, 0, coordVertices);
            }
        }

        // 创建纹理
        private void createTexture(int width, int height, int format, int[] textureId)
        {
            //创建纹理
            GLES20.glGenTextures(1, textureId, 0);
            //绑定纹理
            GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId[0]);
            //设置纹理属性
            GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
            GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
            GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
            GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
            GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, format, width, height, 0, format, GLES20.GL_UNSIGNED_BYTE, null);
        }
        public void onDrawFrame(GL10 unused)
        {
            // 重绘背景色
            GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

            if ( yBuf != null )
            {
                //y
                GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, Ytexture[0]);
                GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,
                        0,
                        0,
                        0,
                        frameWidth,
                        frameHeight,
                        GLES20.GL_LUMINANCE,
                        GLES20.GL_UNSIGNED_BYTE,
                        yBuf);

                //u
                GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, Utexture[0]);
                GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,
                        0,
                        0,
                        0,
                        frameWidth >> 1,
                        frameHeight >> 1,
                        GLES20.GL_LUMINANCE,
                        GLES20.GL_UNSIGNED_BYTE,
                        uBuf);

                //v
                GLES20.glActiveTexture(GLES20.GL_TEXTURE2);
                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, Vtexture[0]);
                GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,
                        0,
                        0,
                        0,
                        frameWidth >> 1,
                        frameHeight >> 1,
                        GLES20.GL_LUMINANCE,
                        GLES20.GL_UNSIGNED_BYTE,
                        vBuf);

                //mark图层
                GLES20.glActiveTexture(GLES20.GL_TEXTURE3);
                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, Mtexture[0]);
                GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D, 0, 0, 0, frameWidth, frameHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
            }

            //绘制
            GLES20.glViewport(0, 0, viewWidth, viewHeight);


            GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
        }

        public void onSurfaceChanged(GL10 unused, int width, int height)
        {
            viewWidth  = width;
            viewHeight = height;
            GLES20.glViewport(0, 0, viewWidth, viewHeight);
        }
    }


    public void onSaveFrames(byte[] data, int length)
    {
        if (  length != 0 && mbpaly )
        {
            yBuf.clear();
            uBuf.clear();
            vBuf.clear();
           rotateYUV(data, srcFrameWidth, srcFrameHeight);
            requestRender();
        }
    }

    public  int  getShaderHandle(int programHandle,String name)
    {
        int handle = GLES20.glGetAttribLocation(programHandle, name);
        if (handle == -1)
        {
            handle = GLES20.glGetUniformLocation(programHandle, name);
        }
        return handle;
    }

    public void rotateYUV(byte[] src,int width,int height)
    {
        byte [] yArray = new  byte[yBuf.limit()];
        byte [] uArray = new  byte[uBuf.limit()];
        byte [] vArray = new  byte[vBuf.limit()];
        int nFrameSize = width * height;
        int k          = 0;
        int uvCount    = nFrameSize>>1;

        //取分量y值
        for(int i = 0;i < height*width;i++ )
        {
            yArray[ k ] = src[ i ];
            k++;
        }

        k = 0;

        //取分量uv值
        for( int i = 0;i < uvCount ;i+=2 )
        {
            vArray[ k ] = src[ nFrameSize +  i ]; //v
            uArray[ k ] = src[ nFrameSize +  i + 1 ];//u
            k++;
        }

        yBuf.put(yArray).position(0);
        uBuf.put(uArray).position(0);
        vBuf.put(vArray).position(0);
    }
}

1、onSurfaceCreated

此函数只要操作为初始化一些opengles的工作,如,调用changeFilterShader 中, 创建VertexShader和FrameShader,获取相应的变量进行初始化赋值,并创建 yuv以及mark图层的纹理

2、rotateYUV()

此函数用户提取yuv的分量,如果有用户显示颜色值不对,将 取分量uv值 的循环体内,将uv位置调换试试;调用requestRender 绘制


VertexShader和FrameShader 代码

   public static String  VERTEX_SHADER =
        "attribute vec4 vPosition;    \n"+
        "attribute vec2 a_texCoord;   \n"+
        "varying vec2 tc;             \n"+
        "void main()                  \n"+
        "{                            \n"+
        "   gl_Position = vPosition;  \n"+
        "   tc = a_texCoord;          \n"+
        "}                            \n";

public static String FRAG_SHADER =
    "varying lowp vec2 tc;      \n"+
    "uniform sampler2D SamplerY;\n"+
    "uniform sampler2D SamplerU;\n"+
    "uniform sampler2D SamplerV;\n"+
    "void main(void)            \n"+
    "{                          \n"+
    "mediump vec3 yuv;          \n"+
    "lowp vec3 rgb;             \n"+
    "yuv.x = texture2D(SamplerY, tc).r;     \n"+
    "yuv.y = texture2D(SamplerU, tc).r - 0.5;\n"+
    "yuv.z = texture2D(SamplerV, tc).r - 0.5;\n"+
    "rgb = mat3( 1,1,1,0,-0.39465,2.03211,1.13983,-0.58060,0) * yuv;\n"+
    "gl_FragColor = vec4(rgb, 1);\n"+
    "}                          \n";


基本上主要的绘制工作以及做完了,现在附上我的这个demo,由于本人csdn分数不是很多,所以下载需要点积分。


http://download.csdn.net/detail/chylove5/9240905



展开阅读全文
博主设置当前文章不允许评论。

没有更多推荐了,返回首页