突然发现 下午无事可干了,心里慌得狠,总想找点事儿事情做。初学android,也是初学OpenGLES,想起自己的许多经验都是这个平台给予的,今天在这里也分享下一段自己整合的代码,如何使用OpenglES在android显示摄像头的滤镜效果,先贴出一篇前辈的帖子,主要用到了在Shader上yuv转RGB的显示。http://blog.csdn.net/eastlhu/article/details/9382431
前提
使用工具: android studio,android版本4.0,opengles2.0
思路:
使用opengles 绘制摄像头数据,并使用滤镜处理了摄像头的效果,GPU的绘制,减少cpu的占用
效果预览
前半部为摄像头采集数据,后半段为opengles 加香槟特效滤镜 绘制的数据
一、获取摄像头数据
通过调用android提供的Camera类,可以打开摄像头,并通过SurfaceView显示数据。在Camera类中,有一个
PreviewCallback 接口,用于接收Camera采集的yuv数据(手机采集的格式一般为yuv420sp),我自己写了一个摄像头采集的类,大致框架如下:
// 摄像头数据采集及图层绘制
public class CameraView extends SurfaceView implements SurfaceHolder.Callback, Camera.PreviewCallback
{
// 源视频帧宽/高
private int srcFrameWidth = 640;
private int srcFrameHeight = 480;
// 数据采集
private int curCameraIndex = 1;
private Camera camera = null;
// 视频帧共享存储回调接口
private SurfaceHolder surfaceHolder;
public CameraView(Context context)
{
super(context);
if (Camera.getNumberOfCameras() > 1)
{
curCameraIndex = 1;
}
else
{
curCameraIndex = 0;
}
surfaceHolder = getHolder();
surfaceHolder.addCallback(this);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
@Override
public void surfaceCreated(SurfaceHolder sh)
{
stopCamera();
startCamera(curCameraIndex);
}
@Override
public void surfaceChanged(SurfaceHolder sh, int format, int width, int height)
{
}
@Override
public void surfaceDestroyed(SurfaceHolder sh)
{
stopCamera();
}
// 根据索引初始化摄像头
public void initCamera(int cameraIndex)
{
// 初始化并打开摄像头
if (camera == null)
{
try
{
camera = Camera.open(cameraIndex);
}
catch (Exception e)
{
return;
}
Camera.Parameters params = camera.getParameters();
if (params.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE))
{
// 自动对焦
params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
}
try
{
params.setPreviewFormat(ImageFormat.NV21);
params.setPreviewSize(srcFrameWidth, srcFrameHeight);
camera.setParameters(params);
params = camera.getParameters();
params.setPreviewFpsRange(15*1000, 30*1000);
camera.setParameters(params);
}
catch (Exception e)
{
}
try
{
camera.setPreviewDisplay(surfaceHolder);
}
catch(Exception ex)
{
ex.printStackTrace();
}
camera.startPreview();
camera.setPreviewCallback(this);
camera.setDisplayOrientation(180);
}
}
// 打开摄像头
private void startCamera(int cameraIndex)
{
initCamera(cameraIndex);
}
// 停止并释放摄像头
public void stopCamera()
{
if (camera != null)
{
camera.setPreviewCallback(null);
camera.stopPreview();
camera.release();
camera = null;
}
}
// 获取摄像头视频数据
@Override
public void onPreviewFrame(byte[] data, Camera camera)
{
synchronized (this)
{
}
}
}
二、Opengl绘制
在android开发中,摄像头数据的显示使用SurfaceView来完成任务,主要的绘制过程由底层API完成,opengles的绘制在SurfaceView做不了什么文章,那么,opengles绘制的数据,需要在哪里显示呢? 如果不嫌麻烦,可以使用 view里的 Canvas(画布)将数据画出来。不过,android为我们专门提供了一个GLSurFaceView的类来更好的显示opengl绘制的数据,在GLSurFaceView中有一个Rederer渲染接口,主要绘制就在这个接口里操作。这个接口有三个方法需要实现,如下
public class MyGL20Renderer implements GLSurfaceView.Renderer
{
@Override
public void onSurfaceCreated(GL10 unused, EGLConfig config)
{
}
@Override
public void onDrawFrame(GL10 unused)
{
}
@Override
public void onSurfaceChanged(GL10 unused, int width, int height)
{
} }
三个方法一目了然,OnSurfaceCreate 用于初始化操作,例如为Opengl设置背景颜色,开启纹理绘制等;OnDrawFrame 用于具体的绘制操作;OnsurfaceChanged 在尺寸发生变化时调用的方法
在开始调用opengles绘制前,需要做的一些准备工作
1、判断手机是否支持opengles绘制,在androidManifest.xml中加入如下
2、调用如下启动Opengles2.0的绘制;设置Renderer并设置绘制模式
setEGLContextClientVersion(2);//设置Renderer到GLSurfaceViewsetRenderer(newMyGL20Renderer());// 只有在绘制数据改变时才绘制viewsetRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
三、具体绘制细节(进入代码环节)
关于MainActivity如下
public classMainActivity extendsActivity
{
privateCameraGLSurfaceView mGLView= null;privateDrawYUVView mDrawView= null;privateCameraView cameraView= null;privateRelativeLayout previewLayout= null,GLpreviewLayout= null,DrawpreviewLayout;@Overrideprotected voidonCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,WindowManager.LayoutParams.FLAG_FULLSCREEN);setContentView(R.layout.activity_main);RelativeLayout.LayoutParams layoutParams = null;//GL绘制窗口GLpreviewLayout= (RelativeLayout)findViewById(R.id.GLpreviewLayout);layoutParams = newRelativeLayout.LayoutParams(640,480);mGLView= newCameraGLSurfaceView(this);GLpreviewLayout.addView(mGLView,layoutParams);//视频窗口previewLayout= (RelativeLayout)findViewById(R.id.previewLayout);layoutParams = newRelativeLayout.LayoutParams(640,480);cameraView= newCameraView(this);cameraView.setSaveFrameCallback(mGLView);previewLayout.addView(cameraView,layoutParams);}
}
CameraGLSurfaceView 是我编写的一个类,继承自GLSurfaceView,所有opengles绘制操作,将在这个类里面完成。该类还继承了一个SaveFrameCallback接口,以回调函数的方式接收camera发送过来的数据,调用代码中的cameraView.SetSaveFrameCallback(mGLView)函数设置此回调函数
CameraGLSurfaceView
public classCameraGLSurfaceView extendsGLSurfaceView implementsCameraView.SaveFrameCallback
{
// 源视频帧宽/高private intsrcFrameWidth= 640;private intsrcFrameHeight= 480;private intviewWidth= 0,viewHeight= 0;private intframeWidth= 640,frameHeight= 480;privateByteBuffer yBuf= null,uBuf= null,vBuf= null;private intyuvFrameSize= 640*480;// 纹理idprivate int[] Ytexture= new int[1];private int[] Utexture= new int[1];private int[] Vtexture= new int[1];private int[] Mtexture= new int[1];private intaPositionMain= 0,aTexCoordMain= 0,uYTextureMain= 0,uUTextureMain= 0,uVTextureMain= 0,uMTextureMain= 0;private intprogramHandleMain= 0;private static final intFLOAT_SIZE_BYTES= 4;privateFloatBuffer squareVertices= null;privateFloatBuffer coordVertices= null;private booleanmbpaly= false;publicCameraGLSurfaceView(Context context)
{
super(context);setEGLContextClientVersion(2);//设置Renderer到GLSurfaceViewsetRenderer(newMyGL20Renderer());// 只有在绘制数据改变时才绘制viewsetRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);intqtrFrameSize = yuvFrameSize>> 2;yBuf= ByteBuffer.allocateDirect(yuvFrameSize);yBuf.order(ByteOrder.nativeOrder()).position(0);uBuf= ByteBuffer.allocateDirect(qtrFrameSize);uBuf.order(ByteOrder.nativeOrder()).position(0);vBuf= ByteBuffer.allocateDirect(qtrFrameSize);vBuf.order(ByteOrder.nativeOrder()).position(0);// 顶点坐标squareVertices= ByteBuffer.allocateDirect(util.squareVertices.length* FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();squareVertices.put(util.squareVertices).position(0);//纹理坐标coordVertices= ByteBuffer.allocateDirect(util.coordVertices.length* FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();coordVertices.put(util.coordVertices).position(0);}
public classMyGL20Renderer implementsGLSurfaceView.Renderer
{
public voidonSurfaceCreated(GL10 unused,EGLConfig config)
{
mbpaly= false;//设置背景的颜色GLES20.glClearColor(0.5f,0.5f,0.5f,1.0f);//启动纹理GLES20.glEnable(GLES20.GL_TEXTURE_2D);GLES20.glActiveTexture(GLES20.GL_TEXTURE0);GLES20.glActiveTexture(GLES20.GL_TEXTURE1);GLES20.glActiveTexture(GLES20.GL_TEXTURE2);GLES20.glActiveTexture(GLES20.GL_TEXTURE3);changeFilterShader(0);//创建yuv纹理createTexture(frameWidth,frameHeight,GLES20.GL_LUMINANCE,Ytexture);createTexture(frameWidth>>1,frameHeight>>1,GLES20.GL_LUMINANCE,Utexture);createTexture(frameWidth>> 1,frameHeight>> 1,GLES20.GL_LUMINANCE,Vtexture);createTexture(frameWidth,frameHeight,GLES20.GL_RGBA,Mtexture);mbpaly= true;}
private voidchangeFilterShader( intfilterId )
{
programHandleMain= util.createShaderProgram();if( programHandleMain!= -1)
{
// 获取VertexShader变量aPositionMain= getShaderHandle(programHandleMain,"vPosition");aTexCoordMain= getShaderHandle(programHandleMain,"a_texCoord");// 获取FrameShader变量uYTextureMain= getShaderHandle(programHandleMain,"SamplerY");uUTextureMain= getShaderHandle(programHandleMain,"SamplerU");uVTextureMain= getShaderHandle(programHandleMain,"SamplerV");uMTextureMain= getShaderHandle(programHandleMain,"SamplerM");// 使用滤镜着色器程序GLES20.glUseProgram(programHandleMain);//给变量赋值GLES20.glUniform1i(uYTextureMain,0);GLES20.glUniform1i(uUTextureMain,1);GLES20.glUniform1i(uVTextureMain,2);GLES20.glUniform1i(uMTextureMain,3);GLES20.glEnableVertexAttribArray(aPositionMain);GLES20.glEnableVertexAttribArray(aTexCoordMain);// 设置Vertex Shader数据squareVertices.position(0);GLES20.glVertexAttribPointer(aPositionMain,2,GLES20.GL_FLOAT, false,0,squareVertices);coordVertices.position(0);GLES20.glVertexAttribPointer(aTexCoordMain,2,GLES20.GL_FLOAT, false,0,coordVertices);}
}
// 创建纹理private voidcreateTexture(intwidth, intheight, intformat, int[] textureId)
{
//创建纹理GLES20.glGenTextures(1,textureId,0);//绑定纹理GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,textureId[0]);//设置纹理属性GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_WRAP_S,GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_WRAP_T,GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_NEAREST);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D,0,format,width,height,0,format,GLES20.GL_UNSIGNED_BYTE, null);}
public voidonDrawFrame(GL10 unused)
{
// 重绘背景色GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);if( yBuf!= null)
{
//yGLES20.glActiveTexture(GLES20.GL_TEXTURE0);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,Ytexture[0]);GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,0,0,0,frameWidth,frameHeight,GLES20.GL_LUMINANCE,GLES20.GL_UNSIGNED_BYTE,yBuf);//uGLES20.glActiveTexture(GLES20.GL_TEXTURE1);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,Utexture[0]);GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,0,0,0,frameWidth>> 1,frameHeight>> 1,GLES20.GL_LUMINANCE,GLES20.GL_UNSIGNED_BYTE,uBuf);//vGLES20.glActiveTexture(GLES20.GL_TEXTURE2);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,Vtexture[0]);GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,0,0,0,frameWidth>> 1,frameHeight>> 1,GLES20.GL_LUMINANCE,GLES20.GL_UNSIGNED_BYTE,vBuf);//mark图层GLES20.glActiveTexture(GLES20.GL_TEXTURE3);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,Mtexture[0]);GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,0,0,0,frameWidth,frameHeight,GLES20.GL_RGBA,GLES20.GL_UNSIGNED_BYTE, null);}
//绘制GLES20.glViewport(0,0,viewWidth,viewHeight);GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP,0,4);}
public voidonSurfaceChanged(GL10 unused, intwidth, intheight)
{
viewWidth= width;viewHeight= height;GLES20.glViewport(0,0,viewWidth,viewHeight);}
}
public voidonSaveFrames(byte[] data, intlength)
{
if( length != 0&& mbpaly)
{
yBuf.clear();uBuf.clear();vBuf.clear();rotateYUV(data,srcFrameWidth,srcFrameHeight);requestRender();}
}
public intgetShaderHandle(intprogramHandle,String name)
{
inthandle = GLES20.glGetAttribLocation(programHandle,name);if(handle == -1)
{
handle = GLES20.glGetUniformLocation(programHandle,name);}
returnhandle;}
public voidrotateYUV(byte[] src,intwidth,intheight)
{
byte[] yArray = new byte[yBuf.limit()];byte[] uArray = new byte[uBuf.limit()];byte[] vArray = new byte[vBuf.limit()];intnFrameSize = width * height;intk = 0;intuvCount = nFrameSize>>1;//取分量y值for(inti = 0;i < height*width;i++ )
{
yArray[ k ] = src[ i ];k++;}
k = 0;//取分量uv值for( inti = 0;i < uvCount ;i+=2)
{
vArray[ k ] = src[ nFrameSize + i ];//vuArray[ k ] = src[ nFrameSize + i + 1];//uk++;}
yBuf.put(yArray).position(0);uBuf.put(uArray).position(0);vBuf.put(vArray).position(0);}
}
1、onSurfaceCreated
此函数只要操作为初始化一些opengles的工作,如,调用changeFilterShader 中, 创建VertexShader和FrameShader,获取相应的变量进行初始化赋值,并创建 yuv以及mark图层的纹理
2、rotateYUV()
此函数用户提取yuv的分量,如果有用户显示颜色值不对,将 取分量uv值 的循环体内,将uv位置调换试试;调用requestRender 绘制
VertexShader和FrameShader 代码
public staticString VERTEX_SHADER=
"attribute vec4 vPosition;\n"+
"attribute vec2 a_texCoord;\n"+
"varying vec2 tc;\n"+
"void main()\n"+
"{\n"+
" gl_Position = vPosition;\n"+
" tc = a_texCoord;\n"+
"}\n";public staticString FRAG_SHADER=
"varying lowp vec2 tc;\n"+
"uniform sampler2D SamplerY;\n"+
"uniform sampler2D SamplerU;\n"+
"uniform sampler2D SamplerV;\n"+
"void main(void)\n"+
"{\n"+
"mediump vec3 yuv;\n"+
"lowp vec3 rgb;\n"+
"yuv.x = texture2D(SamplerY, tc).r;\n"+
"yuv.y = texture2D(SamplerU, tc).r - 0.5;\n"+
"yuv.z = texture2D(SamplerV, tc).r - 0.5;\n"+
"rgb = mat3( 1,1,1,0,-0.39465,2.03211,1.13983,-0.58060,0) * yuv;\n"+
"gl_FragColor = vec4(rgb, 1);\n"+
"}\n";
基本上主要的绘制工作以及做完了,现在附上我的这个demo,由于本人csdn分数不是很多,所以下载需要点积分。
http://download.csdn.net/detail/chylove5/9240905