在做AR云渲染项目中,需要在Unity中利用Android的MediaCodec能力进行对视频进行硬解码,我们知道MediaCodec可以把视频流渲染到一个surface上,怎样把视频流中的纹理在Unity中使用呢?
一个办法就是MediaCodec解码完成之后,可以把图像的缓存读取出来,在Unity中利用Texture2D 的LoadRawTextureData能力加载到一个纹理上去, 然后在经过YUV格式转换成RGB格式。这个方法非常耗性能,需要从GPU->CPU->GPU的链条,是非常耗时间的。
我们当前的方案为:MediaCodec解码完成之后直接加载到一个surface上,然后在surface中拿到texture指针,赋值到Texture2D中,这个数据都在GPU中流转,相对来说性能提高了不少。
我们都知道Unity使用多线程来支持协程。渲染本身可以是“单个”或“多线程渲染”。本篇博客的方案需要用到单线程渲染模型,后续我在实践之后将扩展多线程渲染的解决方案。需要修改多线程模型关闭,点击Edit->Project Setting->Player->Other Settings, 把Multithreaded Rendering的勾选去掉。
在JAVA中,我们需要创建一个texture,创建surface,并把texture绑定到surface上,具体代码如下所示。
public class HwSurface implements SurfaceTexture.OnFrameAvailableListener {
private static final String TAG = HwSurface.class.getSimpleName();
private static EGLContext unityContext = EGL14.EGL_NO_CONTEXT;
private static EGLDisplay unityDisplay = EGL14.EGL_NO_DISPLAY;
private static EGLSurface unityDrawSurface = EGL14.EGL_NO_SURFACE;
private static EGLSurface unityReadSurface = EGL14.EGL_NO_SURFACE;
private int mTextureId;
private Surface mSurface;
private SurfaceTexture mSurfaceTexture;
private boolean mNewFrameAvailable = false;
public int glCreateExternalTexture() {
int[] texId = new int[1];
GLES20.glGenTextures(1, IntBuffer.wrap(texId));
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texId[0]);
// GLES20.glTexParameteri(
// GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
// GLES20.glTexParameteri(
// GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// GLES20.glTexParameteri(
// GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
// GLES20.glTexParameteri(
// GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
//GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
ShaderUtil.checkGlError(TAG, "glCreateExternalTexture");
return texId[0];
}
public int getTextureId() {
return mTextureId;
}
public void createSurface(int width, int height, int texId) {
unityContext = EGL14.eglGetCurrentContext();
unityDisplay = EGL14.eglGetCurrentDisplay();
unityDrawSurface = EGL14.eglGetCurrentSurface(EGL14.EGL_DRAW);
unityReadSurface = EGL14.eglGetCurrentSurface(EGL14.EGL_READ);
if (unityContext == EGL14.EGL_NO_CONTEXT) {
Log.e(TAG, "UnityEGLContext is invalid -> Most probably wrong thread");
}
EGL14.eglMakeCurrent(unityDisplay, unityDrawSurface, unityReadSurface, unityContext);
//mTextureId = glCreateExternalTexture();
mTextureId = texId;
if (mTextureId == 0) {
mTextureId = glCreateExternalTexture();
}
Log.d(TAG, "textureid: " + mTextureId);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTextureId);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
mSurfaceTexture = new SurfaceTexture(mTextureId);
mSurfaceTexture.setDefaultBufferSize(width, height);
mSurfaceTexture.setOnFrameAvailableListener(this);
mSurface = new Surface(mSurfaceTexture);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
}
public Surface getSurface() {
return mSurface;
}
public void updateTexImage() {
if (mNewFrameAvailable) {
if (!Thread.currentThread().getName().equals("UnityMain"))
Log.e(TAG, "Not called from render thread and hence update texture will fail");
Log.d(TAG, "updateTexImage");
mSurfaceTexture.updateTexImage();
mNewFrameAvailable = false;
}
}
public long getTimestamp() {
return mSurfaceTexture.getTimestamp();
}
public float[] getTransformMatrix() {
float[] textureTransform = new float[16];
mSurfaceTexture.getTransformMatrix(textureTransform);
return textureTransform;
}
@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
mNewFrameAvailable = true;
}
}
然后在创建MediaCodec解码器的时候,把我们创建的surface配置到解码器中。
videoCodec = android.media.MediaCodec.createDecoderByType(mimeType);
//videoCodec.configure(trackFormat, surface, null, 0);
videoCodec.configure(trackFormat, surface, null, 0);
具体的解码代码我就不再这写了,大家自行去百度去吧。
然后Unity 中,利用java代码提供的接口获取得到texture id,然后创建一个Texture2D
int texId = JAVAInterface.getTextureId ();
Texture2D videoTexture = Texture2D.CreateExternalTexture(_imageSize.Width, _imageSize.Height, TextureFormat.RGBA32, false, false, new System.IntPtr(texId));
_material.SetTexture("_MainTex", videoTexture);
在shader需要使用uniform samplerExternalOES _MainTex进行指定。
Shader "Unlit/PlanerLSR"
{
Properties
{
_MainTex("Texture", 2D) = "white" {}
}
SubShader
{
Tags { "Queue" = "Transparent" "IgnoreProjector" = "True" "RenderType" = "Transparent"}
Pass
{
Cull Off
ZTest Always
ZWrite On
Lighting Off
LOD 100
Tags {"LightMode" = "ForwardBase"}
// 这是混合模式
Blend SrcAlpha OneMinusSrcAlpha
GLSLPROGRAM
#pragma only_renderers gles3
#include "UnityCG.glslinc"
#ifdef SHADER_API_GLES3
#version 320 es
#extension GL_OES_EGL_image_external_essl3 : require
#endif // SHADER_API_GLES3
#ifdef VERTEX
out vec2 textureCoord;
void main()
{
#ifdef SHADER_API_GLES3
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
textureCoord = gl_MultiTexCoord0.xy;
#endif // SHADER_API_GLES3
}
#endif // VERTEX
#ifdef FRAGMENT
in vec2 textureCoord;
layout(binding = 0) uniform samplerExternalOES _MainTex;
out vec4 fragColor;
void main()
{
#ifdef SHADER_API_GLES3
fragColor = texture(_MainTex, textureCoord);
#endif // SHADER_API_GLES3
}
#endif // FRAGMENT
ENDGLSL
}
}
FallBack Off
}
经过实践,我的这种方法是可行的,还有一些博客是在Unity 创建一个Texture,然后把texture id传到JAVA中,然后在绑定到surface中,我这种方式没有调通过,不知道其他人有没有调通过。
现在,到目前为止讨论的解决方案仅适用于Unity中的“单”线程渲染模型。要支持“多线程”渲染,我们需要使用JNI并使用“GL.IssuePluginEvent”。在多线程渲染上下文中,问题插件事件方法确保提供给它的方法始终从Unity的当前渲染线程调用。因此,在JNI中,我们需要将正确的java插件方法附加到这个渲染线程并执行它。这里概述的解决方案在多线程上下文中也很好用,前提是我们编写适当的JNI方法并通过GL.IssuePluginEvent调用它们。由于这涉及到对JNI的更多理解,后续我经过实践之后,单独开一篇博客来实现多线程模型,请大家多多关注。
参考文献:
https://medium.com/xrpractices/external-texture-rendering-with-unity-and-android-b844bb7a35da
https://github.com/gcschrader/MediaSurfacePlugin
https://stackoverflow.com/questions/36257388/unity-android-surfacetexture-update-error
https://www.coder.work/article/1495701
https://github.com/Unity-Technologies/NativeRenderingPlugin
https://stackoverflow.com/questions/45971282/drawing-arcore-camera-feed-onto-unity-texture