使用OES纹理+GLSurfaceView+JNI实现基于OpenGL ES的播放器画面处理

前言:

        安卓使用SurfaceView + SurfaceTexture + Surface进行视频播放或者相机预览,只能看到原色画面,但很多场合需求画面可以实现二次加工,例如调整红绿蓝三原色的比例、添加滤镜、降噪等,此时如果使用GLSurfaceview搭配GL Shader Language进行基于GPU的实时处理,就可以又快又节能地满足这个需求。效果如下:

安卓利用GLSurfaceview + OpenGL片元渲染器实现MediaPlayer播放时画面白平衡、对比度和亮度的调节

承接上文:一种基于FBO实现渲染流水线的思路_轮子初级玩家-CSDN博客假如我希望实现如下特性:1、使用片元shader,把YUV信号作为纹理输入,采样过程中转换为RGB信号。2、把第1步取得的画面通过片元shader,使用3*3的卷积核,实现卷积模糊。那么,就有如下几种方案:第一种:片元shader每次采样3*3个坐标,转换后记录到数组,之后对数组实现卷积处理,最后输出片元颜色。第二种:片元shader每次采样1个坐标,转换后直接输出到片元颜色,此时采样后的输出就会在指定的frameBuffer中。然后第二个片元shader使用framebufhttps://blog.csdn.net/cjzjolly/article/details/123412651?spm=1001.2014.3001.5501

一、前置知识:

0、OES纹理是什么呢:

2、BufferQueue是什么

//todo

3、surface是什么

//todo

4、surfaceTexture是什么

        用途:OES纹理无论是用于安卓下的相机预览数据获取,还是播放器播放画面的获取,都是比较容易实现的一种途径。把OES纹理输入到片元shader后,即可在采样过程中实现各种各样的图像处理操作,如降噪、美颜、滤镜等,这是在安卓下进行比较复杂的实时图像处理的常用入口。搭配NDK即可实现图像处理和OpenGL方面的代码使用C++进行编写以实现代码跨端,既能在安卓上使用,也能在别的平台上复用。其实安卓的GLES30、GLES20等库本身也是通过JNI调用native下的GL函数而已,只要GLSurfaceView建立成功,则当前进程的EGLContext就已经建立成功,这个时候无论是调用GLES20等库,还是直接在JNI上调用C下的GL函数,只要还是在同一个进程同一个EGLContext环境下,那这个调用就基本上是等价的。

5、画面是如何从播放器中传递到OES纹理的:

       //todo

6、为什么使用JNI:

        因为GL的语法,在每个平台都基本一致,因此完全可以用C/C++写一份OpenGL处理逻辑即可跨平台使用。但如果使用GLES20/GLES30的安卓自带的JNI桥接库,则不能直接应用于如PC等的平台。因此GL代码使用JNI进行编写我认为是比较合理的。

二、具体实践:

0、创建GLSurfaceView与渲染器GLSurfaceView.Renderer:

        首先创建GLSurfaceview,创建这个实例时,就会主动帮你创建好EGLContext,之后就可以自由地在drawCall线程中灵活使用GL的API进行各种各样的绘图操作了。而要使用GLSurfaceview,则必须给它指定一系列的操作,去指导它如何进行渲染,因此我们需要implements一个GLSurfaceView.Renderer,作为我们的GLSurfaceview实例是渲染器,其中主要要实现以下几个方法:

onSurfaceCreated:当GLSurfaceview的surface创建后,该方法就会被回调。此时可以做一些初始化的操作,例如我就在这里创建了一个OES_TEXTURE,并且把纹理的索引值通过SurfaceTexture进行包装,再传入Surface再进行一次包装,此时这个Surface即可作为MediaPlayer或相机API的写入数据的桥梁,摄像机画面或者播放器的画面就会被OES_TEXTURE所记录,同时OpenGL可以使用这个纹理索引作为纹理传入(需要external特性声明),即可在片元渲染器进行各种画面操作,如降噪、卷积图像处理、滤镜、颜色调节等。另外我还在这里创建了一个用于实验的MediaPlayer实例。
onSurfaceChanged:当GLSurfaceview的大小发生改变时,此方法会被回调,此时可以使用传入的width、height变量更新自己的GL绘图视口大小等操作。
onDrawFrame:EGLContext所在的GL线程,每次需要绘图时就会回调该方法,此时用于所填充的绘图代码就会被执行。由于绘图操作都需要在GL线程中执行,所以这是所有绘图操作的起点。在其他线程执行GL绘图操作是不被允许的,虽然编译不会出现问题,但不会执行任何你想要看到的画面。另外,由于SurfaceTexture从BufferQueue中拿帧需要调用updateTexImage方法,这个调用过程放到每一帧绘制的回调中去调用是更合适的。

具体整个GLSurfaceView的代码如下:

package com.opengldecoder.jnibridge;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.SurfaceTexture;
import android.media.MediaPlayer;
import android.opengl.GLES11Ext;
import android.opengl.GLES30;
import android.opengl.GLSurfaceView;
import android.util.AttributeSet;
import android.util.Log;
import android.view.MotionEvent;
import android.view.Surface;

import java.nio.ByteBuffer;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

public class NativeGLSurfaceView extends GLSurfaceView {
    private Bitmap mTestBmp;
    private Renderer mRenderer;
    /**图层native指针**/
    private long mLayer = Long.MIN_VALUE;
    private long mRenderOES = Long.MIN_VALUE;
    private long mRenderNoiseReduction = Long.MIN_VALUE;
    private long mRenderConvolutionDemo = Long.MIN_VALUE;
    private long mRenderLut = Long.MIN_VALUE;
    private long mRenderDeBackground = Long.MIN_VALUE;

    //Android画面数据输入Surface
    private Surface mDataInputSurface = null;
    //Android画面数据输入纹理
    private int[] mDataInputTexturesPointer = null;
    private SurfaceTexture mInputDataSurfaceTexture;
    private Player mDemoPlayer;
    private float mdx, mdy;
    private float mPrevX, mPrevY;


    public NativeGLSurfaceView(Context context) {
        super(context);
    }

    public NativeGLSurfaceView(Context context, AttributeSet attrs) {
        super(context, attrs);
        init();
        Log.i("cjztest", "NativeGLSurfaceView222");
    }

    private void init() {
        this.setEGLContextClientVersion(3);//使用OpenGL ES 3.0需设置该参数为3
        mRenderer = new Renderer();//创建Renderer类的对象
        this.setRenderer(mRenderer);    //设置渲染器
        this.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
    }

    public Surface getSurface() {
        Log.i("cjztest", "GLRenderer.getSurface:" + mDataInputSurface.toString());
        return mDataInputSurface;
    }

    /**亮度调整**/
    public void setRenderBrightness(float brightness) {
        if (mRenderOES != Long.MIN_VALUE) {
            JniBridge.setBrightness(mRenderOES, brightness);
        }
    }

    /**对比度调整**/
    public void setRenderContrast(float contrast) {
        if (mRenderOES != Long.MIN_VALUE) {
            JniBridge.setContrast(mRenderOES, contrast);
        }
    }


    /**白平衡调整**/
    public void setRenderWhiteBalance(float rWeight, float gWeight, float bWeight) {
        if (mRenderOES != Long.MIN_VALUE) {
            JniBridge.setWhiteBalance(mRenderOES, rWeight, gWeight, bWeight);
        }
    }

    /**降噪渲染器开关**/
    public void setRenderNoiseReductionOnOff(boolean sw) {
        if (mLayer != Long.MIN_VALUE) {
            if (mRenderNoiseReduction != Long.MIN_VALUE) {
                if (sw) {
                    JniBridge.addRenderToLayer(mLayer, mRenderNoiseReduction);
                } else {
                    JniBridge.removeRenderForLayer(mLayer, mRenderNoiseReduction);
                }
            }
        }
    }

    /**滤镜开关**/
    public void setRenderLutOnOff(boolean sw) {
        if (mLayer != Long.MIN_VALUE && mRenderLut != Long.MIN_VALUE) {
            if (sw) {
                JniBridge.addRenderToLayer(mLayer, mRenderLut);
            } else {
                JniBridge.removeRenderForLayer(mLayer, mRenderLut);
            }
        }
    }

    /**背景去除程序开关**/
    public void setRenderDeBackgroundOnOff(boolean sw) {
        if (mLayer != Long.MIN_VALUE && mRenderDeBackground != Long.MIN_VALUE) {
            if (sw) {
                JniBridge.addRenderToLayer(mLayer, mRenderDeBackground);
            } else {
                JniBridge.removeRenderForLayer(mLayer, mRenderDeBackground);
            }
        }
    }

    /**长宽缩放**/
    public void setScale(float sx, float sy) {
        if (mLayer != Long.MIN_VALUE) {
            JniBridge.layerScale(mLayer, sx, sy);
        }
    }

    /**移动**/
    public void setTrans(float x, float y) {
        if (mLayer != Long.MIN_VALUE) {
            JniBridge.layerTranslate(mLayer, x, y);
        }
    }

    /**旋转**/
    public void setRotate(float angle) {
        if (mLayer != Long.MIN_VALUE) {
            JniBridge.layerRotate(mLayer, angle);
        }
    }

    /**加载滤镜**/
    public void setLut(Bitmap lutBMP) {
        if (mLayer != Long.MIN_VALUE && mRenderLut != Long.MIN_VALUE) {
            byte b[] = new byte[lutBMP.getByteCount()];
            ByteBuffer bb = ByteBuffer.wrap(b);
            lutBMP.copyPixelsToBuffer(bb);
            JniBridge.renderLutTextureLoad(mRenderLut, b, lutBMP.getWidth(), lutBMP.getHeight(), lutBMP.getWidth());
            Log.i("cjztest", "lut pixels size:" + lutBMP.getByteCount());
        }
    }

    @Override
    public boolean onTouchEvent(MotionEvent event) {
        switch (event.getAction()) {
            case MotionEvent.ACTION_DOWN:
                mPrevX = event.getX();
                mPrevY = event.getY();
                break;
            case MotionEvent.ACTION_MOVE:
                mdx += (float) (event.getX() - mPrevX) / getWidth();
                mdy -= (float) (event.getY() - mPrevY) / getHeight();
                setTrans(mdx, mdy);
                mPrevX = event.getX();
                mPrevY = event.getY();
                break;
        }
        return true;
    }

    private class Renderer implements GLSurfaceView.Renderer {

        private int mWidth;
        private int mHeight;
        private int mVideoWidth;
        private int mVideoHeight;
        private boolean mIsFirstFrame = true;


        @Override
        public void onSurfaceCreated(GL10 gl, EGLConfig config) {
            Log.i("cjztest", String.format("NativeGlSurfaceView.onSurfaceCreated"));
            mWidth = 0;
            mHeight = 0;
            mVideoWidth = 0;
            mVideoHeight = 0;
            mIsFirstFrame = true;
            //创建一个OES纹理和相关配套对象
            if (mDataInputSurface == null) {
                //创建OES纹理
                mDataInputTexturesPointer = new int[1];
                GLES30.glGenTextures(1, mDataInputTexturesPointer, 0);
                GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mDataInputTexturesPointer[0]);
                //设置放大缩小。设置边缘测量
                GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                        GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR);
                GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                        GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
                GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                        GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
                GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                        GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
                mInputDataSurfaceTexture = new SurfaceTexture(mDataInputTexturesPointer[0]);
                mDataInputSurface = new Surface(mInputDataSurfaceTexture);
            }
            //创建一个demo播放器
            if (mDemoPlayer == null) {
                mDemoPlayer = new Player(getContext(), getSurface(), new MediaPlayer.OnVideoSizeChangedListener() {
                    @Override
                    public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
                        /**设置OES图层内容得大小**/
                        if ((width != mVideoWidth || height != mVideoHeight) && width > 0 && height > 0) {
                            Log.i("cjztest", String.format("onSurfaceChanged: w:%d, h:%d", width, height));
                            mVideoWidth = width;
                            mVideoHeight = height;
                        }
                    }
                });
            }
        }

        @Override
        public void onSurfaceChanged(GL10 gl, int width, int height) {
            if ((width != mWidth || height != mHeight) && width > 0 && height > 0) {
                this.mWidth = width;
                this.mHeight = height;
                Log.i("cjztest", String.format("NativeGlSurfaceView.onSurfaceChanged:width:%d, height:%d", mWidth, mHeight));
                JniBridge.nativeGLInit(width, height);
                mIsFirstFrame = true;
            }
        }

        @Override
        public void onDrawFrame(GL10 gl) {
            if (mIsFirstFrame) {  //不能异步进行gl操作,所以只能移到第一帧(或glrender的各种回调中,但这里需要等待onVideoSizeChanged准备好)进行图层创建
                if (mVideoWidth > 0 && mVideoHeight > 0) {
                    //清除上次用过的图层
                    if (mLayer != Long.MIN_VALUE) {
                        JniBridge.removeLayer(mLayer);
                    }
                    //创建一个图层(由于这个使用场景种没有数组数据,只有OES纹理,所以dataPointer为0)
                    mLayer = JniBridge.addFullContainerLayer(mDataInputTexturesPointer[0], new int[]{mVideoWidth, mVideoHeight}, 0, new int[]{0, 0}, GLES30.GL_RGBA);  //依次传入纹理、纹理的宽高、数据地址(如果有)、数据的宽高
                    //添加一个oes渲染器
                    mRenderOES = JniBridge.makeRender(JniBridge.RENDER_PROGRAM_KIND.RENDER_OES_TEXTURE.ordinal()); //添加oes纹理

//                    mRenderConvolutionDemo = JniBridge.addRenderForLayer(mLayer, JniBridge.RENDER_PROGRAM_KIND.RENDER_CONVOLUTION.ordinal()); //添加卷积图像处理demo
                    mRenderNoiseReduction = JniBridge.makeRender(JniBridge.RENDER_PROGRAM_KIND.NOISE_REDUCTION.ordinal()); //添加降噪渲染器
                    mRenderLut = JniBridge.makeRender(JniBridge.RENDER_PROGRAM_KIND.RENDER_LUT.ordinal()); //添加Lut渲染器
                    mRenderDeBackground = JniBridge.makeRender(JniBridge.RENDER_PROGRAM_KIND.DE_BACKGROUND.ordinal()); //创建背景去除渲染程序

                    JniBridge.addRenderToLayer(mLayer, mRenderOES);
                    JniBridge.addRenderToLayer(mLayer, mRenderNoiseReduction);
                    mIsFirstFrame = false;
                }
            }
            mInputDataSurfaceTexture.updateTexImage();
            JniBridge.renderLayer(0, mWidth, mHeight);
        }
    }
}

MediaPlayer实例的创建代码如下:

package com.opengldecoder.jnibridge;

import android.content.Context;
import android.content.res.AssetFileDescriptor;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.view.Surface;

import java.io.IOException;

public class Player {

    private MediaPlayer mMediaPlayer;

    public Player(Context context, Surface surface, MediaPlayer.OnVideoSizeChangedListener sizeChangedListener) {
        initMediaPlayer(context, surface, sizeChangedListener);
    }

    private void initMediaPlayer(Context context, Surface surface, MediaPlayer.OnVideoSizeChangedListener sizeChangedListener) {
        mMediaPlayer = new MediaPlayer();
        try {
            AssetFileDescriptor afd = context.getAssets().openFd("car_race.mp4");
            mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
//            String path = "http://192.168.1.254:8192";
//            mediaPlayer.setDataSource(path);
//            mediaPlayer.setDataSource(TextureViewMediaActivity.videoPath);
        } catch (IOException e) {
            e.printStackTrace();
        }
        mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        mMediaPlayer.setLooping(true);
        mMediaPlayer.setOnVideoSizeChangedListener(sizeChangedListener);
        mMediaPlayer.setSurface(surface);
        mMediaPlayer.prepareAsync();
        mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
            @Override
            public void onPrepared(MediaPlayer mediaPlayer) {
                mediaPlayer.start();
            }
        });
    }

}

1、利用OES_TEXTURE在JNI中进行纹理渲染与处理:

此时无论在JNI中和GLSurfaceview中,其实使用的都是同一个进程空间、同一个EGLContext,因此纹理索引都是共用的,指代的都是同样的东西。因此在onDrawFrame中,我把之前创建的OES传递到底下的JNI方法,在C中可以直接使用:

                    mLayer = JniBridge.addFullContainerLayer(mDataInputTexturesPointer[0], new int[]{mVideoWidth, mVideoHeight}, 0, new int[]{0, 0}, GLES30.GL_RGBA);  //依次传入纹理、纹理的宽高、数据地址(如果有)、数据的宽高

这就是我编写的JNI桥接文件:

package com.opengldecoder.jnibridge;

import android.graphics.Bitmap;
import android.view.Surface;

public class JniBridge {

    static {
        System.loadLibrary("opengl_decoder");
    }

    /**渲染器类型枚举器 todo java要调用,则也要抄一份**/
    public enum RENDER_PROGRAM_KIND {
        RENDER_OES_TEXTURE, //OES纹理渲染
        RENDER_YUV, //YUV数据或纹理渲染
        RENDER_CONVOLUTION, //添加卷积处理
        NOISE_REDUCTION, //添加噪声处理
        RENDER_LUT, //添加滤镜处理渲染器
        DE_BACKGROUND, //去除背景
    }

    public static native void nativeGLInit(int viewPortWidth, int viewPortHeight);

//    public static native void drawRGBABitmap(Bitmap bmp, int bmpW, int bmpH);

    public static native void drawToSurface(Surface surface, int color);

    public static native void drawBuffer();

    public static native long addFullContainerLayer(int texturePointer, int textureWidthAndHeight[], long dataPointer,
                                                    int dataWidthAndHeight[],
                                                    int dataPixelFormat);

    public static native void removeLayer(long layerPointer);

    /**创建渲染器
     @param renderProgramKind 渲染器类型,参考RENDER_PROGRAM_KIND**/
    public static native long makeRender(int renderProgramKind);

    public static native void addRenderToLayer(long layerPointer, long renderPointer);

    public static native void removeRenderForLayer(long layerPointer, long renderPointer);

    public static native void setRenderAlpha(long renderPointer, float alpha);

    /**渲染器亮度调整**/
    public static native void setBrightness(long renderPointer, float brightness);

    /**渲染器对比度调整**/
    public static native void setContrast(long renderPointer, float contrast);

    /**白平衡调整**/
    public static native void setWhiteBalance(long renderPointer, float redWeight, float greenWeight, float blueWeight);

    public static native void renderLayer(int fboPointer, int fboWidth, int fboHeight);

    public static native void layerScale(long layerPointer, float scaleX, float scaleY);

    public static native void layerTranslate(long layerPointer, float dx, float dy);

    public static native void layerRotate(long layerPointer, float angle);


    /******************************************特定图层非通用功能设置区*************************************************/
    public static native void renderLutTextureLoad(long lutRenderPointer, byte lutPixels[], int w, int h, int unitLen);
}

在这个demo中我已经实验了多种渲染器,但由于输入是一个OES纹理,因此首先使用OES渲染器:

                    mRenderOES = JniBridge.makeRender(JniBridge.RENDER_PROGRAM_KIND.RENDER_OES_TEXTURE.ordinal()); //添加oes纹理
                    JniBridge.addRenderToLayer(mLayer, mRenderOES);

可以看到,此时会调用我写的方法makeRender,根据传入的枚举参数创建一个OES纹理渲染器对象实例,并返回C下的对象指针(指针本质就是一个内存地址数)。具体代码如下:

   /**创建渲染器
    @param layerPointer 图层的内存地址
     @@param renderProgramKind 渲染器类型**/
    JNIEXPORT jlong JNICALL
    Java_com_opengldecoder_jnibridge_JniBridge_makeRender(JNIEnv *env, jobject activity,
                                                          int renderProgramKind) {

        RenderProgram *resultProgram = nullptr;
        switch (renderProgramKind) {
            default:
                break;
            //创建OES纹理渲染器
            case RENDER_OES_TEXTURE: {
                RenderProgramOESTexture *renderProgramOesTexture = new RenderProgramOESTexture();
                renderProgramOesTexture->createRender(-1, -mRatio, 0, 2,
                                                      mRatio * 2,
                                                      mWidth,
                                                      mHeight);
                resultProgram = renderProgramOesTexture;
                break;
            }
            case NOISE_REDUCTION: {
                RenderProgramNoiseReduction *renderProgramNoiseReduction = new RenderProgramNoiseReduction();
                renderProgramNoiseReduction->createRender(-1, -mRatio, 0, 2,
                                                      mRatio * 2,
                                                      mWidth,
                                                      mHeight);
                resultProgram = renderProgramNoiseReduction;
                break;
            }
            case RENDER_YUV: {
                //todo 暂时未完成
                break;
            }
            //创建卷积渲染器
            case RENDER_CONVOLUTION: {
                float kernel[] = {
                        1.0, 1.0, 1.0,
                        1.0, -7.0, 1.0,
                        1.0, 1.0, 1.0
                };
                RenderProgramConvolution *renderProgramConvolution = new RenderProgramConvolution(
                        kernel);
                renderProgramConvolution->createRender(-1, -mRatio, 0, 2,
                                                       mRatio * 2,
                                                       mWidth,
                                                       mHeight);
                resultProgram = renderProgramConvolution;
                break;
            }
            //创建滤镜渲染器:
            case RENDER_LUT: {
                RenderProgramFilter *renderProgramFilter = new RenderProgramFilter();
                renderProgramFilter->createRender(-1, -mRatio, 0, 2,
                                                  mRatio * 2,
                                                  mWidth,
                                                  mHeight);
                resultProgram = renderProgramFilter;
                break;
            }
            //tddo 去除背景:
            case DE_BACKGROUND: {
                RenderProgramDebackground *renderProgramDebackground = new RenderProgramDebackground();
                renderProgramDebackground->createRender(-1, -mRatio, 0, 2,
                                                        mRatio * 2,
                                                        mWidth,
                                                        mHeight);
                resultProgram = renderProgramDebackground;
            }
        }
        return (jlong) resultProgram;
    }

 我们暂时只关注创建RENDER_OES_TEXTURE类型的渲染器的过程即可,具体代码如下:

//
// Created by jiezhuchen on 2021/6/21.
//

#include <GLES3/gl3.h>
#include <GLES3/gl3ext.h>

#include <string.h>
#include <jni.h>
#include "RenderProgramOESTexture.h"
#include "android/log.h"


using namespace OPENGL_VIDEO_RENDERER;
static const char *TAG = "nativeGL";
#define LOGI(fmt, args...) __android_log_print(ANDROID_LOG_INFO,  TAG, fmt, ##args)
#define LOGD(fmt, args...) __android_log_print(ANDROID_LOG_DEBUG, TAG, fmt, ##args)
#define LOGE(fmt, args...) __android_log_print(ANDROID_LOG_ERROR, TAG, fmt, ##args)

RenderProgramOESTexture::RenderProgramOESTexture() {
    vertShader = GL_SHADER_STRING(
            \n
            uniform mat4 uMVPMatrix; //旋转平移缩放 总变换矩阵。物体矩阵乘以它即可产生变换
            attribute vec3 objectPosition; //物体位置向量,参与运算但不输出给片源

            attribute vec4 objectColor; //物理颜色向量
            attribute vec2 vTexCoord; //纹理内坐标
            varying vec4 fragObjectColor;//输出处理后的颜色值给片元程序
            varying vec2 fragVTexCoord;//输出处理后的纹理内坐标给片元程序

            void main() {
                    vec2 temp = vec2(1.0, 1.0);
                    gl_Position = uMVPMatrix * vec4(objectPosition, 1.0); //设置物体位置
                    fragVTexCoord = vTexCoord; //默认无任何处理,直接输出物理内采样坐标
                    fragObjectColor = objectColor; //默认无任何处理,输出颜色值到片源
            }
    );
    fragShader = GL_SHADER_STRING(
            $#extension GL_OES_EGL_image_external : require\n
            precision highp float;
            uniform samplerExternalOES oesTexture;//OES形式的纹理输入
            uniform int funChoice;
            uniform float frame;//第几帧
            uniform float brightness;//亮度
            uniform float contrast;//对比度
            uniform vec3 rgbWeight; //白平衡
            uniform vec2 resolution;//容器的分辨率
            uniform vec2 videoResolution;//视频自身的分辨率
            varying vec4 fragObjectColor;//接收vertShader处理后的颜色值给片元程序
            varying vec2 fragVTexCoord;//接收vertShader处理后的纹理内坐标给片元程序

            float fakeRandom(vec2 st) {
                return fract(sin(dot(st.xy, vec2(12.9898, 78.233))) * 43758.5453123 * frame / 1000.0);
            }

            //添加噪声进行测试
            vec3 getNoise(vec2 st) {
                float rnd = fakeRandom(st);
                return vec3(rnd);
            }

            void main() {
                vec2 xy = vec2(fragVTexCoord.s, 1.0 - fragVTexCoord.t);
                vec3 rgbWithBrightness = texture2D(oesTexture, xy).rgb * rgbWeight + brightness; //亮度调节
                vec3 rgbWithContrast = rgbWithBrightness + (rgbWithBrightness - 0.5) * contrast / 1.0;  //对比度调整 参考https://blog.csdn.net/yuhengyue/article/details/103856476
                gl_FragColor = vec4(rgbWithContrast, fragObjectColor.a);
                //cjztest 噪声测试
//                gl_FragColor = vec4(getNoise(fragVTexCoord) + rgbWithContrast.rgb, 1.0);
            }
    );

    float tempTexCoord[] =   //纹理内采样坐标,类似于canvas坐标 //这东西有问题,导致两个framebuffer的画面互相取纹理时互为颠倒
            {
                    1.0, 0.0,
                    0.0, 0.0,
                    1.0, 1.0,
                    0.0, 1.0
            };
    memcpy(mTexCoor, tempTexCoord, sizeof(tempTexCoord));
    float tempColorBuf[] = {
            1.0, 1.0, 1.0, 1.0,
            1.0, 1.0, 1.0, 1.0,
            1.0, 1.0, 1.0, 1.0,
            1.0, 1.0, 1.0, 1.0
    };
    memcpy(mColorBuf, tempColorBuf, sizeof(tempColorBuf));
}

RenderProgramOESTexture::~RenderProgramOESTexture() {
    destroy();
}

void RenderProgramOESTexture::createRender(float x, float y, float z, float w, float h, int windowW,
                                      int windowH) {
    mWindowW = windowW;
    mWindowH = windowH;
    initObjMatrix(); //使物体矩阵初始化为单位矩阵,否则接下来的矩阵操作因为都是乘以0而无效
    float vertxData[] = {
            x + w, y, z,
            x, y, z,
            x + w, y + h, z,
            x, y + h, z,
    };
    memcpy(mVertxData, vertxData, sizeof(vertxData));
    mImageProgram = createProgram(vertShader + 1, fragShader + 1);
    //获取程序中顶点位置属性引用"指针"
    mObjectPositionPointer = glGetAttribLocation(mImageProgram.programHandle, "objectPosition");
    //纹理采样坐标
    mVTexCoordPointer = glGetAttribLocation(mImageProgram.programHandle, "vTexCoord");
    //获取程序中顶点颜色属性引用"指针"
    mObjectVertColorArrayPointer = glGetAttribLocation(mImageProgram.programHandle, "objectColor");
    //获取程序中总变换矩阵引用"指针"
    muMVPMatrixPointer = glGetUniformLocation(mImageProgram.programHandle, "uMVPMatrix");
    //渲染方式选择,0为线条,1为纹理
    mGLFunChoicePointer = glGetUniformLocation(mImageProgram.programHandle, "funChoice");
    //渲染帧计数指针
    mFrameCountPointer = glGetUniformLocation(mImageProgram.programHandle, "frame");
    //亮度指针
    mBrightnessPointer = glGetUniformLocation(mImageProgram.programHandle, "brightness");
    //对比度指针
    mContrastPointer = glGetUniformLocation(mImageProgram.programHandle, "contrast");
    //白平衡指针
    mRGBWeightPointer = glGetUniformLocation(mImageProgram.programHandle, "rgbWeight");
    //设置分辨率指针,告诉gl脚本现在的分辨率
    mResoulutionPointer = glGetUniformLocation(mImageProgram.programHandle, "resolution");
}

void RenderProgramOESTexture::setAlpha(float alpha) {
    if (mColorBuf != nullptr) {
        for (int i = 3; i < sizeof(mColorBuf) / sizeof(float); i += 4) {
            mColorBuf[i] = alpha;
        }
    }
}

void RenderProgramOESTexture::setBrightness(float brightness) {
    mBrightness = brightness;
}

void RenderProgramOESTexture::setContrast(float contrast) {
    mContrast = contrast;
}

void RenderProgramOESTexture::setWhiteBalance(float redWeight, float greenWeight, float blueWeight) {
    mRedWeight = redWeight;
    mGreenWeight = greenWeight;
    mBlueWeight = blueWeight;
}

void RenderProgramOESTexture::loadData(char *data, int width, int height, int pixelFormat, int offset) {
    //不用实现
}

/**@param texturePointers 传入需要渲染处理的纹理,可以为上一次处理的结果,例如处理完后的FBOTexture **/
void RenderProgramOESTexture::loadTexture(Textures textures[]) {
    mInputTexturesArray = textures[0].texturePointers;
    mInputTextureWidth = textures[0].width;
    mInputTextureHeight = textures[0].height;
}

/**@param outputFBOPointer 绘制到哪个framebuffer,系统默认一般为0 **/
void RenderProgramOESTexture::drawTo(float *cameraMatrix, float *projMatrix, DrawType drawType, int outputFBOPointer, int fboW, int fboH) {
    if (mIsDestroyed) {
        return;
    }
    glUseProgram(mImageProgram.programHandle);
    glUniform1f(mBrightnessPointer, mBrightness);
    glUniform1f(mContrastPointer, mContrast);
    float whiteBalanceWeight[3] = {mRedWeight, mGreenWeight, mBlueWeight};
    glUniform3fv(mRGBWeightPointer, 1, whiteBalanceWeight);
    //设置视窗大小及位置
    glBindFramebuffer(GL_FRAMEBUFFER, outputFBOPointer);
    glViewport(0, 0, mWindowW, mWindowH);
    glUniform1i(mGLFunChoicePointer, 1);
    glUniform1f(mFrameCountPointer, mframeCount++);
    //传入位置信息
    locationTrans(cameraMatrix, projMatrix, muMVPMatrixPointer);
    //开始渲染:
    if (mVertxData != nullptr && mColorBuf != nullptr) {
        //将顶点位置数据送入渲染管线
        glVertexAttribPointer(mObjectPositionPointer, 3, GL_FLOAT, false, 0, mVertxData); //三维向量,size为2
        //将顶点颜色数据送入渲染管线
        glVertexAttribPointer(mObjectVertColorArrayPointer, 4, GL_FLOAT, false, 0, mColorBuf);
        //将顶点纹理坐标数据传送进渲染管线
        glVertexAttribPointer(mVTexCoordPointer, 2, GL_FLOAT, false, 0, mTexCoor);  //二维向量,size为2
        glEnableVertexAttribArray(mObjectPositionPointer); //启用顶点属性
        glEnableVertexAttribArray(mObjectVertColorArrayPointer);  //启用颜色属性
        glEnableVertexAttribArray(mVTexCoordPointer);  //启用纹理采样定位坐标
        float resolution[2];

        switch (drawType) {
            case OPENGL_VIDEO_RENDERER::RenderProgram::DRAW_DATA:
                break;
            case OPENGL_VIDEO_RENDERER::RenderProgram::DRAW_TEXTURE:
                glActiveTexture(GL_TEXTURE0); //激活0号纹理
//                glBindTexture(36197, mInputTexturesArrayPointer); //0号纹理绑定内容
                glBindTexture(GL_TEXTURE_2D, mInputTexturesArray); //0号纹理绑定内容,发现使用GL_TEXTURE_2D也可以绑定OES纹理
                glUniform1i(glGetUniformLocation(mImageProgram.programHandle, "oesTexture"), 0); //映射到渲染脚本,获取纹理属性的指针
                resolution[0] = (float) mInputTextureWidth;
                resolution[1] = (float) mInputTextureHeight;
                glUniform2fv(mResoulutionPointer, 1, resolution);
                break;
        }
        glDrawArrays(GL_TRIANGLE_STRIP, 0, /*mPointBufferPos / 3*/ 4); //绘制线条,添加的point浮点数/3才是坐标数(因为一个坐标由x,y,z3个float构成,不能直接用)
        glDisableVertexAttribArray(mObjectPositionPointer);
        glDisableVertexAttribArray(mObjectVertColorArrayPointer);
        glDisableVertexAttribArray(mVTexCoordPointer);
    }
}

void RenderProgramOESTexture::destroy() {
    if (!mIsDestroyed) {
        //释放纹理所占用的显存
        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, 0);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, 0, 0, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, nullptr);
        //删除不用的shaderprogram
        destroyProgram(mImageProgram);
    }
    mIsDestroyed = true;
}

其中最重点的部分就是片元渲染器的main方法:

            void main() {
                vec2 xy = vec2(fragVTexCoord.s, 1.0 - fragVTexCoord.t);
                vec3 rgbWithBrightness = texture2D(oesTexture, xy).rgb * rgbWeight + brightness; //亮度调节
                vec3 rgbWithContrast = rgbWithBrightness + (rgbWithBrightness - 0.5) * contrast / 1.0;  //对比度调整 参考https://blog.csdn.net/yuhengyue/article/details/103856476
                gl_FragColor = vec4(rgbWithContrast, fragObjectColor.a);
                //cjztest 噪声测试
//                gl_FragColor = vec4(getNoise(fragVTexCoord) + rgbWithContrast.rgb, 1.0);
            }

 其目的就是为了把传入的纹理索引,通过glBindTexture绑定纹理到glActiveTexture激活的纹理索引,然后把激活的纹理索引通过glGetUniformLocation取得编译后的shaderProgram中名为“oesTexture”纹理的索引,并进行相互绑定,然后片元渲染器就可以通过texture2D方法(GLES20的2d纹理采样函数)把从播放器中传入的纹理,进行采样,即可得到类型为vec4(rgba)的像素数据。此时如果直接赋予给保留变量gl_FragColor,此时对应的顶点封闭图形即可被贴上视频画面的原色。但我这里额外添加了rgbWeight白平衡矩阵、rgbWithBrightness亮度系数和rgbWithContrast系数,使得播放画面的红绿蓝3通道的颜色比例、亮度、对比度均可调。

代码地址:

        learnOpengl: 我的OpenGL联系库 - Gitee.comhttps://gitee.com/cjzcjl/learnOpenGLDemo/tree/main/app/src/main

小结:

        在安卓的音频和视频处理上,如果单纯使用surfaceview/textureView + surfaceTexture + surface这套组合去进行视频播放,实际上很难进行复杂的变化和调整。但通过glsurfaceview和glsl,上可以贴3D模型,下可以进行通常播放器都有的白平衡调节等功能,使得画面处理的灵活性大大增加,同时由于OpenGL基于GPU的硬件加速特性,处理性能比常规通过Java或者C直接逐像素地依赖CPU处理要快得多,同时能耗也较少,glsurfaceView+glsl处理视频画面是非常不错的搭配。

  • 1
    点赞
  • 13
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
要在 GLSurfaceView 中显示 YUV 数据,你需要将 YUV 数据转换为 RGB 数据,然后将 RGB 数据传递给 OpenGL ES,最后在 GLSurfaceView 中显示。这个过程可以通过 JNI 来完成。 以下是一个简单的示例代码: 1. Java 代码: ``` public class YuvRenderer implements GLSurfaceView.Renderer { private static final String TAG = "YuvRenderer"; private int mTextureId; private int mProgram; private int mPositionHandle; private int mTexCoordHandle; private int mYuvWidth; private int mYuvHeight; private ByteBuffer mYuvBuffer; public YuvRenderer() { mYuvWidth = 0; mYuvHeight = 0; mYuvBuffer = null; } public void setYuvData(int width, int height, byte[] yuvData) { mYuvWidth = width; mYuvHeight = height; mYuvBuffer = ByteBuffer.wrap(yuvData); } @Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f); GLES20.glDisable(GLES20.GL_DEPTH_TEST); mProgram = createProgram(); mPositionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition"); mTexCoordHandle = GLES20.glGetAttribLocation(mProgram, "aTexCoord"); int textureUniformHandle = GLES20.glGetUniformLocation(mProgram, "uTexture"); int[] textureIds = new int[1]; GLES20.glGenTextures(1, textureIds, 0); mTextureId = textureIds[0]; GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureId); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, mYuvWidth / 2, mYuvHeight / 2, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null); GLES20.glUseProgram(mProgram); GLES20.glVertexAttribPointer(mPositionHandle, 2, GLES20.GL_FLOAT, false, 0, createVertexBuffer()); GLES20.glVertexAttribPointer(mTexCoordHandle, 2, GLES20.GL_FLOAT, false, 0, createTexCoordBuffer()); GLES20.glEnableVertexAttribArray(mPositionHandle); GLES20.glEnableVertexAttribArray(mTexCoordHandle); GLES20.glUniform1i(textureUniformHandle, 0); } @Override public void onSurfaceChanged(GL10 gl, int width, int height) { GLES20.glViewport(0, 0, width, height); } @Override public void onDrawFrame(GL10 gl) { if (mYuvBuffer == null) { GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); return; } GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); byte[] yuvData = mYuvBuffer.array(); int[] rgbData = new int[mYuvWidth * mYuvHeight]; YuvUtils.convertYUV420ToRGB8888(yuvData, rgbData, mYuvWidth, mYuvHeight); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureId); GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D, 0, 0, 0, mYuvWidth / 2, mYuvHeight / 2, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, ByteBuffer.wrap(rgbData)); GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4); } private int createProgram() { int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, VERTEX_SHADER_CODE); int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, FRAGMENT_SHADER_CODE); int program = GLES20.glCreateProgram(); GLES20.glAttachShader(program, vertexShader); GLES20.glAttachShader(program, fragmentShader); GLES20.glLinkProgram(program); return program; } private int loadShader(int shaderType, String shaderCode) { int shader = GLES20.glCreateShader(shaderType); GLES20.glShaderSource(shader, shaderCode); GLES20.glCompileShader(shader); return shader; } private FloatBuffer createVertexBuffer() { float[] vertexData = new float[] { -1.0f, -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, }; ByteBuffer byteBuffer = ByteBuffer.allocateDirect(vertexData.length * 4); byteBuffer.order(ByteOrder.nativeOrder()); FloatBuffer buffer = byteBuffer.asFloatBuffer(); buffer.put(vertexData); buffer.position(0); return buffer; } private FloatBuffer createTexCoordBuffer() { float[] texCoordData = new float[] { 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, }; ByteBuffer byteBuffer = ByteBuffer.allocateDirect(texCoordData.length * 4); byteBuffer.order(ByteOrder.nativeOrder()); FloatBuffer buffer = byteBuffer.asFloatBuffer(); buffer.put(texCoordData); buffer.position(0); return buffer; } private static final String VERTEX_SHADER_CODE = "attribute vec4 aPosition;\n" + "attribute vec2 aTexCoord;\n" + "varying vec2 vTexCoord;\n" + "void main() {\n" + " gl_Position = aPosition;\n" + " vTexCoord = aTexCoord;\n" + "}"; private static final String FRAGMENT_SHADER_CODE = "precision mediump float;\n" + "uniform sampler2D uTexture;\n" + "varying vec2 vTexCoord;\n" + "void main() {\n" + " gl_FragColor = texture2D(uTexture, vTexCoord);\n" + "}"; } ``` 2. JNI 代码: ``` JNIEXPORT void JNICALL Java_com_example_yuvrenderer_YuvRenderer_setYuvData(JNIEnv *env, jobject obj, jint width, jint height, jbyteArray yuvData) { jclass clazz = env->GetObjectClass(obj); jfieldID yuvWidthField = env->GetFieldID(clazz, "mYuvWidth", "I"); jint yuvWidth = env->GetIntField(obj, yuvWidthField); jfieldID yuvHeightField = env->GetFieldID(clazz, "mYuvHeight", "I"); jint yuvHeight = env->GetIntField(obj, yuvHeightField); jbyte* yuvDataPtr = env->GetByteArrayElements(yuvData, NULL); jsize yuvDataSize = env->GetArrayLength(yuvData); if (yuvWidth != width || yuvHeight != height) { env->SetIntField(obj, yuvWidthField, width); env->SetIntField(obj, yuvHeightField, height); jclass byteBufferClazz = env->FindClass("java/nio/ByteBuffer"); jmethodID allocateDirectMethod = env->GetStaticMethodID(byteBufferClazz, "allocateDirect", "(I)Ljava/nio/ByteBuffer;"); jobject yuvBuffer = env->CallStaticObjectMethod(byteBufferClazz, allocateDirectMethod, yuvDataSize); env->SetObjectField(obj, env->GetFieldID(clazz, "mYuvBuffer", "Ljava/nio/ByteBuffer;"), yuvBuffer); } jobject yuvBuffer = env->GetObjectField(obj, env->GetFieldID(clazz, "mYuvBuffer", "Ljava/nio/ByteBuffer;")); env->GetDirectBufferAddress(yuvBuffer); memcpy(yuvBufferPtr, yuvDataPtr, yuvDataSize); env->ReleaseByteArrayElements(yuvData, yuvDataPtr, JNI_ABORT); } ``` 这个示例代码中假设 YUV 数据是 NV21 格式的,你需要根据你的 YUV 数据格式进行相应的修改。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值