Android中可以用来“播放”视频的View

概述

Android中可以用来播放视频的View有几个,包括SurfaceView,VideoView,TextureView,GLSurfaceView。下面我们总结一下各自优缺点,并展示一下基本使用。(VideoView由于是SurfaceView和MediaPlayer实现的这里不做展示)
这里只是总结一下优缺点,不去分析源码。

SurfaceView

重点:
SurfaceView的工作方式是创建一个置于应用窗口之后的新窗口
SurfaceView的UI在一个独立的线程中进行绘制,可以不会占用主线程资源。
SurfaceView使用双缓冲机制,播放视频时画面更流畅;(SurfaceView在更新视图时用到了两张 Canvas,一张 frontCanvas 和一张 backCanvas ,每次实际显示的是 frontCanvas ,backCanvas 存储的是上一次更改前的视图。当你在播放这一帧的时候,它已经提前帮你加载好后面一帧了,所以播放起视频很流畅。)

TextureView

TextureView是在4.0(API level 14)引入的,用于承载显示‘数据流’的View, 如本地Camera采集的预览数据流或从网络包里解出实时视频‘数据流’。TextureView必须在硬件加速的窗口中使用。
Android自带Camera源码中,有这么一段演变过程:4.0的时候用SurfaceView预览、4.2的时候换成了GLSurfaceView到4.4开始使用TextureView预览

GLSurfaceView

提供并且管理一个独立的Surface。
提供并且管理一个EGL display,它能让opengl把内容渲染到上述的Surface上。
支持用户自定义渲染器(Render),通过setRenderer设置一个自定义的Renderer。
让渲染器在独立的GLThread线程里运作,和UI线程分离。
支持按需渲染(on-demand)和连续渲染(continuous)两种模式。
GPU加速:GLSurfaceView的效率是SurfaceView的30倍以上,SurfaceView使用画布进行绘制,GLSurfaceView利用GPU加速提高了绘制效率。
View的绘制onDraw(Canvas canvas)使用Skia渲染引擎渲染,而GLSurfaceView的渲染器Renderer的onDrawFrame(GL10 gl)使用opengl绘制引擎进行渲染。

GLSurfaceView由于这里的渲染工作由我们自己控制,所以对于视频显示方式(全屏,居中留白)就很好控制,提升了灵活性。
下面是非常基础的使用,视频解码部分用的IjkPlayer(github链接)。主要说明各个View作为视频渲染View时候的具体涉及的类。

一 SurfaceView播放视频

布局文件

就一个SurfaceView

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".demo.SurfaceViewActivity">

    <SurfaceView
        android:id="@+id/play_sfv"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />
</RelativeLayout>

加一个网络权限

要播放网络视频,加一个权限。

<uses-permission android:name="android.permission.INTERNET" />

Activty

public class SurfaceViewActivity extends AppCompatActivity implements IMediaPlayer.OnPreparedListener,SurfaceHolder.Callback{
    private static final String VIDEO_URL = "http://cnvod.cnr.cn/audio2017/ondemand/transcode/l_target/wcm_system/video/20190403/xw0219xwyt22_56/index.m3u8";

    private IjkMediaPlayer mPlayer;
    private SurfaceView surfaceView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
                WindowManager.LayoutParams.FLAG_FULLSCREEN);
        setContentView(R.layout.activity_surface_view);

        initView();

        initPlayer();
    }

    private void initPlayer() {
        mPlayer = new IjkMediaPlayer();
        mPlayer.setOnPreparedListener(this);
        try {
            mPlayer.setDataSource(VIDEO_URL);
            mPlayer.prepareAsync();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    private void initView() {
        surfaceView = findViewById(R.id.play_sfv);

        //获取SurfaceHolder,添加监听器
        SurfaceHolder surfaceHolder = surfaceView.getHolder();
        surfaceHolder.addCallback(this);
    }

    @Override
    public void onPrepared(IMediaPlayer iMediaPlayer) {
        //player准备完毕开播
        mPlayer.start();
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        //Surface创建成功,将其交由player进行渲染操作。
        mPlayer.setDisplay(holder);
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
        
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {

    }
}

总结

重点类SurfaceView和SurfaceHolder,这里我们不用去实现具体渲染的事情,只要监听Surfce变化,并处理好回调就可以了。

二 TextureView播放视频

布局文件

<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".demo.TextureViewActivity">

    <TextureView
        android:id="@+id/play_ttv"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />
</android.support.constraint.ConstraintLayout>

Activity

public class TextureViewActivity extends AppCompatActivity implements IMediaPlayer.OnPreparedListener,TextureView.SurfaceTextureListener{
    private static final String VIDEO_URL = "http://cnvod.cnr.cn/audio2017/ondemand/transcode/l_target/wcm_system/video/20190403/xw0219xwyt22_56/index.m3u8";

    private IjkMediaPlayer mPlayer;
    private TextureView textureView;
    private Surface surface;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
                WindowManager.LayoutParams.FLAG_FULLSCREEN);
        setContentView(R.layout.activity_surface_view);

        setContentView(R.layout.activity_texture_view);

        initView();

        initPlayer();
    }

    private void initPlayer() {
        mPlayer = new IjkMediaPlayer();
        mPlayer.setOnPreparedListener(this);
        try {
            mPlayer.setDataSource(VIDEO_URL);
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    private void initView() {
        textureView = findViewById(R.id.play_ttv);

        //设置纹理监听器
        textureView.setSurfaceTextureListener(this);
    }

    @Override
    public void onPrepared(IMediaPlayer iMediaPlayer) {
        mPlayer.start();
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
        initSurfaceAndPlay(surfaceTexture);
    }

    private void initSurfaceAndPlay(SurfaceTexture surfaceTexture) {
        //创建Surface,并设置给
        surface = new Surface(surfaceTexture);
        mPlayer.setSurface(surface);
        mPlayer.prepareAsync();
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }
}

总结

关键类TextureView和Surface。至于SurfaceTexture这里使用了用来构造Surface但是我们不用去处理它。这里我们也不用去实现具体的渲染工作,只是监听SurfaceTexture(纹理)的变化,并作出相应的操作。

下面是效果图,全屏模式,会有拉伸和压缩的问题,这里不做处理因为不是重点。
在这里插入图片描述

三 GLSurfaceView 播放视频

前面一篇文章中记录了OpenGLES的兼容性问题,并且有了前面在前面一系列OpenGlES基础文章的铺垫,这里播放视频就显得比较简单了,注意看渲染器类中的细节与之前渲染一张图片的差别不是很大。这里将3.0和2.0两个版本的着色器文件都贴出来供大家参考。其余的代码都是相同的。下面是未经显示模式处理的例子,效果和上面使用SurfaceView和TextureView一样,最后会有处理后的正常效果。

3.1 着色器文件

3.0版本着色器文件

顶点着色器

#version 300 es
in vec4 aPosition;//顶点位置
in vec2 aTexCoord;//S T 纹理坐标
out vec2 vTexCoord;
void main() {
    vTexCoord = aTexCoord;
    gl_Position = aPosition;
}

片段着色器

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision mediump float;
in vec2 vTexCoord;
uniform samplerExternalOES sTexture;
out vec4 vFragColor;
void main() {
    vFragColor=texture(sTexture, vTexCoord);
}

2.0版本的着色器文件

顶点着色器

attribute vec4 aPosition;//顶点位置
attribute vec2 aTexCoord;//S T 纹理坐标
varying vec2 vTexCoord;
void main() {
    vTexCoord = aTexCoord;
    gl_Position = aPosition;
}

片段着色器

#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 vTexCoord;
uniform samplerExternalOES sTexture;
void main() {
    gl_FragColor=texture2D(sTexture, vTexCoord);
}

3.2 Activity和xml布局文件

public class GLSurfaceViewActivity extends AppCompatActivity {
    private static final String VIDEO_URL = "http://cnvod.cnr.cn/audio2017/ondemand/transcode/l_target/wcm_system/video/20190403/xw0219xwyt22_56/index.m3u8";
    //private static final String VIDEO_URL = "http://video.newsapp.cnr.cn/data/video/2019/27675/index.m3u8";

    private IjkMediaPlayer mPlayer;
    private GLSurfaceView glSurfaceView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
                WindowManager.LayoutParams.FLAG_FULLSCREEN);
        setContentView(R.layout.activity_surface_view);

        setContentView(R.layout.activity_glsurface_view);

        initView();
    }

    private void initView() {
        glSurfaceView = findViewById(R.id.play_glsv);
        glSurfaceView.setEGLContextClientVersion(3);
        MyGLVideoRenderer glVideoRenderer = new MyGLVideoRenderer(glSurfaceView,VIDEO_URL);//创建renderer
        glSurfaceView.setRenderer(glVideoRenderer);//设置renderer
        glSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
    }
}
<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".demo.GLSurfaceViewActivity">

    <android.opengl.GLSurfaceView
        android:id="@+id/play_glsv"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />
</android.support.constraint.ConstraintLayout>

3.3 重点渲染器

public class MyGLVideoRenderer implements GLSurfaceView.Renderer,SurfaceTexture.OnFrameAvailableListener{
    //顶点,纹理本地引用
    private FloatBuffer vertexBuffer, mTexVertexBuffer;
    //绘制向量顺序索引
    private ShortBuffer mVertexIndexBuffer;
    //程序id
    private int mProgram;
    //顶点向量元素个数(x,y)
    private static final int COORDS_PER_VERTEX = 2;
    //一个float占4bytes
    private final int vertexStride = COORDS_PER_VERTEX * 4;
    //纹理向量元素个数(s,t)
    private static final int COORDS_PER_TEXTURE = 2;
    //一个float占4bytes
    private final int textureStride = COORDS_PER_VERTEX * 4;

    //纹理id
    private int textureId;
    //纹理
    private SurfaceTexture surfaceTexture;

    //返回属性变量的位置
    //顶点
    private int aPositionLocation;
    //纹理
    private int aTextureLocation;
    //IjkMediaPlayer实例
    private IjkMediaPlayer ijkMediaPlayer;

    /**
     * 顶点坐标
     * (x,y)
     */
    private static final float[] POSITION_VERTEX = new float[]{
            -1.0f,  1.0f,
            -1.0f, -1.0f,
            1.0f, -1.0f,
            1.0f,  1.0f,
    };

    /**
     * 纹理坐标
     * (s,t)
     */
    private static final float[] TEX_VERTEX = {
            0.0f, 0.0f,
            0.0f, 1.0f,
            1.0f, 1.0f,
            1.0f, 0.0f,
    };

    /**
     * 绘制顺序索引
     */
    private static final short[] VERTEX_INDEX = {
            0, 1, 2,
            0, 2, 3
    };

    private String videoPath;
    private GLSurfaceView glSurfaceView;

    public MyGLVideoRenderer(GLSurfaceView glSurfaceView,String videoPath) {
        this.glSurfaceView = glSurfaceView;
        this.videoPath = videoPath;

        vertexBuffer = ByteBuffer.allocateDirect(POSITION_VERTEX.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        vertexBuffer.put(POSITION_VERTEX);
        vertexBuffer.position(0);

        mTexVertexBuffer = ByteBuffer.allocateDirect(TEX_VERTEX.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer()
                .put(TEX_VERTEX);
        mTexVertexBuffer.position(0);

        mVertexIndexBuffer = ByteBuffer.allocateDirect(VERTEX_INDEX.length * 2)
                .order(ByteOrder.nativeOrder())
                .asShortBuffer()
                .put(VERTEX_INDEX);
        mVertexIndexBuffer.position(0);

        initMediaPlayer();
    }

    //初始化IjkMediaPlayer
    private void initMediaPlayer() {
        Mlog.e("==initMediaPlayer");
        ijkMediaPlayer = new IjkMediaPlayer();
        try {
            ijkMediaPlayer.setDataSource(videoPath);
        } catch (IOException e) {
            e.printStackTrace();
        }
        ijkMediaPlayer.setLooping(true);
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        Mlog.e("==onSurfaceCreated");
        //初始化OpenGl程序
        initGLProgram();
        //获取各个属性位置
        initLocation();
        //初始化纹理id
        initTextureId();
        //播放器和纹理绑定
        attachTexture();
        //播放
        preparePlay();
    }

    private void initGLProgram() {
        //编译
        final int vertexShaderId = ShaderUtils.compileVertexShader(ResReadUtils.readResource(R.raw.vetext_sharder));
        final int fragmentShaderId = ShaderUtils.compileFragmentShader(ResReadUtils.readResource(R.raw.fragment_sharder));
        //链接程序片段
        mProgram = ShaderUtils.linkProgram(vertexShaderId, fragmentShaderId);
        //在OpenGLES环境中使用程序
        GLES30.glUseProgram(mProgram);
    }

    private void preparePlay() {
        ijkMediaPlayer.prepareAsync();
        ijkMediaPlayer.setOnPreparedListener(new IMediaPlayer.OnPreparedListener() {
            @Override
            public void onPrepared(IMediaPlayer iMediaPlayer) {
                Mlog.e("==onPrepared");
                iMediaPlayer.start();
            }
        });
    }

    private void attachTexture() {
        Mlog.e("==attachTexture");
        Mlog.e("textureId=="+textureId);
        surfaceTexture = new SurfaceTexture(textureId);
        surfaceTexture.setOnFrameAvailableListener(this);//监听是否有新的一帧数据到来
        Surface surface = new Surface(surfaceTexture);
        ijkMediaPlayer.setSurface(surface);
        surface.release();
    }

    private void initTextureId() {
        int[] textures = new int[1];

        GLES30.glGenTextures(1, textures, 0);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textures[0]);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_MIN_FILTER,GL10.GL_LINEAR);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
        GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
        GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);

        textureId = textures[0];
    }

    private void initLocation() {
        aPositionLocation = GLES30.glGetAttribLocation(mProgram, "aPosition");
        aTextureLocation = GLES30.glGetAttribLocation(mProgram, "aTexCoord");
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        Mlog.e("==onSurfaceChanged");
        //设置绘制窗口
        GLES30.glViewport(0, 0, width, height);
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        //将背景设置为白色
        GLES30.glClearColor(1.0f,1.0f,1.0f,1.0f);
        GLES30.glClear(GLES30.GL_DEPTH_BUFFER_BIT | GLES30.GL_COLOR_BUFFER_BIT);
        //更新画面
        surfaceTexture.updateTexImage();

        //激活纹理
        GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);

        //启用顶点坐标属性
        GLES30.glEnableVertexAttribArray(aPositionLocation);
        GLES30.glVertexAttribPointer(aPositionLocation, COORDS_PER_VERTEX, GLES30.GL_FLOAT, false, vertexStride, vertexBuffer);

        //启用纹理坐标属性
        GLES30.glEnableVertexAttribArray(aTextureLocation);
        GLES30.glVertexAttribPointer(aTextureLocation, COORDS_PER_TEXTURE, GLES30.GL_FLOAT, false, textureStride, mTexVertexBuffer);

        // 绘制
        GLES30.glDrawElements(GLES30.GL_TRIANGLES, VERTEX_INDEX.length, GLES30.GL_UNSIGNED_SHORT, mVertexIndexBuffer);

        //禁止顶点数组的句柄
        GLES30.glDisableVertexAttribArray(aPositionLocation);
        GLES30.glDisableVertexAttribArray(aTextureLocation);
    }

    @Override
    public void onFrameAvailable(SurfaceTexture surfaceTexture) {
        Mlog.e("==onFrameAvailable");
        //请求渲染
        glSurfaceView.requestRender();
    }
}

总结

自己完成视频画面的渲染,从IjkPlayer中获取媒体流自己渲染出来。
关键:

着色器文件
Surface,SurfaceTexture。自己通过纹理id构建SurfaceTexture,通过SurfaceTexture构建Surface,将Surface和IjkPlayer绑定完成渲染
其实上面的视频是一个VR视频资源,播放的时候这里以普通视频的方式进行的渲染,所以看起来会有点怪。后面会介绍如何播放VR视频,相信大家也能感到其实播放VR视频也是比较简单的事情了(通过前面十几篇文章的铺垫)。

补充

上面的视频是默认全屏显示的,会存在拉伸和压缩的问题,下面来解决他。其实比较简单就是利用了相机矩阵和投影矩阵来完成。

顶点着色器
attribute vec4 aPosition;//顶点位置
attribute vec2 aTexCoord;//S T 纹理坐标
uniform mat4 u_Matrix;
varying vec2 vTexCoord;
void main() {
    vTexCoord = aTexCoord;
    gl_Position = u_Matrix*aPosition;
}

片段着色器
#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 vTexCoord;
uniform samplerExternalOES sTexture;
void main() {
    gl_FragColor=texture2D(sTexture, vTexCoord);
}

渲染器
public class MyGLVideoRenderer implements GLSurfaceView.Renderer,SurfaceTexture.OnFrameAvailableListener, IMediaPlayer.OnVideoSizeChangedListener {
    //顶点,纹理本地引用
    private FloatBuffer vertexBuffer, mTexVertexBuffer;
    //绘制向量顺序索引
    private ShortBuffer mVertexIndexBuffer;
    //程序id
    private int mProgram;
    //顶点向量元素个数(x,y)
    private static final int COORDS_PER_VERTEX = 3;
    //一个float占4bytes
    private final int vertexStride = COORDS_PER_VERTEX * 4;
    //纹理向量元素个数(s,t)
    private static final int COORDS_PER_TEXTURE = 3;
    //一个float占4bytes
    private final int textureStride = COORDS_PER_VERTEX * 4;

    //纹理id
    private int textureId;
    //纹理
    private SurfaceTexture surfaceTexture;

    //返回属性变量的位置
    //顶点
    private int aPositionLocation;
    //纹理
    private int aTextureLocation;
    //变换矩阵
    private int uMatrixLocation;

    //相机矩阵
    private final float[] mViewMatrix = new float[16];
    //投影矩阵
    private final float[] mProjectMatrix = new float[16];
    //最终变换矩阵
    private final float[] mMVPMatrix = new float[16];

    //屏幕宽,高
    private int screenWidth , screenHeight;

    //视频宽,高
    private int videoWidth, videoHeight;

    //IjkMediaPlayer实例
    private IjkMediaPlayer ijkMediaPlayer;

    /**
     * 顶点坐标
     * (x,y)
     */
    private static final float[] POSITION_VERTEX = new float[]{
            -1.0f,  1.0f, 0.0f,
            -1.0f, -1.0f, 0.0f,
            1.0f, -1.0f, 0.0f,
            1.0f,  1.0f, 0.0f,
    };

    /**
     * 纹理坐标
     * (s,t)
     */
    private static final float[] TEX_VERTEX = {
            0.0f, 0.0f,0.0f,
            0.0f, 1.0f,0.0f,
            1.0f, 1.0f,0.0f,
            1.0f, 0.0f,0.0f,
    };

    /**
     * 绘制顺序索引
     */
    private static final short[] VERTEX_INDEX = {
            0, 1, 2,
            0, 2, 3
    };

    private String videoPath;
    private GLSurfaceView glSurfaceView;

    public MyGLVideoRenderer(GLSurfaceView glSurfaceView,String videoPath) {
        this.glSurfaceView = glSurfaceView;
        this.videoPath = videoPath;

        vertexBuffer = ByteBuffer.allocateDirect(POSITION_VERTEX.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        vertexBuffer.put(POSITION_VERTEX);
        vertexBuffer.position(0);

        mTexVertexBuffer = ByteBuffer.allocateDirect(TEX_VERTEX.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer()
                .put(TEX_VERTEX);
        mTexVertexBuffer.position(0);

        mVertexIndexBuffer = ByteBuffer.allocateDirect(VERTEX_INDEX.length * 2)
                .order(ByteOrder.nativeOrder())
                .asShortBuffer()
                .put(VERTEX_INDEX);
        mVertexIndexBuffer.position(0);

        initMediaPlayer();
    }

    //初始化IjkMediaPlayer
    private void initMediaPlayer() {
        Mlog.e("==initMediaPlayer");
        ijkMediaPlayer = new IjkMediaPlayer();
        try {
            ijkMediaPlayer.setDataSource(videoPath);
        } catch (IOException e) {
            e.printStackTrace();
        }
        ijkMediaPlayer.setLooping(true);
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        Mlog.e("==onSurfaceCreated");
        //初始化OpenGl程序
        initGLProgram();
        //获取各个属性位置
        initLocation();
        //初始化纹理id
        initTextureId();
        //播放器和纹理绑定
        attachTexture();
        //播放
        preparePlay();
    }

    private void initGLProgram() {
        //编译
        final int vertexShaderId = ShaderUtils.compileVertexShader(ResReadUtils.readResource(R.raw.vetext_sharder));
        final int fragmentShaderId = ShaderUtils.compileFragmentShader(ResReadUtils.readResource(R.raw.fragment_sharder));
        //链接程序片段
        mProgram = ShaderUtils.linkProgram(vertexShaderId, fragmentShaderId);
        //在OpenGLES环境中使用程序
        GLES30.glUseProgram(mProgram);
    }

    private void preparePlay() {
        ijkMediaPlayer.prepareAsync();
        ijkMediaPlayer.setOnPreparedListener(new IMediaPlayer.OnPreparedListener() {
            @Override
            public void onPrepared(IMediaPlayer iMediaPlayer) {
                Mlog.e("==onPrepared");
                iMediaPlayer.start();
            }
        });
        ijkMediaPlayer.setOnVideoSizeChangedListener(this);
    }

    private void attachTexture() {
        Mlog.e("==attachTexture");
        Mlog.e("textureId=="+textureId);
        surfaceTexture = new SurfaceTexture(textureId);
        surfaceTexture.setOnFrameAvailableListener(this);//监听是否有新的一帧数据到来
        Surface surface = new Surface(surfaceTexture);
        ijkMediaPlayer.setSurface(surface);
        surface.release();
    }

    private void initTextureId() {
        int[] textures = new int[1];

        GLES30.glGenTextures(1, textures, 0);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textures[0]);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_MIN_FILTER,GL10.GL_LINEAR);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
        GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
        GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);

        textureId = textures[0];
    }

    private void initLocation() {
        aPositionLocation = GLES30.glGetAttribLocation(mProgram, "aPosition");
        aTextureLocation = GLES30.glGetAttribLocation(mProgram, "aTexCoord");
        uMatrixLocation = GLES30.glGetUniformLocation(mProgram, "u_Matrix");
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        Mlog.e("==onSurfaceChanged");

        //设置绘制窗口
        GLES30.glViewport(0, 0, width, height);

        screenWidth = width;
        screenHeight = height;

        if (videoWidth!=0&&videoHeight!=0){
            setMvpMatrix();
        }
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        //将背景设置为白色
        GLES30.glClearColor(0.0f,0.0f,0.0f,1.0f);
        GLES30.glClear(GLES30.GL_DEPTH_BUFFER_BIT | GLES30.GL_COLOR_BUFFER_BIT);
        //将变换矩阵传入顶点渲染器
        GLES30.glUniformMatrix4fv(uMatrixLocation,1,false,mMVPMatrix,0);
        //更新画面
        surfaceTexture.updateTexImage();

        //激活纹理
        GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);

        //启用顶点坐标属性
        GLES30.glEnableVertexAttribArray(aPositionLocation);
        GLES30.glVertexAttribPointer(aPositionLocation, COORDS_PER_VERTEX, GLES30.GL_FLOAT, false, vertexStride, vertexBuffer);

        //启用纹理坐标属性
        GLES30.glEnableVertexAttribArray(aTextureLocation);
        GLES30.glVertexAttribPointer(aTextureLocation, COORDS_PER_TEXTURE, GLES30.GL_FLOAT, false, textureStride, mTexVertexBuffer);

        // 绘制
        GLES30.glDrawElements(GLES30.GL_TRIANGLES, VERTEX_INDEX.length, GLES30.GL_UNSIGNED_SHORT, mVertexIndexBuffer);

        //禁止顶点数组的句柄
        GLES30.glDisableVertexAttribArray(aPositionLocation);
        GLES30.glDisableVertexAttribArray(aTextureLocation);
    }

    @Override
    public void onFrameAvailable(SurfaceTexture surfaceTexture) {
        //Mlog.e("==onFrameAvailable");
        //请求渲染
        glSurfaceView.requestRender();
    }

    @Override
    public void onVideoSizeChanged(IMediaPlayer iMediaPlayer, int i, int i1, int i2, int i3) {
        Mlog.e("i="+i+"--i1=="+i1+"--i2=="+i2+"--i3=="+i3);
        videoWidth = i;
        videoHeight = i1;
        setMvpMatrix();
    }

    private void setMvpMatrix() {
        Mlog.e("screenWidth=="+screenWidth+"-----screenHeight=="+screenHeight);
        float videoWHRatio=videoWidth/(float)videoHeight;
        float displayWHRatio=screenWidth/(float)screenHeight;
        if(screenWidth>screenHeight){
            if(videoWHRatio>displayWHRatio){
                Matrix.orthoM(mProjectMatrix, 0, -displayWHRatio*videoWHRatio,displayWHRatio*videoWHRatio, -1,1, 3, 7);
            }else{
                Matrix.orthoM(mProjectMatrix, 0, -displayWHRatio/videoWHRatio,displayWHRatio/videoWHRatio, -1,1, 3, 7);
            }
        }else{
            if(videoWHRatio>displayWHRatio){
                Matrix.orthoM(mProjectMatrix, 0, -1, 1, -1/displayWHRatio*videoWHRatio, 1/displayWHRatio*videoWHRatio,3, 7);
            }else{
                Matrix.orthoM(mProjectMatrix, 0, -1, 1, -videoWHRatio/displayWHRatio, videoWHRatio/displayWHRatio,3, 7);
            }
        }
        //设置相机位置
        Matrix.setLookAtM(mViewMatrix, 0, 0, 0, 4.0f, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
        //计算变换矩阵
        Matrix.multiplyMM(mMVPMatrix,0,mProjectMatrix,0,mViewMatrix,0);
    }
}

里面的代码都进行了详细注释,这里就不做过多解释了。横竖屏切换的时候也可以根据屏幕的大小自动调整

下面是效果图
竖屏
在这里插入图片描述
横屏
在这里插入图片描述

  • 2
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
SurfaceView播放视频可以通过创建一个新的Activity来实现全屏播放。具体实现可以参考以下步骤: 1. 创建一个新的Activity,例如FullScreenActivity。 2. 在FullScreenActivity设置布局文件,用来展示视频播放器。可以使用VideoView或者SurfaceView来展示视频。 3. 传递视频播放地址到FullScreenActivity。可以使用Intent来传递地址。 4. 在FullScreenActivity启动视频播放器,并且设置全屏标志。 5. 在视频播放完成或者用户退出全屏播放时,关闭FullScreenActivity,回到原来的Activity。 以下是示例代码: 在原来的Activity,启动FullScreenActivity: ```java Intent intent = new Intent(this, FullScreenActivity.class); intent.putExtra("videoUrl", videoUrl); startActivity(intent); ``` 在FullScreenActivity,设置布局文件和全屏标志,并且启动视频播放器: ```java public class FullScreenActivity extends AppCompatActivity { private VideoView videoView; private String videoUrl; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_full_screen); // 获取视频播放地址 videoUrl = getIntent().getStringExtra("videoUrl"); // 设置全屏标志 getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN); // 启动视频播放器 videoView = findViewById(R.id.video_view); videoView.setVideoPath(videoUrl); videoView.start(); } @Override public void onBackPressed() { // 关闭FullScreenActivity,回到原来的Activity finish(); } } ``` 在布局文件activity_full_screen.xml,使用SurfaceView或者VideoView来展示视频: ```xml <FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent"> <VideoView android:id="@+id/video_view" android:layout_width="match_parent" android:layout_height="match_parent" /> </FrameLayout> ```

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值