openGL ES进阶教程(四)用openGL ES+MediaPlayer 渲染播放视频+滤镜效果

之前曾经写过用SurfaceView,TextureView+MediaPlayer 播放视频,和 ffmpeg avi解码后SurfaceView播放视频 ,今天再给大家来一篇openGL ES+MediaPlayer来播放视频。。。。当年也曾呆过camera开发组近一年时间,可惜那时候没写博客的意识,没能给自己给大家留下多少干货分享。。。

上个效果图吧:

这里写图片描述

用openGL着色器实现黑白(灰度图)效果。
即 0.299,0.587,0.114 CRT中转灰度的模型

这里写图片描述

下面看具体实现的逻辑:

如果你曾用openGL实现过贴图,那么就容易理解多了。和图片不同的是,视频需要不断地刷新,每当有新的一帧来时,我们都应该更新纹理,然后重新绘制。用openGL播放视频就是把视频贴到屏幕上。
对openGL不熟的同学先看这里:学openGL必知道的图形学知识

1.先写顶点着色器和片段着色器(我的习惯是这样,你也可以后边根据需要再写这个)

顶点着色器:

attribute vec4 aPosition;//顶点位置
attribute vec4 aTexCoord;//S T 纹理坐标
varying vec2 vTexCoord;
uniform mat4 uMatrix;
uniform mat4 uSTMatrix;
void main() {
    vTexCoord = (uSTMatrix * aTexCoord).xy;
    gl_Position = uMatrix*aPosition;
}

片段着色器:

#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 vTexCoord;
uniform samplerExternalOES sTexture;
void main() {
    gl_FragColor=texture2D(sTexture, vTexCoord);
}

对着色器语言不懂的同学看这里:http://blog.csdn.net/king1425/article/details/71425556

samplerExternalOES代替贴图片时的sampler2D,作用就是和surfaceTexture配合进行纹理更新和格式转换

2.MediaPlayer的输出

在GLVideoRenderer的构造函数中初始化MediaPlayer:

mediaPlayer=new MediaPlayer();
        try{
            mediaPlayer.setDataSource(context, Uri.parse(videoPath));
        }catch (IOException e){
            e.printStackTrace();
        }
        mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        mediaPlayer.setLooping(true);

        mediaPlayer.setOnVideoSizeChangedListener(this);

onSurfaceCreated函数中使用SurfaceTexture来设置MediaPlayer的输出
我们要用SurfaceTexture 创建一个Surface,然后将这个Surface作为MediaPlayer的输出表面。
SurfaceTexture的主要作用就是,从视频流和相机数据流获取新一帧的数据,获取新数据调用的方法是updateTexImage。
需要注意的是MediaPlayer的输出往往不是RGB格式(一般是YUV),而GLSurfaceView需要RGB格式才能正常显示。
所以我们先在onSurfaceCreated中将生成纹理的代码改成这样:

textureId = textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
ShaderUtils.checkGlError("ws-------glBindTexture mTextureID");

GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
        GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
        GLES20.GL_LINEAR);

GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用处是什么?
之前提到视频解码的输出格式是YUV的(YUV420sp,应该是),那么这个扩展纹理的作用就是实现YUV格式到RGB的自动转化,我们就不需要再为此写YUV转RGB的代码了

然后在onSurfaceCreated的最后加上如下代码:

 surfaceTexture = new SurfaceTexture(textureId);
        surfaceTexture.setOnFrameAvailableListener(this);//监听是否有新的一帧数据到来

        Surface surface = new Surface(surfaceTexture);
        mediaPlayer.setSurface(surface);
        surface.release();

        if (!playerPrepared){
            try {
                mediaPlayer.prepare();
                playerPrepared=true;
            } catch (IOException t) {
                Log.e(TAG, "media player prepare failed");
            }
            mediaPlayer.start();
            playerPrepared=true;
        }

用SurfaceTexture 创建一个Surface,然后将这个Surface作为MediaPlayer的输出表面.

在onDrawFrame中

 synchronized (this){
            if (updateSurface){
                surfaceTexture.updateTexImage();//获取新数据
                surfaceTexture.getTransformMatrix(mSTMatrix);//让新的纹理和纹理坐标系能够正确的对应,mSTMatrix的定义是和projectionMatrix完全一样的。
                updateSurface = false;
            }
        }

在有新数据时,用updateTexImage来更新纹理,这个getTransformMatrix的目的,是让新的纹理和纹理坐标系能够正确的对应,mSTMatrix的定义是和projectionMatrix完全一样的。

 private final float[] vertexData = {
            1f,-1f,0f,
            -1f,-1f,0f,
            1f,1f,0f,
            -1f,1f,0f
    };



    private final float[] textureVertexData = {
            1f,0f,
            0f,0f,
            1f,1f,
            0f,1f
    };

vertexData 代表要绘制的视口坐标。textureVertexData 代表视频纹理,与屏幕坐标对应

然后我们读取坐标,在此自己我们先与着色器映射。
在onSurfaceCreated映射

        aPositionLocation= GLES20.glGetAttribLocation(programId,"aPosition");
        uMatrixLocation=GLES20.glGetUniformLocation(programId,"uMatrix");
        uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");
        uTextureSamplerLocation=GLES20.glGetUniformLocation(programId,"sTexture");
        aTextureCoordLocation=GLES20.glGetAttribLocation(programId,"aTexCoord");

onDrawFrame中读取:

GLES20.glUseProgram(programId);
        GLES20.glUniformMatrix4fv(uMatrixLocation,1,false,projectionMatrix,0);
        GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, mSTMatrix, 0);

        vertexBuffer.position(0);
        GLES20.glEnableVertexAttribArray(aPositionLocation);
        GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,
                12, vertexBuffer);

        textureVertexBuffer.position(0);
        GLES20.glEnableVertexAttribArray(aTextureCoordLocation);
        GLES20.glVertexAttribPointer(aTextureCoordLocation,2,GLES20.GL_FLOAT,false,8,textureVertexBuffer);

        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textureId);

        GLES20.glUniform1i(uTextureSamplerLocation,0);
        GLES20.glViewport(0,0,screenWidth,screenHeight);
        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);

GLVideoRenderer 全部代码如下:

package com.ws.openglvideoplayer;

import android.content.Context;
import android.graphics.SurfaceTexture;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.net.Uri;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.Matrix;
import android.util.Log;
import android.view.Surface;

import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

/**
 * Created by Shuo.Wang on 2017/3/19.
 */

public class GLVideoRenderer implements GLSurfaceView.Renderer
        , SurfaceTexture.OnFrameAvailableListener, MediaPlayer.OnVideoSizeChangedListener  {


    private static final String TAG = "GLRenderer";
    private Context context;
    private int aPositionLocation;
    private int programId;
    private FloatBuffer vertexBuffer;
    private final float[] vertexData = {
            1f,-1f,0f,
            -1f,-1f,0f,
            1f,1f,0f,
            -1f,1f,0f
    };

    private final float[] projectionMatrix=new float[16];
    private int uMatrixLocation;

    private final float[] textureVertexData = {
            1f,0f,
            0f,0f,
            1f,1f,
            0f,1f
    };
    private FloatBuffer textureVertexBuffer;
    private int uTextureSamplerLocation;
    private int aTextureCoordLocation;
    private int textureId;

    private SurfaceTexture surfaceTexture;
    private MediaPlayer mediaPlayer;
    private float[] mSTMatrix = new float[16];
    private int uSTMMatrixHandle;

    private boolean updateSurface;
    private boolean playerPrepared;
    private int screenWidth,screenHeight;
    public GLVideoRenderer(Context context,String videoPath) {
        this.context = context;
        playerPrepared=false;
        synchronized(this) {
            updateSurface = false;
        }
        vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer()
                .put(vertexData);
        vertexBuffer.position(0);

        textureVertexBuffer = ByteBuffer.allocateDirect(textureVertexData.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer()
                .put(textureVertexData);
        textureVertexBuffer.position(0);

        mediaPlayer=new MediaPlayer();
        try{
            mediaPlayer.setDataSource(context, Uri.parse(videoPath));
        }catch (IOException e){
            e.printStackTrace();
        }
        mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        mediaPlayer.setLooping(true);

        mediaPlayer.setOnVideoSizeChangedListener(this);
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        String vertexShader = ShaderUtils.readRawTextFile(context, R.raw.simple_vertex_shader);
        String fragmentShader= ShaderUtils.readRawTextFile(context, R.raw.simple_fragment_shader);
        programId=ShaderUtils.createProgram(vertexShader,fragmentShader);
        aPositionLocation= GLES20.glGetAttribLocation(programId,"aPosition");

        uMatrixLocation=GLES20.glGetUniformLocation(programId,"uMatrix");
        uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");
        uTextureSamplerLocation=GLES20.glGetUniformLocation(programId,"sTexture");
        aTextureCoordLocation=GLES20.glGetAttribLocation(programId,"aTexCoord");


        int[] textures = new int[1];
        GLES20.glGenTextures(1, textures, 0);

        textureId = textures[0];
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
        ShaderUtils.checkGlError("glBindTexture mTextureID");
   /*GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用处?
      之前提到视频解码的输出格式是YUV的(YUV420p,应该是),那么这个扩展纹理的作用就是实现YUV格式到RGB的自动转化,
      我们就不需要再为此写YUV转RGB的代码了*/
        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
                GLES20.GL_NEAREST);
        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
                GLES20.GL_LINEAR);

        surfaceTexture = new SurfaceTexture(textureId);
        surfaceTexture.setOnFrameAvailableListener(this);//监听是否有新的一帧数据到来

        Surface surface = new Surface(surfaceTexture);
        mediaPlayer.setSurface(surface);
        surface.release();

        if (!playerPrepared){
            try {
                mediaPlayer.prepare();
                playerPrepared=true;
            } catch (IOException t) {
                Log.e(TAG, "media player prepare failed");
            }
            mediaPlayer.start();
            playerPrepared=true;
        }
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        Log.d(TAG, "onSurfaceChanged: "+width+" "+height);
        screenWidth=width; screenHeight=height;
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
        synchronized (this){
            if (updateSurface){
                surfaceTexture.updateTexImage();//获取新数据
                surfaceTexture.getTransformMatrix(mSTMatrix);//让新的纹理和纹理坐标系能够正确的对应,mSTMatrix的定义是和projectionMatrix完全一样的。
                updateSurface = false;
            }
        }
        GLES20.glUseProgram(programId);
        GLES20.glUniformMatrix4fv(uMatrixLocation,1,false,projectionMatrix,0);
        GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, mSTMatrix, 0);

        vertexBuffer.position(0);
        GLES20.glEnableVertexAttribArray(aPositionLocation);
        GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,
                12, vertexBuffer);

        textureVertexBuffer.position(0);
        GLES20.glEnableVertexAttribArray(aTextureCoordLocation);
        GLES20.glVertexAttribPointer(aTextureCoordLocation,2,GLES20.GL_FLOAT,false,8,textureVertexBuffer);

        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textureId);

        GLES20.glUniform1i(uTextureSamplerLocation,0);
        GLES20.glViewport(0,0,screenWidth,screenHeight);
        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
    }

    @Override
    synchronized public void onFrameAvailable(SurfaceTexture surface) {
        updateSurface = true;
    }

    @Override
    public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
        Log.d(TAG, "onVideoSizeChanged: "+width+" "+height);
        updateProjection(width,height);
    }

    private void updateProjection(int videoWidth, int videoHeight){
        float screenRatio=(float)screenWidth/screenHeight;
        float videoRatio=(float)videoWidth/videoHeight;
        if (videoRatio>screenRatio){
            Matrix.orthoM(projectionMatrix,0,-1f,1f,-videoRatio/screenRatio,videoRatio/screenRatio,-1f,1f);
        }else Matrix.orthoM(projectionMatrix,0,-screenRatio/videoRatio,screenRatio/videoRatio,-1f,1f,-1f,1f);
    }

    public MediaPlayer getMediaPlayer() {
        return mediaPlayer;
    }
}


要实现上图中的滤镜视频效果,只需用0.299,0.587,0.114 CRT中转灰度的模型算法。(自己可以网上搜寻更多效果,这里只是抛砖引玉)
更改片段着色器即可:

#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 vTexCoord;
uniform samplerExternalOES sTexture;
void main() {
    //gl_FragColor=texture2D(sTexture, vTexCoord);

        vec3 centralColor = texture2D(sTexture, vTexCoord).rgb;
        gl_FragColor = vec4(0.299*centralColor.r+0.587*centralColor.g+0.114*centralColor.b);

}

这里写图片描述

到此结束,我们已经实现了openGL ES+MediaPlayer 渲染播放视频+滤镜效果。

发布了134 篇原创文章 · 获赞 170 · 访问量 39万+
展开阅读全文

android OPENGLES 截屏显示问题

03-07

先上代码 public class MainActivity extends AppCompatActivity implements GLSurfaceView.Renderer, SurfaceTexture.OnFrameAvailableListener { private MediaProjectionManager mediaProjectionManager; private GLSurfaceView surfaceview; private LinearLayout linearLayout; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); linearLayout = ((LinearLayout) findViewById(R.id.linearlayout)); ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE}, 0); } /** * 初始化摄像头 */ private void initCamera() { surfaceview = new GLSurfaceView(this); linearLayout.addView(surfaceview); surfaceview.setEGLContextClientVersion(2); surfaceview.setRenderer(this); } private MediaProjection mediaProjection; @Override public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) { super.onRequestPermissionsResult(requestCode, permissions, grantResults); mediaProjectionManager = (MediaProjectionManager) getSystemService(MEDIA_PROJECTION_SERVICE); startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), 0); } @Override protected void onActivityResult(int requestCode, int resultCode, Intent data) { super.onActivityResult(requestCode, resultCode, data); mediaProjection = mediaProjectionManager.getMediaProjection(resultCode, data); initCamera(); } @Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { setupGraphics(); setupVertexBuffer(); setupTexture(); // } private int width=1080; private int height=1920; private MediaPlayer mediaPlayer; @Override public void onSurfaceChanged(GL10 gl, int width, int height) { this.width = width; this.height = height; // if (mediaPlayer == null) { // mediaPlayer = new MediaPlayer(); // mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() { // @Override // public void onPrepared(MediaPlayer mp) { // mp.start(); // } // }); // Surface surface = new Surface(videoTexture); // mediaPlayer.setSurface(surface); // surface.release(); // try { // mediaPlayer.setDataSource(Environment.getExternalStorageDirectory().getAbsolutePath()+"/1.mp4"); // mediaPlayer.prepareAsync(); // } catch (IOException e) { // e.printStackTrace(); // } // } else { // mediaPlayer.start(); // } Surface surface = new Surface(videoTexture); VirtualDisplay display = mediaProjection.createVirtualDisplay("aa", 1080, 1920, 32, DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR, surface, null, null); } private float[] videoTextureTransform = new float[16]; @Override public void onDrawFrame(GL10 gl) { synchronized (this) { if (frameAvailable) { videoTexture.updateTexImage(); videoTexture.getTransformMatrix(videoTextureTransform); frameAvailable = false; } } GLES20.glActiveTexture(GLES20.GL_TEXTURE0); // GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, textures[0]); // Draw a rectangle and render the video frame as a texture on it. GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4); // GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); // GLES20.glViewport(0, 0, width, height); this.drawTexture(); } private int shaderProgram; int textureParamHandle; int textureCoordinateHandle; int positionHandle; int textureTranformHandle; private static short drawOrder[] = {0, 1, 2, 0, 2, 3}; private FloatBuffer vertexBuffer; private ShortBuffer drawListBuffer; private static float squareSize = 0.5f; private static float squareCoords[] = { -squareSize, squareSize, // top left -squareSize, -squareSize, // bottom left squareSize, -squareSize, // bottom right squareSize, squareSize}; // top right private FloatBuffer textureBuffer; private float textureCoords[] = { 0.0f, 1.0f, 0.0f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 1.0f}; private int[] textures = new int[1]; private SurfaceTexture videoTexture; private void setupGraphics() { final String vertexShader = RawResourceReader.readTextFileFromRawResource(this, R.raw.vetext_sharder); final String fragmentShader = RawResourceReader.readTextFileFromRawResource(this, R.raw.fragment_sharder); final int vertexShaderHandle = ShaderHelper.compileShader(GLES20.GL_VERTEX_SHADER, vertexShader); final int fragmentShaderHandle = ShaderHelper.compileShader(GLES20.GL_FRAGMENT_SHADER, fragmentShader); shaderProgram = ShaderHelper.createAndLinkProgram(vertexShaderHandle, fragmentShaderHandle, new String[]{"texture", "vPosition", "vTexCoordinate", "textureTransform"}); GLES20.glUseProgram(shaderProgram); textureParamHandle = GLES20.glGetUniformLocation(shaderProgram, "texture"); textureCoordinateHandle = GLES20.glGetAttribLocation(shaderProgram, "vTexCoordinate"); positionHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition"); textureTranformHandle = GLES20.glGetUniformLocation(shaderProgram, "textureTransform"); } private void setupVertexBuffer() { // Draw list buffer ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2); dlb.order(ByteOrder.nativeOrder()); drawListBuffer = dlb.asShortBuffer(); drawListBuffer.put(drawOrder); drawListBuffer.position(0); // Initialize the texture holder ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4); bb.order(ByteOrder.nativeOrder()); vertexBuffer = bb.asFloatBuffer(); vertexBuffer.put(squareCoords); vertexBuffer.position(0); } private void setupTexture() { ByteBuffer texturebb = ByteBuffer.allocateDirect(textureCoords.length * 4); texturebb.order(ByteOrder.nativeOrder()); textureBuffer = texturebb.asFloatBuffer(); textureBuffer.put(textureCoords); textureBuffer.position(0); // Generate the actual texture GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glGenTextures(1, textures, 0); // checkGlError("Texture generate"); GLES20.glBindTexture(GLES20.GL_TEXTURE_BINDING_2D, textures[0]); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); // checkGlError("Texture bind"); videoTexture = new SurfaceTexture(textures[0]); videoTexture.setOnFrameAvailableListener(this); } private boolean frameAvailable = false; @Override public void onFrameAvailable(SurfaceTexture surfaceTexture) { synchronized (this) { Log.e("---", "onFrameAvailable: "); frameAvailable = true; } } private void drawTexture() { // Draw texture GLES20.glEnableVertexAttribArray(positionHandle); GLES20.glVertexAttribPointer(positionHandle, 2, GLES20.GL_FLOAT, false, 0, vertexBuffer); GLES20.glEnableVertexAttribArray(textureCoordinateHandle); GLES20.glVertexAttribPointer(textureCoordinateHandle, 4, GLES20.GL_FLOAT, false, 0, textureBuffer); GLES20.glUniformMatrix4fv(textureTranformHandle, 1, false, videoTextureTransform, 0); GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer); } } fragment_shader #extension GL_OES_EGL_image_external : require precision mediump float; uniform samplerExternalOES texture; varying vec2 v_TexCoordinate; void main () { vec4 color = texture2D(texture, v_TexCoordinate); gl_FragColor = color; } vetext_shader attribute vec4 vPosition; attribute vec4 vTexCoordinate; uniform mat4 textureTransform; varying vec2 v_TexCoordinate; void main () { v_TexCoordinate = (textureTransform * vTexCoordinate).xy; gl_Position = vPosition; } 视频可以播放,截屏就无法正常显示了,不知道为何求大神指教![图片说明](https://img-ask.csdn.net/upload/201703/07/1488876551_179145.png) 问答

没有更多推荐了,返回首页

©️2019 CSDN 皮肤主题: 编程工作室 设计师: CSDN官方博客

分享到微信朋友圈

×

扫一扫,手机浏览