OpenGL ES SDK for Android - 6

Instanced Tessellation

该应用程序显示一个旋转的实心圆环,其周围有一个低多边形线框网格。 使用OpenGL ES 3.0通过实例镶嵌技术绘制环面。

该应用程序显示一个旋转的实心圆环,其周围有一个低多边形线框网格。

要执行实例化细分,我们需要将模型划分为几个补丁。每个贴片都密集地填充三角形,并改善圆形表面的效果。在曲面细分的第一阶段,补丁由以正方形形式放置的顶点组成。一旦传递到着色器,它们就会根据存储在统一块中的控制点转换为Bezier曲面。绘图调用的每个实例都呈现环面的下一部分。

以下应用程序实例化两个类,它们管理实体环形模型和围绕它的线框。第一个类负责配置具有着色器的程序,该着色器能够实例化绘图,初始化数据缓冲区和处理实例化绘制调用。为了简化数学并满足补丁之间C1连续性的条件,我们假设圆环由12个圆构成,每个圆也由12个点定义。通过这种方式,我们可以将圆环的“大”和“小”圆分成四个象限,并构建近似完美圆形的贝塞尔曲面。为此目的,控制点不能放置在环面的表面上,而是必须适当地扭曲。

第二类管理与线框相对应的组件。它使用放置在环面上的顶点,并使用GL_LINES模式进行简单的绘制调用。它的“小圆圈”的尺寸略大于实心圆环的相应尺寸,因此两个模型之间有一个空间。

这两个类的公共元素放在一个抽象的Torus类中。

Setup Graphics

首先,我们需要生成我们将渲染的模型的坐标。 这是在WireframeTorus和InstancedSolidTorus类的构造函数中实现的。

wireframeTorus = new WireframeTorus(torusRadius, circleRadius + distance);
复制代码
solidTorus = new InstancedSolidTorus(torusRadius, circleRadius);
复制代码

请注意,我们希望线框对象比实体对象稍微大一点,这就是为什么circleRadius增加了距离。 由于这个原因,我们将看到一个由线框对象包围的实体对象。

两个模型的坐标以相同的方式生成,如下所示:

void TorusModel::generateVertices(float torusRadius, float circleRadius, unsigned int circlesCount, unsigned int pointsPerCircleCount, float* vertices)
{
    if (vertices == NULL)
    {
        LOGE("Cannot use null pointer while calculating torus vertices.");
        return;
    }
    /* Index variable. */
    unsigned int componentIndex = 0;
    for (unsigned int horizontalIndex = 0; horizontalIndex < circlesCount; ++horizontalIndex)
    {
        /* Angle in radians on XZ plane. */
        float xyAngle = (float) horizontalIndex * 2.0f * M_PI / circlesCount;
        for (unsigned int verticalIndex = 0; verticalIndex < pointsPerCircleCount; ++verticalIndex)
        {
            /* Angle in radians on XY plane. */
            float theta  = (float) verticalIndex * 2.0f * M_PI / pointsPerCircleCount;
            /* X coordinate. */
            vertices[componentIndex++] = (torusRadius + circleRadius * cosf(theta)) * cosf(xyAngle);
            /* Y coordinate. */
            vertices[componentIndex++] = circleRadius * sinf(theta);
            /* Z coordinate. */
            vertices[componentIndex++] = (torusRadius + circleRadius * cosf(theta)) * sinf(xyAngle);
            /* W coordinate. */
            vertices[componentIndex++] = 1.0f;
        }
    }
}
复制代码

在呈现请求的模型时使用单独的程序对象。 线框模型上没有应用光照,也没有复杂的顶点平移,这使事情变得更加容易。 请查看渲染线框模型时使用的着色器。

线框环面的顶点着色器源

/* Input vertex coordinates. */ 
in vec4 position;
/* Constant transformation matrices. */
uniform mat4 cameraMatrix;
uniform mat4 projectionMatrix;
uniform mat4 scaleMatrix;
/* Coefficients of rotation needed for configuration of rotation matrix. */
uniform vec3 rotationVector;
void main()
{
    mat4 modelViewMatrix;
    mat4 modelViewProjectionMatrix;
    
    /* Matrix rotating Model-View matrix around X axis. */
    mat4 xRotationMatrix = mat4(1.0,  0.0,                            0.0,                            0.0, 
                                0.0,  cos(radians(rotationVector.x)), sin(radians(rotationVector.x)), 0.0, 
                                0.0, -sin(radians(rotationVector.x)), cos(radians(rotationVector.x)), 0.0, 
                                0.0,  0.0,                            0.0,                            1.0);
    
    /* Matrix rotating Model-View matrix around Y axis. */
    mat4 yRotationMatrix = mat4( cos(radians(rotationVector.y)), 0.0, -sin(radians(rotationVector.y)), 0.0, 
                                 0.0,                            1.0,  0.0,                            0.0, 
                                 sin(radians(rotationVector.y)), 0.0,  cos(radians(rotationVector.y)), 0.0, 
                                 0.0,                            0.0,  0.0,                            1.0);
    
    /* Matrix rotating Model-View matrix around Z axis. */
    mat4 zRotationMatrix = mat4( cos(radians(rotationVector.z)), sin(radians(rotationVector.z)), 0.0, 0.0, 
                                -sin(radians(rotationVector.z)), cos(radians(rotationVector.z)), 0.0, 0.0, 
                                 0.0,                            0.0,                            1.0, 0.0, 
                                 0.0,                            0.0,                            0.0, 1.0);
    
    /* Model-View matrix transformations. */
    modelViewMatrix = scaleMatrix;
    modelViewMatrix = xRotationMatrix  * modelViewMatrix;
    modelViewMatrix = yRotationMatrix  * modelViewMatrix;
    modelViewMatrix = zRotationMatrix  * modelViewMatrix;
    modelViewMatrix = cameraMatrix     * modelViewMatrix;
    
    /* Configure Model-View-ProjectionMatrix. */
    modelViewProjectionMatrix = projectionMatrix * modelViewMatrix;
    
    /* Set vertex position in Model-View-Projection space. */
    gl_Position = modelViewProjectionMatrix * position;
}
复制代码

线框圆环的片段着色器源

precision mediump float;
uniform vec4 color;
/* Output variable. */
out vec4 fragColor;
void main()
{
    fragColor = color;
}
复制代码

如果我们现在想要在屏幕上渲染线框环面,那么使用GL_LINES模式发出glDrawElements()调用就足够了。 当然应该使用正确的程序对象和顶点数组对象。

void WireframeTorus::draw(float* rotationVector)
{
    GLint rotationVectorLocation = GL_CHECK(glGetUniformLocation(programID, "rotationVector"));
    /* Set required elements to draw mesh torus. */
    GL_CHECK(glUseProgram(programID));
    GL_CHECK(glBindVertexArray(vaoID));
    GL_CHECK(glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indicesBufferID));
    /* Pass Model-View matrix elements to the shader. */
    GL_CHECK(glUniform3fv(rotationVectorLocation, 1, rotationVector));
    /* Draw lines described by previously determined indices. */
    GL_CHECK(glDrawElements(GL_LINES, indicesCount, GL_UNSIGNED_INT, 0));
}
复制代码

请查看我们用于渲染实体圆环的着色器对象。 这里的情况更复杂,因为应用了一些照明,并且发布了将顶点转换为贝塞尔曲面的实例绘制技术。

固体圆环的顶点着色器源

/* Number of control points in one dimension for a patch.. */
const uint patchDimension = 4u;
/* Total number of control points in a patch. */
const uint controlPointsPerPatchCount = patchDimension * patchDimension;
/* Number of quads in a patch. */
const uint quadsInPatchCount = (patchDimension - 1u) * (patchDimension - 1u);
/* Total number of vertices in a patch. */
const uint verticesCount = 144u;
/* Input patch vertex coordinates. */
in vec2 patchUVPosition;
/* Constant transofrmation matrices. */
uniform mat4 cameraMatrix;
uniform mat4 projectionMatrix;
uniform mat4 scaleMatrix;
/* Coefficients of rotation needed for configuration of rotation matrix. */
uniform vec3 rotationVector;
/* Uniform block that stores control mesh indices. */
uniform ControlPointsIndices
{
    uint indices[controlPointsPerPatchCount * verticesCount / quadsInPatchCount];
};
/* Uniform block that stores control mesh vertices. */
uniform ControlPointsVertices
{
    vec4 vertices[verticesCount];
};
/* Normal vector set in Model-View-Projection space. */
out vec3 modelViewProjectionNormalVector;
void main()
{
    const float pi = 3.14159265358979323846;
    
    mat4 modelViewMatrix;
    mat4 modelViewProjectionMatrix;
    
    /* Array storing control vertices of current patch. */
    vec4 controlVertices[controlPointsPerPatchCount];
    
    /* Initialize array of current control vertices. */
    for (uint i = 0u; i < controlPointsPerPatchCount; ++i)
    {
        controlVertices[i] = vertices[indices[uint(gl_InstanceID) * controlPointsPerPatchCount + i]];
    }
    
    /* Coefficients of Bernstein polynomials. */
    vec2 bernsteinUV0 = (1.0 - patchUVPosition) * (1.0 - patchUVPosition) * (1.0 - patchUVPosition);
    vec2 bernsteinUV1 =  3.0 * patchUVPosition  * (1.0 - patchUVPosition) * (1.0 - patchUVPosition);
    vec2 bernsteinUV2 =  3.0 * patchUVPosition  *        patchUVPosition  * (1.0 - patchUVPosition);
    vec2 bernsteinUV3 =        patchUVPosition  *        patchUVPosition  *        patchUVPosition ;
    
    /* Position of a patch vertex on Bezier surface. */
    vec3 position = bernsteinUV0.x * (bernsteinUV0.y * controlVertices[ 0].xyz + bernsteinUV1.y * controlVertices[ 1].xyz + bernsteinUV2.y * controlVertices[ 2].xyz + bernsteinUV3.y * controlVertices[ 3].xyz) +
                    bernsteinUV1.x * (bernsteinUV0.y * controlVertices[ 4].xyz + bernsteinUV1.y * controlVertices[ 5].xyz + bernsteinUV2.y * controlVertices[ 6].xyz + bernsteinUV3.y * controlVertices[ 7].xyz) +
                    bernsteinUV2.x * (bernsteinUV0.y * controlVertices[ 8].xyz + bernsteinUV1.y * controlVertices[ 9].xyz + bernsteinUV2.y * controlVertices[10].xyz + bernsteinUV3.y * controlVertices[11].xyz) +
                    bernsteinUV3.x * (bernsteinUV0.y * controlVertices[12].xyz + bernsteinUV1.y * controlVertices[13].xyz + bernsteinUV2.y * controlVertices[14].xyz + bernsteinUV3.y * controlVertices[15].xyz);
    
    /* Matrix rotating Model-View matrix around X axis. */
    mat4 xRotationMatrix = mat4(1.0,  0.0,                            0.0,                            0.0, 
                                0.0,  cos(radians(rotationVector.x)), sin(radians(rotationVector.x)), 0.0, 
                                0.0, -sin(radians(rotationVector.x)), cos(radians(rotationVector.x)), 0.0, 
                                0.0,  0.0,                            0.0,                            1.0);
                
    /* Matrix rotating Model-View matrix around Y axis. */
    mat4 yRotationMatrix = mat4( cos(radians(rotationVector.y)), 0.0, -sin(radians(rotationVector.y)), 0.0, 
                                 0.0,                            1.0,  0.0,                            0.0, 
                                 sin(radians(rotationVector.y)), 0.0,  cos(radians(rotationVector.y)), 0.0, 
                                 0.0,                            0.0,  0.0,                            1.0);
    
    /* Matrix rotating Model-View matrix around Z axis. */
    mat4 zRotationMatrix = mat4( cos(radians(rotationVector.z)), sin(radians(rotationVector.z)), 0.0, 0.0, 
                                -sin(radians(rotationVector.z)), cos(radians(rotationVector.z)), 0.0, 0.0, 
                                0.0,                             0.0,                            1.0, 0.0, 
                                0.0,                             0.0,                            0.0, 1.0);
    
    /* Model-View matrix transformations. */
    modelViewMatrix = scaleMatrix;
    modelViewMatrix = xRotationMatrix * modelViewMatrix;
    modelViewMatrix = yRotationMatrix * modelViewMatrix;
    modelViewMatrix = zRotationMatrix * modelViewMatrix;
    modelViewMatrix = cameraMatrix    * modelViewMatrix;
    /* Configure Model-View-ProjectionMatrix. */
    modelViewProjectionMatrix = projectionMatrix * modelViewMatrix;
    /* Set vertex position in Model-View-Projection space. */
    gl_Position = modelViewProjectionMatrix * vec4(position, 1.0);
    /* Angle on the "big circle" of torus. */
    float phi = (patchUVPosition.x + mod(float(gl_InstanceID), 4.0)) * pi / 2.0;
    /* Angle on the "small circle" of torus. */
    float theta = (patchUVPosition.y + mod(float(gl_InstanceID / 4), 4.0)) * pi / 2.0;
    /* Horizontal tangent to torus. */
    vec3 dBdu = vec3(-sin(phi), 0.0, cos(phi));
    /* Vertical tangent to torus. */
    vec3 dBdv = vec3(cos(phi) * (-sin(theta)), cos(theta), sin(phi) * (-sin(theta)));
    /* Calculate normal vector. */
    vec3 normalVector = normalize(cross(dBdu, dBdv));
    /* Calculate normal matrix. */
    mat3 normalMatrix = transpose(inverse(mat3x3(modelViewMatrix)));
    /* Transform normal vector to Model-View-Projection space. */
    modelViewProjectionNormalVector = normalize(normalMatrix * normalVector);
}
Fragment shader source for the solid torus

precision mediump float;
/* Input normal vector. */
in vec3 modelViewProjectionNormalVector;
/* Structure storing directional light parameters. */
struct Light
{
    vec3  lightColor;
    vec3  lightDirection;
    float ambientIntensity;
};
/* Color of the drawn torus. */
uniform vec4  color;
/* Uniform representing light parameters. */
uniform Light light;
/* Output variable. */
out vec4 fragColor;
void main()
{
    /* Calculate the value of diffuse intensity. */
    float diffuseIntensity = max(0.0, -dot(modelViewProjectionNormalVector, normalize(light.lightDirection)));
    /* Calculate the output color value considering the light. */
    fragColor = color * vec4(light.lightColor * (light.ambientIntensity + diffuseIntensity), 1.0);
}
复制代码

如果我们现在想在屏幕上渲染实体圆环,那么使用GL_TRIANGLES模式发出glDrawElementsInstanced()调用就足够了。 当然应该使用正确的程序对象和顶点数组对象。

void InstancedSolidTorus::draw(float* rotationVector)
{
    /* Location of rotation vector. */
    GLint rotationVectorLocation = GL_CHECK(glGetUniformLocation(programID, "rotationVector"));
    /* Set required OpenGL ES state. */
    GL_CHECK(glUseProgram     (programID                                    ));
    GL_CHECK(glBindVertexArray(vaoID                                        ));
    GL_CHECK(glBindBuffer     (GL_ELEMENT_ARRAY_BUFFER, patchIndicesBufferID));
    if (rotationVectorLocation != -1)
    {
        /* Pass rotation parameters to the shader. */
        GL_CHECK(glUniform3fv(rotationVectorLocation, 1, rotationVector));
    }
    else
    {
        LOGE("Could not locate \"rotationVector\" uniform in program [%d]", programID);
    }
    /* Draw patchInstancesCount instances of patchTriangleIndicesCount triangles. */ 
    GL_CHECK(glDrawElementsInstanced(GL_TRIANGLES, patchTriangleIndicesCount, GL_UNSIGNED_INT, 0, patchInstancesCount));
}
复制代码

Result

我们希望模型旋转,这就是为什么我们需要计算每帧的新旋转角度。 生成旋转矢量后,它将用于更新两者的顶点位置:线框和实心圆环。

/* Increment rotation angles. */
angleX += 0.5;
angleY += 0.5;
angleZ += 0.5;
if(angleX >= 360.0f) angleX = 0.0f;
if(angleY >= 360.0f) angleY = 0.0f;
if(angleZ >= 360.0f) angleZ = 0.0f;
float rotationVector[] = {angleX, angleY, angleZ};
复制代码

然后在绘制所请求的模型时使用计算的值。

wireframeTorus->draw(rotationVector);
复制代码
solidTorus->draw(rotationVector);
复制代码

Instancing

此示例介绍使用OpenGL ES 3.0的实例绘制技术。

每个立方体都是同一对象的实例。

内存中只有一个立方体顶点数据的副本,并且绘制的每个立方体都是该数据的一个实例。 这减少了需要传输到GPU的内存量。 通过在着色器中使用gl_instanceID,每个立方体可以具有不同的位置,旋转速度和颜色。 在场景中使用重复几何体的任何地方都可以使用此技术。

Generating a Geometry

要渲染立方体(最基本的3D形状),我们需要为其顶点生成坐标。 这是我们要关注的第一步。 请看下面的图片。

立方体顶点的坐标。

如图所示,立方体顶点坐标围绕点<0,0,0>排列,将它们置于[<-1,-1,-1>,<1,1,1>]范围内。 这不是必要的。 可以在任何方向上平移坐标或缩放立方体,但必须确保立方体在屏幕上仍然可见。 如果不确定如何操作,请按照我们的建议操作。 我们还有另一个原因使用围绕屏幕中心排列的坐标(点<0,0,0>) - 我们将生成立方体的副本,每个实例将被转换为新位置(以便立方体在圆形轨迹上移动)。

使立方体顶点不足以绘制立方体形状。 基本的OpenGL ES渲染技术基于绘制构成所请求形状的三角形。 这里要提到的是,在描述立方体三角形顶点时,应该遵循顺时针或逆时针顺序,否则OpenGL ES在检测正面和背面时会遇到一些麻烦。 在这个例子中,我们使用顺时针(CW)顺序来描述立方体坐标,因为这是OpenGL ES的默认值。

构成立方体形状的三角形。

    /* Please see header for the specification. */
    void CubeModel::getTriangleRepresentation(float** coordinatesPtrPtr,
                                              int*    numberOfCoordinatesPtr,
                                              int*    numberOfPointsPtr,
                                              float   scalingFactor)
    {
        ASSERT(coordinatesPtrPtr != NULL,
               "Cannot use null pointer while calculating coordinates");
        /* Index of an array we will put new point coordinates at. */
        int       currentIndex                    = 0;
        /* 6 faces of cube, 2 triangles for each face, 3 points of triangle, 3 coordinates for each point. */
        const int numberOfCubeTriangleCoordinates = NUMBER_OF_CUBE_FACES        *
                                                    NUMBER_OF_TRIANGLES_IN_QUAD *
                                                    NUMBER_OF_TRIANGLE_VERTICES *
                                                    NUMBER_OF_POINT_COORDINATES;
        /* Allocate memory for result array. */
        *coordinatesPtrPtr = (float*) malloc(numberOfCubeTriangleCoordinates * sizeof(float));
        /* Is allocation successful?. */
        ASSERT(*coordinatesPtrPtr != NULL,
               "Could not allocate memory for result array.")
        /* Example:
         * Coordinates for cube points:
         * A -1.0f,  1.0f,  1.0f
         * B -1.0f,  1.0f, -1.0f
         * C  1.0f,  1.0f, -1.0f
         * D  1.0f,  1.0f,  1.0f
         * E -1.0f, -1.0f,  1.0f
         * F -1.0f, -1.0f, -1.0f
         * G  1.0f, -1.0f, -1.0f
         * H  1.0f, -1.0f,  1.0f
         * Create 2 triangles for each face of the cube. Vertices are written in clockwise order.
         *       B ________ C
         *      / |     /  |
         *  A ......... D  |
         *    .   |   .    |
         *    .  F|_ _.___ |G
         *    . /     .  /
         *  E ......... H
         */
        const Vec3f pointA = {-1.0f,  1.0f,  1.0f};
        const Vec3f pointB = {-1.0f,  1.0f, -1.0f};
        const Vec3f pointC = { 1.0f,  1.0f, -1.0f};
        const Vec3f pointD = { 1.0f,  1.0f,  1.0f};
        const Vec3f pointE = {-1.0f, -1.0f,  1.0f};
        const Vec3f pointF = {-1.0f, -1.0f, -1.0f};
        const Vec3f pointG = { 1.0f, -1.0f, -1.0f};
        const Vec3f pointH = { 1.0f, -1.0f,  1.0f};
        /* Fill the array with coordinates. */
        /* Top face. */
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* B */
        (*coordinatesPtrPtr)[currentIndex++] = pointB.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* D */
        (*coordinatesPtrPtr)[currentIndex++] = pointD.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.z;
        /* Bottom face. */
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;;
        /* E */
        (*coordinatesPtrPtr)[currentIndex++] = pointE.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.z;
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* G */
        (*coordinatesPtrPtr)[currentIndex++] = pointG.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.z;
        /* Back face. */
        /* G */
        (*coordinatesPtrPtr)[currentIndex++] = pointG.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* B */
        (*coordinatesPtrPtr)[currentIndex++] = pointB.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.z;
        /* G */
        (*coordinatesPtrPtr)[currentIndex++] = pointG.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.z;
        /* B */
        (*coordinatesPtrPtr)[currentIndex++] = pointB.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.z;
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;
        /* Front face. */
        /* E */
        (*coordinatesPtrPtr)[currentIndex++] = pointE.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.z;
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* D */
        (*coordinatesPtrPtr)[currentIndex++] = pointD.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.z;
        /* E */
        (*coordinatesPtrPtr)[currentIndex++] = pointE.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.z;
        /* D */
        (*coordinatesPtrPtr)[currentIndex++] = pointD.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.z;
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* Right face. */
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* D */
        (*coordinatesPtrPtr)[currentIndex++] = pointD.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* G */
        (*coordinatesPtrPtr)[currentIndex++] = pointG.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.z;
        /* Left face. */
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;
        /* B */
        (*coordinatesPtrPtr)[currentIndex++] = pointB.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.z;
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* E */
        (*coordinatesPtrPtr)[currentIndex++] = pointE.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.z;
        /* Calculate size of a cube. */
        if (scalingFactor != 1.0f)
        {
            for (int i = 0; i < numberOfCubeTriangleCoordinates; i++)
            {
                (*coordinatesPtrPtr)[i] *= scalingFactor;
            }
        }
        if (numberOfCoordinatesPtr != NULL)
        {
            *numberOfCoordinatesPtr = numberOfCubeTriangleCoordinates;
        }
        if (numberOfPointsPtr != NULL)
        {
            *numberOfPointsPtr = numberOfCubeTriangleCoordinates / NUMBER_OF_POINT_COORDINATES;
        }
    }
复制代码

我们想告诉OpenGL ES获取数据并绘制立方体。

首先,我们需要一个缓冲对象,用于存储构成立方体的三角形的顶点坐标。

生成缓冲区对象(在代码中我们生成3个缓冲区对象,因为我们将在后续步骤中使用它们,但此时只需要其中一个:cubeCoordinatesBufferObjectId缓冲区对象):

/* Generate buffers. */
GL_CHECK(glGenBuffers(numberOfBufferObjectIds, bufferObjectIds));
cubeCoordinatesBufferObjectId  = bufferObjectIds[0];
cubeColorsBufferObjectId       = bufferObjectIds[1];
uniformBlockDataBufferObjectId = bufferObjectIds[2];
复制代码

我们需要调用函数(已在上面描述)来获取立方体的坐标数据。

/* Get triangular representation of a cube. Save data in cubeTrianglesCoordinates array. */
CubeModel::getTriangleRepresentation(&cubeTrianglesCoordinates,
                                     &numberOfCubeTriangleCoordinates,
                                     &numberOfCubeVertices,
                                      cubeSize);
复制代码

下一步是将检索到的数据复制到缓冲区对象中。

/* Buffer holding coordinates of triangles which create a cube. */
GL_CHECK(glBindBuffer(GL_ARRAY_BUFFER,
                  cubeCoordinatesBufferObjectId));
GL_CHECK(glBufferData(GL_ARRAY_BUFFER,
                  numberOfCubeTriangleCoordinates * sizeof(GLfloat),
                  cubeTrianglesCoordinates,
                  GL_STATIC_DRAW));
复制代码

接下来要做的是为立方体的坐标设置顶点attrib数组。 该函数适用于当前绑定的数组缓冲区对象。 我们在应用程序中多次重新绑定缓冲区对象,这就是为什么我们需要在这里再做一次。 但是,如果确定存储顶点坐标的缓冲区对象当前绑定到GL_ARRAY_BUFFER目标,则不需要这样做。 请注意,应为活动程序对象调用以下所有函数。

GL_CHECK(glBindBuffer             (GL_ARRAY_BUFFER,
                                   cubeCoordinatesBufferObjectId));
GL_CHECK(glEnableVertexAttribArray(positionLocation));
GL_CHECK(glVertexAttribPointer    (positionLocation,
                                   NUMBER_OF_POINT_COORDINATES,
                                   GL_FLOAT,
                                   GL_FALSE,
                                   0,
                                   0));
复制代码

此时,应该对positionLocation变量感兴趣:它代表什么? 这是从我们用于渲染的程序对象中检索的属性位置。 一旦熟悉程序对象并正确初始化该值,就可以调用它

glDrawArrays(GL_TRIANGLES, 0, numberOfCubeVertices);
复制代码

在屏幕上渲染单个立方体。 请注意,使用了GL_TRIANGLES模式,它对应于我们生成的立方体坐标的三角形表示。

Program Object

要开始使用程序对象,我们必须首先生成其ID。

renderingProgramId = GL_CHECK(glCreateProgram());
复制代码

程序对象需要将片段和顶点着色器附加到它上面。 现在让我们专注于生成和设置着色器对象。

Shader::processShader(&vertexShaderId,   VERTEX_SHADER_FILE_NAME,   GL_VERTEX_SHADER);
Shader::processShader(&fragmentShaderId, FRAGMENT_SHADER_FILE_NAME, GL_FRAGMENT_SHADER);
复制代码

基本机制如下:

  1. 创建着色器对象:
*shaderObjectIdPtr = GL_CHECK(glCreateShader(shaderType));
复制代码
  1. 设置着色器源:
GL_CHECK(glShaderSource(*shaderObjectIdPtr, 1, strings, NULL));
复制代码

请注意,strings变量存储从文件中读取的着色器源。

strings[0] = loadShader(filename);
复制代码
  1. 编译着色器:
GL_CHECK(glCompileShader(*shaderObjectIdPtr));
复制代码

通过检查GL_COMPILE_STATUS(期望GL_TRUE)来检查编译是否成功总是一个好主意。

GL_CHECK(glGetShaderiv(*shaderObjectIdPtr, GL_COMPILE_STATUS, &compileStatus));
复制代码

为片段和顶点着色器调用这些函数后,将两者都附加到程序对象,

GL_CHECK(glAttachShader(renderingProgramId, vertexShaderId));
GL_CHECK(glAttachShader(renderingProgramId, fragmentShaderId));
复制代码

链接程序对象,

GL_CHECK(glLinkProgram(renderingProgramId));
复制代码

并设置要使用的程序对象(活动)。

GL_CHECK(glUseProgram(renderingProgramId));
复制代码

我们在应用程序中使用的着色器对象更高级,但是,如果只想渲染一个立方体,则使用下面的代码定义着色器就足够了。

顶点着色器:

#version 300 es
in      vec4 attributePosition;
uniform vec3 cameraVector;
uniform vec4 perspectiveVector;
void main()
{
    float fieldOfView = 1.0 / tan(perspectiveVector.x * 0.5);
                      
    mat4 cameraMatrix = mat4 (1.0,            0.0,            0.0,           0.0, 
                              0.0,            1.0,            0.0,           0.0, 
                              0.0,            0.0,            1.0,           0.0, 
                              cameraVector.x, cameraVector.y, cameraVector.z, 1.0);
    mat4 perspectiveMatrix = mat4 (fieldOfView/perspectiveVector.y, 0.0,         0.0,                                                                                             0.0, 
                                   0.0,                             fieldOfView, 0.0,                                                                                             0.0, 
                                   0.0,                             0.0,        -(perspectiveVector.w + perspectiveVector.z) / (perspectiveVector.w - perspectiveVector.z),      -1.0, 
                                   0.0,                             0.0,        (-2.0 * perspectiveVector.w * perspectiveVector.z) / (perspectiveVector.w - perspectiveVector.z), 0.0);
  
    /* Return gl_Position. */
    gl_Position = perspectiveMatrix * cameraMatrix * attributePosition;
}
复制代码

片段着色器:

#version 300 es
precision mediump float;
out vec4 fragmentColour;
void main()
{
    fragmentColour = vec4(0.3, 0.2, 0.8, 1.0); //Please use any colour you want
}
复制代码

现在可以看到positionLocation变量代表什么。 它是名为attributePosition的属性的位置。 我们如何获得属性位置?

程序对象链接并激活后,可以调用

positionLocation = GL_CHECK(glGetAttribLocation   (renderingProgramId, "attributePosition"));
复制代码

请记住,检查检索到的值是否有效总是一个好主意。 如果在着色器中未找到属性名称或处于非活动状态(未使用),则返回-1(被视为无效的属性位置)。

ASSERT(positionLocation != -1,  "Could not retrieve attribute location: attributePosition");
复制代码

如果返回了有效值,则可以在先前的步骤中使用它。

Instanced Drawing

该应用程序的主要思想是提出实例化绘图技术。 一旦熟悉了前面的部分,就可以这样做了。

首先,我们必须知道要渲染立方体对象的实例数。

/* Number of cubes that are drawn on a screen. */
#define NUMBER_OF_CUBES (10)
复制代码

接下来要做的是调整绘图命令,以便渲染所有立方体。 请注意,这次我们使用glDrawArraysInstanced()而不是glDrawArrays()。

/* Draw cubes on a screen. */
GL_CHECK(glDrawArraysInstanced(GL_TRIANGLES,
                               0,
                               numberOfCubeVertices,
                               NUMBER_OF_CUBES));
复制代码

我们现在在屏幕上绘制了NUMBER_OF_CUBES数量。 但问题是所有立方体都在屏幕上的相同位置渲染,因此我们无法看到所有这些立方体。为每个立方体设置不同的位置。 立方体也应该有不同的颜色,所以我们将在这种情况下使用uniform块。

在我们的顶点着色器中添加了两个新东西

/*
 * We use uniform block in order to reduce amount of memory transfers to minimum. 
 * The uniform block uses data taken directly from a buffer object. 
 */
uniform CubesUniformBlock
{
    float startPosition[numberOfCubes];
    vec4  cubeColor[numberOfCubes];
};
复制代码

其中numberOfCubes定义为

const int   numberOfCubes = 10;
复制代码

然后,如果我们想从其中一个数组中取一个元素,我们需要使用gl_InstanceID作为索引,它指示当前正在渲染的元素的索引(在我们的例子中,该值来自一个范围[ 0,NUMBER_OF_CUBES - 1])。

在API中,我们需要检索uniform块的位置,

uniformBlockIndex = GL_CHECK(glGetUniformBlockIndex(renderingProgramId, "CubesUniformBlock"));
复制代码

验证返回的值是否有效,

ASSERT(uniformBlockIndex != GL_INVALID_INDEX, "Could not retrieve uniform block index: CubesUniformBlock");
复制代码

并设置数据。

/* Set binding point for uniform block. */
GL_CHECK(glUniformBlockBinding(renderingProgramId,
                               uniformBlockIndex,
                               0));
GL_CHECK(glBindBufferBase     (GL_UNIFORM_BUFFER,
                               0,
                               uniformBlockDataBufferObjectId));
复制代码

程序对象将使用存储在名为uniformBlockDataBufferObjectId的缓冲区对象中的数据。

/* Buffer holding coordinates of start positions of cubes and RGBA values of colors. */
GL_CHECK(glBindBuffer(GL_ARRAY_BUFFER,
                      uniformBlockDataBufferObjectId));
GL_CHECK(glBufferData(GL_ARRAY_BUFFER,
                      sizeof(startPosition) + sizeof(cubeColors),
                      NULL,
                      GL_STATIC_DRAW));
GL_CHECK(glBufferSubData(GL_ARRAY_BUFFER,
                         0,
                         sizeof(startPosition),
                         startPosition));
GL_CHECK(glBufferSubData(GL_ARRAY_BUFFER,
                         sizeof(startPosition),
                         sizeof(cubeColors),
                         cubeColors));
复制代码
void generateStartPosition()
{
    float spaceBetweenCubes = (2 * M_PI) / (NUMBER_OF_CUBES);
    /* Fill array with startPosition data. */
    for (int allCubes = 0; allCubes < NUMBER_OF_CUBES; allCubes++)
    {
        startPosition[allCubes] = allCubes * spaceBetweenCubes;
    }
}
复制代码
void fillCubeColorsArray()
{
    for (int allComponents = 0;
             allComponents < numberOfValuesInCubeColorsArray;
             allComponents++)
    {
        /* Get random value from [0.0, 1.0] range. */
        cubeColors[allComponents] = (float)rand() / (float)RAND_MAX;
    }
}
复制代码

我们希望我们的立方体在圆形轨迹上移动,以不同的速度旋转,并具有不同的颜色。

顶点着色器代码

/* [Define number of cubes] */
const int   numberOfCubes = 10;
/* [Define number of cubes] */
const float pi            = 3.14159265358979323846;
const float radius        = 20.0;
in      vec4 attributeColor;
in      vec4 attributePosition;
out     vec4 vertexColor;
uniform vec3 cameraVector;
uniform vec4 perspectiveVector;
uniform float time; /* Time value used for determining positions and rotations. */
/* [Define uniform block] */
/*
 * We use uniform block in order to reduce amount of memory transfers to minimum. 
 * The uniform block uses data taken directly from a buffer object. 
 */
uniform CubesUniformBlock
{
    float startPosition[numberOfCubes];
    vec4  cubeColor[numberOfCubes];
};
/* [Define uniform block] */
void main()
{
    float fieldOfView = 1.0 / tan(perspectiveVector.x * 0.5);
    
    /* Vector data used for translation of cubes (each cube is placed on and moving around a circular curve). */
    vec3 locationOfCube = vec3(radius * cos(startPosition[gl_InstanceID] + (time/3.0)),
                               radius * sin(startPosition[gl_InstanceID] + (time/3.0)),
                               1.0);
    /* 
     * Vector data used for setting rotation of cube. Each cube has different speed of rotation,
     * first cube has the slowest rotation, the last one has the fastest. 
     */
    vec3 rotationOfube = vec3 (float(gl_InstanceID + 1) * 5.0 * time);
    
    /* 
     * Set different random colours for each cube. 
     * There is one colour passed in per cube set for each cube (cubeColor[gl_InstanceID]).
     * There are also different colours per vertex of a cube (attributeColor).
     */
    vertexColor = attributeColor * cubeColor[gl_InstanceID];
    
    /* Create transformation matrices. */
    mat4 translationMatrix = mat4 (1.0,             0.0,             0.0,             0.0, 
                                   0.0,             1.0,             0.0,             0.0, 
                                   0.0,             0.0,             1.0,             0.0, 
                                   locationOfCube.x, locationOfCube.y, locationOfCube.z, 1.0);
                                  
    mat4 cameraMatrix = mat4 (1.0,           0.0,           0.0,           0.0, 
                              0.0,              1.0,           0.0,           0.0, 
                              0.0,           0.0,           1.0,           0.0, 
                              cameraVector.x, cameraVector.y, cameraVector.z, 1.0);
    
    mat4 xRotationMatrix = mat4 (1.0,  0.0,                               0.0,                                0.0, 
                                 0.0,  cos(pi * rotationOfube.x / 180.0), sin(pi * rotationOfube.x / 180.0),  0.0, 
                                 0.0, -sin(pi * rotationOfube.x / 180.0), cos(pi * rotationOfube.x / 180.0),  0.0, 
                                 0.0,  0.0,                               0.0,                                1.0);
                                
    mat4 yRotationMatrix = mat4 (cos(pi * rotationOfube.y / 180.0), 0.0, -sin(pi * rotationOfube.y / 180.0), 0.0, 
                                 0.0,                               1.0, 0.0,                                0.0, 
                                 sin(pi * rotationOfube.y / 180.0), 0.0, cos(pi * rotationOfube.y / 180.0),  0.0, 
                                 0.0,                               0.0, 0.0,                                1.0);
                                
    mat4 zRotationMatrix = mat4 ( cos(pi * rotationOfube.z / 180.0), sin(pi * rotationOfube.z / 180.0), 0.0, 0.0, 
                                 -sin(pi * rotationOfube.z / 180.0), cos(pi * rotationOfube.z / 180.0), 0.0, 0.0, 
                                  0.0,                               0.0,                               1.0, 0.0, 
                                  0.0,                               0.0,                               0.0, 1.0);
                                 
    mat4 perspectiveMatrix = mat4 (fieldOfView/perspectiveVector.y, 0.0,        0.0,                                                                                              0.0, 
                                   0.0,                            fieldOfView, 0.0,                                                                                              0.0, 
                                   0.0,                            0.0,        -(perspectiveVector.w + perspectiveVector.z) / (perspectiveVector.w - perspectiveVector.z),        -1.0, 
                                   0.0,                            0.0,        (-2.0 * perspectiveVector.w * perspectiveVector.z) / (perspectiveVector.w - perspectiveVector.z), 0.0);
    /* Compute rotation. */
    mat4 tempMatrix = xRotationMatrix;
    
    tempMatrix = yRotationMatrix * tempMatrix;
    tempMatrix = zRotationMatrix * tempMatrix;
    
    /* Compute translation. */
    tempMatrix = translationMatrix * tempMatrix;
    tempMatrix = cameraMatrix      * tempMatrix;
                
    /* Compute perspective. */
    tempMatrix = perspectiveMatrix * tempMatrix;
                
    /* Return gl_Position. */
    gl_Position = tempMatrix * attributePosition;
}
复制代码

片段着色器代码

in vec4 vertexColor;
out vec4 fragmentColour;
void main()
{
    fragmentColour = vertexColor;
}
复制代码

在API中,我们通过调用查询顶点着色器中使用的所有uniform的位置

uniformLocation = glGetUniformLocation(renderingProgramId, "uniformName");
ASSERT(uniformLocation != -1, "Could not retrieve uniform location: uniformName");
复制代码

当然,uniformName代表着色器中使用的uniform的实际名称。

然后,根据uniform类型(float,vec3,vec4),我们使用不同的OpenGL ES调用来设置uniform的值。

在渲染过程中,摄像机位置和透视矢量是恒定的,因此仅调用下面显示的函数就足够了。

GL_CHECK(glUniform4fv(perspectiveMatrixLocation,
                      1,
                      (GLfloat*)&perspectiveVector));
复制代码
GL_CHECK(glUniform3fv(cameraPositionLocation,
                      1,
                      (GLfloat*)&cameraVector));
复制代码

应该每帧更新time值,因此调用

GL_CHECK(glUniform1f(timeLocation, time));
复制代码

为每个正在渲染的帧发出(放在renderFrame()函数内)。

完成上述所有步骤后,我们得到结果:

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值