WebGL试玩(一)-- 环境搭建

Introduction

   实时渲染中的大部分基础内容已经复习差不多了。应该说,现在已经对实时渲染的一整套体系都有所了解了。开源引擎也仔细研读了几种。应该说已经略窥了实时渲染的核心内容。下一个阶段是整理实时渲染中工业阶中常用算法的代码实现。有一部分已经在Vulkan学习中实现了。渲染方程虽然实现过,但那也是很久远的事情了。但现阶段还是关注实时渲染。具体实现像材质相关,和RTRT相关算法。
  首先,还是从Games202入手,正好了解一下WebGL。之前本职是Java全栈程序员,JS自然很拿手。作业内容部分我会隐藏掉,毕竟老师的话还是要听的#109.

WebGL–Window下环境配置

  • 下载并安装 nodejs :http://nodejs.cn/download/
    • 安装完后:以管理员允许PowerShell
      • 输入:set-ExecutionPolicy RemoteSigned 回车
      • 输入 A 回车
      • 输出 npm -v 显示版本号表示安装成功
  • 下载并安装 VSCode :https://code.visualstudio.com/docs/?dv=win
    • 安装插件:Live Server
    • Ctrl+Shift+P -> 输入 Live Server -> Open with Live Server
  • Node.js 搭建本地服务器
    • 以管理员允许PowerShell
      • npm install http -server -g
      • cd 代码根目录
      • http -server . -p 8000
      • 浏览器输入:127.0.0.1:8000 (本人还是喜欢8080)

Homework0

index.html

  • lib文件夹:引用的第三方库
  • src文件夹:自己编写的代码
<!DOCTYPE html>
<html>
<head>
    <style>
        html,
        body {
            margin: 0;
            background-color: black;
            height: 100%;
            width: 100%;
            overflow: hidden;
        }
		//设置绘图标签大小
        #glcanvas {
            top: 0;
            width: 100%;
            height: 100%;
        }
    </style>
    //three.js 渲染引擎
    <script src="lib/three.js" defer></script>
    //相机控件
    <script src="lib/OrbitControls.js" defer></script>
    //加载材质使用的库
    <script type="text/javascript" src="lib/MTLLoader.js" defer></script>
    //加载模型使用的库
    <script type="text/javascript" src="lib/OBJLoader.js" defer></script>
    //图形用户界面库 
    <script type="text/javascript" src="lib/dat.gui.js" defer></script>
    //gl矩阵运算库
    <script src="https://cdnjs.cloudflare.com/ajax/libs/gl-matrix/2.8.1/gl-matrix-min.js"
        integrity="sha512-zhHQR0/H5SEBL3Wn6yYSaTTZej12z0hVZKOv3TwCUXT1z5qeqGcXJLLrbERYRScEDDpYIJhPC1fk31gqR783iQ=="
        crossorigin="anonymous" defer> </script>
	//内置编写好的shader
    <script src="src/shaders/InternalShader.js" defer></script>
    //编译shader用到的脚本
    <script src="src/shaders/Shader.js" defer></script>
    //材质类,sampler、uniform 和 shader资源组成 
    <script src="src/materials/Material.js" defer></script>
    //贴图资源创建 sampler
    <script src="src/textures/Texture.js" defer></script>
	//自定义的一些几何体
    <script src="src/objects/Mesh.js" defer></script>
    //使用OBJLoader加载模型,并创建gpu资源
    <script src="src/loads/loadOBJ.js" defer></script>
    //加载shader
    <script src="src/loads/loadShader.js" defer></script>
	//灯光类,暂时看起来像平行光
    <script src="src/lights/Light.js" defer></script>
    //点光源
    <script src="src/lights/PointLight.js" defer></script>
	//渲染类
    <script src="src/renderers/WebGLRenderer.js" defer></script>
    //绑定模型gpu资源,并进行渲染
    <script src="src/renderers/MeshRender.js" defer></script>
    //engine类
    <script src="src/engine.js" defer></script>
</head>

<body>
	//绘图标签,即RenderWindow
    <canvas id="glcanvas"></canvas>

</body>

</html>

engine.js

  • engine.js 为程序入口
//全局变量 相机位置
var cameraPosition = [-20, 180, 250];
//JS加载后,立即运行该方法。
GAMES202Main();

function GAMES202Main() {
	//document.**()是js语法。用于查询标签,这里寻找的是id为glcanvas的图形标签
	const canvas = document.querySelector('#glcanvas');
	//js获取物理分辨率的宽高,设置当前图形标签宽高
	canvas.width = window.screen.width;
	canvas.height = window.screen.height;
	//获取gl
	const gl = canvas.getContext('webgl');
	if (!gl) {
		alert('Unable to initialize WebGL. Your browser or machine may not support it.');
		return;
	}
	//创建相机,这块就略掉,千篇一律
	const camera = new THREE.PerspectiveCamera(75, gl.canvas.clientWidth / gl.canvas.clientHeight, 0.1, 1000);
	const cameraControls = new THREE.OrbitControls(camera, canvas);
	cameraControls.enableZoom = true;
	cameraControls.enableRotate = true;
	cameraControls.enablePan = true;
	cameraControls.rotateSpeed = 0.3;
	cameraControls.zoomSpeed = 1.0;
	cameraControls.panSpeed = 2.0;
	//刷新相机用到的方法
	function setSize(width, height) {
		camera.aspect = width / height;
		camera.updateProjectionMatrix();
	}
	setSize(canvas.clientWidth, canvas.clientHeight);
	//监听到resize事件后重置相机投影矩阵
	window.addEventListener('resize', () => setSize(canvas.clientWidth, canvas.clientHeight));
	//相机eye
	camera.position.set(cameraPosition[0], cameraPosition[1], cameraPosition[2]);
	//相机target
	cameraControls.target.set(0, 1, 0);
	//新建点光源
	const pointLight = new PointLight(250, [1, 1, 1]);
	//创建renderer
	const renderer = new WebGLRenderer(gl, camera);
	renderer.addLight(pointLight);
	//加载模型
	loadOBJ(renderer, 'assets/mary/', 'Marry');

	var guiParams = {
		modelTransX: 0,
		modelTransY: 0,
		modelTransZ: 0,
		modelScaleX: 52,
		modelScaleY: 52,
		modelScaleZ: 52,
	}
	function createGUI() {
		const gui = new dat.gui.GUI();
		const panelModel = gui.addFolder('Model properties');
		const panelModelTrans = panelModel.addFolder('Translation');
		const panelModelScale = panelModel.addFolder('Scale');
		panelModelTrans.add(guiParams, 'modelTransX').name('X');
		panelModelTrans.add(guiParams, 'modelTransY').name('Y');
		panelModelTrans.add(guiParams, 'modelTransZ').name('Z');
		panelModelScale.add(guiParams, 'modelScaleX').name('X');
		panelModelScale.add(guiParams, 'modelScaleY').name('Y');
		panelModelScale.add(guiParams, 'modelScaleZ').name('Z');
		panelModel.open();
		panelModelTrans.open();
		panelModelScale.open();
	}
	//创建GUI
	createGUI();
	//渲染循环
	function mainLoop(now) {
		cameraControls.update();

		renderer.render(guiParams);
		requestAnimationFrame(mainLoop);
	}
	requestAnimationFrame(mainLoop);
}

loadOBJ.js


function loadOBJ(renderer, path, name) {
	//创建manager 
	const manager = new THREE.LoadingManager();
	//输出日志
	manager.onProgress = function (item, loaded, total) {
		console.log(item, loaded, total);
	};
	//应该是加载进度条
	function onProgress(xhr) {
		if (xhr.lengthComputable) {
			const percentComplete = xhr.loaded / xhr.total * 100;
			console.log('model ' + Math.round(percentComplete, 2) + '% downloaded');
		}
	}
	function onError() { }
	//加载材质信息
	new THREE.MTLLoader(manager)
		.setPath(path)
		.load(name + '.mtl', function (materials) {
			materials.preload();
			//加载模型
			new THREE.OBJLoader(manager)
				.setMaterials(materials)
				.setPath(path)
				.load(name + '.obj', function (object) {
					object.traverse(function (child) {
						if (child.isMesh) {
							let geo = child.geometry;
							let mat;
							if (Array.isArray(child.material)) mat = child.material[0];
							else mat = child.material;

							var indices = Array.from({ length: geo.attributes.position.count }, (v, k) => k);
							//现在模型只读顶点、法线和uv
							let mesh = new Mesh({ name: 'aVertexPosition', array: geo.attributes.position.array },
								{ name: 'aNormalPosition', array: geo.attributes.normal.array },
								{ name: 'aTextureCoord', array: geo.attributes.uv.array },
								indices);

							let colorMap = null;
							//创建贴图资源
							if (mat.map != null) colorMap = new Texture(renderer.gl, mat.map.image);
							// MARK: You can change the myMaterial object to your own Material instance

							let textureSample = 0;
							let myMaterial;
							if (colorMap != null) {
								textureSample = 1;
								//创建材质类。包括sampler,uniform 和shader
								myMaterial = new Material({
									'uSampler': { type: 'texture', value: colorMap },
									'uTextureSample': { type: '1i', value: textureSample },
									'uKd': { type: '3fv', value: mat.color.toArray() }
								},[],VertexShader, FragmentShader);
							}else{
								myMaterial = new Material({
									'uTextureSample': { type: '1i', value: textureSample },
									'uKd': { type: '3fv', value: mat.color.toArray() }
								},[],VertexShader, FragmentShader);
							}
							//这里将模型的渲染单独拿出来
							let meshRender = new MeshRender(renderer.gl, mesh, myMaterial);
							renderer.addMesh(meshRender);
						}
					});
				}, onProgress, onError);
		});
}

Mesh.js

//模型包含顶点,三角形indices,法线和uv
class Mesh {
	constructor(verticesAttrib, normalsAttrib, texcoordsAttrib, indices) {
		this.indices = indices;
		this.count = indices.length;
		this.hasVertices = false;
		this.hasNormals = false;
		this.hasTexcoords = false;
		let extraAttribs = [];

		if (verticesAttrib != null) {
			this.hasVertices = true;
			this.vertices = verticesAttrib.array;
			this.verticesName = verticesAttrib.name;
		}
		if (normalsAttrib != null) {
			this.hasNormals = true;
			this.normals = normalsAttrib.array;
			this.normalsName = normalsAttrib.name;
		}
		if (texcoordsAttrib != null) {
			this.hasTexcoords = true;
			this.texcoords = texcoordsAttrib.array;
			this.texcoordsName = texcoordsAttrib.name;
		}
	}
	//内置 一个cub
	static cube() {
		const positions = [
			// Front face
			-1.0, -1.0, 1.0,
			1.0, -1.0, 1.0,
			1.0, 1.0, 1.0,
			-1.0, 1.0, 1.0,

			// Back face
			-1.0, -1.0, -1.0,
			-1.0, 1.0, -1.0,
			1.0, 1.0, -1.0,
			1.0, -1.0, -1.0,

			// Top face
			-1.0, 1.0, -1.0,
			-1.0, 1.0, 1.0,
			1.0, 1.0, 1.0,
			1.0, 1.0, -1.0,

			// Bottom face
			-1.0, -1.0, -1.0,
			1.0, -1.0, -1.0,
			1.0, -1.0, 1.0,
			-1.0, -1.0, 1.0,

			// Right face
			1.0, -1.0, -1.0,
			1.0, 1.0, -1.0,
			1.0, 1.0, 1.0,
			1.0, -1.0, 1.0,

			// Left face
			-1.0, -1.0, -1.0,
			-1.0, -1.0, 1.0,
			-1.0, 1.0, 1.0,
			-1.0, 1.0, -1.0,
		];
		const indices = [
			0, 1, 2, 0, 2, 3,    // front
			4, 5, 6, 4, 6, 7,    // back
			8, 9, 10, 8, 10, 11,   // top
			12, 13, 14, 12, 14, 15,   // bottom
			16, 17, 18, 16, 18, 19,   // right
			20, 21, 22, 20, 22, 23,   // left
		];
		return new Mesh({ name: 'aVertexPosition', array: new Float32Array(positions) }, null, null, indices);
	}
}

Texture.js

class Texture {
    constructor(gl, img) {
    	//根opengl创建纹理几乎相同
    	//id
        this.texture = gl.createTexture();
        gl.bindTexture(gl.TEXTURE_2D, this.texture);

        // Because images have to be download over the internet
        // they might take a moment until they are ready.
        // Until then put a single pixel in the texture so we can
        // use it immediately. When the image has finished downloading
        // we'll update the texture with the contents of the image.
        const level = 0;
        const internalFormat = gl.RGBA;
        const width = 1;
        const height = 1;
        const border = 0;
        const srcFormat = gl.RGBA;
        const srcType = gl.UNSIGNED_BYTE;
        const pixel = new Uint8Array([0, 0, 255, 255]); // opaque blue
        //这里先给一个默认纹理
        gl.texImage2D(gl.TEXTURE_2D, level, internalFormat,
            width, height, border, srcFormat, srcType,
            pixel);

        gl.bindTexture(gl.TEXTURE_2D, this.texture);
        gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
        //将导入贴图绑定
        gl.texImage2D(gl.TEXTURE_2D, level, internalFormat,
            srcFormat, srcType, img);

        // WebGL1 has different requirements for power of 2 images
        // vs non power of 2 images so check if the image is a
        // power of 2 in both dimensions.
        if (isPowerOf2(img.width) && isPowerOf2(img.height)) {
            // Yes, it's a power of 2. Generate mips.
            //生成mipmap
            gl.generateMipmap(gl.TEXTURE_2D);
        } else {
            // No, it's not a power of 2. Turn of mips and set
            // wrapping to clamp to edge
            //gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
            //为什么不生成mipmap的时候要设置滤波和Wrap呢,贴图都需要设置这些参数吧?
            gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.REPEATE);
            //gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
            gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.REPEATE);
            gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
            gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
        }
        gl.bindTexture(gl.TEXTURE_2D, null);
    }
}

function isPowerOf2(value) {
    return (value & (value - 1)) == 0;
}

Material.js

class Material {
    #flatten_uniforms;
    #flatten_attribs;
    #vsSrc;
    #fsSrc;
    // Uniforms is a map, attribs is a Array
    constructor(uniforms, attribs, vsSrc, fsSrc) {
        this.uniforms = uniforms;
        this.attribs = attribs;
        this.#vsSrc = vsSrc;
        this.#fsSrc = fsSrc;
        //材质需要一个保存uniform的字段
        this.#flatten_uniforms = ['uModelViewMatrix', 'uProjectionMatrix', 'uCameraPos', 'uLightPos'];
        for (let k in uniforms) {
            this.#flatten_uniforms.push(k);
        }
        this.#flatten_attribs = attribs;
    }

    setMeshAttribs(extraAttribs) {
        for (let i = 0; i < extraAttribs.length; i++) {
            this.#flatten_attribs.push(extraAttribs[i]);
        }
    }

    compile(gl) {
    	//编译shader,材质的shader编译时需要给出材质参数的uniforms,如果有额外的参数也一并给出
        return new Shader(gl, this.#vsSrc, this.#fsSrc,
            {
                uniforms: this.#flatten_uniforms,
                attribs: this.#flatten_attribs
            });
    }
}

Shader.js

class Shader {
	//根opengl编译shader几乎相同
    constructor(gl, vsSrc, fsSrc, shaderLocations) {
        this.gl = gl;
        //编译shader
        const vs = this.compileShader(vsSrc, gl.VERTEX_SHADER);
        const fs = this.compileShader(fsSrc, gl.FRAGMENT_SHADER);
		//绑定uniform和attrib
        this.program = this.addShaderLocations({
        	//链接shader
            glShaderProgram: this.linkShader(vs, fs),
        }, shaderLocations);
    }

    compileShader(shaderSource, shaderType) {
        const gl = this.gl;
        //创建shader
        var shader = gl.createShader(shaderType);
        //输入shader字符串
        gl.shaderSource(shader, shaderSource);
        //编译
        gl.compileShader(shader);
		//编译失败提示
        if (!gl.getShaderParameter(shader, gl.COMP``````````````````````````````````````````````````ILE_STATUS)) {
            console.error(shaderSource);
            console.error(('shader compiler error:\n' + gl.getShaderInfoLog(shader)));
        }

        return shader;
    };

    linkShader(vs, fs) {
        const gl = this.gl;
        //链接shader
        var prog = gl.createProgram();
        gl.attachShader(prog, vs);
        gl.attachShader(prog, fs);
        gl.linkProgram(prog);
		//链接失败提示
        if (!gl.getProgramParameter(prog, gl.LINK_STATUS)) {
            abort('shader linker error:\n' + gl.getProgramInfoLog(prog));
        }
        return prog;
    };
	//shader绑定uniforms和Attrib
    addShaderLocations(result, shaderLocations) {
        const gl = this.gl;
        result.uniforms = {};
        result.attribs = {};

        if (shaderLocations && shaderLocations.uniforms && shaderLocations.uniforms.length) {
            for (let i = 0; i < shaderLocations.uniforms.length; ++i) {
                result.uniforms = Object.assign(result.uniforms, {
                    [shaderLocations.uniforms[i]]: gl.getUniformLocation(result.glShaderProgram, shaderLocations.uniforms[i]),
                });
                //console.log(gl.getUniformLocation(result.glShaderProgram, 'uKd'));
            }
        }
        if (shaderLocations && shaderLocations.attribs && shaderLocations.attribs.length) {
            for (let i = 0; i < shaderLocations.attribs.length; ++i) {
                result.attribs = Object.assign(result.attribs, {
                    [shaderLocations.attribs[i]]: gl.getAttribLocation(result.glShaderProgram, shaderLocations.attribs[i]),
                });
            }
        }
        
        return result;
    }
}

MeshRender.js


class MeshRender {

	#vertexBuffer;
	#normalBuffer;
	#texcoordBuffer;
	#indicesBuffer;
	
	constructor(gl, mesh, material) {
		this.gl = gl;
		this.mesh = mesh;
		this.material = material;
		//创建buffer
		this.#vertexBuffer = gl.createBuffer();
		this.#normalBuffer = gl.createBuffer();
		this.#texcoordBuffer = gl.createBuffer();
		this.#indicesBuffer = gl.createBuffer();

		let extraAttribs = []
		if (mesh.hasVertices) {
			//绑定vertex
			extraAttribs.push(mesh.verticesName);
			gl.bindBuffer(gl.ARRAY_BUFFER, this.#vertexBuffer);
			gl.bufferData(gl.ARRAY_BUFFER, mesh.vertices, gl.STATIC_DRAW);
			gl.bindBuffer(gl.ARRAY_BUFFER, null);
		}

		if (mesh.hasNormals) {
			//绑定normal
			extraAttribs.push(mesh.normalsName);
			gl.bindBuffer(gl.ARRAY_BUFFER, this.#normalBuffer);
			gl.bufferData(gl.ARRAY_BUFFER, mesh.normals, gl.STATIC_DRAW);
			gl.bindBuffer(gl.ARRAY_BUFFER, null);
		}

		if (mesh.hasTexcoords) {
			//绑定uv
			extraAttribs.push(mesh.texcoordsName);
			gl.bindBuffer(gl.ARRAY_BUFFER, this.#texcoordBuffer);
			gl.bufferData(gl.ARRAY_BUFFER, mesh.texcoords, gl.STATIC_DRAW);
			gl.bindBuffer(gl.ARRAY_BUFFER, null);
		}
		//绑定indices
		gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, this.#indicesBuffer);
		gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(mesh.indices), gl.STATIC_DRAW);
		gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, null);
		//材质中增加额外参数(名称)
		this.material.setMeshAttribs(extraAttribs);
		//这里shader的编译放在绑定模型之后
		this.shader = this.material.compile(gl);
	}
	//模型绘制函数
	draw(camera, transform) {
		const gl = this.gl;

		let modelViewMatrix = mat4.create();
		let projectionMatrix = mat4.create();
		//更新MVP矩阵
		camera.updateMatrixWorld();
		mat4.invert(modelViewMatrix, camera.matrixWorld.elements);
		mat4.translate(modelViewMatrix, modelViewMatrix, transform.translate);
		mat4.scale(modelViewMatrix, modelViewMatrix, transform.scale);
		mat4.copy(projectionMatrix, camera.projectionMatrix.elements);
		//绑定Vertex
		if (this.mesh.hasVertices) {
			const numComponents = 3;
			const type = gl.FLOAT;
			const normalize = false;
			const stride = 0;
			const offset = 0;
			gl.bindBuffer(gl.ARRAY_BUFFER, this.#vertexBuffer);
			gl.vertexAttribPointer(
				this.shader.program.attribs[this.mesh.verticesName],
				numComponents,
				type,
				normalize,
				stride,
				offset);
			gl.enableVertexAttribArray(
				this.shader.program.attribs[this.mesh.verticesName]);
		}
		//绑定normal
		if (this.mesh.hasNormals) {
			const numComponents = 3;
			const type = gl.FLOAT;
			const normalize = false;
			const stride = 0;
			const offset = 0;
			gl.bindBuffer(gl.ARRAY_BUFFER, this.#normalBuffer);
			gl.vertexAttribPointer(
				this.shader.program.attribs[this.mesh.normalsName],
				numComponents,
				type,
				normalize,
				stride,
				offset);
			gl.enableVertexAttribArray(
				this.shader.program.attribs[this.mesh.normalsName]);
		}
		//绑定uv
		if (this.mesh.hasTexcoords) {
			const numComponents = 2;
			const type = gl.FLOAT;
			const normalize = false;
			const stride = 0;
			const offset = 0;
			gl.bindBuffer(gl.ARRAY_BUFFER, this.#texcoordBuffer);
			gl.vertexAttribPointer(
				this.shader.program.attribs[this.mesh.texcoordsName],
				numComponents,
				type,
				normalize,
				stride,
				offset);
			gl.enableVertexAttribArray(
				this.shader.program.attribs[this.mesh.texcoordsName]);
		}
		//绑定indices
		gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, this.#indicesBuffer);
		//绑定shader
		gl.useProgram(this.shader.program.glShaderProgram);
		//绑定mvp矩阵
		gl.uniformMatrix4fv(
			this.shader.program.uniforms.uProjectionMatrix,
			false,
			projectionMatrix);
		gl.uniformMatrix4fv(
			this.shader.program.uniforms.uModelViewMatrix,
			false,
			modelViewMatrix);

		// Specific the camera uniforms
		gl.uniform3fv(
			this.shader.program.uniforms.uCameraPos,
			[camera.position.x, camera.position.y, camera.position.z]);
		//绑定材质参数
		for (let k in this.material.uniforms) {
			if (this.material.uniforms[k].type == 'matrix4fv') {
				gl.uniformMatrix4fv(
					this.shader.program.uniforms[k],
					false,
					this.material.uniforms[k].value);
			} else if (this.material.uniforms[k].type == '3fv') {
				gl.uniform3fv(
					this.shader.program.uniforms[k],
					this.material.uniforms[k].value);
			} else if (this.material.uniforms[k].type == '1f') {
				gl.uniform1f(
					this.shader.program.uniforms[k],
					this.material.uniforms[k].value);
			} else if (this.material.uniforms[k].type == '1i') {
				gl.uniform1i(
					this.shader.program.uniforms[k],
					this.material.uniforms[k].value);
			} else if (this.material.uniforms[k].type == 'texture') {
				gl.activeTexture(gl.TEXTURE0);
				gl.bindTexture(gl.TEXTURE_2D, this.material.uniforms[k].value.texture);
				gl.uniform1i(this.shader.program.uniforms[k], 0);
			}
		}

		{
			const vertexCount = this.mesh.count;
			const type = gl.UNSIGNED_SHORT;
			const offset = 0;
			//绘制三角形
			gl.drawElements(gl.TRIANGLES, vertexCount, type, offset);
		}
	}
}

WebGLRenderer.js

// Remain rotatation
class TRSTransform {
	//这里只做了平移和缩放?这里的translate为相机的位置
    constructor(translate = [0, 0, 0], scale = [1, 1, 1]) {
        this.translate = translate;
        this.scale = scale;
    }
}

class WebGLRenderer {
    meshes = [];
    lights = [];
	//获取gl和相机
    constructor(gl, camera) {
        this.gl = gl;
        this.camera = camera;
    }
	//为什么会在添加灯光时创建MeshRender?
    addLight(light) { this.lights.push({ entity: light, meshRender: new MeshRender(this.gl, light.mesh, light.mat) }); }

    addMesh(mesh) { this.meshes.push(mesh); }
	//渲染循环
    render(guiParams) {
        const gl = this.gl;
		//与opengl相同
        gl.clearColor(0.0, 0.0, 0.0, 1.0); // Clear to black, fully opaque
        gl.clearDepth(1.0); // Clear everything
        gl.enable(gl.DEPTH_TEST); // Enable depth testing
        gl.depthFunc(gl.LEQUAL); // Near things obscure far things

        gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

        // Handle light
        //光源自动移动,根据时间戳生成灯光位置
        const timer = Date.now() * 0.00025;
        let lightPos = [ Math.sin(timer * 6) * 100, 
                         Math.cos(timer * 4) * 150, 
                         Math.cos(timer * 2) * 100 ];

        if (this.lights.length != 0) {
            for (let l = 0; l < this.lights.length; l++) {
                let trans = new TRSTransform(lightPos);
                //为什么要用光源计算draw call次数?
                this.lights[l].meshRender.draw(this.camera, trans);

                for (let i = 0; i < this.meshes.length; i++) {
                    const mesh = this.meshes[i];
					//获取ui中用户输入的位置和缩放(我觉得使用旋转矩阵比较好吧)
                    const modelTranslation = [guiParams.modelTransX, guiParams.modelTransY, guiParams.modelTransZ];
                    const modelScale = [guiParams.modelScaleX, guiParams.modelScaleY, guiParams.modelScaleZ];
                    let meshTrans = new TRSTransform(modelTranslation, modelScale);
                    //绑定shader
                    this.gl.useProgram(mesh.shader.program.glShaderProgram);
                    //绑定uniform
                    this.gl.uniform3fv(mesh.shader.program.uniforms.uLightPos, lightPos);
                    mesh.draw(this.camera, meshTrans);
                }
            }
        } else {
            // Handle mesh(no light)
            //无光源的时候直接渲染模型
            for (let i = 0; i < this.meshes.length; i++) {
                const mesh = this.meshes[i];
                let trans = new TRSTransform();
                mesh.draw(this.camera, trans);
            }
        }
    }
}

InternalShader.js

  • 这里是一些shader的字符串,一个cube着色shader,一个模型着色shader(fs中只使用了贴图作为gl_FragColor )
//渲染灯光cube的vs
const LightCubeVertexShader = `
attribute vec3 aVertexPosition;

uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;


void main(void) {

  gl_Position = uProjectionMatrix * uModelViewMatrix * vec4(aVertexPosition, 1.0);

}
`;
//渲染灯光cube的FS
const LightCubeFragmentShader = `
#ifdef GL_ES
precision mediump float;
#endif

uniform float uLigIntensity;
uniform vec3 uLightColor;

void main(void) {
    
  //gl_FragColor = vec4(1,1,1, 1.0);
  gl_FragColor = vec4(uLightColor, 1.0);
}
`;
//渲染模型的vs
const VertexShader = `
attribute vec3 aVertexPosition;
attribute vec3 aNormalPosition;
attribute vec2 aTextureCoord;

uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;

varying highp vec3 vFragPos;
varying highp vec3 vNormal;
varying highp vec2 vTextureCoord;

void main(void) {

  vFragPos = aVertexPosition;
  vNormal = aNormalPosition;

  gl_Position = uProjectionMatrix * uModelViewMatrix * vec4(aVertexPosition, 1.0);

  vTextureCoord = aTextureCoord;

}
`;
//渲染模型的fs,这里颜色只采样贴图
const FragmentShader = `
#ifdef GL_ES
precision mediump float;
#endif

uniform int uTextureSample;
uniform vec3 uKd;
uniform sampler2D uSampler;
uniform vec3 uLightPos;
uniform vec3 uCameraPos;

varying highp vec3 vFragPos;
varying highp vec3 vNormal;
varying highp vec2 vTextureCoord;

void main(void) {
  
  if (uTextureSample == 1) {
    gl_FragColor = texture2D(uSampler, vTextureCoord);
  } else {
    gl_FragColor = vec4(uKd,1);
  }

}
`;

lightShader

  • VS。可以看出, 使用attribute 绑定VBO, uniform与OpenGL相同
attribute vec3 aVertexPosition;

uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;


void main(void) {

  gl_Position = uProjectionMatrix * uModelViewMatrix * vec4(aVertexPosition, 1.0);

}
  • FS。
#ifdef GL_ES
//使用低精度的数值
precision mediump float;
#endif

uniform float uLigIntensity;
uniform vec3 uLightColor;

void main(void) {
    
  //gl_FragColor = vec4(1,1,1, 1.0);
  gl_FragColor = vec4(uLightColor, 1.0);
}

phongShader

  • VS
attribute vec3 aVertexPosition;
attribute vec3 aNormalPosition;
attribute vec2 aTextureCoord;

uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;
//varying 传递变量时使用
//这里传递的是高精度的数值
varying highp vec2 vTextureCoord;
varying highp vec3 vFragPos;
varying highp vec3 vNormal;


void main(void) {

  vFragPos = aVertexPosition;
  vNormal = aNormalPosition;

  gl_Position = uProjectionMatrix * uModelViewMatrix * vec4(aVertexPosition, 1.0);

  vTextureCoord = aTextureCoord;

}
  • FS
#ifdef GL_ES
precision mediump float;
#endif
uniform sampler2D uSampler;
uniform vec3 uKd;
uniform vec3 uKs;
uniform vec3 uLightPos;
uniform vec3 uCameraPos;
uniform float uLightIntensity;
uniform int uTextureSample;

varying highp vec2 vTextureCoord;
varying highp vec3 vFragPos;
varying highp vec3 vNormal;
//Blinn-Phong基础中的基础了,略掉
void main(void) {
  vec3 color;
  if (uTextureSample == 1) {
    color = pow(texture2D(uSampler, vTextureCoord).rgb, vec3(2.2));
  } else {
    color = uKd;
  }
  
  vec3 ambient = 0.05 * color;

  vec3 lightDir = normalize(uLightPos - vFragPos);
  vec3 normal = normalize(vNormal);
  float diff = max(dot(lightDir, normal), 0.0);
  float light_atten_coff = uLightIntensity / length(uLightPos - vFragPos);
  vec3 diffuse =  diff * light_atten_coff * color;

  vec3 viewDir = normalize(uCameraPos - vFragPos);
  float spec = 0.0;
  vec3 reflectDir = reflect(-lightDir, normal);
  spec = pow (max(dot(viewDir, reflectDir), 0.0), 35.0);
  vec3 specular = uKs * light_atten_coff * spec;  
  
  gl_FragColor = vec4(pow((ambient + diffuse + specular), vec3(1.0/2.2)), 1.0);

}

小结

   webgl的开发环境还是很好搭建的。本节主要熟悉一下作业webgl的框架。webgl主要依赖于Three.js 库。gl相关语法与OpenGL几乎相同。对开源渲染引擎也算是了解一些。这个框架感觉挺怪的。不过作为学习使用,还是很简单易懂的!

  • 2
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
webgl-fingerprint-defende.crx 是一个浏览器扩展文件,它的作用是提供WebGL指纹防御功能。WebGL是一种用于在网页浏览器中渲染3D图形的技术,而指纹则是通过收集浏览器和计算机的信息来识别用户身份的一种方法。 通常,浏览器会从操作系统和硬件中收集一些信息来创建一个独特的指纹识别码。这个识别码可以随着浏览器的使用而变化,因此可以用作用户识别。然而,一些网站可能会滥用这种技术来跟踪用户的在线行为,侵犯用户隐私。 webgl-fingerprint-defende.crx 文件可以帮助用户保护自己的隐私,防止被WebGL指纹识别出来。它通过修改浏览器的WebGL指纹数据,使之变得随机或无法识别。这样,即使网站尝试使用WebGL指纹进行用户跟踪,也无法准确识别用户的真实身份。 使用 webgl-fingerprint-defende.crx 文件可以有效地防止被WebGL指纹追踪,保护用户的个人隐私。它的安装和使用也非常简单,只需将文件添加到浏览器的扩展管理页面即可。然后,在用户浏览网页时,该扩展将自动激活并对WebGL指纹进行保护。 需要注意的是,虽然这个扩展可以有效防止WebGL指纹追踪,但在使用时仍需注意个人隐私的其他方面。同时,由于浏览器和WebGL技术的不断更新和演变,扩展的效果可能会有所变化。因此,保持扩展的更新和关注相关的隐私保护措施是很重要的。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值