效果:
这次主要碰到两个难点:
1. 如何进行切线空间的变换,使得即使 mesh 发生变换,法线也依然是正确的。(图中平面已绕 x 轴旋转90°)
2. 正确计算光照
探索过程
一开始考虑基本上有3种可能:
1. 作为 attribute 引入,这也是 opengl 教程的方法
2. 在 vertex shader 里实现,有博客http://www.zwqxin.com/archives/shaderglsl/review-normal-map-bump-map.html 但是并看不懂
3. 在 frag shader 里实现
https://github.com/mrdoob/three.js/blob/6e89128f1ae239f29f2124a43133bb3d767b19bf/src/renderers/shaders/ShaderChunk/normalmap_pars_fragment.glsl
原理博客:http://hacksoflife.blogspot.com/2009/11/per-pixel-tangent-space-normal-mapping.html
https://github.com/mrdoob/three.js/issues/7094 这个串就是讨论关于 three.js 的 tangent space 的实现问题的,可以看到在约 r70之前是有支持的,但是现在已经去掉了。
一开始是觉得按 attribute 的方法实现好像太累赘了,如果要做的话(按 learn-opengl 教程的风格),应该得取 geometry 的 face 信息,然后计算,然后新建一个 bufferGeometry 放入自定义 attribute。
不过经过这两天的研究,发现 three.js 用的是第三种办法,用 GLSL内置的函数dFdy
和dFdx
在片段着色器中计算TBN矩阵
。
一开始我用的就是这段代码:
// Per-Pixel Tangent Space Normal Mapping
// http://hacksoflife.blogspot.ch/2009/11/per-pixel-tangent-space-normal-mapping.html
vec3 perturbNormal2Arb( vec3 eye_pos, vec3 surf_norm ) {
// Workaround for Adreno 3XX dFd*( vec3 ) bug. See #9988
vec3 q0 = vec3( dFdx( eye_pos.x ), dFdx( eye_pos.y ), dFdx( eye_pos.z ) );
vec3 q1 = vec3( dFdy( eye_pos.x ), dFdy( eye_pos.y ), dFdy( eye_pos.z ) );
vec2 st0 = dFdx( vUv.st );
vec2 st1 = dFdy( vUv.st );
float scale = sign( st1.t * st0.s - st0.t * st1.s ); // we do not care about the magnitude
vec3 S = normalize( ( q0 * st1.t - q1 * st0.t ) * scale );
vec3 T = normalize( ( - q0 * st1.s + q1 * st0.s ) * scale );
vec3 N = normalize( surf_norm );
mat3 tsn = mat3( S, T, N );
vec3 mapN = texture2D( normalMap, vUv ).xyz * 2.0 - 1.0;
mapN.xy *= normalScale;
mapN.xy *= ( float( gl_FrontFacing ) * 2.0 - 1.0 );
return normalize( tsn * mapN );
}
通过搜索 github 仓库,调用这个函数的代码是在
https://github.com/mrdoob/three.js/blob/6e89128f1ae239f29f2124a43133bb3d767b19bf/src/renderers/shaders/ShaderChunk/normal_fragment_maps.glsl#L23
normal = perturbNormal2Arb( -vViewPosition, normal );
而vViewPosition = -modelViewPosition
// vertex shader
modelViewPosition = modelViewMatrix * vec4(position, 1.0);
这个时候我还没有发现光照计算的错误,加上看不太懂这个方法,于是(莫名其妙)找到了另外一份代码:
https://github.com/mrdoob/three.js/blob/f0936b0c3e4d050dc412b5b922e38400d54f4010/examples/js/ShaderSkin.js#L426
这份代码实现的是一个皮肤的 shader,也是很值得看的,它展示了如何去手动计算光照,以及如何手动实现了 normal mapping,我就是参考它的,另外它做切线空间变换的代码也更容易懂,不过暂时只是大概有个印象,至于为什么要传入-vViewPosition,tangent为什么是这样计算还需要继续研究。。。:
// normal mapping
"vec4 posAndU = vec4( -vViewPosition, vUv.x );",
"vec4 posAndU_dx = dFdx( posAndU ), posAndU_dy = dFdy( posAndU );",
"vec3 tangent = posAndU_dx.w * posAndU_dx.xyz + posAndU_dy.w * posAndU_dy.xyz;",
"vec3 normal = normalize( vNormal );",
"vec3 binormal = normalize( cross( tangent, normal ) );",
"tangent = cross( normal, binormal );", // no normalization required
"mat3 tsb = mat3( tangent, binormal, normal );",
"vec3 normalTex = texture2D( tNormal, vUv ).xyz * 2.0 - 1.0;",
"normalTex.xy *= uNormalScale;",
"normalTex = normalize( normalTex );",
"vec3 finalNormal = tsb * normalTex;",
"normal = normalize( finalNormal );",
注意这里的vNormal
,它是在顶点着色器中这样计算的,属于 view space,之后的一切计算都是在 view space 中进行的:
//normalMatrix = inverse transpose of modelViewMatrix
vNormal = normalize( normalMatrix * normal );
因为引入的光照的数据是在 view space 的,我没有注意到,沿用了在 world space 计算的代码,所以一直出错,最后发现了,把原本的观察位置(原来设为 camera position)改为 vec3(0.0,0.0,0.0)
就正确了。
虽然数学原理还没有很理解,但是折腾了这一番感觉对之前感到很困扰的shaderChunk
等东西有了初步的认识。
代码
这里只做了 简易版的 pointlight 的照明,没有衰减的效果。
vertex shader:
varying vec2 vUv;
varying vec3 viewPos;
varying vec3 worldPos;
varying vec3 vNormal;
varying vec3 vViewPosition;
uniform mat4 transform; // 在本程序中没用,是个 identity 矩阵
// uniform vec3 cameraPosition;
// uniform mat3 normalMatrix; // = inverse transpose of modelViewMatrix
// uniform mat4 viewMatrix;
// uniform mat4 projectionMatrix;
// uniform mat4 modelViewMatrix;
// uniform mat4 modelMatrix;
void main() {
vUv = uv;
vNormal = normal;
worldPos = (viewMatrix*transform*modelMatrix*vec4( position, 1.0 )).xyz;
// viewPos = cameraPosition; // world-space lighting
viewPos = vec3(0.0,0.0,0.0); // view-space lighting
vNormal = normalize( normalMatrix * normal );
vec4 mvPosition = viewMatrix*transform*modelMatrix*vec4( position, 1.0 );
vViewPosition = -mvPosition.xyz;
gl_Position = projectionMatrix*mvPosition;
}
fragment shader:
光照是手动做的,所以比较长,可以尝试直接拼接 shaderchunk 中的相关模块
https://github.com/mrdoob/three.js/blob/dev/src/renderers/shaders/ShaderChunk/lights_fragment_begin.glsl
varying vec2 vUv;
varying vec3 vNormal;
uniform float time;
varying vec3 viewPos;
varying vec3 worldPos;
uniform sampler2D diffuseMap;
uniform sampler2D normalMap;
varying vec3 vViewPosition;
#if NUM_POINT_LIGHTS > 0
struct PointLight {
vec3 color;
vec3 position; // light position, in camera coordinates
float distance; // used for attenuation purposes. Since
// we're writing our own shader, it can
// really be anything we want (as long as
// we assign it to our light in its
// "distance" field
};
uniform PointLight pointLights[NUM_POINT_LIGHTS];
// uniform vec3 pointLightColor[NUM_POINT_LIGHTS];
// uniform vec3 pointLightPosition[NUM_POINT_LIGHTS];
// uniform float pointLightDistance[NUM_POINT_LIGHTS];
#endif
//
#if NUM_DIR_LIGHTS > 0
uniform vec3 directionalLightColor[NUM_DIR_LIGHTS];
uniform vec3 directionalLightDirection[NUM_DIR_LIGHTS];
struct DirectionalLight {
vec3 direction;
vec3 color;
int shadow;
float shadowBias;
float shadowRadius;
vec2 shadowMapSize;
};
uniform DirectionalLight directionalLights[ NUM_DIR_LIGHTS ];
#endif
uniform vec3 ambientLightColor;
// -----------------------------------
vec3 perturbNormal2Arb( vec3 eye_pos, vec3 surf_norm ) {
vec2 normalScale = vec2(1.0,1.0);
// Workaround for Adreno 3XX dFd*( vec3 ) bug. See #9988
vec3 q0 = vec3( dFdx( eye_pos.x ), dFdx( eye_pos.y ), dFdx( eye_pos.z ) );
vec3 q1 = vec3( dFdy( eye_pos.x ), dFdy( eye_pos.y ), dFdy( eye_pos.z ) );
vec2 st0 = dFdx( vUv.st );
vec2 st1 = dFdy( vUv.st );
float scale = sign( st1.t * st0.s - st0.t * st1.s ); // we do not care about the magnitude
vec3 S = normalize( ( q0 * st1.t - q1 * st0.t ) * scale );
vec3 T = normalize( ( - q0 * st1.s + q1 * st0.s ) * scale );
vec3 N = normalize( surf_norm );
mat3 tsn = mat3( S, T, N );
vec3 mapN = texture2D( normalMap, vUv ).xyz * 2.0 - 1.0;
// mapN.xy *= normalScale;
// mapN.xy *= ( float( gl_FrontFacing ) * 2.0 - 1.0 );
return normalize( tsn * mapN );
// return ( tsn * mapN );
}
mat3 tangentTransform(vec3 vViewPosition) {
// normal mapping
vec4 posAndU = vec4( -vViewPosition, vUv.x );
// tangent is alongside the u-axis(x-axis, horizontal one.)
vec4 posAndU_dx = dFdx( posAndU ), posAndU_dy = dFdy( posAndU );
vec3 tangent = posAndU_dx.w * posAndU_dx.xyz + posAndU_dy.w * posAndU_dy.xyz;
vec3 normal = normalize( vNormal );
vec3 binormal = normalize( cross( tangent, normal ) );
tangent = cross( normal, binormal ); // no normalization required
mat3 tsb = mat3( tangent, binormal, normal );
return tsb;
}
// -----------------------------------
void main() {
vec4 diffuse = texture2D(diffuseMap, vUv);
vec3 samNorm = texture2D(normalMap, vUv).xyz;
samNorm = samNorm * 2.0 - 1.0;
vec3 normal = 1.0 * samNorm;
// option1
// normal = perturbNormal2Arb( -vViewPosition, normal ); // this also works
// option2
mat3 tsb = tangentTransform( vViewPosition );
// normal.xy *= vNormalScale;
normal = normalize(tsb * normal);
vec4 addedLights = vec4(0.0,0.0,0.0, 1.0);
for(int l = 0; l < NUM_POINT_LIGHTS; l++) {
vec3 lightPos = pointLights[l].position;
vec3 lightColor = pointLights[l].color;
// lightPos = vec3(-10.0,2.0,10.0); // debugging
// lightColor = vec3(0.0,5.0,1.0); // debugging
vec3 lightDir = normalize(lightPos - worldPos);
// diffuse lighting
addedLights.rgb += clamp(dot(lightDir, normal), 0.0, 1.0) * lightColor;
// specular lighting
float specularStrength = 0.8;
vec3 viewDir = normalize(viewPos - vec3(worldPos));
vec3 inlight = -lightDir;
vec3 reflectDir = reflect(inlight, normal);
float spec = pow(max(dot(viewDir, reflectDir), 0.0), 16.0);
vec3 specular = specularStrength * spec * lightColor;
addedLights.rgb += specular;
}
gl_FragColor = mix(vec4(diffuse.x, diffuse.y, diffuse.z, 1.0), addedLights, 0.5);
// gl_FragColor = vec4(0.5,0.5,0.5,1.0);
// gl_FragColor = vec4(pointLights[0].position / length(pointLights[0].position), 1.0);
// gl_FragColor = diffuse;
// gl_FragColor = vec4(normal,1.0);
// gl_FragColor = addedLights;
// gl_FragColor = vec4( directionalLnormalights[0].color, 1.0);
}
如注释所言,两种方法都可以,但当光从侧面射来时 Option1的效果如下,可以看到和旁边的效果还是有差距的。:
主程序,只贴出无关场景基础建设的部分
function addObjs() {
var plgeo = new THREE.PlaneBufferGeometry(5,10);
var brickmap = new THREE.TextureLoader().load( "images/brickwall.jpg",
(texture)=>{
texture.wrapS = THREE.RepeatWrapping;
texture.wrapT = THREE.RepeatWrapping;
myUniforms.diffuseMap.value = texture;
});
var normalmap = new THREE.TextureLoader().load( "images/brickwall_normal.jpg",
(texture)=>{
texture.wrapS = THREE.RepeatWrapping;
texture.wrapT = THREE.RepeatWrapping;
myUniforms.normalMap.value = texture;
});
myUniforms = THREE.UniformsUtils.merge([
THREE.UniformsLib['lights'], // must merge this before set .lights = true
// THREE.UniformsLib['normalmap'],
{
time: { value: 1.0 },
// diffuse: {type: 'c', value: new THREE.Color(0xffffff)},
diffuseMap: {
type: 't',
value: brickmap
},
normalMap: {
type: 't',
value: normalmap
},
transform: {
type: "m4", value: new THREE.Matrix4()
}
, updatedNormalMatrix: {
type: "m3", value: new THREE.Matrix3()
}
}]);
var plmat = new THREE.ShaderMaterial( {
uniforms: myUniforms,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
lights: true,
derivatives: true // 使用 dFdx / dFdy 函数时需要手动启用
} );
var pln = new THREE.Mesh(plgeo, plmat);
scene.add(pln);
var tsf = new THREE.Matrix4();
pln.position.x = -2.5;
pln.rotation.x = -Math.PI/2;
// 右边的那个平面
var sph2 = new THREE.Mesh(plgeo.clone(), new THREE.MeshPhongMaterial({
color:0xdddddd,
map: brickmap,
normalMap: normalmap
}));
scene.add(sph2);
sph2.position.x = 2.5;
sph2.rotation.x = -Math.PI/2;
}