之前B站Unity官方发布了一个URP制作水体着色器的教程视频,我试着将它移植到shaderlab中,并增加反射、折射等效果。
深度检测
首先是深度检测。
如果直接渲染水体材质,就无法体现深水区和浅水的明暗变化。
(图片来源)
为了营造这种明暗变化,需要读取相机深度图还原深度,然后同当前坐标对比,已确定水深。
我们首先使用以下代码读取一下摄像机深度图:
camera.depthTextureMode |= DepthTextureMode.Depth;
这样,在shader中通过
uniform sampler2D_float _CameraDepthTexture;
就能获取摄像机深度图,值得注意的是,为了得到合理的深度图,需要相机现将不透明物体都渲染好后再处理水面。同时之后做折射是用GrabPass制作,不需要透明。因此shader的Tag为:
Tags { "Queue"="Transparent" "RenderType"="Opaque" }
在顶点着色器中:
获得深度图后,我们需要一个UV对图片进行采样。为了确定当前顶点再深度图中的位置,Unity提供了一个inline方法,方法需要传入裁剪空间坐标。:
o.scrPos = ComputeScreenPos ( o.vertex );
作为对比,同样需要水体当前位置的深度,Unity同样提供了inline方法:
COMPUTE_EYEDEPTH(o.scrPos.z);
//源码:#define COMPUTE_EYEDEPTH(o) o = -UnityObjectToViewPos( v.vertex ).z
由于之后用不上scrPos.z
因此这里直接替换掉。
在片面着色器中:
fixed depth =LinearEyeDepth(SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(i.scrPos)));
float diff = 1-saturate(((depth-i.scrPos.z) -_Depth)*_Strenght);
fixed4 col = lerp(_DepthCol, _ShallowCol, diff);
读取深度图后通过上面的操作转化为观察空间线性深度。然后同刚才的scrPos.z
相减获得深度差。对浅水颜色和深水颜色进行差值。当看到这个图像后就证明深度检测做好了:
折射
这里使用《shader入门精要》中的水体折射方案。
在渲染pass前先使用GrabPass{"_RefractionTex"}
渲染水池底部。渲染pass中要加入
sampler2D _RefractionTex;
float4 _RefractionTex_TexelSize;
来获取GrabPass纹理,与纹素。与上文深度检测中使用的深度图相同,GrabPass纹理也需要一个UV。可以使用下面这个inline函数处理:
o.grabPos = ComputeGrabScreenPos(o.vertex);
函数看上去之前的ComputeScreenPos ()
十分相似,实际上他们源码的内容也是高度相似的。他们的输出结果只在少数情况下y值相反,具体区别可以参考这篇文章,以下是两者的源码:
inline float4 ComputeNonStereoScreenPos(float4 pos) {
float4 o = pos * 0.5f;
o.xy = float2(o.x, o.y*_ProjectionParams.x) + o.w;
o.zw = pos.zw;
return o;
}
inline float4 ComputeScreenPos(float4 pos) {
float4 o = ComputeNonStereoScreenPos(pos);
#if defined(UNITY_SINGLE_PASS_STEREO)
o.xy = TransformStereoScreenSpaceTex(o.xy, pos.w);
#endif
return o;
}
inline float4 ComputeGrabScreenPos (float4 pos) {
#if UNITY_UV_STARTS_AT_TOP
float scale = -1.0;
#else
float scale = 1.0;
#endif
float4 o = pos * 0.5f;
o.xy = float2(o.x, o.y*scale) + o.w;
#ifdef UNITY_SINGLE_PASS_STEREO
o.xy = TransformStereoScreenSpaceTex(o.xy, pos.w);
#endif
o.zw = pos.zw;
return o;
}
片面着色器中的折射函数也相当简单
//折射
float2 offset = 10*normal.xz*_RefrDistortion * _RefractionTex_TexelSize.xy;
float2 offsetPos = offset+i.grabPos.xy;
fixed3 refrCol = tex2D(_RefractionTex,offsetPos/i.grabPos.w).rgb;
这里通过水面法线对采样UV进行偏移。采样结果就是折射颜色。
关于水面法线的处理可以自由发挥,值得注意的是,如果是水平平面水源,可以直接采样切线空间法线贴图来当作世界空间法线使用。
下图就是正常渲染的反射颜色,在《入门精要》中可以增加y轴垂直方向的位移来增加折射真实感,但是在水体边缘处会有点穿帮,因此这个效果并不采用。
反射
目前我所了解到的水体反射有三种制作方法,最简单的就是使用Unity的reflect函数对CubeMap进行反射映射。然后是通过脚本创建反射摄像机的方式实时绘制反射图像。最后是后期制作屏幕空间反射(Screen Space Reflection)模拟真实反射。
这里我制作了其中前两者。
CubeMap反射最为简单,只需申明一个CubeMap反射球。
_Cubemap("Reflect Cubemap",Cube) = "_Skybox" {}
然后调用已有的反射函数
//反射
float3 viewVector = normalize(UnityWorldSpaceViewDir(worldPos));
float3 worldRefl = reflect(-viewVector,normal);
float3 reflCol = texCUBE(_Cubemap,worldRefl ).rgb;
通过传入reflect函数世界空间观察向量和法线,计算映射UV,通过采样CubeMap来获得反射颜色。
这是直接使用skybox作为CubeMap绘制的反射效果。如果要反射其他物体需要另外制作CubeMap,或者使用Reflection Probe。当也有实时生成CubeMap的C#脚本,但是消耗巨大。对于普通物体CubeMap反射由于消耗小也许好用,但是地面、水面这种大平面使用这种方法映射除无限远的天空盒外的其他物体很容易产生错位。这里就引入另一种平面反射方案。
方案来源:贴吧
方案使用C#脚本创建了一个斜视锥体摄像机。由于时代久远很多函数都遭到了Unity的抛弃,这里的C#脚本是我经过修复后的Mirror脚本。源码会在文章最后给出。将脚本给到水体平面后,shader只需要对摄像机产生的图像_Reflection
进行读取即可,使用的UV是经过偏移后的屏幕UV坐标:
float2 screenUV = i.scrPos.xy/i.scrPos.w + normal.xz * _ReflDistortion*_Reflection_TexelSize.xy;
fixed3 reflCol = tex2D(_Reflection, screenUV).rgb;
最后的反射效果如下图所示:
可以看到,这种方案绘制的反射效果十分优秀。集齐反射,折射和深度检测了就可以进行光照处理了,光照处理方案就不多说了,下面的就是我的水体效果和C#脚本及shader源码:
gif经过压缩,看不清水底,实际上浅水深水透明度都可以单独调整。
下面是斜视锥体摄像机C#脚本:
using UnityEngine;
using System.Collections;
[ExecuteInEditMode]
public class Mirror : MonoBehaviour
{
public bool m_DisablePixelLights = true;
public int m_TextureSize = 256;
public float m_ClipPlaneOffset = 0.07f;
public bool m_IsFlatMirror = true;
public LayerMask m_ReflectLayers = -1;
private Hashtable m_ReflectionCameras = new Hashtable();
private RenderTexture m_ReflectionTexture = null;
private int m_OldReflectionTextureSize = 0;
private static bool s_InsideRendering = false;
private Renderer renderer;
public void Awake()
{
if (!renderer)
{
renderer = GetComponent<Renderer>();
}
}
public void OnWillRenderObject()
{
if (!enabled || !renderer || !renderer.sharedMaterial || !renderer.enabled)
return;
Camera cam = Camera.current;
if (!cam)
return;
if (s_InsideRendering)
return;
s_InsideRendering = true;
Camera reflectionCamera;
CreateMirrorObjects(cam, out reflectionCamera);
Vector3 pos = transform.position;
Vector3 normal;
if (m_IsFlatMirror)
{
normal = transform.up;
}
else
{
normal = transform.position - cam.transform.position;
normal.Normalize();
}
int oldPixelLightCount = QualitySettings.pixelLightCount;
if (m_DisablePixelLights)
QualitySettings.pixelLightCount = 0;
UpdateCameraModes(cam, reflectionCamera);
float d = -Vector3.Dot(normal, pos) - m_ClipPlaneOffset;
Vector4 reflectionPlane = new Vector4(normal.x, normal.y, normal.z, d);
Matrix4x4 reflection = Matrix4x4.zero;
CalculateReflectionMatrix(ref reflection, reflectionPlane);
Vector3 oldpos = cam.transform.position;
Vector3 newpos = reflection.MultiplyPoint(oldpos);
reflectionCamera.worldToCameraMatrix = cam.worldToCameraMatrix * reflection;
Vector4 clipPlane = CameraSpacePlane(reflectionCamera, pos, normal, 1.0f);
Matrix4x4 projection = cam.projectionMatrix;
CalculateObliqueMatrix(ref projection, clipPlane);
reflectionCamera.projectionMatrix = projection;
reflectionCamera.cullingMask = ~(1 << 4) & m_ReflectLayers.value;
reflectionCamera.targetTexture = m_ReflectionTexture;
//GL.SetRevertBackfacing(true);
GL.invertCulling = true;
reflectionCamera.transform.position = newpos;
Vector3 euler = cam.transform.eulerAngles;
reflectionCamera.transform.eulerAngles = new Vector3(0, euler.y, euler.z);
reflectionCamera.Render();
reflectionCamera.transform.position = oldpos;
//GL.SetRevertBackfacing(false);
GL.invertCulling = false;
Material[] materials = renderer.sharedMaterials;
foreach (Material mat in materials)
{
if (mat.HasProperty("_Reflection"))
mat.SetTexture("_Reflection", m_ReflectionTexture);
}
if (m_DisablePixelLights)
QualitySettings.pixelLightCount = oldPixelLightCount;
s_InsideRendering = false;
}
void OnDisable()
{
if (m_ReflectionTexture)
{
DestroyImmediate(m_ReflectionTexture);
m_ReflectionTexture = null;
}
foreach (DictionaryEntry kvp in m_ReflectionCameras)
DestroyImmediate(((Camera)kvp.Value).gameObject);
m_ReflectionCameras.Clear();
}
private void UpdateCameraModes(Camera src, Camera dest)
{
if (dest == null)
return;
dest.clearFlags = src.clearFlags;
dest.backgroundColor = src.backgroundColor;
if (src.clearFlags == CameraClearFlags.Skybox)
{
Skybox sky = src.GetComponent<Skybox>();
Skybox mysky = dest.GetComponent<Skybox>();
if (!sky || !sky.material)
{
mysky.enabled = false;
}
else
{
mysky.enabled = true;
mysky.material = sky.material;
}
}
dest.farClipPlane = src.farClipPlane;
dest.nearClipPlane = src.nearClipPlane;
dest.orthographic = src.orthographic;
dest.fieldOfView = src.fieldOfView;
dest.aspect = src.aspect;
dest.orthographicSize = src.orthographicSize;
dest.renderingPath = src.renderingPath;
}
private void CreateMirrorObjects(Camera currentCamera, out Camera reflectionCamera)
{
reflectionCamera = null;
if (!m_ReflectionTexture || m_OldReflectionTextureSize != m_TextureSize)
{
if (m_ReflectionTexture)
DestroyImmediate(m_ReflectionTexture);
m_ReflectionTexture = new RenderTexture(m_TextureSize, m_TextureSize, 16);
m_ReflectionTexture.name = "__MirrorReflection" + GetInstanceID();
m_ReflectionTexture.isPowerOfTwo = true;
m_ReflectionTexture.hideFlags = HideFlags.DontSave;
m_OldReflectionTextureSize = m_TextureSize;
}
reflectionCamera = m_ReflectionCameras[currentCamera] as Camera;
if (!reflectionCamera)
{
GameObject go = new GameObject("CubemapCamera", typeof(Camera), typeof(Skybox));
reflectionCamera = go.GetComponent<Camera>();
reflectionCamera.enabled = false;
reflectionCamera.transform.position = transform.position;
reflectionCamera.transform.rotation = transform.rotation;
reflectionCamera.gameObject.AddComponent<FlareLayer>();
go.hideFlags = HideFlags.HideAndDontSave;
m_ReflectionCameras[currentCamera] = reflectionCamera;
}
}
private static float sgn(float a)
{
if (a > 0.0f) return 1.0f;
if (a < 0.0f) return -1.0f;
return 0.0f;
}
private Vector4 CameraSpacePlane(Camera cam, Vector3 pos, Vector3 normal, float sideSign)
{
Vector3 offsetPos = pos + normal * m_ClipPlaneOffset;
Matrix4x4 m = cam.worldToCameraMatrix;
Vector3 cpos = m.MultiplyPoint(offsetPos);
Vector3 cnormal = m.MultiplyVector(normal).normalized * sideSign;
return new Vector4(cnormal.x, cnormal.y, cnormal.z, -Vector3.Dot(cpos, cnormal));
}
private static void CalculateObliqueMatrix(ref Matrix4x4 projection, Vector4 clipPlane)
{
Vector4 q = projection.inverse * new Vector4(
sgn(clipPlane.x),
sgn(clipPlane.y),
1.0f,
1.0f
);
Vector4 c = clipPlane * (2.0F / (Vector4.Dot(clipPlane, q)));
projection[2] = c.x - projection[3];
projection[6] = c.y - projection[7];
projection[10] = c.z - projection[11];
projection[14] = c.w - projection[15];
}
private static void CalculateReflectionMatrix(ref Matrix4x4 reflectionMat, Vector4 plane)
{
reflectionMat.m00 = (1F - 2F * plane[0] * plane[0]);
reflectionMat.m01 = (-2F * plane[0] * plane[1]);
reflectionMat.m02 = (-2F * plane[0] * plane[2]);
reflectionMat.m03 = (-2F * plane[3] * plane[0]);
reflectionMat.m10 = (-2F * plane[1] * plane[0]);
reflectionMat.m11 = (1F - 2F * plane[1] * plane[1]);
reflectionMat.m12 = (-2F * plane[1] * plane[2]);
reflectionMat.m13 = (-2F * plane[3] * plane[1]);
reflectionMat.m20 = (-2F * plane[2] * plane[0]);
reflectionMat.m21 = (-2F * plane[2] * plane[1]);
reflectionMat.m22 = (1F - 2F * plane[2] * plane[2]);
reflectionMat.m23 = (-2F * plane[3] * plane[2]);
reflectionMat.m30 = 0F;
reflectionMat.m31 = 0F;
reflectionMat.m32 = 0F;
reflectionMat.m33 = 1F;
}
}
以下是水体shader源码:
Shader "sence/Water"
{
Properties
{
_NormalTex ("NormalTex", 2D) = "white" {}
_HeightTex ("HeightTex", 2D) = "white" {}
_NormalStrenght("NormalStrenght",Range(0,2)) = 1
_WaterDir("WaterDir",Range(0,6.28)) = 0
_WaterSpeed("WaterSpeed",Range(0,1)) = 1
//深度检测
_DepthCol("Depth Water Color",Color) = (1,1,1,1)
_ShallowCol("Shallow Water Color",Color) = (1,1,1,1)
_Depth("Depth" , Range(0 , 5)) = 0.5 // 水浓度
_Strenght("Strenght" , Range(0 , 5)) = 0.5 // 水强度
//光照
_FresnelScale("FresnelScale",Range(0,1)) = 1 //菲涅耳参数
_Roughness("Roughness",Range(0,1)) = 0.5 //粗糙度
_Specular("Specular",Range(0,1)) = 0.5 //高光度
_SpecularSmooth("SpecularSmooth",Range(0,10)) = 0.5 //高光锐利
_RefrDistortion("RefrDistortion",Range(0,100)) = 10 //折射强度
_ReflDistortion("ReflDistortion",Range(0,100)) = 10 //折射强度
//反射方案二选一
//_Cubemap("Reflect Cubemap",Cube) = "_Skybox" {} //反射cubemap
_Reflection ("Reflection", 2D) = "white" {} //镜面反射
}
SubShader
{
Tags { "Queue"="Transparent" "RenderType"="Opaque" }
LOD 100
GrabPass{"_RefractionTex"} //水池底部
Pass
{
//ZWrite Off
Cull Off
Blend SrcAlpha OneMinusSrcAlpha
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
#include "AutoLight.cginc"
#include "Lighting.cginc"
#pragma multi_compile LIGHTMAP_OFF LIGHTMAP_ON //光照贴图
uniform sampler2D_float _CameraDepthTexture;
sampler2D _NormalTex;
sampler2D _HeightTex;
sampler2D _RefractionTex; //GrabPass 纹理
sampler2D _Reflection; // 反射贴图
//samplerCUBE _Cubemap;
float4 _RefractionTex_TexelSize; //GrabPass 纹理的纹素大小
float4 _Reflection_TexelSize; //GrabPass 纹理的纹素大小
float4 _NormalTex_ST;
float4 _HeightTex_ST;
half _Depth;
half _Strenght;
half _NormalStrenght;
half _FresnelScale;
half _Roughness;
half _RefrDistortion;
half _ReflDistortion;
half _WaterDir;
half _WaterSpeed;
half _Specular;
half _SpecularSmooth;
fixed4 _DepthCol;
fixed4 _ShallowCol;
struct appdata
{
float4 vertex : POSITION;
float3 normal:NORMAL;
float4 tangent:TANGENT;
float2 uv : TEXCOORD0;
};
struct v2f
{
float4 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
float4 scrPos:TEXCOORD1;
float4 grabPos:TEXCOORD2;
float4 TtoW0:TEXCOORD3;
float4 TtoW1:TEXCOORD4;
float4 TtoW2:TEXCOORD5;
float2 uvDir : TEXCOORD6;
};
v2f vert (appdata v)
{
v2f o;
o.uv.xy = TRANSFORM_TEX(v.uv, _NormalTex);
o.uv.zw = TRANSFORM_TEX(v.uv, _HeightTex);
o.uvDir = float2(cos(_WaterDir),sin(_WaterDir))*_WaterSpeed*_Time.y;
fixed height0 = tex2Dlod(_HeightTex,float4(o.uv.zw+o.uvDir,0,0)).r;
fixed height1 = tex2Dlod(_HeightTex,float4(o.uv.zw-o.uvDir,0,0)).r;
v.vertex.y += (height0+height1)*_NormalStrenght*0.01;
o.vertex = UnityObjectToClipPos(v.vertex);
o.scrPos = ComputeScreenPos ( o.vertex );
o.grabPos = ComputeGrabScreenPos(o.vertex); //
COMPUTE_EYEDEPTH(o.scrPos.z); //源码:#define COMPUTE_EYEDEPTH(o) o = -UnityObjectToViewPos( v.vertex ).z
//世界空间的发现空间坐标轴的向量
fixed3 worldPos = mul(unity_ObjectToWorld,v.vertex).xyz;
fixed3 worldNormal = UnityObjectToWorldNormal(v.normal);
fixed3 worldTangent = UnityObjectToWorldDir(v.tangent.xyz);
fixed3 worldBinormal = cross(worldNormal,worldTangent)*v.tangent.w;
o.TtoW0 = float4(worldTangent.x,worldBinormal.x,worldNormal.x,worldPos.x);
o.TtoW1 = float4(worldTangent.y,worldBinormal.y,worldNormal.y,worldPos.y);
o.TtoW2 = float4(worldTangent.z,worldBinormal.z,worldNormal.z,worldPos.z);
return o;
}
fixed4 frag (v2f i) : SV_Target
{
fixed depth =LinearEyeDepth(SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(i.scrPos)));
//因此这里得到当前平面的观察空间深度值后,取了反,与上面得到的结果对应
float diff = 1-saturate(((depth-i.scrPos.z) -_Depth)*_Strenght);
fixed4 col = lerp(_DepthCol, _ShallowCol, diff);
//分解TtoW 计算 normal
float3 worldPos = float3(i.TtoW0.w,i.TtoW1.w,i.TtoW2.w);
fixed3 normal0 = UnpackNormal(tex2D(_NormalTex, i.uv.xy+i.uvDir));
fixed3 normal1 = UnpackNormal(tex2D(_NormalTex, i.uv.xy+i.uvDir/3));
fixed3 normal = normalize(normal0+normal1);
normal.xy *= lerp(_NormalStrenght,_NormalStrenght/3,diff);
normal.z = sqrt(1.0-saturate(dot(normal.xy,normal.xy)));
normal = normalize(half3(dot(i.TtoW0.xyz,normal),dot(i.TtoW1.xyz,normal),dot(i.TtoW2.xyz,normal)));
//光照贴图
#ifdef LIGHTMAP_ON
//如果应用了光照贴图就提取光照贴图。
fixed4 lightcol = float4(DecodeLightmap(UNITY_SAMPLE_TEX2D(unity_Lightmap, i.uv.xy)), 1);
#else
fixed4 lightcol = float4(0,1,0,1);
#endif
fixed3 lightDir = normalize(UnityWorldSpaceLightDir(worldPos));
fixed3 ambient = lightcol.rgb * col.rgb;
fixed3 diffuse = lightcol.rgb*col.rgb*(0.5*saturate(dot(lightDir, normal)+0.5));
//UNITY_LIGHT_ATTENUATION(atten, i, worldPos);
//高光
float3 viewVector = normalize(UnityWorldSpaceViewDir(worldPos));
float3 halfVector = normalize(viewVector+normalize(_WorldSpaceLightPos0.xyz));
float3 fresnel = _FresnelScale+(1-_FresnelScale)*pow(1-dot(viewVector,normal),4);
float roughnessFactor = (_Roughness + 8.0f)*pow(max(dot(halfVector, normal), 0.0f), _Roughness) / 8.0f;
//float nh= max(0,dot(normal,halfVector));
//float roughnessFactor =pow (nh, _Specular*128.0) * _Roughness;
float3 specularColor = lightcol.rgb*pow(saturate(dot(lightDir, normal))*fresnel*roughnessFactor+_Specular,_SpecularSmooth);
//折射
float2 offset = normal.xz*_RefrDistortion * _RefractionTex_TexelSize.xy;
float2 offsetPos = offset + i.grabPos.xy/i.grabPos.w;
fixed3 refrCol = tex2D(_RefractionTex,offsetPos).rgb;
//反射
//float3 worldRefl = reflect(-viewVector,normal);
//float3 reflCol = texCUBE(_Cubemap,worldRefl ).rgb;
float2 screenUV = i.scrPos.xy/i.scrPos.w - normal.xz * _ReflDistortion*_Reflection_TexelSize.xy;
fixed3 reflCol = tex2D(_Reflection, screenUV).rgb;
fixed3 combinCol = lerp(refrCol,diffuse,col.a);
return fixed4(reflCol*fresnel+specularColor + combinCol*(1-fresnel),1);
}
ENDCG
}
}
}
参考文章:
Unity Shader-反射效果
关于ComputeScreenPos和ComputeGrabScreenPos的差别
shader实例(二十六)水(反射,法线,透明)
使用Unity URP制作水体着色器(教程)