unity大气散射

大气散射效果对游戏画质提升来说巨大,本文主要从代码层面讲解下大气散射

单次散射

路径 AB 观察大气,并且求解 B 点的大气颜色,光线在大气中只发生一次散射,散射点为 P

阳光进入大气层CP开始衰减,在P点发生散射,然后PA衰减进入A点相机

I_{CP}=\boxed{I_P}{S(\lambda,\theta,h)} {T(CP)}

I_{PA}=\boxed{I_P}{S(\lambda,\theta,h)} {T(PA)}

T表示衰减系数 表示某段路径上光照的衰减程度

S表示散射系数 表示有多少光散射的角度为θ,λ为波长,h

实际上在路径 AB(也可能是斜的一条射线) 上有无数个 P 点,因此最终求解是对 AB 路径上每一个点的光照贡献进行累加 (求积分,不同高度不同密度)

\int_{A}^{B}I_{s} {T({CP})}{S(\lambda,\theta,h)} {T(PA)}ds

散射类型分2种

Rayleigh Scattering

大小远小于光线波长的粒子,由于粒子比波长还要小很多,所以光的波长也会影响散射程度

Mie Scattering

大小远大于光线波长的粒子,由于光的波长相较于这些粒子大小来说是可以忽略的,所以认为Mie Scattering和光线波长无关,Mie Scattering是太阳周围光晕的主要成因

散射方程

{S(\lambda,\theta,h)} = \frac{\pi^2(n^2-1)^2}{2}\frac{​{\rho(h)}}{N}\frac{1}{\lambda^4}(1+cos\theta^2)

λ光的波长(R680 G550 B440nm),θ散射角度,h高度,n空气的折射率,N一个常数表示大气厚度,大气密度ρ(h)密度比,在海平面为1,随高度呈指数下降

散射方程主要分为,1散射系数,2相位函数,3大气密度比

1 散射系数

{S(\lambda,h)} = \frac{8\pi^2(n^2-1)^2}{3}\frac{​{\rho(h)}}{N}\frac{1}{\lambda^4}

{S(\lambda,0)} = \frac{8\pi^2(n^2-1)^2}{3}\frac{​{1}}{N}\frac{1}{\lambda^4}

在海平面上,是一个常数

Rayleigh 0.005802f, 0.013558f, 0.033100f

Mie 来说 0.003996f, 0.003996f, 0.003996f

2 相位函数

{P(\theta)} = \frac{3}{16\pi}(1+cos\theta^2)

3 大气密度比

{\rho(h)} = exp(-\frac{h}{H})

Rayleigh 来说,H= 8500米

Mie 来说,H = 1200米

颜色来表示Rayleigh的散射

Absorption

除了发生散射,大气中还有臭氧和甲烷吸收

臭氧(O3)臭氧主要吸收短波紫外线

甲烷(CH4)甲烷是一种吸收红外光而反射蓝光的气体

Ragleigh没有吸收

用颜色来表示下Absorption

公式表

伪代码

Function Transmittance(float3 A, float3 B)
{
    float   SAMPLE=64//采样次数
    float   distance=length(A,B)
    float3  dir=B-A
    float   ds=distance/SAMPLE
    float3  p = A + (dir * ds) * 0.5;
    float3 result=0
    //算积分
    for(int i=0; i<SAMPLE; i++)
    {
        float h = length(p)
        float3 extinctionMie = MieExtinction(h);
        float3 extinctionRay = RayleighCoefficient(h)
        float3 extinctionOzo = OzoneAbsorption(h)
        result += (extinctionMie + extinctionRay + extinctionOzo)*ds
    }
    return exp(-result)
}
float3 eyePos
float3 viewDir
float3 lightDir
float3 P = eyePos + (viewDir * ds) * 0.5
float  cos⁡𝜃 = dot(lightDir, viewDir)
float3 RayleighScattering = RayleighCoefficient(h) * RayleiPhase(cos⁡𝜃)
float3 MieScattering = MieCoefficient(h) * MiePhase(cos⁡𝜃)
Single Scattering = Transmittance(P,lightDir) *(RayleighScattering+MieScattering) * Transmittance(eyePos,P)

Multi Scattering

微分立体角

在讲解多次散射,需要对微分立体角有一个了解,好理解能量守恒

整个球体的立体角就是4π

微分立体角就是A/(r*r) 面积除以半径的平方

辐射度量学

辐射强度(Radiant Intensity) 是从光源出发,每单位立体角上的功率(亮度)

辐照度(Irradiance) 是入射投影到单位面积上的功率(光照、辐射通量)

辐射(Radiance) 是描述光在环境中分布的基本场量,辐射描述 每单位立体角、每单位垂直面积的功率

总结:假设从光源总的辐射强度Radiant Intensity为1,半径为2的立体角的辐照度Irradiance就是1/(4π2^2),

辐射Radiance就是1/(4π2^2)dw,每个微分立体角可以接受到的能量

比如我们要算一个半球面能接受多少光照,直接做一个积分,就可以直接算出来,这样计算的光照能量守恒

多散次射的经过

总的来说,a算出单次散射,b加上地面的反射,c做一个球面的积分,d把整个视线路径做一个积分,求出有多少光可以到眼睛里

Bruneton2017

https://github.com/ebruneton/precomputed_atmospheric_scattering

参考了Bruneton的代码,功能很全,仅需自己修改采样shadow

伪代码
ComputeTransmittanceLUT()
ComputeDirectIrradianceLUT()
ComputeSingleScatteringLUT()
for (uint order = 2; order <= numScatteringOrders; order++) {
	ComputeScatteringDensity()// order>=2计算间接光照, order>2地球表面反射光
	ComputeIndirectIrradiance()//地球表面反射光
	ComputeMultipleScattering()//PA积分
}

优点就是一次计算完各个方向r(高度), mu(视角天顶角), mu_s(太阳天顶角), nu(视线和太阳的夹角)

并且把mu_s和nu压缩到一个变量里面

Number frag_coord_nu = floor(frag_coord.x / Number(SCATTERING_TEXTURE_MU_S_SIZE));
Number frag_coord_mu_s = fmod(frag_coord.x, Number(SCATTERING_TEXTURE_MU_S_SIZE))

保存结果到一个3DTexture,直接采样就行了

缺点就是如果需要修改默认参数,就要重新渲染一次,消耗巨大,PC都卡,移动端更受不了

UnrealEngineSkyAtmosphere的LUT预处理

GitHub - sebh/UnrealEngineSkyAtmosphere: Unreal Engine Sky Atmosphere Rendering Technique

主要需要计算4个LUT

TransmittanceLUT MultipleScatteringLUT Sky-ViewLUT AerialPerspectiveLUT

TransmittanceLUT

uv 分别表示 高度(r)和视角天顶角(mu)

[numthreads(8, 8, 1)]
void IntergalTransmittanceLUT(uint3 id: SV_DispatchThreadID)
{
    float2 pixPos = id.xy + 0.5f;

    AtmosphereParameters Atmosphere = GetAtmosphereParameters();

    // Compute camera position from LUT coords
    float2 uv = (pixPos) / float2(TRANSMITTANCE_TEXTURE_WIDTH, TRANSMITTANCE_TEXTURE_HEIGHT);
    float viewHeight;
    float viewZenithCosAngle;
    UvToLutTransmittanceParams(Atmosphere, viewHeight, viewZenithCosAngle, uv);

    //  A few extra needed constants
    float3 WorldPos = float3(0.0f, viewHeight, 0.0f);
    float3 WorldDir = float3(sqrt(1.0 - viewZenithCosAngle * viewZenithCosAngle), viewZenithCosAngle, 0.0f);

    const bool ground = false;
    const float SampleCountIni = 40.0f; // Can go a low as 10 sample but energy lost starts to be visible.
    const float DepthBufferValue = -1.0;
    const bool VariableSampleCount = false;
    const bool MieRayPhase = false;
    float3 transmittance = exp(-IntegrateScatteredLuminance(uv, WorldPos, WorldDir, sun_direction, Atmosphere,
                                                            ground, SampleCountIni, DepthBufferValue,
                                                            VariableSampleCount, MieRayPhase).OpticalDepth);

    _TransmittanceLUT[id.xy] = float4(transmittance, 1.0f);
}

代码很简单,就是遍历 高度和视角天顶角,太阳朝向已知,其他都可以算出来,把透射记录到Texture里面就行

void UvToLutTransmittanceParams(AtmosphereParameters Atmosphere, out float viewHeight, out float viewZenithCosAngle, in float2 uv)
{
	//uv = float2(fromSubUvsToUnit(uv.x, TRANSMITTANCE_TEXTURE_WIDTH), fromSubUvsToUnit(uv.y, TRANSMITTANCE_TEXTURE_HEIGHT)); // No real impact so off
	float x_mu = uv.x;
	float x_r = uv.y;

	float H = sqrt(Atmosphere.TopRadius * Atmosphere.TopRadius - Atmosphere.BottomRadius * Atmosphere.BottomRadius);
	float rho = H * x_r;
	viewHeight = sqrt(rho * rho + Atmosphere.BottomRadius * Atmosphere.BottomRadius);

	float d_min = Atmosphere.TopRadius - viewHeight;
	float d_max = rho + H;
	float d = d_min + x_mu * (d_max - d_min);
	viewZenithCosAngle = d == 0.0 ? 1.0f : (H * H - rho * rho - d * d) / (2.0 * viewHeight * d);
	viewZenithCosAngle = clamp(viewZenithCosAngle, -1.0, 1.0);
}

UvToLutTransmittanceParams把r和mu压缩到一个最大范围里面用来提高精度

mu视线角度长度范围就是d_min 和 d_max

 
float H = sqrt(Atmosphere.TopRadius * Atmosphere.TopRadius - Atmosphere.BottomRadius * Atmosphere.BottomRadius);

该H和图中H不是一个点,表示把精度压缩到以(0,0)为原点的H坐标系中

IntegrateScatteredLuminance中,主要获取medium.extinction * dt

viewZenithCosAngle = d == 0.0 ? 1.0f : (H * H - rho * rho - d * d) / (2.0 * viewHeight * d);
...
MediumSampleRGB sampleMediumRGB(in float3 WorldPos, in AtmosphereParameters Atmosphere)
{
	const float viewHeight = length(WorldPos) - Atmosphere.BottomRadius;

	const float densityMie = exp(Atmosphere.MieDensityExpScale * viewHeight);
	const float densityRay = exp(Atmosphere.RayleighDensityExpScale * viewHeight);
	const float densityOzo = saturate(viewHeight < Atmosphere.AbsorptionDensity0LayerWidth ?
		Atmosphere.AbsorptionDensity0LinearTerm * viewHeight + Atmosphere.AbsorptionDensity0ConstantTerm :
		Atmosphere.AbsorptionDensity1LinearTerm * viewHeight + Atmosphere.AbsorptionDensity1ConstantTerm);

	MediumSampleRGB s;

	s.scatteringMie = densityMie * Atmosphere.MieScattering;
	s.absorptionMie = densityMie * Atmosphere.MieAbsorption;
	s.extinctionMie = densityMie * Atmosphere.MieExtinction;

	s.scatteringRay = densityRay * Atmosphere.RayleighScattering;
	s.absorptionRay = 0.0f;
	s.extinctionRay = s.scatteringRay + s.absorptionRay;

	s.scatteringOzo = 0.0;
	s.absorptionOzo = densityOzo * Atmosphere.AbsorptionExtinction;
	s.extinctionOzo = s.scatteringOzo + s.absorptionOzo;

	s.scattering = s.scatteringMie + s.scatteringRay + s.scatteringOzo;
	s.absorption = s.absorptionMie + s.absorptionRay + s.absorptionOzo;
	s.extinction = s.extinctionMie + s.extinctionRay + s.extinctionOzo;
	s.albedo = getAlbedo(s.scattering, s.extinction);

	return s;
}
...
MediumSampleRGB medium = sampleMediumRGB(P, Atmosphere);
const float3 SampleOpticalDepth = medium.extinction * dt;
const float3 SampleTransmittance = exp(-SampleOpticalDepth);
OpticalDepth += SampleOpticalDepth;

最后得到一张LUT

长:透射率,宽:高度

MultipleScatteringLUT

numthreads(1, 1, 64)]
void NewMultiScattCS(uint3 ThreadId : SV_DispatchThreadID)
{
    float2 pixPos = float2(ThreadId.xy) + 0.5f;
    float2 uv = pixPos / MultiScatteringLUTRes;


    uv = float2(fromSubUvsToUnit(uv.x, MultiScatteringLUTRes), fromSubUvsToUnit(uv.y, MultiScatteringLUTRes));

    AtmosphereParameters Atmosphere = GetAtmosphereParameters();

    float cosSunZenithAngle = uv.x * 2.0 - 1.0;
    float3 sunDir = float3(sqrt(saturate(1.0 - cosSunZenithAngle * cosSunZenithAngle)), cosSunZenithAngle, 0.0);
    // We adjust again viewHeight according to PLANET_RADIUS_OFFSET to be in a valid range.
    float viewHeight = Atmosphere.BottomRadius + saturate(uv.y + PLANET_RADIUS_OFFSET) * (Atmosphere.TopRadius -
        Atmosphere.BottomRadius - PLANET_RADIUS_OFFSET);

    float3 WorldPos = float3(0.0f, viewHeight, 0.0f);
    float3 WorldDir = float3(0.0f, 1.0f, 0.0f);


    const bool ground = true;
    const float SampleCountIni = 20; // a minimum set of step is required for accuracy unfortunately
    const float DepthBufferValue = -1.0;
    const bool VariableSampleCount = false;
    const bool MieRayPhase = false;

    const float SphereSolidAngle = 4.0 * PI;
    const float IsotropicPhase = 1.0 / SphereSolidAngle;


    // Reference. Since there are many sample, it requires MULTI_SCATTERING_POWER_SERIE to be true for accuracy and to avoid divergences (see declaration for explanations)
    #define SQRTSAMPLECOUNT 8
    const float sqrtSample = float(SQRTSAMPLECOUNT);
    float i = 0.5f + float(ThreadId.z / SQRTSAMPLECOUNT);
    float j = 0.5f + float(ThreadId.z - float((ThreadId.z / SQRTSAMPLECOUNT) * SQRTSAMPLECOUNT));
    {
        float randA = i / sqrtSample;
        float randB = j / sqrtSample;
        float theta = 2.0f * PI * randA;
        float phi = acos(1.0f - 2.0f * randB);
        // uniform distribution https://mathworld.wolfram.com/SpherePointPicking.html
        //phi = PI * randB;						// bad non uniform
        float cosPhi = cos(phi);
        float sinPhi = sin(phi);
        float cosTheta = cos(theta);
        float sinTheta = sin(theta);
        WorldDir.x = cosTheta * sinPhi;
        WorldDir.y = sinTheta * sinPhi;
        WorldDir.z = cosPhi;
        SingleScatteringResult result = IntegrateScatteredLuminance(uv, WorldPos, WorldDir, sunDir, Atmosphere,
                                                                    ground, SampleCountIni, DepthBufferValue,
                                                                    VariableSampleCount, MieRayPhase);

        MultiScatAs1SharedMem[ThreadId.z] = result.MultiScatAs1 * SphereSolidAngle / (sqrtSample * sqrtSample);
        LSharedMem[ThreadId.z] = result.L * SphereSolidAngle / (sqrtSample * sqrtSample);
    }
    #undef SQRTSAMPLECOUNT

    GroupMemoryBarrierWithGroupSync();

    // 64 to 32
    if (ThreadId.z < 32)
    {
        MultiScatAs1SharedMem[ThreadId.z] += MultiScatAs1SharedMem[ThreadId.z + 32];
        LSharedMem[ThreadId.z] += LSharedMem[ThreadId.z + 32];
    }
    GroupMemoryBarrierWithGroupSync();

    // 32 to 16
    if (ThreadId.z < 16)
    {
        MultiScatAs1SharedMem[ThreadId.z] += MultiScatAs1SharedMem[ThreadId.z + 16];
        LSharedMem[ThreadId.z] += LSharedMem[ThreadId.z + 16];
    }
    GroupMemoryBarrierWithGroupSync();

    // 16 to 8 (16 is thread group min hardware size with intel, no sync required from there)
    if (ThreadId.z < 8)
    {
        MultiScatAs1SharedMem[ThreadId.z] += MultiScatAs1SharedMem[ThreadId.z + 8];
        LSharedMem[ThreadId.z] += LSharedMem[ThreadId.z + 8];
    }
    GroupMemoryBarrierWithGroupSync();
    if (ThreadId.z < 4)
    {
        MultiScatAs1SharedMem[ThreadId.z] += MultiScatAs1SharedMem[ThreadId.z + 4];
        LSharedMem[ThreadId.z] += LSharedMem[ThreadId.z + 4];
    }
    GroupMemoryBarrierWithGroupSync();
    if (ThreadId.z < 2)
    {
        MultiScatAs1SharedMem[ThreadId.z] += MultiScatAs1SharedMem[ThreadId.z + 2];
        LSharedMem[ThreadId.z] += LSharedMem[ThreadId.z + 2];
    }
    GroupMemoryBarrierWithGroupSync();
    if (ThreadId.z < 1)
    {
        MultiScatAs1SharedMem[ThreadId.z] += MultiScatAs1SharedMem[ThreadId.z + 1];
        LSharedMem[ThreadId.z] += LSharedMem[ThreadId.z + 1];
    }
    GroupMemoryBarrierWithGroupSync();
    if (ThreadId.z > 0)
        return;

    float3 MultiScatAs1 = MultiScatAs1SharedMem[0] * IsotropicPhase; // Equation 7 f_ms
    float3 InScatteredLuminance = LSharedMem[0] * IsotropicPhase; // Equation 5 L_2ndOrder

    // MultiScatAs1 represents the amount of luminance scattered as if the integral of scattered luminance over the sphere would be 1.
    //  - 1st order of scattering: one can ray-march a straight path as usual over the sphere. That is InScatteredLuminance.
    //  - 2nd order of scattering: the inscattered luminance is InScatteredLuminance at each of samples of fist order integration. Assuming a uniform phase function that is represented by MultiScatAs1,
    //  - 3nd order of scattering: the inscattered luminance is (InScatteredLuminance * MultiScatAs1 * MultiScatAs1)
    //  - etc.
    #if	MULTI_SCATTERING_POWER_SERIE==0
	float3 MultiScatAs1SQR = MultiScatAs1 * MultiScatAs1;
	float3 L = InScatteredLuminance * (1.0 + MultiScatAs1 + MultiScatAs1SQR + MultiScatAs1 * MultiScatAs1SQR + MultiScatAs1SQR * MultiScatAs1SQR);
    #else
    // For a serie, sum_{n=0}^{n=+inf} = 1 + r + r^2 + r^3 + ... + r^n = 1 / (1.0 - r), see https://en.wikipedia.org/wiki/Geometric_series 
    const float3 r = MultiScatAs1;
    const float3 SumOfAllMultiScatteringEventsContribution = 1.0f / (1.0 - r);
    float3 L = InScatteredLuminance * SumOfAllMultiScatteringEventsContribution; // Equation 10 Psi_ms
    #endif

    OutputTexture[ThreadId.xy] = float4(MultipleScatteringFactor * L, 1.0f);
}

uv 分别表示 高度(r)和太阳天顶角(mu_s)

根据r算出WorldPos,根据mu_s算出sunDir,均匀采样球面64个点,算球面积分

        WorldDir.x = cosTheta * sinPhi;
        WorldDir.y = sinTheta * sinPhi;
        WorldDir.z = cosPhi;

如何球面均匀采样,参考https://mathworld.wolfram.com/SpherePointPicking.html

MultipleScattering和Bruneton2017对比,做了优化

    const float SphereSolidAngle = 4.0 * PI;
    const float IsotropicPhase = 1.0 / SphereSolidAngle;

1.各项同性

2.采样周围点的光能量相同

相当于一个系数衰减

    float3 MultiScatAs1 = MultiScatAs1SharedMem[0] * IsotropicPhase; // Equation 7 f_ms
    float3 InScatteredLuminance = LSharedMem[0] * IsotropicPhase; // Equation 5 L_2ndOrder

把结果加一起,GroupMemoryBarrierWithGroupSync是做什么的,可以参考

Compute Shader中的Parallel Reduction和Parallel Scan - 知乎

IntegrateScatteredLuminance里面积分运算用了寒霜引擎的方法

        float3 MS = medium.scattering * 1;
        float3 MSint = (MS - MS * SampleTransmittance) / medium.extinction;
        result.MultiScatAs1 += throughput * MSint;
        ...
        float3 Sint = (S - S * SampleTransmittance) / medium.extinction; // integrate along the current step segment 
        L += throughput * Sint; // accumulate and also take into account the transmittance from previous steps
        throughput *= SampleTransmittance;

Sky-ViewLUT

[numthreads(8, 8, 1)]
void IntergalSkyViewLutPS(uint3 id: SV_DispatchThreadID)
{
    float2 pixPos = id.xy + 0.5f;
    AtmosphereParameters Atmosphere = GetAtmosphereParameters();

    float3 WorldPos = camera + float3(0, Atmosphere.BottomRadius, 0);
    float2 uv = pixPos / float2(192.0, 108.0);
    float viewHeight = length(WorldPos);
    float viewZenithCosAngle;
    float lightViewCosAngle;
    UvToSkyViewLutParams(Atmosphere, viewZenithCosAngle, lightViewCosAngle, viewHeight, uv);


    float3 SunDir;
    {
        float3 UpVector = WorldPos / viewHeight;
        float sunZenithCosAngle = dot(UpVector, sun_direction);
        SunDir = normalize(float3(sqrt(1.0 - sunZenithCosAngle * sunZenithCosAngle), sunZenithCosAngle, 0.0));
    }
    
    
    WorldPos = float3(0.0f, viewHeight, 0.0f);
    
    float viewZenithSinAngle = sqrt(1 - viewZenithCosAngle * viewZenithCosAngle);
    float3 WorldDir = float3(viewZenithSinAngle * lightViewCosAngle,
                             viewZenithCosAngle,
                             viewZenithSinAngle * sqrt(1.0 - lightViewCosAngle * lightViewCosAngle));


    // Move to top atmospehre
    if (!MoveToTopAtmosphere(WorldPos, WorldDir, Atmosphere.TopRadius))
    {
        // Ray is not intersecting the atmosphere
        _SkyViewLUT[id.xy] = float3(0, 0, 0);
        return;
    }

    const bool ground = false;
    const float SampleCountIni = 30;
    const float DepthBufferValue = -1.0;
    const bool VariableSampleCount = true;
    const bool MieRayPhase = true;
    SingleScatteringResult ss = IntegrateScatteredLuminance(uv, WorldPos, WorldDir, SunDir, Atmosphere, ground,
                                                            SampleCountIni, DepthBufferValue, VariableSampleCount,
                                                            MieRayPhase);
     _SkyViewLUT[id.xy] = ss.L;
}

viewZenithCosAngle和lightViewCosAngle取值范围[1,-1]角度0-90度

UvToSkyViewLutParams(Atmosphere, viewZenithCosAngle, lightViewCosAngle, viewHeight, uv);

WorldPos UpVector WorldDir 跟UE的区别改了up朝向,UE是z朝上,unity是y朝上

AerialPerspectiveLUT

[numthreads(8, 8, 1)]
void IntergalCameraVolumeLUT(uint3 id: SV_DispatchThreadID)
{
    float2 pixPos = id.xy + 0.5f;
    float w, h, d;
    _CameraVolumeLUT.GetDimensions(w, h, d);
    AtmosphereParameters Atmosphere = GetAtmosphereParameters();
    float2 uv = pixPos / float2(w, h);
    float3 ClipSpace = float3(uv * float2(2.0, -2.0) - float2(1.0, -1.0), 0.5);
    float4 HViewPos = mul(gSkyInvProjMat, float4(ClipSpace, 1.0));
    float3 WorldDir = normalize(mul((float3x3)gSkyInvViewMat, HViewPos.xyz / HViewPos.w));

    float earthR = Atmosphere.BottomRadius;
    float3 camPos = camera + float3(0, earthR, 0);
    float3 SunDir = sun_direction;
    
    float Slice = ((id.z + 0.5f) / AP_SLICE_COUNT);
    Slice *= Slice; // squared distribution
    Slice *= AP_SLICE_COUNT;

    float3 WorldPos = camPos;

    // Compute position from froxel information
    float tMax = AerialPerspectiveSliceToDepth(Slice);
    float3 newWorldPos = WorldPos + tMax * WorldDir;

    // If the voxel is under the ground, make sure to offset it out on the ground.
    float viewHeight = length(newWorldPos);
    if (viewHeight <= (Atmosphere.BottomRadius + PLANET_RADIUS_OFFSET))
    {
        // Apply a position offset to make sure no artefact are visible close to the earth boundaries for large voxel.
        newWorldPos = normalize(newWorldPos) * (Atmosphere.BottomRadius + PLANET_RADIUS_OFFSET + 0.001f);
        WorldDir = normalize(newWorldPos - camPos);
        tMax = length(newWorldPos - camPos);
    }
    float tMaxMax = tMax;

    const bool ground = false;
    const float SampleCountIni = max(1.0, float(id.z + 1.0) * 2.0f);
    const float DepthBufferValue = -1.0;
    const bool VariableSampleCount = false;
    const bool MieRayPhase = true;
    SingleScatteringResult ss = IntegrateScatteredLuminance(uv, WorldPos, WorldDir, SunDir, Atmosphere, ground,
                                                            SampleCountIni, DepthBufferValue, VariableSampleCount,
                                                            MieRayPhase, tMaxMax);


    const float Transmittance = dot(ss.Transmittance, float3(1.0f / 3.0f, 1.0f / 3.0f, 1.0f / 3.0f));
    _CameraVolumeLUT[id] = float4(ss.L, 1.0 - Transmittance);
}

 
#define AP_SLICE_COUNT 32.0f 切片数
#define AP_KM_PER_SLICE 3.0f  一切片包含多少公里
...
float tMax = AerialPerspectiveSliceToDepth(Slice); 根据切片还原射线最远可到达距离

号称1ms可以在6s上渲染完成4个lut

最后在后处理加上AerialPerspective和shadow的处理即可

结尾:

SlightlyMad,Sebh,Bruneton 的大气散射效果实现各不相同,需要根据实际项目需求修改效果

在移动端性能需要做一个取舍,比如原神的天空不是完全基于物理大气渲染,效果也是很棒的

本人能力有限,可能有一些错误,请多多指教

项目源码

https://github.com/wujuju/UnitySkyAtmosphere

参考资料:

01.游戏引擎导论 | GAMES104-现代游戏引擎:从入门到实践_哔哩哔哩_bilibili

GAMES101-现代计算机图形学入门-闫令琪_哔哩哔哩_bilibili

GitHub - sebh/UnrealEngineSkyAtmosphere: Unreal Engine Sky Atmosphere Rendering Technique

GitHub - SlightlyMad/AtmosphericScattering: Atmospheric Scattering for Unity

https://github.com/ebruneton/precomputed_atmospheric_scattering

Volumetric Atmospheric Scattering - Alan Zucconi

https://sebh.github.io/publications/egsr2020.pdf

  • 47
    点赞
  • 49
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
以下是一个基本的Unity URP次表面散射(SSS)着色器示例: ``` Shader "Custom/URP SSS" { Properties { _MainTex ("Albedo (RGB)", 2D) = "white" {} _NormalMap ("Normal Map", 2D) = "bump" {} _SubsurfaceColor ("Subsurface Color", Color) = (1,1,1,1) _SubsurfaceRadius ("Subsurface Radius", Range(0.0,1.0)) = 0.25 _ThicknessMap ("Thickness Map", 2D) = "white" {} _SpecularColor ("Specular Color", Color) = (0.5,0.5,0.5,1) _Glossiness ("Smoothness", Range(0,1)) = 0.5 _Metallic ("Metallic", Range(0,1)) = 0.0 } SubShader { Tags {"RenderType"="Opaque"} LOD 100 Pass { CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" struct appdata { float4 vertex : POSITION; float2 uv : TEXCOORD0; float3 normal : NORMAL; float4 tangent : TANGENT; }; struct v2f { float2 uv : TEXCOORD0; float3 worldPos : TEXCOORD1; float3 worldNormal : TEXCOORD2; float3 worldTangent : TEXCOORD3; float3 worldBitangent : TEXCOORD4; UNITY_VERTEX_INPUT_INSTANCE_ID }; sampler2D _MainTex; sampler2D _NormalMap; sampler2D _ThicknessMap; float4 _MainTex_ST; float4 _ThicknessMap_ST; float _SubsurfaceRadius; float4 _SubsurfaceColor; float4 _SpecularColor; float _Glossiness; float _Metallic; float3 CalculateTangent(float3 n, float3 p, float2 uv, float3 dpdu) { float3 dpdv = cross(n, dpdu); float3 t = normalize(dpdv - uv.y * dpdu); return t; } float3 CalculateBitangent(float3 n, float3 p, float2 uv, float3 dpdu) { float3 dpdv = cross(n, dpdu); float3 b = normalize(cross(n, dpdv)); return b; } float3 CalculateSubsurface(float3 worldPos, float3 worldNormal, float3 worldTangent, float3 worldBitangent) { // Calculate tangent space vectors float3x3 TBN = float3x3(worldTangent, worldBitangent, worldNormal); // Calculate thickness float thickness = tex2D(_ThicknessMap, TRANSFORM_TEX(worldPos, _ThicknessMap)).r; // Calculate subsurface scattering float3 subsurface = _SubsurfaceColor.rgb * thickness * _SubsurfaceRadius; // Transform subsurface scattering to world space subsurface = mul(TBN, subsurface); return subsurface; } float4 LightingURP(SurfaceOutputStandard s, float3 lightDir, float3 worldPos, float atten) { // Calculate subsurface scattering float3 subsurface = CalculateSubsurface(worldPos, s.Normal, s.Tangent, s.Bitangent); // Calculate diffuse lighting float diff = max(0.0, dot(s.Normal, lightDir)); float3 diffuse = (s.Albedo * _LightColor0.rgb * diff); // Calculate specular lighting float3 viewDir = normalize(UnityWorldSpaceViewDir(worldPos)); float3 halfDir = normalize(lightDir + viewDir); float spec = pow(max(0.0, dot(s.Normal, halfDir)), s.Smoothness); float3 specular = _SpecularColor.rgb * _LightColor0.rgb * spec; // Combine lighting float3 finalColor = ((diffuse + specular) * atten) + subsurface; return float4(finalColor, s.Alpha); } v2f vert (appdata v) { v2f o; o.worldPos = mul(unity_ObjectToWorld, v.vertex).xyz; o.worldNormal = UnityObjectToWorldNormal(v.normal); o.worldTangent = UnityObjectToWorldDir(UnityObjectToWorldDir(v.tangent.xyz, UNITY_MATRIX_IT_MV), UNITY_MATRIX_IT_MV); o.worldBitangent = cross(o.worldNormal, o.worldTangent) * v.tangent.w; o.uv = TRANSFORM_TEX(v.uv, _MainTex); UNITY_SETUP_INSTANCE_ID(v); return o; } fixed4 frag (v2f i) : SV_Target { SurfaceOutputStandard s = SurfaceOutputStandard(); s.Albedo = tex2D(_MainTex, i.uv).rgb; s.Normal = UnpackNormal(tex2D(_NormalMap, i.uv)); s.Smoothness = _Glossiness; s.Metallic = _Metallic; float3 worldPos = i.worldPos; float3 worldNormal = normalize(i.worldNormal); float3 worldTangent = normalize(i.worldTangent); float3 worldBitangent = normalize(i.worldBitangent); float4 finalColor = LightingURP(s, normalize(_WorldSpaceLightPos0.xyz), worldPos, _LightColor0.a); return finalColor; } ENDCG } } FallBack "Diffuse" } ``` 这个着色器包含了一个基本的表面纹理、法线贴图、次表面散射颜色和半径、厚度贴图、高光颜色、光滑度和金属度选项。在顶点着色器中,我们还计算了Tangent和Bitangent向量,并传递给片段着色器以计算次表面散射。在片段着色器中,我们计算了次表面散射、漫反射和高光光照,并将它们合并以产生最终颜色。请注意,此着色器是基于URP版本的标准表面着色器,并使用UnityCG.cginc文件中定义的LightingURP函数来计算光照。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值