渲染18——实时GI

https://catlikecoding.com/unity/tutorials/rendering/part-18/

参考网址:https://docs.unity3d.com/Manual/GIIntro.html

this is part 18 of a tutorial series about rendering. after wrapping up baked global illumination in part 17, we move on to supporting real time gi. after that, we will also support light probe proxy volumes and cross-fading lod groups.

from now on, this tutorial series is made with unity 2017.1.0f3. it will not work with older versions, because we will end up using a new shader function.

1 realtime global illumination

baking light works very well for static geometry, and also pretty well for dynamic geometry thanks to light probes. however, it can not deal with dynamic lights. lights in mixed mode can get away with some real time adjustments, but too much makes it obvious that the baked indirect light does not change. so when u have an outdoor scene, the sun has to be unchanging. it can not travel across the sky like it does in real life, as it requires gradually changing gi. so the scene has to be frozen in time.

to make indirect lighting work with something like a moving sun, unity uses the enlighten system to calculate realtime global illumination. it works like baked indirect lighting, except that the lightmaps and probes are computed at runtime.

figuring out indirect light requires knowledge of how light could bounce between static surfaces. the question is which surfaces are potentially affected by which other surfaces, and to what degree. 当前的面,受到哪些面影响。

figuring out these relationships is a lot of work and can not be done in real time. so this data is processed by the editor and stored for use at runtime. enlighten then uses it to compute the realtime lightmaps and probe data. even then, it is only feasible with low resolution lightmaps. 只能使用低分辨率的光照贴图

1.1 enabling realtime gi

real time global illumination can be enabled independent of baked lighting. 可以和烘焙光独立开单独使用。u can have none, one, or both active at the same time. it is enabled via the checkbox in the realtime lighting section of the lighting window.

在这里插入图片描述
to see real time gi in action, set the mode of the main light in our test scene to real time. as we have no other lights, this effectively turns off baked lighting, event when it is enabled.

make sure that all objects in the scene use our white material. like last time, the spheres are all dynamic and everything else is static geometry.

在这里插入图片描述
it turns out that only the dynamic objects benefit from realtime gi. the static objects have become darker. that is because the light probes automatically incorporated the realtime gi. static objects have to sample the realtime lightmaps, which are not the same as the baked lightmaps. our shader does not do this yet.

1.2 baking realtime gi
unity already generates the realtime lightmaps while in edit mode, so u can always see the realtime gi contribution. these maps are not retained when switching between edit and play mode, but they end up the same. u can inspect the realtime lightmaps via the object maps tab of the lighting window, with a lightmap-static object selected. choose the realtime intensity visualization to see the realtime lightmap data.

在这里插入图片描述

实时gi,也是针对静态物体的????
although realtime lightmaps are already baked, and they might appear correct, 看起来是正确的,但是是错误的。our meta pass actually uses the wrong coordinates. realtime gi has its own lightmap coordinates, which can end up being different than those for static lightmaps. unity generates these coordinates automatically, based on the lightmap and object settings. they are stored in the third mesh uv channel. so add this data to VertexData in My Lightmapping.

struct VertexData
{
float4 vertex:POSITION;
float4 uv:TEXCOORD0;
float4 uv1:TEXCOORD1;
float4 uv2:TEXCOORD2;
};

now MyLightmappingVertexProgram has to use either the second or third UV set, together with either the static or dynamic lightmap’s scale and offset. we can rely on the UntiyMetaVertexPosition function to use the right data.

Interpolators MyLightmappingVertexProgram (VertexData v) {
Interpolators i;
i.pos = UnityMetaVertexPosition(
v.vertex, v.uv1, v.uv2, unity_LightmapST, unity_DynamicLightmapST
);
i.uv.xy = TRANSFORM_TEX(v.uv, _MainTex);
i.uv.zw = TRANSFORM_TEX(v.uv, _DetailTex);
return i;
}

note that the meta pass is used for both baked and realtime lightmapping. so when realtime gi is used, it will also be included in builds.

1.3 sampling realtime lightmaps

to actually sample the realtime lightmpas, we have to also add the third uv set to VertexData in MyLighting.

struct VertexData {
float4 vertex : POSITION;
float3 normal : NORMAL;
float4 tangent : TANGENT;
float2 uv : TEXCOORD0;
float2 uv1 : TEXCOORD1;
float2 uv2 : TEXCOORD2;
};

when a realtime lightmap is used, we have to add its lightmap coordinates to our interpolators. the standard shader combines both lightmap coordinates sets in a single interpolator——multiplexed with some other data——but we can get away with separate interplators for both. we know that there is dynamic light data when DYNAMICLIGHTMAP_ON keyword ins defined. it is part of keyword list of the multi_compile_fwdbase compiler directive.

struct Interpolators
{

#if defined(DYNAMICLIGHTMAP_ON)
	float2 dynamicLightmapUV : TEXCOORD7;
#endif

};

fill the coordiantes just like the static lightmap coordinates, except with the dynamic lightmap’s scale and offset, made available via unity_DynamicLightmapST.

Interpolators MyVertexProgram (VertexData v) {#if defined(LIGHTMAP_ON) || ADDITIONAL_MASKED_DIRECTIONAL_SHADOWS
		i.lightmapUV = v.uv1 * unity_LightmapST.xy + unity_LightmapST.zw;
	#endif

	#if defined(DYNAMICLIGHTMAP_ON)
		i.dynamicLightmapUV =
			v.uv2 * unity_DynamicLightmapST.xy + unity_DynamicLightmapST.zw;
	#endif}

sampling the realtime lightmap is done in our CreateIndirectLight function. duplicate the #if defined(LIGHTMAP_ON) code block and make a few changes. first, the new block is based on the DYNAMICLIGHTMAP_ON keyword. also, it should use DecodeRealtimeLigthmap instead of DecodeLightmap, because the realtime maps use a different color format. because this data might be added to baked lighting, do not immediately assign to indirectLight.diffuse, but use an immediate variable which is added to it at the end. 不要赋值,而是使用+=。finally, we should only sample spherical harmonics when neither a baked nor a realtime lighmap is used.

#if defined(LIGHTMAP_ON)
			indirectLight.diffuse =
				DecodeLightmap(UNITY_SAMPLE_TEX2D(unity_Lightmap, i.lightmapUV));
			
			#if defined(DIRLIGHTMAP_COMBINED)
				float4 lightmapDirection = UNITY_SAMPLE_TEX2D_SAMPLER(
					unity_LightmapInd, unity_Lightmap, i.lightmapUV
				);
				indirectLight.diffuse = DecodeDirectionalLightmap(
					indirectLight.diffuse, lightmapDirection, i.normal
				);
			#endif

			ApplySubtractiveLighting(i, indirectLight);
//		#else
//			indirectLight.diffuse += max(0, ShadeSH9(float4(i.normal, 1)));
		#endif

		#if defined(DYNAMICLIGHTMAP_ON)
			float3 dynamicLightDiffuse = DecodeRealtimeLightmap(
				UNITY_SAMPLE_TEX2D(unity_DynamicLightmap, i.dynamicLightmapUV)
			);

			#if defined(DIRLIGHTMAP_COMBINED)
				float4 dynamicLightmapDirection = UNITY_SAMPLE_TEX2D_SAMPLER(
					unity_DynamicDirectionality, unity_DynamicLightmap,
					i.dynamicLightmapUV
				);
            	indirectLight.diffuse += DecodeDirectionalLightmap(
            		dynamicLightDiffuse, dynamicLightmapDirection, i.normal
            	);
			#else
				indirectLight.diffuse += dynamicLightDiffuse;
			#endif
		#endif

		#if !defined(LIGHTMAP_ON) && !defined(DYNAMICLIGHTMAP_ON)
			indirectLight.diffuse += max(0, ShadeSH9(float4(i.normal, 1)));
		#endif

now realtime lightmaps are used by our shader. initially, it might look the same as baked lighting with a mixed light, when using distance shadowmask mode. the difference becomes obvious when turning off the light while in play mode.

在这里插入图片描述
after disabling a mixed lights, its indirect light will remain. in contrast, the indirect contribution of a real-time light disappears–and reappears-- as it should. however, it might take a while before the new situation is fully baked. enlighten incrementally adjust the lightmaps and probes. how quickly this happens depends on the complexity of the scene and the realtime global illumination cpu quality tier setting.

这个地方不懂了。。。。

all realtime lights contribute to realtime gi. however, its typical use is with the main direction light only, representing the sun as it moves through the sky. it is fully functional for directional lights. points lights and spotlights work too, but only unshadowed. so when using shadowed point lights or spotlights u can end up with incorrect lighting.

也就是典型的应用场景是一个主光源是平行光,而且他是实时灯光。对于点光源或者聚光灯则只能是没有阴影的时候才能正常设置为realtime的。

if u want to exclude a realtime light from realtime gi, u can do so by settings its indirect multiplier for its light intensity to zero.

1.4 emissive light
realtime gi can also be used for static objects that emit light. this makes it possible to vary their emission with matching realtime indirect light. let us try this out. add a static sphere to the scene and give it a material that uses our shader with a black albedo and white emission color. initially, we can only see the indirect effects of the emitted light via the static lightmaps.

to bake emissive light into the static lightmap, we had to set the material’s global illumination flags in our shader gui. as we always set the flags to BakedEmissive, the light ends up in the baked lightmap. this is fine when emissive light is constant, but does not allow us to animate it.

to support both baked and realtime lighting for the emission, we have to make this configurable. we can do so by adding a choice for this to MyLightingShaderGUI, via the MaterialEditor.LightmapEmissionProperty method. its single parameter is the property’s indentation level.

the visual difference between baked and realtime gi is that the real-time lightmap usually has a much lower resolution than the baked one. so when the emission does not change and u use baked gi anyway, make sure to take advantage of its higher resolution.

当自发光不变的时候,尽量使用baked,因为其分辨率要更高。

1.5 animating emission

real-timer gi for emission is only possible for static objects. while the objects are static, the emission properties of their materials can be animated and will be packed up by the global illumination system. let’s try this out with a simple component that oscillates 震荡 between a white and emission color.

using UnityEngine;

public class EmissiveOscillator : MonoBehaviour {

	Material emissiveMaterial;

	void Start () {
		emissiveMaterial = GetComponent<MeshRenderer>().material;
	}
	
	void Update () {
		Color c = Color.Lerp(
			Color.white, Color.black,
			Mathf.Sin(Time.time * Mathf.PI) * 0.5f + 0.5f
		);
		emissiveMaterial.SetColor("_Emission", c);
	}
}

add this component to our emisive sphere. when in play mode, its emission will animate, but indirect light is not affected yet. we have to notify the realtime gi system that it has work to do. this can be done by invoking the Render.UpdateGIMaterials method of the appropriate mesh render.

MeshRenderer emissiveRenderer;
	Material emissiveMaterial;

	void Start () {
		emissiveRenderer = GetComponent<MeshRenderer>();
		emissiveMaterial = emissiveRenderer.material;
	}
	
	void Update () {
		…
		emissiveMaterial.SetColor("_Emission", c);
		emissiveRenderer.UpdateGIMaterials();
	}

invoking UpdateGIMaterials triggers a complete update of the object’s emission, rendering it using its meta pass. this is necessary when the emission is more complex than a solid color, for example, if we used a texture. if a solid color is sufficient , we can get away with a shortcut by invoking Dynamic.SetEmissive with the renderer and the emission color. this is quicker than rendering the object with the meta pass, so take advantage of it when able.

//		emissiveRenderer.UpdateGIMaterials();
		DynamicGI.SetEmissive(emissiveRenderer, c);

2 light probe proxy volumes

both baked and realtime gi is applied to dynamic objects via light probes. an object’s position is used to interpolate light probe data, which is then used to apply gi. this works for fairly small objects, but is too crude for large ones.

as an example, add long stretched cube to the test scene so that it is subject to varying lighting conditions. it should use our white material. as it is a dynamic cube, it ends up using a single point to determine its gi contribution. positioning it so that this point ends up shadowed, the entire cube becomes dark, which is obviously wrong. to make it very obvious, use a baked main light, so all lighting comes from the baked and realtime gi data.

to make light probes work for cases like this, we can use a light probe proxy volume, or LPPV for short.

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
1. GlobalIlluminationRendering.................................................1 1.1 Materials................................................................6 1.2 Rendering Equation......................................................8 1.3 Local Illumination ......................................................10 1.4 Global Illumination .....................................................13 1.5 Random Walk Solution of the Rendering Equation.........................14 1.5.1 MonteCarloIntegration.........................................17 1.5.2 ImportanceSampling............................................19 1.6 Iteration Solution of the Rendering Equation..............................20 1.7 ApplicationofFormerGIResearchResultsinGPUGI.....................22 2. LocalIlluminationRenderingPipelineofGPUS...............................23 2.1 Evolution of the Fixed-function Rendering Pipeline........................26 2.1.1 RasterOperations................................................27 2.1.2 RasterizationWithLinearInterpolation...........................27 2.1.3 Texturing.......................................................29 2.1.4 Transformation..................................................31 2.1.5 Per-vertexLighting..............................................34 2.2 ProgrammableGPUs....................................................34 2.2.1 Render-to-texture and Multiple Render Targets....................36 2.2.2 TheGeometryShader............................................36 2.3 ArchitectureofProgrammableGPUs.....................................38 2.3.1 Resources.......................................................38 2.3.2 Pipeline Stages and Their Control by Render States.................38 3. ProgrammingandControllingGPUS .........................................42 3.1 IntroductiontoHLSLProgramming.....................................44 3.1.1 VertexShaderProgramming......................................46 3.1.2 GeometryShaderProgramming...................................49 3.1.3 FragmentShaderProgramming...................................49 MOCL002-FM MOBKXXX-Sample.cls April 24, 2008 21:8 viii CONTENTS 3.2 ControllingtheGPUfromtheCPU......................................50 3.2.1 ControllingtheDirect3D9Pipeline...............................51 3.2.2 ControllingthePipelineUsingEffectFiles.........................56 3.2.2.1 ControllingthePipelineFromDirect3D10...............59 3.3 ShadersBeyondtheStandardPipelineOperation..........................60 3.3.1 Are GPUs Good for Global Illumination?..........................60 3.3.2 Basic Techniques for GPU Global Illumination.....................62 4. SimpleImprovementsoftheLocalIlluminationModel.........................66 4.1 ShadowMapping.......................................................68 4.1.1 Shadow Map Generation.........................................70 4.1.2 Rendering With the Shadow Map.................................71 4.1.3 ShadowMapAnti-aliasing.......................................74 4.2 Image-basedLighting...................................................77 4.2.1 MirroredReflectionsandRefractions..............................80 4.2.2 Diffuse and Glossy Reflections Without Self-shadowing.............82 4.2.3 Diffuse and Glossy Reflections With Shadowing....................85 5. RayCastingontheGPU......................................................90 5.1 Ray–TriangleIntersectioninaShader.....................................91 5.2 TheRayEngine........................................................95 5.3 Acceleration Hierarchy Built on Rays .....................................97 5.3.1 ImplementationofRecursiveRay-tracingUsingtheRayEngine.....98 5.3.2 RayCasting....................................................100 5.4 FullRay-TracersontheGPUUsingSpacePartitioning....................102 5.4.1 UniformGrid..................................................102 5.4.2 Octree.........................................................103 5.4.3 HierarchicalBoundingBoxes....................................103 5.4.4 Kd-Tree.......................................................104 5.5 Loosekd-TreeTraversalinaSingleShader...............................107 5.5.1 GPURepresentationoftheLoosekd-tree........................109 5.5.2 GPUTraversaloftheLoosekd-tree..............................111 5.5.3 Performance...................................................115 6. SpecularEffectswithRasterization...........................................116 6.1 Ray-TracingofDistanceMaps..........................................120 6.1.1 ParallaxCorrection.............................................121 6.1.2 LinearSearch..................................................122 6.1.3 RefinementbySecantSearch....................................124 MOCL002-FM MOBKXXX-Sample.cls April 24, 2008 21:8 CONTENTS ix 6.2 SingleLocalizedReflectionsandRefractions..............................126 6.3 Inter-ObjectReflectionsandRefractions.................................130 6.4 SpecularReflectionswithSearchingontheReflector......................130 6.5 SpecularReflectionswithGeometryorImageTransformation..............131 6.6 SelfReflectionsandRefractions.........................................132 6.6.1 Simplified Methods for Multiple Reflections and Refractions .......136 6.7 Caustics...............................................................136 6.7.1 LightPass.....................................................137 6.7.2 PhotonHitFilteringandLightProjection........................139 6.8 CombiningDifferentSpecularEffects...................................144 7. DiffuseandGlossyIndirectIllumination.....................................146 7.1 RadiosityontheGPU..................................................148 7.1.1 RandomTexelSelection........................................151 7.1.2 UpdateoftheRadianceTexture..................................153 7.2 Pre-ComputedRadiosity...............................................157 7.3 DiffuseandGlossyFinalGatheringwithDistanceMaps...................159 8. Pre-computationAidedGlobalIllumination..................................169 8.1 Sampling..............................................................171 8.2 Finite-element Method.................................................171 8.2.1 CompressionofTransferCoefficients.............................173 8.3 Pre-computedRadianceTransfer........................................175 8.3.1 Direct3D Support of the Diffuse Pre-computed Radiance Transfer..179 8.4 LightPathMap.......................................................182 8.4.1 Implementation................................................184 9. ParticipatingMediaRendering ..............................................195 9.1 PhaseFunctions.......................................................197 9.2 ParticleSystemModel..................................................197 9.3 Billboard Rendering....................................................198 9.3.1 Spherical Billboards.............................................199 9.3.1.1 GouraudShadingofParticles...........................201 9.3.1.2 GPU Implementation of Spherical Billboards.............202 9.4 Illuminating Participating Media........................................204 9.5 Rendering Explosions and Fire..........................................205 9.5.1 DustandSmoke................................................205 9.5.2 Fire...........................................................206 9.5.3 LayerComposition.............................................209 MOCL002-FM MOBKXXX-Sample.cls April 24, 2008 21:8 xCONTENTS 9.6 Participating Media Illumination Networks...............................210 9.6.1 Iteration Solution of the Volumetric Rendering Equation...........211 9.6.2 Building the Illumination Network...............................212 9.6.3 Iterating the Illumination Network...............................213 10. FakeGlobalIllumination....................................................218 10.1 TheScalarObscurancesMethod........................................220 10.2 TheSpectralObscurancesMethod.......................................221 10.3 ConstructionoftheObscurancesMap...................................223 10.4 Depth Peeling.........................................................225 11. PostprocessingEffects.......................................................229 11.1 ImageFiltering........................................................229 11.1.1 SeparationofDimensions.......................................231 11.1.2 ExploitationoftheBi-linearFilteringHardware...................232 11.1.3 ImportanceSampling...........................................234 11.2 Glow.................................................................236 11.3 ToneMapping.........................................................236 11.3.1 LocalToneMapping...........................................238 11.3.2 GlowIntegrationIntoToneMapping............................239 11.3.3 TemporalLuminanceAdaptation................................239 11.3.4 ScotopicVision.................................................240 11.3.5 Implementation................................................241 11.4 DepthofField.........................................................242 11.4.1 CameraModelsandDepthofField..............................243 11.4.2 DepthofFieldWiththeSimulationofCircleofConfusion.........244 12. IntegratingGIEffectsinGamesandVirtualRealitySystems...................248 12.1 GameEnginesandSceneGraphManagers...............................249 12.2 Combining Different Rendering Algorithms..............................253 12.3 CaseStudies...........................................................256 12.3.1 Moria.........................................................257 12.3.2 RTCar........................................................259 12.3.3 SpaceStation...................................................263 Bibliography................................................................265

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值