1. 前言
HLSL SM6.5 引入了一个新Feature, 叫Sampler Feedback, 它可以让硬件还回一些采样信息,软件可以使用这些采样信息做一些文章,场景有两个:streaming and texture-space shading, 本文主要介绍后者Texel Space Shading.
2. Texel Space Shading (TSS)
预备知识
Texture 有两个最重要的概念:Miplevel , Filter implementation.
TSS Motivation
-
A motive for this texture-space shading algorithm is the reduction in visual instability-over-time for shiny objects that are far away. Because lighting for objects is computed from a perspective of a close observer, the lighting results are likely to be temporally stable.
-
Another motive is the de-coupling of the lighting step from the rasterization step. Many lighting computations do not need to be re-computed frame-over-frame; there could be a real performance savings by computing lighting, say, every other frame, while the observable locations of objects in screen space are still updated every frame.
TSS Algorithm
TSS 是一个Two-Pass 算法:shading pass ,rasterization pass。
Without Sampler Feedback
-
Consider a 3-D object or 3-D scene element which should be shaded in texture space.
-
Allocate a target texture of suitable resolution for how close the object will tend to be relative to the camera.
-
Determine a scheme for mapping locations on the surface of that object, in world space, to areas of that target texture. Fortunately, real scenarios often have the notion of {U, V} co-ordinates per object, and a {U, V} unwrapping map to act as this scheme.
-
Draw the scene, targeting the target texture. For this pass, it may be desirable to simply run a compute shader instead of a conventional graphics render, using a pre-canned mapping of geometry-to-target-space with no notion of a “camera”. This pass would be the pass in which expensive lighting operations are used.
-
Draw the scene once again, targeting the final target. The object is rasterized to the screen. Shading the object is a simple texture lookup which already contains the result of the scene’s lighting computations. This is a far less expensive rendering operation compared to the previous step.
With Sampler Feedback
-
Draw objects straightforwardly to the final target in screen space. For each object with which texture-space shading will be used, keep a feedback map of which areas of objects’ target texture would be updated.
-
For objects with which texture-space-shading will be used, draw the scene targeting the objects’ target texture. This pass would be the pass in which expensive lighting operations are used. But, do not shade areas of the target texture not included in the feedback map.
-
Draw the scene once again, targeting the final target. The object is rasterized to the screen. Shading the object is a simple texture lookup which already contains the result of the scene’s lighting computations.
3. API
上层应用可以使用的sampler feedback 类型有两种:MinMip,MipRegionUsed。
-
MinMip
MinMip, also sometimes called MinLOD, stores “what’s the highest-detailed mip that got sampled”. If no mip got sampled, you’ll get a value of 0xFF when you decode. For streaming systems, this is the representation you’re most likely to use, since it will easily tell you which mip should be loaded next. -
MipRegionUsed
MipRegionUsed acts like a bitfield of mip levels. It tells you exactly which mip levels were requested, not just “what was the most detailed one?” And yes, it’s strictly possible to get a MinMip representation from the MipRegionUsed one, it’d just be rather cumbersome. As a convenience, here’s both. Non-streaming applications such as texture-space-shading rendering scenarios may choose to use MipRegionUsed, since details about exactly which mips were requested could be used to inform level-of-detail settings in rendering.
// HLSL New Type
FeedbackTexture2D<SAMPLER_FEEDBACK_MIN_MIP> g_feedback : register(u3);
FeedbackTexture2D<SAMPLER_FEEDBACK_MIP_REGION_USED> g_feedback : register(u3);
// HLSL New Intrinsics
WriteSamplerFeedback
WriteSamplerFeedbackBias
WriteSamplerFeedbackGrad
WriteSamplerFeedbackLevel
WorkFLow :
参考资料
- https://microsoft.github.io/DirectX-Specs/d3d/SamplerFeedback.html#hlsl-constructs-for-writing-to-feedback-maps
- https://gpuopen.com/learn/texel-shading/
- http://amd-dev.wpengine.netdna-cdn.com/wordpress/media/2013/12/TexelShading_EG2016_AuthorVersion.pdf
- https://graphinesoftware.com/Texture-Space-Shading
- https://software.intel.com/content/dam/develop/external/us/en/documents/pdf/july-gdc-2021-sampler-feedback-texture-space-shading-direct-storage.pdf