渲染pipeline参考

godot  https://godotengine.org/article/godot-3-renderer-design-explained

godot 架构讲的不错的另外一篇文章http://bearcoder.codes/blog/2018/05/26/godot-1/

RENDERING SEQUENCE

Godot efficiently adds or removes steps according to how rendering is configured, so the list presented below should be interpreted as "worst case scenario". As you will probably notice, given effects are so harcoded into the rendering pipeline, they are extremely efficient. Godot with all the effects enabled runs nicely on low-end hardware.

1. Depth buffer pre-pass

The list of geometry to be rendered is evaluated once, and the opaque render list is filled once. As most geometry is just opaque, Godot optimizes by assigning the same material and shader for all objects that don't use alpha scissor or write to VERTEX.

2. Light setup

All lights visible in a frame are configured and set up into UBO/TBO. Indices are assigned to each. In pure forward rendering, these indices are passed per object drawn. In clustered rendering, they are obtained via 3D texture + indices.

3. Opaque sort

Godot uses a 64-bits integer to identify drawing order. This integer mixes render priority, flags, material index, geometry index, and many other variables. When sorted, it minimizes state changes in the render pass.

4. Opaque render pass

Opaque objects are rendered in a way where state changes (and calls to OpenGL) are minimized. In this pass, the MSAA buffers are used.

5. Sky rendering

If a panorama sky is present, it's rendered in this step.

6. Perform SSAO

The depth and normal buffers are resolved and set for reading. Godot processes full-screen SSAO, which is later saved to an R8 buffer, then ping-pong blurred using separable convolution.

Finally, the diffuse + ambient buffer is resolved, and ambient occlusion is applied based on the "ambient scale" component, which stores the ratio between diffuse and ambient light. On objects that provide their own baked AO, this value is harcoded to 0, to avoid SSAO from interfering with them. This allows beautiful mixing between baked and real-time.

Up to this point, keep in mind that only diffuse + ambient light has been resolved.

7. Perform Subsurface Scattering

A separable convolution is performed to blur the diffuse + ambient buffers, using the Subsurface Scattering buffer as reference of where this effect is applied. This effect looks correct in Godot, given it must only be applied to diffuse and ambient light.

Again, keep in mind that only diffuse + ambient light has been resolved here.

 

8. Perform Screen Space Reflection

Screen space reflection is performed at half-resolution and stored at a half-resolution buffer. Godot uses a very correct raymatching approach for this, using Bresenham to draw lines and not process redundant pixels. For roughness, a blurred version of the source scene is used. This approach is not correct, but it's pretty fast.

Keep in mind that only diffuse lighting is used as source for SSR, otherwise specular blobs are wrongly reflected... which looks really bad.

Finally, the specular buffer is resolved and mixed with SSR depending on the metallic amount. This is not completely correct, as it can make some specular blobs lose strength, but in practice it looks fine.

 

9. Compositing

Both specular and diffuse are mixed back, obtaining the final image for opaque materials. If Godot detects that any shader reads from the screen, it will generate mipmaps and put this composition in the effect buffer, while applying gaussian blur to each step. This allows to read from the screen blurred, and works great for materials with fake rough refraction.

Finally, everything is also copied back into the diffuse + ambient buffer to continue rendering.

10. Transparent sort

Objects that were sent to the transparency render list are sorted by depth, from back to front. This rendering step happens on the diffuse + ambient MSAA buffer alone.

11. Transparent rendering

Finally, objects that use transparent materials are all rendered to the diffuse + ambient MSAA buffer. More state changes happen, so this is obviously a less efficient pass.

12. Resolve

The final image is again resolved into the effects buffer.

13. DOF blur, FAR

A two-pass separable convolution takes place to blur objects beyond a specific depth, with interpolation during the transition area.

 

13. DOF blur, NEAR

Again, a two-pass separable convolution takes place to blur objects closer to a specific depth, with interpolation during the transition area. The interesting part of this algorithm is that it is actually done in two passes, without any extra buffers... in a somewhat novel and unique way. The quality is not the best, but the difference is hardly noticeable and performance gains are big.

 

14. Exposure

The image is converted to luminance and shrunk by sizes of 3x3 until it's a single pixel. For auto exposure, this value is used as interpolation target.

15. Bloom

The image in the effect buffer is also gaussian blurred using the mipmap buffers. Godot allows to exchange up to 8 mipmaps to go from thick outlines to broad blurred bloom. Mipmaps are also shown using bicubic upscaling for better (less blocky) quality.

 

16. Final compositing

In a single, final compositing stage, the following happens:

  • Bloom stage application
  • Exposure
  • Tonemapping
  • Linear->SRGB conversion
  • Final adjustments (BCS, color correction, etc.)

And that's it!

urneal https://unrealartoptimization.github.io/book/pipelines/ 

Engine’s rendering pipeline

The engine’s pipeline includes both CPU and GPU-based operations. It includes the whole setup of multiple passes mentioned before. The chapter about passes lists over 20 of them done by Unreal Engine every frame. The resulting information is provided as an input for subsequent passes’ shaders.

There’s much more to Unreal’s pipeline, though. For example, occlusion takes care of discarding meshes which are not visible from camera’s perspective. It’s done an early stage of the pipeline, before they’re even sent to be processed by shaders. Position and properties of light sources are provided to appropriate passes - depending on the choice between deferred and forward shading.

Only after the last pass is finished the image can be displayed on the screen. If the vertical sync (VSync) is enabled, the image may also be delayed or discarded to achieve a required frame rate (for example 60 frames per second).

 

unreal 渲染讲的比较好的一个材料https://neil3d.github.io/assets/pdf/2016-vr-summit-ue4.pdf

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值