Post Processing Bloom

https://catlikecoding.com/unity/tutorials/custom-srp/post-processing/
git 参考:https://bitbucket.org/catlikecodingunitytutorials/custom-srp-11-post-processing/src/master/

most of the time a rendered image is not displayed as-is. the image is post-processed, getting various effects——fx for short——applied to it. common fx include bloom, color grading, depth-of-field, motion blur and tone mapping.
these fx are applied as a stack, one on top of the other.
in this tutorial we will create a simple post-FX stack that initially only supports bloom.
在这里插入图片描述
1.1 setting asset
a project could require multiple post-FX stack configurations, so we begin by creating a PostFXSettings asset type to store the settings for a stack.

using UnityEngine;

[CreateAssetMenu(menuName = "Rendering/Custom Post FX Settings")]
public class PostFXSettings : ScriptableObject { }

we will use a single stack in this tutorial, which we will make available to the RP by adding a configuration option for it to CustomRenderPipelineAsset, which passes it to the RP’s constuctor.

[SerializeField]
	PostFXSettings postFXSettings = default;

	protected override RenderPipeline CreatePipeline () {
		return new CustomRenderPipeline(
			useDynamicBatching, useGPUInstancing, useSRPBatcher,
			useLightsPerObject, shadows, postFXSettings
		);
	}

在这里插入图片描述
1.2 stack object
we will use the same approach for the stack that we used for Lighting and Shadows.
we create a class for it that keeps track of a buffer, context, camera, and post-FX settings, with a public Setup method to initialize them.

using UnityEngine;
using UnityEngine.Rendering;

public class PostFXStack {

	const string bufferName = "Post FX";

	CommandBuffer buffer = new CommandBuffer {
		name = bufferName
	};

	ScriptableRenderContext context;
	
	Camera camera;

	PostFXSettings settings;

	public void Setup (
		ScriptableRenderContext context, Camera camera, PostFXSettings settings
	) {
		this.context = context;
		this.camera = camera;
		this.settings = settings;
	}
}

next, add a public property to indicate whether the stack is active, which is only the case if there are settings for it.
the idea is that if no settings are provided post-processing should be skipped.

public bool IsActive => settings != null;

and the last part we need is a public Render method that renders the stack.
applying an effect to the entire image is done by simply drawing a rectangle that covers the entire image,
using an appropriate shader.
right now we have no shader, so we will simply copy whatever’s rendered up to this point to the camera’s frame buffer.
that can be done by invoking Blit on the command buffer, passing it identifiers for the source and destination.

these identifiers can be provided in multiple formats.
we will use an integer for a source, for which we will add a parameter, and BuiltinRenderTextureType.CameraTarget
for the destination. then we execute and clear the buffer.

public void Render (int sourceId) {
		buffer.Blit(sourceId, BuiltinRenderTextureType.CameraTarget);
		context.ExecuteCommandBuffer(buffer);
		buffer.Clear();
	}

1.3 using the stack
CameraRender now needs a stack instance and invoke Setup on it in Render, just like it does for its Lighting object.

Lighting lighting = new Lighting();

	PostFXStack postFXStack = new PostFXStack();

	public void Render () {
		…
		lighting.Setup(
			context, cullingResults, shadowSettings, useLightsPerObject
		);
		postFXStack.Setup(context, camera, postFXSettings);
		buffer.EndSample(SampleName);
		Setup();}

up to this point we always rendered directly to the camera’s frame buffer, which is either one used for a display or a configured render texture.
we have no direct control over those and are only supposed to write them. 只写??????
so to provide a source texture for an active stack we have to use a render texture as an interrmediate frame buffer for the camera.
Getting one and setting it as the render target works like for shadow maps, except that we’ll use the RenderTextureFormat.Default format. Do this before we clear the render target.

static int frameBufferId = Shader.PropertyToID("_CameraFrameBuffer");void Setup () {
		context.SetupCameraProperties(camera);
		CameraClearFlags flags = camera.clearFlags;

		if (postFXStack.IsActive) {
			buffer.GetTemporaryRT(
				frameBufferId, camera.pixelWidth, camera.pixelHeight,
				32, FilterMode.Bilinear, RenderTextureFormat.Default
			);
			buffer.SetRenderTarget(
				frameBufferId,
				RenderBufferLoadAction.DontCare, RenderBufferStoreAction.Store
			);
		}

		buffer.ClearRenderTarget();
		buffer.BeginSample(SampleName);
		ExecuteBuffer();
	}

also add a Cleanup method to release the texture if we have an active stack.

void Cleanup () {
		lighting.Cleanup();
		if (postFXStack.IsActive) {
			buffer.ReleaseTemporaryRT(frameBufferId);
		}
	}

invoke Cleanup at the end of Render before submitting. Directly before that render the stack, if it is active.

public void Render () {DrawGizmos();
		if (postFXStack.IsActive) {
			postFXStack.Render(frameBufferId);
		}
		Cleanup();
		//lighting.Cleanup();
		Submit();
	}

1.4 force clearing
when darwing to an intermediate frame buffer we render to a texture filled with arbitrary data.
u can see this when the frame debugger is active. unity makes sure the frame debugger gets a clear frame buffer at
the start of each frame, but we sidestep 规避 this when rendering to our own texture.
it usually results in us drawing on top of the previous frame’s result, but this is not guaranteed.
this does not matter if the camera’s Clear Flags is set to the sky box or a solid color, as we are guaranteed to
completely cover the previous data.
but the other two options do not work.

to prevent random results, when a stack is active always clear depth and also clear color unless a sky box is used.

CameraClearFlags flags = camera.clearFlags;

		if (postFXStack.IsActive) {
			if (flags > CameraClearFlags.Color) {
				flags = CameraClearFlags.Color;
			}}

		buffer.ClearRenderTarget();

1.6 custom darwing
the Blit method that we current use draws a quad mesh——two triangles——that covers the entire screen space.
but we could get the same results by drawing only a single triangle, which is a bit less work.
we do not even need to send a single-triangle mesh to the GPU, we can generate it procedurally.

create a PostFXStackPasses.hlsl file in our RP’s shaders folder.
we will put all passes of our stack in there.
the first thing we will define in it is the Varyings struct, which only needs to contain the clip-space position and FX UV coordinates.

#ifndef CUSTOM_POST_FX_PASSES_INCLUDED
#define CUSTOM_POST_FX_PASSES_INCLUDED

struct Varyings {
	float4 positionCS : SV_POSITION;
	float2 fxUV : VAR_FX_UV;
};

#endif

next, create a default vertex pass, with only a vertex identifier as a parameter.
it is an unsigned integer——uint——with the SV_VertexID semantic.
Use the ID to generate the vertex position and UV coordinates. The X coordinates are −1, −1, 3. The Y coordinates are −1, 3, −1. To make the visible UV coordinates cover the 0–1 range use 0, 0, 2 for U and 0, 2, 0 for V.
在这里插入图片描述

Varyings DefaultPassVertex (uint vertexID : SV_VertexID) {
	Varyings output;
	output.positionCS = float4(
		vertexID <= 1 ? -1.0 : 3.0,
		vertexID == 1 ? 3.0 : -1.0,
		0.0, 1.0
	);
	output.fxUV = float2(
		vertexID <= 1 ? 0.0 : 2.0,
		vertexID == 1 ? 2.0 : 0.0,
	);
	return output;
}

2 bloom
the bloom post effect is used to make things glow 发光.
this has a basis in physics, but the classical bloom effect is artistic rather than realistic.
the nonrealistic bloom is very obvious and is thus a good effect to demonstrate that our post FX stack works. we will look at more realistic bloom in the next tutorial, when we will cover HDR rendering. for now we will aim for an LDR bloom glow effect.

2.1 bloom Pyramid
bloom represents the scattering of color, which can be done by blurring the image.
bright pixels will bleed into adjacent darker pixels and thus appear to glow.
the simplest and fastest way to blur a texture is by copying it to another texture that has half the width and height. each sample of the copy pass ends up sampling in between four source pixels.
with bilinear filtering this average blocks of 2x2 pixels.
在这里插入图片描述
Bilinear downsampling 4×4 to 2×2.

Doing this a single time only blurs a little. So we repeat this process, progressively downsampling until a desired level, effectively building a pyramid of textures.
在这里插入图片描述
Pyramid with four textures, halving dimensions each level.

we need to keep track of the textures in the stack, but how many there are depends on how many levels there are in the pyramid, which depends on the source image size.
let us define a maximum of sixteen levels in PostFXStack, which would be enough to scale a 65536x65536 texture all the way down to a single pixel.

const int maxBloomPyramidLevels = 16;

to keep track of the textures in the pyramid we need texture identifiers.

2.7 Threshold
Bloom is often used artistically to make only some things glow, but our effect currently applies to everything, no matter how bright it is. Although it makes no physical sense, we can limit what contributes to the effect by introducing a brightness threshold.

we can not suddenly eliminate colors from the effect as that would introduce sharp boundaries where a gradual transition is expected.
instead we will multiply the color by a weight
在这里插入图片描述
where b is its brightness and t is the configured threshold.
we will use the maximum of the color’s RGB channels for b.
The result is always 1 when the threshold is zero, which leaves the color unchanged. As the threshold increases, the weight curve will bend downward so it becomes zero where b≤t. Because of the curve’s shape, it’s known as a knee curve.
在这里插入图片描述
Thresholds 0.25, 0.5, 0.75, and 1.

this curve reaches zero at an angle, which means that although the transition is smoother than a clamp there is still an abrupt cutoff point. this is why it is also known as a hard knee. we can control the shape of the knee by changing the weight to
在这里插入图片描述
with
在这里插入图片描述
在这里插入图片描述
and k a knee 0-1 slider.
在这里插入图片描述
Threshold 1 with knee 0, 0.25, 0.5, 0.75, and 1.

#ifndef CUSTOM_POST_FX_PASSES_INCLUDED
#define CUSTOM_POST_FX_PASSES_INCLUDED

#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Filtering.hlsl"

TEXTURE2D(_PostFXSource);
TEXTURE2D(_PostFXSource2);
SAMPLER(sampler_linear_clamp);

float4 _PostFXSource_TexelSize;

float4 GetSourceTexelSize () {
	return _PostFXSource_TexelSize;
}

float4 GetSource(float2 fxUV) {
	return SAMPLE_TEXTURE2D(_PostFXSource, sampler_linear_clamp, fxUV);
}

float4 GetSourceBicubic (float2 fxUV) {
	return SampleTexture2DBicubic(
		TEXTURE2D_ARGS(_PostFXSource, sampler_linear_clamp), fxUV,
		_PostFXSource_TexelSize.zwxy, 1.0, 0.0
	);
}

float4 GetSource2(float2 fxUV) {
	return SAMPLE_TEXTURE2D(_PostFXSource2, sampler_linear_clamp, fxUV);
}

struct Varyings {
	float4 positionCS : SV_POSITION;
	float2 fxUV : VAR_FX_UV;
};

Varyings DefaultPassVertex (uint vertexID : SV_VertexID) {
	Varyings output;
	output.positionCS = float4(
		vertexID <= 1 ? -1.0 : 3.0,
		vertexID == 1 ? 3.0 : -1.0,
		0.0, 1.0
	);
	output.fxUV = float2(
		vertexID <= 1 ? 0.0 : 2.0,
		vertexID == 1 ? 2.0 : 0.0
	);
	if (_ProjectionParams.x < 0.0) {
		output.fxUV.y = 1.0 - output.fxUV.y;
	}
	return output;
}

bool _BloomBicubicUpsampling;
float _BloomIntensity;

float4 BloomCombinePassFragment (Varyings input) : SV_TARGET {
	float3 lowRes;
	if (_BloomBicubicUpsampling) {
		lowRes = GetSourceBicubic(input.fxUV).rgb;
	}
	else {
		lowRes = GetSource(input.fxUV).rgb;
	}
	float3 highRes = GetSource2(input.fxUV).rgb;
	return float4(lowRes * _BloomIntensity + highRes, 1.0);
}

float4 BloomHorizontalPassFragment (Varyings input) : SV_TARGET {
	float3 color = 0.0;
	float offsets[] = {
		-4.0, -3.0, -2.0, -1.0, 0.0, 1.0, 2.0, 3.0, 4.0
	};
	float weights[] = {
		0.01621622, 0.05405405, 0.12162162, 0.19459459, 0.22702703,
		0.19459459, 0.12162162, 0.05405405, 0.01621622
	};
	for (int i = 0; i < 9; i++) {
		float offset = offsets[i] * 2.0 * GetSourceTexelSize().x;
		color += GetSource(input.fxUV + float2(offset, 0.0)).rgb * weights[i];
	}
	return float4(color, 1.0);
}

float4 _BloomThreshold;

float3 ApplyBloomThreshold (float3 color) {
	float brightness = Max3(color.r, color.g, color.b);
	float soft = brightness + _BloomThreshold.y;
	soft = clamp(soft, 0.0, _BloomThreshold.z);
	soft = soft * soft * _BloomThreshold.w;
	float contribution = max(soft, brightness - _BloomThreshold.x);
	contribution /= max(brightness, 0.00001);
	return color * contribution;
}

float4 BloomPrefilterPassFragment (Varyings input) : SV_TARGET {
	float3 color = ApplyBloomThreshold(GetSource(input.fxUV).rgb);
	return float4(color, 1.0);
}

float4 BloomVerticalPassFragment (Varyings input) : SV_TARGET {
	float3 color = 0.0;
	float offsets[] = {
		-3.23076923, -1.38461538, 0.0, 1.38461538, 3.23076923
	};
	float weights[] = {
		0.07027027, 0.31621622, 0.22702703, 0.31621622, 0.07027027
	};
	for (int i = 0; i < 5; i++) {
		float offset = offsets[i] * GetSourceTexelSize().y;
		color += GetSource(input.fxUV + float2(0.0, offset)).rgb * weights[i];
	}
	return float4(color, 1.0);
}

float4 CopyPassFragment (Varyings input) : SV_TARGET {
	return GetSource(input.fxUV);
}

#endif

PostFXStack.cs

using UnityEngine;
using UnityEngine.Rendering;

public partial class PostFXStack {

	enum Pass {
		BloomCombine,
		BloomHorizontal,
		BloomPrefilter,
		BloomVertical,
		Copy
	}

	const string bufferName = "Post FX";

	const int maxBloomPyramidLevels = 16;

	int
		bloomBucibicUpsamplingId = Shader.PropertyToID("_BloomBicubicUpsampling"),
		bloomIntensityId = Shader.PropertyToID("_BloomIntensity"),
		bloomPrefilterId = Shader.PropertyToID("_BloomPrefilter"),
		bloomThresholdId = Shader.PropertyToID("_BloomThreshold"),
		fxSourceId = Shader.PropertyToID("_PostFXSource"),
		fxSource2Id = Shader.PropertyToID("_PostFXSource2");

	CommandBuffer buffer = new CommandBuffer {
		name = bufferName
	};

	ScriptableRenderContext context;

	Camera camera;

	PostFXSettings settings;

	int bloomPyramidId;

	public bool IsActive => settings != null;

	public PostFXStack () {
		bloomPyramidId = Shader.PropertyToID("_BloomPyramid0");
		for (int i = 1; i < maxBloomPyramidLevels * 2; i++) {
			Shader.PropertyToID("_BloomPyramid" + i);
		}
	}

	public void Setup (
		ScriptableRenderContext context, Camera camera, PostFXSettings settings
	) {
		this.context = context;
		this.camera = camera;
		this.settings =
			camera.cameraType <= CameraType.SceneView ? settings : null;
		ApplySceneViewState();
	}

	public void Render (int sourceId) {
		DoBloom(sourceId);
		context.ExecuteCommandBuffer(buffer);
		buffer.Clear();
	}

	void DoBloom (int sourceId) {
		buffer.BeginSample("Bloom");
		PostFXSettings.BloomSettings bloom = settings.Bloom;
		int width = camera.pixelWidth / 2, height = camera.pixelHeight / 2;
		
		if (
			bloom.maxIterations == 0 || bloom.intensity <= 0f ||
			height < bloom.downscaleLimit * 2 || width < bloom.downscaleLimit * 2
		) {
			Draw(sourceId, BuiltinRenderTextureType.CameraTarget, Pass.Copy);
			buffer.EndSample("Bloom");
			return;
		}

		Vector4 threshold;
		threshold.x = Mathf.GammaToLinearSpace(bloom.threshold);
		threshold.y = threshold.x * bloom.thresholdKnee;
		threshold.z = 2f * threshold.y;
		threshold.w = 0.25f / (threshold.y + 0.00001f);
		threshold.y -= threshold.x;
		buffer.SetGlobalVector(bloomThresholdId, threshold);

		RenderTextureFormat format = RenderTextureFormat.Default;
		buffer.GetTemporaryRT(
			bloomPrefilterId, width, height, 0, FilterMode.Bilinear, format
		);
		Draw(sourceId, bloomPrefilterId, Pass.BloomPrefilter);
        width /= 2;
        height /= 2;

        int fromId = bloomPrefilterId, toId = bloomPyramidId + 1;
        int i;
        for (i = 0; i < bloom.maxIterations; i++)
        {
            if (height < bloom.downscaleLimit || width < bloom.downscaleLimit)
            {
                break;
            }
            int midId = toId - 1;
            buffer.GetTemporaryRT(
                midId, width, height, 0, FilterMode.Bilinear, format
            );
            buffer.GetTemporaryRT(
                toId, width, height, 0, FilterMode.Bilinear, format
            );
            Draw(fromId, midId, Pass.BloomHorizontal);
            Draw(midId, toId, Pass.BloomVertical);
            fromId = toId;
            toId += 2;
            width /= 2;
            height /= 2;
        }

        buffer.ReleaseTemporaryRT(bloomPrefilterId);
        buffer.SetGlobalFloat(
            bloomBucibicUpsamplingId, bloom.bicubicUpsampling ? 1f : 0f
        );
        buffer.SetGlobalFloat(bloomIntensityId, 1f);
        if (i > 1)
        {
            buffer.ReleaseTemporaryRT(fromId - 1);
            toId -= 5;
            for (i -= 1; i > 0; i--)
            {
                buffer.SetGlobalTexture(fxSource2Id, toId + 1);
                Draw(fromId, toId, Pass.BloomCombine);
                buffer.ReleaseTemporaryRT(fromId);
                buffer.ReleaseTemporaryRT(toId + 1);
                fromId = toId;
                toId -= 2;
            }
        }
        else
        {
            buffer.ReleaseTemporaryRT(bloomPyramidId);
        }
        buffer.SetGlobalFloat(bloomIntensityId, bloom.intensity);
        buffer.SetGlobalTexture(fxSource2Id, sourceId);
        Draw(fromId, BuiltinRenderTextureType.CameraTarget, Pass.BloomCombine);
        buffer.ReleaseTemporaryRT(fromId);
        buffer.EndSample("Bloom");
	}

	void Draw (
		RenderTargetIdentifier from, RenderTargetIdentifier to, Pass pass
	) {
		buffer.SetGlobalTexture(fxSourceId, from);
		buffer.SetRenderTarget(
			to, RenderBufferLoadAction.DontCare, RenderBufferStoreAction.Store
		);
		buffer.DrawProcedural(
			Matrix4x4.identity, settings.Material, (int)pass,
			MeshTopology.Triangles, 3
		);
	}
}

2.8 intensity
we wrap up this tutorial by adding an intensity slider to control the overall strenght of the bloom. we will not give it a limit so it’s possible to blow out the entire image if desired.

[Min(0f)]
public float intensity;

在这里插入图片描述
if the intensity is set to zero we can skip bloom, so check for that at the start of DoBloom.

if (
			bloom.maxIterations == 0 || bloom.intensity <= 0f ||
			height < bloom.downscaleLimit * 2 || width < bloom.downscaleLimit * 2
		) {
			Draw(sourceId, BuiltinRenderTextureType.CameraTarget, Pass.Copy);
			buffer.EndSample("Bloom");
			return;
		}

Otherwise pass the intensity to the GPU, using a new identifier for _BloomIntensity. We’ll use it to weight the low-resolution image during the combine pass, so we don’t need to create an extra pass. Set it to 1 for all draws except the final draw to the camera target.

buffer.SetGlobalFloat(bloomIntensityId, 1f);
		if (i > 1) {}
		else {
			buffer.ReleaseTemporaryRT(bloomPyramidId);
		}
		buffer.SetGlobalFloat(bloomIntensityId, bloom.intensity);
		buffer.SetGlobalTexture(fxSource2Id, sourceId);
		Draw(fromId, BuiltinRenderTextureType.CameraTarget, Pass.BloomCombine);

now we just need to multiply the low-resolution color in BloomCombinePassFragment with the intensity.

bool _BloomBicubicUpsampling;
float _BloomIntensity;

float4 BloomCombinePassFragment (Varyings input) : SV_TARGET {return float4(lowRes * _BloomIntensity + highRes, 1.0);
}
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值