compute Shader入门和overdraw参考

本文深入探讨 Unity 中的 Compute Shader,解释线程组概念,并通过实例展示其在过绘制监测中的应用。文章提供了一个用于监测场景中过绘制信息的 Compute Shader 示例,同时分析了其精度误差及其原因。此外,还介绍了如何利用 Compute Shader 转换图像为灰度图。文章适合希望了解和掌握 Unity Compute Shader 的开发者阅读。
摘要由CSDN通过智能技术生成

基础知识理解

参考

简单看懂unity默认生成的computesahder

比较深入了解原理和线程组的概念,还有很多厉害的实例

还不错的讲解
还不错的讲解2

笔记

在这里插入图片描述
一个线程组最大为1024,多的就需要多次dispatch了。

栗子

参考

overdrawMonitor ComputeShader版本
里面有个小bug,在停止后摄像机没有被删掉。还有个不影响运行的报错,好像是unity的bug
别人学习的overdrawMonitor

我的笔记

其中关于事件和委托 补课
关于eidorwindow和事件函数的生命周期

小练手
把图片变成灰度图并换到场景的物体上去

using System;
using UnityEngine;

/// <summary> This is a singleton component that is responsible for measuring overdraw information
/// on the main camera. You shouldn't add this compoenent manually, but use the Instance getter to
/// access it.
/// 
/// The measurements process is done in two passes. First a new camera is created that will render
/// the scene into a texture with high precission, the texture is called overdrawTexture. This texture
/// contains the information how many times a pixel has been overdrawn. After this step a compute shader
/// is used to add up all the pixels in the overdrawTexture and stores the information into this component.
/// 
/// We say this tool measures exactly the amount of overdraw, but it does so only in certain cases. In other
/// cases the error margin is very small. This is because of the nature of the compute shaders. Compute
/// shaders should operate in batches in order for them to be efficient. In our case the compute shader 
/// batch that sums up the results of the first render pass has a size of 32x32. This means that if the
/// pixel size of the camera is not divisible by 32, then the edge pixels that don't fit in range won't
/// be processed. But since we usually have huge render targets (in comparison to 32x32 pixel blocks) and
/// the error comes from the part of the image that is not important, this is acceptable. 
/// </summary>
[ExecuteInEditMode]
public class OverdrawMonitor : MonoBehaviour
{
	//单例模式
	private static OverdrawMonitor instance;
	public static OverdrawMonitor Instance
	{
		get
		{
			if (instance == null)
			{
				instance = GameObject.FindObjectOfType<OverdrawMonitor>();
				if (instance == null)
				{
					var go = new GameObject("OverdrawMonitor");
					instance = go.AddComponent<OverdrawMonitor>();
				}
			}

			return instance;
		}
	}

	private new Camera camera;
	private RenderTexture overdrawTexture;

	private ComputeShader computeShader;

	private const int dataSize = 128 * 128;//这个是compute shader的RWStructBuffer数组的最大长度
	private int[] inputData = new int[dataSize];
	private int[] resultData = new int[dataSize];
	private ComputeBuffer resultBuffer;
	private Shader replacementShader;

	// ========= Results ========
	// Last measurement
	/// <summary> The number of shaded fragments in the last frame. </summary>
	public long TotalShadedFragments { get; private set; }
	/// <summary> The overdraw ration in the last frame. </summary>
	public float OverdrawRatio { get; private set; }

	// Sampled measurement
	/// <summary> Number of shaded fragments in the measured time span. </summary>
	public long IntervalShadedFragments { get; private set; }
	/// <summary> The average number of shaded fragments in the measured time span. </summary>
	public float IntervalAverageShadedFragments { get; private set; }
	/// <summary> The average overdraw in the measured time span. </summary>
	public float IntervalAverageOverdraw { get; private set; }
	public float AccumulatedAverageOverdraw { get { return accumulatedIntervalOverdraw / intervalFrames; } }

	// Extreems
	/// <summary> The maximum overdraw measured. </summary>
	public float MaxOverdraw { get; private set; }

	private long accumulatedIntervalFragments;
	private float accumulatedIntervalOverdraw;
	private long intervalFrames;

	private float intervalTime = 0;
	public float SampleTime = 1;

	/// <summary> An empty method that can be used to initialize the singleton. </summary>
	public void Touch() { }

	#region Measurement magic

	public void Awake()
	{
#if UNITY_EDITOR
		// Since this emulation always turns on by default if on mobile platform. With the emulation
		// turned on the tool won't work.
		UnityEditor.EditorApplication.ExecuteMenuItem("Edit/Graphics Emulation/No Emulation");
		SubscribeToPlayStateChanged();
#endif
		
		if (Application.isPlaying) DontDestroyOnLoad(gameObject);
		gameObject.hideFlags = HideFlags.DontSave | HideFlags.HideInInspector;

		// Prepare the camera that is going to render the scene with the initial overdraw data.
		replacementShader = Shader.Find("Debug/OverdrawInt");

		camera = GetComponent<Camera>();
		if (camera == null) camera = gameObject.AddComponent<Camera>();
		camera.CopyFrom(Camera.main);
		camera.SetReplacementShader(replacementShader, null);

		RecreateTexture(Camera.main);
		RecreateComputeBuffer();

		computeShader = Resources.Load<ComputeShader>("OverdrawParallelReduction");

		for (int i = 0; i < inputData.Length; i++) inputData[i] = 0;
	}

#if UNITY_EDITOR
	public void SubscribeToPlayStateChanged()
	{
		UnityEditor.EditorApplication.playmodeStateChanged -= OnPlayStateChanged;
		UnityEditor.EditorApplication.playmodeStateChanged += OnPlayStateChanged;
	}

	private static void OnPlayStateChanged()
	{
		if (!UnityEditor.EditorApplication.isPlayingOrWillChangePlaymode && UnityEditor.EditorApplication.isPlaying)
		{
			if (instance != null) instance.OnDisable();
		}
	}
#endif

	private bool disabled = true;//控制函数开关 与GUI中的按钮结合

	public void OnEnable()
	{
		disabled = false;
	}

	public void OnDisable()
	{
		disabled = true;
		OnDestroy();
	}

	public void LateUpdate()//update 执行后
	{
		if (disabled) return;

		Camera main = Camera.main;
		camera.CopyFrom(main);
		camera.clearFlags = CameraClearFlags.SolidColor;
		camera.backgroundColor = Color.black;//使用颜色清除背景,但是也会清除掉天空盒子
		camera.targetTexture = overdrawTexture;
		camera.SetReplacementShader(replacementShader, null);//使用replacementShader替换当前相机视角内物体的shader

		transform.position = main.transform.position;
		transform.rotation = main.transform.rotation;

		RecreateTexture(main);

		intervalTime += Time.deltaTime;//Time.deltaTime 是每帧的间隔,因为一秒的帧数是不固定的,例如匀速运动的时候需要
		if (intervalTime > SampleTime)
		{
			IntervalShadedFragments = accumulatedIntervalFragments;
			IntervalAverageShadedFragments = (float)accumulatedIntervalFragments / intervalFrames;
			IntervalAverageOverdraw = (float)accumulatedIntervalOverdraw / intervalFrames;

			intervalTime -= SampleTime;

			accumulatedIntervalFragments = 0;
			accumulatedIntervalOverdraw = 0;
			intervalFrames = 0;
		}//这里应该是算一个每秒平均的overdraw,但是最后显示的也不是这个。。感觉有点乱没懂
	}

	/// <summary> Checks if the overdraw texture should be updated. This needs to happen if the main camera
	/// configuration changes. </summary>
	private void RecreateTexture(Camera main)
	{
		if (overdrawTexture == null)
		{
			overdrawTexture = new RenderTexture(camera.pixelWidth, camera.pixelHeight, 24, RenderTextureFormat.RFloat);
			overdrawTexture.enableRandomWrite = true;//支持随机写,只作为输入应该没有要求,但输出有
			camera.targetTexture = overdrawTexture;
		}

		if (main.pixelWidth != overdrawTexture.width || main.pixelHeight != overdrawTexture.height)
		{
			overdrawTexture.Release();
			overdrawTexture.width = main.pixelWidth;
			overdrawTexture.height = main.pixelHeight;
		}
	}

	private void RecreateComputeBuffer()
	{
		if (resultBuffer != null) return;
		resultBuffer = new ComputeBuffer(resultData.Length, 4);
	}

	public void OnDestroy()
	{
		if (camera != null)
		{
			camera.targetTexture = null;
		}
		if (resultBuffer != null) resultBuffer.Release();//记得release
	}

	public void OnPostRender()
	{
		if (disabled) return;

		int kernel = computeShader.FindKernel("CSMain");

		RecreateComputeBuffer();

		// Setting up the data
		resultBuffer.SetData(inputData);
		computeShader.SetTexture(kernel, "Overdraw", overdrawTexture);
		computeShader.SetBuffer(kernel, "Output", resultBuffer);

		int xGroups = (overdrawTexture.width / 32);
		int yGroups = (overdrawTexture.height / 32);

		// Summing up the fragments
		computeShader.Dispatch(kernel, xGroups, yGroups, 1);//设置线程组数
		resultBuffer.GetData(resultData);//把读出来的结果写入resultData数组

		// Getting the results
		TotalShadedFragments = 0;
		for (int i = 0; i < resultData.Length; i++)
		{
			TotalShadedFragments += resultData[i];
		}

		OverdrawRatio = (float)TotalShadedFragments / (xGroups * 32 * yGroups * 32);

		accumulatedIntervalFragments += TotalShadedFragments;
		accumulatedIntervalOverdraw += OverdrawRatio;
		intervalFrames++;

		if (OverdrawRatio > MaxOverdraw) MaxOverdraw = OverdrawRatio;
	}

	#endregion
	#region Measurement control methods

	public void StartMeasurement()
	{
		enabled = true;
		camera.enabled = true;
	}
	
	public void Stop()
	{
		enabled = false;
		camera.enabled = false;
	}

	public void SetSampleTime(float time)
	{
		SampleTime = time;
	}

	public void ResetSampling()
	{
		accumulatedIntervalOverdraw = 0;
		accumulatedIntervalFragments = 0;
		intervalTime = 0;
		intervalFrames = 0;
	}

	public void ResetExtreemes()
	{
		MaxOverdraw = 0;
	}

	#endregion
}
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
URP(Universal Render Pipeline)是Unity引擎中的一种渲染管线,而Compute Shader是URP中的一种功能,用于在GPU上进行并行计算。Compute Shader可以用来执行各种计算任务,例如图像处理、物理模拟、粒子系统等。 URP Compute Shader提供了一种在GPU上进行高性能计算的方式,它可以利用GPU的并行计算能力来加速复杂的计算任务。与传统的图形渲染不同,Compute Shader不需要与图形渲染管线交互,它可以独立于渲染过程进行计算。 使用URP Compute Shader可以带来以下优势: 1. 并行计算:Compute Shader可以同时在多个线程上执行计算任务,充分利用GPU的并行计算能力,提高计算效率。 2. 高性能:由于在GPU上执行,Compute Shader可以利用硬件加速,提供更高的计算性能。 3. 灵活性:Compute Shader可以执行各种类型的计算任务,不仅限于图形渲染,可以用于各种领域的并行计算需求。 使用URP Compute Shader的基本步骤如下: 1. 创建Compute Shader:在Unity中创建一个Compute Shader文件,并编写需要执行的计算任务代码。 2. 创建Compute Buffer:创建一个Compute Buffer对象,用于在CPU和GPU之间传递数据。 3. 设置Compute Shader参数:将需要的参数传递给Compute Shader,例如输入数据、输出数据等。 4. 调度Compute Shader:使用Graphics类的Dispatch方法来调度Compute Shader的执行。 5. 获取计算结果:在计算完成后,可以从Compute Buffer中获取计算结果。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值