<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[coty.tips]]></title><description><![CDATA[programming stuff]]></description><link>https://coty.tips/</link><generator>Ghost 3.42</generator><lastBuildDate>Sat, 06 Sep 2025 00:16:04 GMT</lastBuildDate><atom:link href="https://coty.tips/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Raytracing in Unity's built-in render pipeline]]></title><description><![CDATA[Getting started with Raytracing in Unity, using the Raytracing API directly.
]]></description><link>https://coty.tips/raytracing-in-unity/</link><guid isPermaLink="false">5fb215f4882de638e3645857</guid><category><![CDATA[raytracing]]></category><category><![CDATA[rtx]]></category><dc:creator><![CDATA[Coty Getzelman]]></dc:creator><pubDate>Thu, 03 Dec 2020 17:22:17 GMT</pubDate><media:content url="https://coty.tips/content/images/2020/11/Unity_GKqZOSxtwj-1.png" medium="image"/><content:encoded><![CDATA[<img src="https://coty.tips/content/images/2020/11/Unity_GKqZOSxtwj-1.png" alt="Raytracing in Unity's built-in render pipeline"><p></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://coty.tips/content/images/2020/11/Unity_GKqZOSxtwj.png" class="kg-image" alt="Raytracing in Unity's built-in render pipeline" width="1578" height="850" srcset="https://coty.tips/content/images/size/w600/2020/11/Unity_GKqZOSxtwj.png 600w, https://coty.tips/content/images/size/w1000/2020/11/Unity_GKqZOSxtwj.png 1000w, https://coty.tips/content/images/2020/11/Unity_GKqZOSxtwj.png 1578w" sizes="(min-width: 720px) 720px"><figcaption>Raytraced Reflections and Shadows</figcaption></figure><hr><p>Unity 2019 added an API for DX12's RTX stuff. Nice, finally!</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://coty.tips/content/images/2020/11/brave_AjqZilvHsd.png" class="kg-image" alt="Raytracing in Unity's built-in render pipeline" width="530" height="74"><figcaption>Oh wait.. guess I'll come back to this later.</figcaption></figure><!--kg-card-begin: html--><center><img src="https://uploads.coty.tips/7TaK4G8TA.gif" width="200" height="200" alt="Raytracing in Unity's built-in render pipeline"> </center><!--kg-card-end: html--><p></p><p>It's later! Unity 2020.1 now has skinned mesh renderer support! Time to play around a bit.</p><hr><h2 id="what-s-new">What's New?</h2><p>There's a few new juicy additions to Unity's scripting API.</p><ul><li><a href="https://docs.unity3d.com/2020.2/Documentation/ScriptReference/Experimental.Rendering.RayTracingShader.html">RayTracingShader</a> - a new special type of shader which is used to generate rays, for querying geometry intersections. </li><li><a href="https://docs.unity3d.com/2020.2/Documentation/ScriptReference/Experimental.Rendering.RayTracingAccelerationStructure.html">RayTracingAccelerationStructure</a> - a GPU memory representation of the geometry to ray trace against. </li></ul><h2 id="getting-started">Getting Started</h2><p>If you are starting from scratch, the very first thing you need to do is to generate a <a href="https://docs.unity3d.com/2020.2/Documentation/ScriptReference/Experimental.Rendering.RayTracingAccelerationStructure.html">RayTracingAccelerationStructure</a>. Here's a simple example to get started.</p><figure class="kg-card kg-code-card"><pre><code class="language-c#">var settings = new RayTracingAccelerationStructure.RASSettings();
settings.layerMask = UpdateLayers;
settings.managementMode = RayTracingAccelerationStructure.ManagementMode.Automatic;
settings.rayTracingModeMask = RayTracingAccelerationStructure.RayTracingModeMask.Everything;

_AccelerationStructure = new RayTracingAccelerationStructure(settings);</code></pre><figcaption>maybe somewhere in a manager's OnEnable(), for example</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-c#">_AccelerationStructure.Update();
_AccelerationStructure.Build(); </code></pre><figcaption>in some manager's Update(), for example</figcaption></figure><p>Once this has been setup, you're ready to generate rays! </p><h3 id="generating-rays">Generating Rays</h3><p>RTX rays are generated from a ray generation shader. If you've used <a href="https://docs.unity3d.com/Manual/class-ComputeShader.html">Compute Shaders</a> in Unity, things will look pretty familiar to you. </p><pre><code class="language-hlsl">#include "HLSLSupport.cginc"
#include "UnityRaytracingMeshUtils.cginc"
#include "UnityShaderVariables.cginc"

RaytracingAccelerationStructure _RaytracingAccelerationStructure : register(t0);

float4x4 _CameraToWorld;
float4x4 _InverseProjection;

[shader("raygeneration")]
void MyRayGenerationShader()
{
    uint3 dispatchId = DispatchRaysIndex();
    uint3 dispatchDim = DispatchRaysDimensions();

    // somehow get the world space position of the pixel we care about
    // somehow generate a ray from that pixel in a meaningful Direction
    // color the pixel we care about, with the information that ray gathered
}</code></pre><p>For the sake of example, we should create something meaningful. Let's dispatch rays from every pixel on the screen, which we will eventually get intersecting with geometry in the acceleration structure that we defined earlier. </p><p>Also, note that the <a href="https://docs.microsoft.com/en-us/windows/win32/direct3d12/raytracingaccelerationstructure">RaytracingAccelerationStructure </a>needs to be defined somewhere, too. t0 is the register that Unity uses for this structure. </p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">float2 texcoord = (dispatchId.xy + float2(0.5, 0.5)) / float2(dispatchDim.x, dispatchDim.y);
</code></pre><figcaption>assuming we dispatch our generation shader such that this kernel is ran for every pixel on the screen, we'll want to do some math to convert that to a 0 to 1 range.</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">float3 viewPosition = float3(texcoord * 2.0 - float2(1.0, 1.0), 0.0);</code></pre><figcaption>after that, we can convert it to be in clip space</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">float4 clip = float4(viewPosition.xyz, 1.0);
float4 viewPos = mul(_InverseProjection, clip);</code></pre><figcaption>next is converting from our clip space, to camera space</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">viewPos.xyz /= viewPos.w;</code></pre><figcaption>don't forget that perspective divide 🔪</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">float3 worldPos = mul(_CameraToWorld, viewPos);
float3 worldDirection = worldPos - _WorldSpaceCameraPos;</code></pre><figcaption>and finally, we can get our world position (and world direction)!</figcaption></figure><p>Now that we know from where and to where we want to cast rays, let's start building a ray and it's payload. In the raytracing API provided to us, we need to build both a <a href="https://docs.microsoft.com/en-us/windows/win32/direct3d12/raydesc">RayDesc </a>and a custom payload struct, which are passed into a TraceRay() call. </p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">RayDesc ray;
ray.Origin = _WorldSpaceCameraPos; 
ray.Direction = worldDirection; 
ray.TMin = 0;
ray.TMax = 10000;</code></pre><figcaption>RayDesc is an intrinsic structure to the RTX API</figcaption></figure><p>You can use any payload that you wish. Keep in mind that you may run into issues with high memory footprint payloads, so try to keep them as small as possible. In this example, we'll just use color. </p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">struct MyPayload
{
	float4 color;
};</code></pre><figcaption>define this somewhere</figcaption></figure><pre><code class="language-hlsl">MyPayload payload;
payload.color = float4(0, 0, 0, 0);</code></pre><pre><code class="language-hlsl">TraceRay(_RaytracingAccelerationStructure, 0, 0xFFFFFFF, 0, 1, 0, ray, payload);</code></pre><p><a href="https://docs.microsoft.com/en-us/windows/win32/direct3d12/traceray-function">TraceRay</a>() is another magic call we can do with the raytracing API. I highly recommend clicking on the previous link, to read up on some of the finer details. At the least, you'll need an acceleration structure to pass in, some flags for intersection rules, a 32 bit <a href="https://docs.unity3d.com/ScriptReference/LayerMask.html">LayerMask </a>flag, the RayDesc, and the custom payload.</p><p>After you've traced a ray, you need to do something with the payload result. </p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">RWTexture2D&lt;float4&gt; RenderTarget;</code></pre><figcaption>define a RenderTexture to write to, outside of this kernel</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">RenderTarget[dispatchId.xy] = payload.color;</code></pre><figcaption>write the color into the render texture. assumming that you have dispatched an array of rays equal to the width and height of your input render texture, you can just use the dispatchId as the position in your texture to write into.</figcaption></figure><p>This traced ray may or may not intersect with geometry. If it does not, you'll probably want to provide a "miss" kernel. </p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">[shader("miss")]
void MyMissShader(inout MyPayload payload : SV_RayPayload)
{
    payload.color = 0;
}</code></pre><figcaption>a miss shader could look something like this</figcaption></figure><p>This miss kernel would be called if the ray does not hit anything, as the name implies.</p><p><em>Keep note of that <strong><a href="https://docs.microsoft.com/en-us/windows/win32/direct3dhlsl/dx-graphics-hlsl-function-parameters">inout </a></strong>keyword. For our purposes, you can assume it works similarly to the <strong>ref </strong>keyword in c# (although its a bit different. inout actually does a copy in and a copy out, its not a reference being passed around). </em></p><p>On the c# side, you'll need some component which can reference the ray generation shader, and then execute it. </p><figure class="kg-card kg-code-card"><pre><code class="language-c#"> // define a RT, to see whats going on 
 // note, if its public, 
 //   you can double click it in the inspector to preview it,
 //   when the game view is focused 
 public RenderTexture _RenderTexture;
 
 </code></pre><figcaption>need to define a render texture to write to</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-c#">public RayTracingShader MyRayGenerationShader;</code></pre><figcaption>need to reference the raytracing shader somehow</figcaption></figure><p>I recommend using a <a href="https://docs.unity3d.com/ScriptReference/Rendering.CommandBuffer.html">CommandBuffer </a>to dispatch the ray generation shader, but you can also just do it from the shader itself. </p><figure class="kg-card kg-code-card"><pre><code class="language-c#">command.SetRayTracingTextureParam(MyRayGenerationShader, "RenderTarget", _RenderTexture); </code></pre><figcaption>don't forget to assign the render texture you want to write to</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-c#">command.SetRayTracingShaderPass(MyRayGenerationShader, "MyRaytracingPass");
command.SetRayTracingAccelerationStructure(MyRayGenerationShader, "_RaytracingAccelerationStructure", _AccelerationStructure);</code></pre><figcaption>take note of "MyRaytracingPass", going to talk about it soon</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-c#">var projection = GL.GetGPUProjectionMatrix(camera.projectionMatrix, false);
var inverseProjection = projection.inverse;

command.SetRayTracingMatrixParam(MyRayGenerationShader, "_InverseProjection", inverseProjection);
command.SetRayTracingMatrixParam(MyRayGenerationShader, "_CameraToWorld", camera.cameraToWorldMatrix);
command.SetRayTracingVectorParam(MyRayGenerationShader, "_WorldSpaceCameraPos", camera.transform.position);</code></pre><figcaption>you'll need to setup those variables for the ray generation shader to actually calculate the stuff it needs</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-c#">command.DispatchRays(MyRayGenerationShader, "MyRayGenerationShader", (uint)_texture.width, (uint)_texture.height, 1u, camera);</code></pre><figcaption>the string passed in matches the raygeneration kernel name, in the RaytracingShader</figcaption></figure><p>Once this dispatch is called, you'll be generating rays! However, they won't do anything when they hit anything, yet. </p><h3 id="hit-">Hit!</h3><p>To do anything interesting, you'll need to provide a hit kernel. In Unity, you can do this in the .shader file of any shader by adding a <a href="https://docs.unity3d.com/Manual/SL-Pass.html">Pass{} block.</a> When you dispatch rays, you can use <a href="https://docs.unity3d.com/ScriptReference/Rendering.CommandBuffer.SetRayTracingShaderPass.html">SetRayTracingShaderPass </a>to specify a name, and when a ray intersects an object, Unity will check for any <a href="https://docs.unity3d.com/Manual/SL-Pass.html">Passes </a>that match that name. </p><p>As an example, let's create a new shader. </p><figure class="kg-card kg-image-card"><img src="https://coty.tips/content/images/2020/11/p8xUWwtBDU.png" class="kg-image" alt="Raytracing in Unity's built-in render pipeline" width="768" height="210" srcset="https://coty.tips/content/images/size/w600/2020/11/p8xUWwtBDU.png 600w, https://coty.tips/content/images/2020/11/p8xUWwtBDU.png 768w" sizes="(min-width: 720px) 720px"></figure><p>Open it up, and add in something like this, after the <a href="https://docs.unity3d.com/Manual/SL-SubShaderTags.html">SubShader</a> block.</p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">SubShader
{
    Pass
    {
        Name "MyRaytraceShaderPass"

        HLSLPROGRAM

        #pragma raytracing MyHitShader

        struct MyPayload
        {
            float4 color; 
        };
        
        struct AttributeData
        {
            float2 barycentrics; 
        };

        [shader("closesthit")]
        void MyHitShader(inout MyPayload payload : SV_RayPayload,
          AttributeData attributes : SV_IntersectionAttributes)
        {
            payload.color = 1;
        }

        ENDHLSL
    }
}</code></pre><figcaption>a very simple hit shader</figcaption></figure><p>HLSL has some a new attribute: [shader("")]. You can define what type of raytracing shader this function is meant for. The most useful for us is "<a href="https://docs.microsoft.com/en-us/windows/win32/direct3d12/closest-hit-shader">closesthit</a>" - which is ran when the ray intersection nearest to the ray's origin is found. </p><p>Here is a list of available DXR shader types, with their appropriate method signatures.</p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">[shader("raygeneration)]
void RayGeneration() {} 

[shader("intersection")]
void Intersection() {}

[shader("miss")]
void Miss(inout MyPayload payload : SV_RayPayload) {}

[shader("closesthit")]
void ClosestHit(inout MyPayload payload : SV_RayPayload, MyAttributes attributes : SV_IntersectionAttributes) {}

[shader("anyhit")]
void AnyHit(inout MyPayload payload : SV_RayPayload, MyAttributes attributes : SV_IntersectionAttributes) {}
</code></pre><figcaption>SV_RayPayload and SV_IntersectionAttributes are both user defined structs.&nbsp;</figcaption></figure><p>What will happen now, is our ray generation shader will generate black for any pixels that are not overlapping geometry, and white for any pixels that are generating geometry. We're raytracing!</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://coty.tips/content/images/2020/11/image-1.png" class="kg-image" alt="Raytracing in Unity's built-in render pipeline" width="672" height="611" srcset="https://coty.tips/content/images/size/w600/2020/11/image-1.png 600w, https://coty.tips/content/images/2020/11/image-1.png 672w"><figcaption>VERY advanced raytracing going on here</figcaption></figure><hr><h2 id="what-can-we-do-with-this">What can we do with this? </h2><p>The raytracing API that DX12 is providing us unlocks some great stuff. </p><p>Here's a screenshot of some raytraced reflections I've thrown together.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://coty.tips/content/images/2020/11/Unity_NvJa5yCt2I.png" class="kg-image" alt="Raytracing in Unity's built-in render pipeline" width="808" height="519" srcset="https://coty.tips/content/images/size/w600/2020/11/Unity_NvJa5yCt2I.png 600w, https://coty.tips/content/images/2020/11/Unity_NvJa5yCt2I.png 808w" sizes="(min-width: 720px) 720px"><figcaption>the "closesthit" kernel is using TraceRay() to keep searching for more intersections, to keep adding on layers of reflections. rays are jiggled around to simulate roughness of materials.&nbsp;</figcaption></figure><p>This can be taken a step further, too. Why not have shadows in our reflections? </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://coty.tips/content/images/2020/11/Unity_czoqNZYi64.png" class="kg-image" alt="Raytracing in Unity's built-in render pipeline" width="856" height="519" srcset="https://coty.tips/content/images/size/w600/2020/11/Unity_czoqNZYi64.png 600w, https://coty.tips/content/images/2020/11/Unity_czoqNZYi64.png 856w" sizes="(min-width: 720px) 720px"><figcaption>I've added an additional ray which sets a flag in the payload to identify as a shadow ray check, to darken areas where rays do not run the miss kernel</figcaption></figure><p>And if our reflections can have raytraced shadows.. why not replace unity's shadows completely with raytraced shadows?</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://coty.tips/content/images/2020/11/Unity_9QEULsWqWE.png" class="kg-image" alt="Raytracing in Unity's built-in render pipeline" width="683" height="481" srcset="https://coty.tips/content/images/size/w600/2020/11/Unity_9QEULsWqWE.png 600w, https://coty.tips/content/images/2020/11/Unity_9QEULsWqWE.png 683w"><figcaption>rays are cast from the world position of the object the pixel is drawing, towards the light. if the ray misses, its white, if it hits, its black. casting multiple rays in semi-random directions can be used to increase the quality of this effect. this is generating a shadow map, which I'm using to replace unity's shadowmap</figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><blockquote class="twitter-tweet" data-width="550"><p lang="en" dir="ltr">its about time voxelgame got some <a href="https://twitter.com/hashtag/rtx?src=hash&amp;ref_src=twsrc%5Etfw">#rtx</a> support<a href="https://twitter.com/hashtag/gamedev?src=hash&amp;ref_src=twsrc%5Etfw">#gamedev</a> <a href="https://twitter.com/hashtag/madewithunity?src=hash&amp;ref_src=twsrc%5Etfw">#madewithunity</a> <a href="https://t.co/w2KuLFFcf0">pic.twitter.com/w2KuLFFcf0</a></p>&mdash; Coty Getzelman (@cotycrg) <a href="https://twitter.com/cotycrg/status/1332761311466442757?ref_src=twsrc%5Etfw">November 28, 2020</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<figcaption>Here's one more example, it's a bit more real-world. I've been working on implementing raytraced effects into my side project.</figcaption></figure><hr><h3 id="getting-useful-data-from-intersections">Getting Useful Data From Intersections</h3><pre><code class="language-hlsl">struct AttributeData
{
    float2 barycentrics; 
};</code></pre><p>Using a user defined struct, you can request some data from intersections. It's possible to use these barycentric coordinates provided to get useful interpolated data from the vertices. For example, you could use this to get the albedo of the triangle with proper uv mapping, or you could even shade the triangle like normal.</p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">#include "UnityRaytracingMeshUtils"</code></pre><figcaption>Unity has recently added some helper functions to their built-in shaders, which makes this pretty trivial to do. Be sure to include it!</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">struct Vertex
{
    float2 texcoord;
};</code></pre><figcaption>Let's make a new struct, for getting some vertex data.</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">uint primitiveIndex = PrimitiveIndex();</code></pre><figcaption>this is a new intrinsic function for hlsl, to fetch a ray's intersected primitive data</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">uint3 triangleIndicies = UnityRayTracingFetchTriangleIndices(primitiveIndex);</code></pre><figcaption>UnityRayTracingFetchTriangleIndices is a new unity built-in function, borrowed from that include earlier</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">Vertex v0, v1, v2;</code></pre><figcaption>to get interpolated data, we're going to need the three vertices that make up the triangle our ray intersected</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">v0.texcoord = UnityRayTracingFetchVertexAttribute2(triangleIndicies.x, kVertexAttributeTexCoord0);

v1.texcoord = UnityRayTracingFetchVertexAttribute2(triangleIndicies.y, kVertexAttributeTexCoord0);

v2.texcoord = UnityRayTracingFetchVertexAttribute2(triangleIndicies.z, kVertexAttributeTexCoord0);</code></pre><figcaption>UnityRayTracingFetchVertexAttribute[0..4] are helper functions from unity, which we can use to get vertex attribute data given the data from earlier</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">float3 barycentrics = float3(1.0 - attributeData.barycentrics.x - attributeData.barycentrics.y, attributeData.barycentrics.x, attributeData.barycentrics.y);</code></pre><figcaption>its time to use those barycentric coordinates!</figcaption></figure><pre><code class="language-hlsl">Vertex vInterpolated;

vInterpolated.texcoord = v0.texcoord * barycentrics.x + v1.texcoord * barycentrics.y + v2.texcoord * barycentrics.z;</code></pre><p>From here, you can sample from textures in the normal way. </p><figure class="kg-card kg-code-card"><pre><code class="language-hlsl">Texture2D&lt;float4&gt; _MainTex;
SamplerState sampler_MainTex;

[shader("closesthit")]
void MyHitShader(inout ExamplePayload payload : SV_RayPayload, AttributeData attributes : SV_IntersectionAttributes)
{
    // get texcoord somehow
    // then use below 
    
    payload.color = _MainTex.SampleLevel(sampler_MainTex, texcoord, 0);
}</code></pre><figcaption>using the texture coordinates</figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://coty.tips/content/images/2020/12/image.png" class="kg-image" alt="Raytracing in Unity's built-in render pipeline" width="495" height="492"><figcaption>rays returning albedos</figcaption></figure><p>Some useful data! You can expand this further by using TraceRay() inside of the closesthit shader passes. Keep in mind that you need to increase <em>#pragma max_recursion_depth 1</em>, when using TraceRay() inside of a hit function. You'll also need to manage your recursion manually. Keep this in mind, or you may crash!</p><hr><h2 id="self-promo">Self Promo</h2><p>If you just want to download some source code, I'm providing it all in the form of a Unity Asset Store package. The raytraced reflections and raytraced shadows are both packaged with it, of course. I've also included a lot of helpful macros, for quickly making your own raytracing shaders. </p><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://assetstore.unity.com/packages/slug/184088"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Corgi Raytracing - Built-in Render Pipeline (Forward and Deferred!) | VFX Shaders | Unity Asset Store</div><div class="kg-bookmark-description">Add depth to your next project with Corgi Raytracing - Built-in Render Pipeline (Forward and Deferred!) from Wandering Corgi. Find this &amp; more VFX Shaders on the Unity Asset Store.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://unity-assetstorev2-prd.storage.googleapis.com/cdn-origin/images/favicons/apple-touch-icon.png?v=1" alt="Raytracing in Unity's built-in render pipeline"><span class="kg-bookmark-publisher">Unity Asset Store</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://assetstorev1-prd-cdn.unity3d.com/key-image/a6836fab-7699-4309-ac31-774f28c90eaa.png" alt="Raytracing in Unity's built-in render pipeline"></div></a><figcaption>my asset includes all of it's source code, so it is easily modifiable</figcaption></figure><hr><h2 id="further-reading">Further Reading</h2><p>I highly recommend the following for further reading.</p><ul><li><a href="https://on-demand.gputechconf.com/siggraph/2019/pdf/sig940-getting-started-with-directx-ray-tracing-in-unity.pdf">https://on-demand.gputechconf.com/siggraph/2019/pdf/sig940-getting-started-with-directx-ray-tracing-in-unity.pdf</a></li><li><a href="http://intro-to-dxr.cwyman.org/presentations/IntroDXR_RaytracingShaders.pdf">http://intro-to-dxr.cwyman.org/presentations/IntroDXR_RaytracingShaders.pdf</a></li><li><a href="https://microsoft.github.io/DirectX-Specs/d3d/Raytracing.html">https://microsoft.github.io/DirectX-Specs/d3d/Raytracing.html</a></li></ul>]]></content:encoded></item><item><title><![CDATA[Wave Break’s Waves]]></title><description><![CDATA[<p><code>Technical Breakdown</code></p><hr><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/yD36ZzEb2Le47_Iw_hNsbpqRGh7F0S6upzUR5LbLbIfBu4XkwaTczTJ2ovF1AQ0-nNmITw88qCfaXQI13TgqEPL8BUa-qf81khZO-8qmAjDjP3tBdkO-gusYrtDidWoPwivrbfFV" class="kg-image" alt></figure><p>Hey, I’m Coty Getzelman, the lead engineer on <em>Wave Break</em>, the world’s first skateboating game, made in Unity! <em>Wave Break</em> is inspired by classic arcade skateboarding games, with a splash of boating, explosions, and cold-blooded murder.</p><hr><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/lgoSMsW_DVm6Kk3u1hXJT0GNnPJdENIB4A2WegbJ37h9XJuGYHG7Pth_wJmyPEOQv4PcSnK4vudAUlxmkZkllBdBbbutFCJEDGLgGo3iqzy2W-_0b0srXv6GTn7M0KIXqzfhEwV2" class="kg-image" alt></figure><p>Inspired by the water physics from <em>Wave Race 64</em>, I</p>]]></description><link>https://coty.tips/wave-breaks-waves/</link><guid isPermaLink="false">5ef233540a535243807ee465</guid><dc:creator><![CDATA[Coty Getzelman]]></dc:creator><pubDate>Tue, 23 Jun 2020 16:53:05 GMT</pubDate><content:encoded><![CDATA[<p><code>Technical Breakdown</code></p><hr><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/yD36ZzEb2Le47_Iw_hNsbpqRGh7F0S6upzUR5LbLbIfBu4XkwaTczTJ2ovF1AQ0-nNmITw88qCfaXQI13TgqEPL8BUa-qf81khZO-8qmAjDjP3tBdkO-gusYrtDidWoPwivrbfFV" class="kg-image" alt></figure><p>Hey, I’m Coty Getzelman, the lead engineer on <em>Wave Break</em>, the world’s first skateboating game, made in Unity! <em>Wave Break</em> is inspired by classic arcade skateboarding games, with a splash of boating, explosions, and cold-blooded murder.</p><hr><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/lgoSMsW_DVm6Kk3u1hXJT0GNnPJdENIB4A2WegbJ37h9XJuGYHG7Pth_wJmyPEOQv4PcSnK4vudAUlxmkZkllBdBbbutFCJEDGLgGo3iqzy2W-_0b0srXv6GTn7M0KIXqzfhEwV2" class="kg-image" alt></figure><p>Inspired by the water physics from <em>Wave Race 64</em>, I started reading a bunch of fluid sim physics papers and tried out many methods. However, the simplicity of <em>Wave Race’s</em> design still holds up to this day. Let’s talk about how we could scale this idea up for modern hardware.</p><p>I wanted the resolution of the wave sim to be such that a single “wave” could be smaller than a player, so that the wakes created by their driving would be in high detail. Getting this level of detail would require a lot of memory, and even now in 2020 this is way too much data to process on a CPU. Instead, We can take advantage of the massively parallel nature of a modern GPU. Unity has Compute Shaders, which will allow us to read and write to arbitrary buffers for this effect.</p><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/FtYkPNzE43q_MdHBys6yRYIx2AM3ke77TGP7CJ1NClW_-k3YbCYuzqANBJAKijCEt8sUBFfLwGs9g-gcEoZu0v9k9rfSZIe48QklhyNbrudWbqH0-7mkmDBVasuuXz6paNhGSMwM" class="kg-image" alt></figure><p>The core idea is to treat a water plane as a set of interconnected springs. For <em>Wave Break</em>, I set up a 2D RenderTexture, in which each pixel represents a spring state. With some simple math, you can get some spring-like behavior by passing the texture through a Compute Shader. In this example gif, the Compute Shader has each pixel sample it’s neighbor pixels and spring accordingly; this alone is enough to generate these nice ripple patterns.</p><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/IFUbYhWk-9UrI-yi_MD4wYiGjuIuJixfVBS0p-LctS6GbwvnMSj9ztksWQI7vkiB0RurPs4MIHoqXPRjWvq4G9RHgZZrUIH79hXdK9xkbutEIjbfh04wXAUvnWwh7gTyB2RGgBVk" class="kg-image" alt></figure><p>Take that texture and blur it to get nice, smooth ripples. Spring + Blur starts looking like some convincing shallow waves!</p><hr><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/UtrS3UlhtDIFfix7MEDV3FPSN2e7OvJ9cY1YZJ6zduyV8bge2UmsCFqNPbc_b0UhRRY9Nr2LZ4630iAC0x2lelxyDWBJcWpcJFfTKrfXG12iFwAdKmvrNuTik1nxrRqDeut3Az5b" class="kg-image" alt></figure><p>The very first pass of this looked like this! All that’s done here is combining that RenderTexture with a simple custom transparent shader, which samples the texture for offsetting vertices and using the length (current - previous) spring states of the pixel for detecting “foam”.</p><p>Obvious issue: the water is too still for ocean/lakes. We need to do a wind pass. We’ll need the world position of the vertex we want to apply wind to. For this world position, we can do the following. Take the current pixel’s (x,y) position, divided by the width and height of the wave texture to normalize into a 0-1 range, then shift over to (-1, 1) range to pretend it’s in the water plane’s transform’s local space, then use localToWorld matrix of the transform to convert it to world space. This allows shifting, rotating, and scaling while still getting the correct world position of a pixel in the RenderTexture. We can then pass the world position into a noise function. However, calling that noise function on every fragment would be pretty slow. Instead, it’s a better idea to create a noise texture!</p><hr><p>Ripples are great for making wakes, but to make a really convincing ocean, you need wind, both by the shoreline and out to sea. So I made this simple unity editor tool for generating noise textures.</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/QRRshxgjoBfwDZDVNXlqcO5bodCuD4G1vuYikC7wl6VsWHWrY-KQMKUb3m40qV8WzHA_43tqy7llLlBB3V30TAnlqb1TMyd93fnt-OZWYHhgQYk081iCx1oqIecqHUM5XbJ6PKYP" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/xmG3Yny_IKXIEZImurOAcRZ1Nevf4WhvhK4C1lqb9JbO91XPvvTo0jsxt9fAEi4I6aS_SoD0FQsZzQduCtJJ9zh5gwlx_5oJ_Lt31ibyQ9SPcbOYnaPt1iJdstMIHIzAKW811el2" class="kg-image" alt></figure><p>Combining the noise texture into the ripple texture results in this texture. Starting to look good!</p><p>With blurred ripples with wind, it looks like waves. Once we start rendering it, it is obvious that the foam is missing. I tried doing it dynamically in a fragment shader, but I wanted the foam to stick around, and it is kind of wasteful doing it for a lot of fragments at a higher resolution (4k?). Instead: allocate a dedicated foam RenderTexture. Also, we’ll need to create two Compute Shader kernels: one for generating foam from wave delta, one for fading foam over time.</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/8omBwxE7kB7T7hX6QlicW-P71Ez3neYg89VA4LWWKEyPDERepj2H4OcVqMOdnLqBz8US01cGdDdWNQ0Q4Zzk9xZWgLdICRD22cTLNFEURXHbFBgy7glJwsv6rhLf0d3lTyiBhPA4" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/bHOFFQKu1RK-dF4t7QPyxmMRgGmeBVya4cxfDKVsPTR4_dKGdKF2ibbgrII2-fSlbUIcDeSD01dzwVvbsJ5lzICa-e4fX5c2lXQEtYx6msZL6jfABn5j-X2LROfhoAVzsoDXlEtv" class="kg-image" alt></figure><p>On top of those two Compute Shaders passively generating foam based on the waves, we should also stamp to the foam texture when we write to the wave texture. For each wave sim step, I’m diffusing the foam while the ripple is happening. Afterwards, I combine these two textures, using the R channel for the waves and G channel for the foam.</p><p>Water planes are placed in a grid, using blurred data for rendering. Unfortunately, this has very visible seams. Our blur is sampling neighbor pixels, but edge pixels were sampling themselves, because there is no info on their neighbor textures!</p><hr><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/WItRfeOgNPuXVYijWgTIrz1NciWA_zp899hz9Jd8w4BpZGb9-5lz4vgTXhy4V9VpPrJW31axu52muuYwZvdyFcObZUVY_ANvl2z7BUXKhgImU-NbFgiGBci17RJgYwRMKuD3Qf_S" class="kg-image" alt></figure><p>The next challenge is to get these textures in-game without seams. The blur effect is tricky, because if you only sample just one texture, you will generate seams around the edges which looks ugly in-game. Each fetch could check if it needs to use the neighbor textures, but that’s just way too slow. One solution is just splitting 1 blur kernel into 5, one for inside block and 4 for the outside edges. Fixed!</p><p>A similar problem occurs when rendering as well, as the fragment shader is billinearly interpolating the blur texture, because there is no info on neighboring textures. So, the shader will need to use a manual bilinear filter, but.. that’s also slow! To avoid shading in this way for most of the screen, I split up the water plane mesh into two meshes: an inside chunk and an outside strip of triangles. The two meshes have the same shader and material, but the outside edge has a material setting to toggle the shader keyword so it uses a shader variant to use a manual bilinear filter which combines neighboring textures. Whew!</p><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/4aR5-WQzge1xeoOwEMzUMT6gR1Y9NrbthpQvDNDVgp837_qn48lV5Wut3d_tRQBP6bbyT_8S9ubQbQAyA-uZbSPenOLDMVC4bzbBv23Cueb4sPQ82I9JkszfM5rCgelbkenYmSZn" class="kg-image" alt></figure><p>Now that we have fixes, we can start rendering. I’m just doing a normal vert/frag shader, where the vertex pass moves verts based on blurred wave texture. But.. verts moved means the normals will be broken! To deal with this, I’m sampling the neighboring wave heights to calculate the new vertex normal, which gets passed to the fragment shader.</p><p>Lighting is typical N dot L diffuse, with some normals scaling and specular adjustments for that stylised feel. The fragment’s normals are shifted a bit by a “micro normals” texture, for added detail. Artists can adjust these micro normals to create the sparkles you see in the gif. The foam RenderTexture is sampled to use as a mask for an artist, who made a repeating foam texture.</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/hscNF0ULZZPverL6wiYne_k4r94C3CtnkS3KE4T_Pkqk2VaWBzl-TiGqP7FFNX4hzSIuhG8yrT3JXiuKlrzf0zCSttLkCRFwJS649XErF3qRs1aIptpI3jMXJMkOb5GEU8OL_53u" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/99iKbnZFRxTrnDJW7wnO7mg2eF5C4Au4ncq9zx0qIqlbqsKVDyStMtohUQ-2uPJOlpEiOtMkBgFUAvJaGk9HNV48dm3V-pekPOzkgJV2AkGlKZXuxohbroNGwWmhgUAYr5-PXAmm" class="kg-image" alt></figure><hr><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/cyDJxbiZqWWuTARYcKiE_3nD1R6-cErmn72rEXLjMqxyuyXSaVk_dwzGW_-JHwl8Dm4AvoCh00J-ct51l_VJ6B5aA-5-xH9H8LN70f1N9ZmiSi6X0ibY3pvjBAOXA-Wpu-5XeTqC" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/7fdgWGgtrDs-emy-GcPU_ufcXf4XCQ9S6r-HFVy1K2VXcq6ioxIHketUEKAf8esfM8o5olNyoDqt1s6aNlKnRrMAkSHTiUw4or4qlS7k4fsu7Ap9NPhuS_eXsvRTc92QtNC1duES" class="kg-image" alt></figure><p>Here are some stylised examples of what is possible with this shader.</p><p>Wireframe is a typical wireframe shader, fragments know their distance from the edge of the triangle. The wire is lit up based on foam mask RT. Light is used to light entire triangles at once.</p><hr><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/1kg27CYHI58oJP-WPtOEqkZ7ksWNkq6j4a_RsmtCpVvhlPR1en3QQTADgIsWwq3qEeHjlzL9qFWwhXsSqeaUCJTqBVpNdeiA8vSEETlPkssPPTlfpRdVww1gBfZS09u2BXIeC8ub" class="kg-image" alt></figure><p>For gameplay, this water is only really useful if there’s two way coupling. This means, I want unity’s Rigidbodies to affect the water, and I want the water to affect the Rigidbodies!</p><p>To ramp off waves, I need to be able to get the wave height from the wave texture. The problem is, it’s not just some array to sample, it’s memory on the GPU. Latency from GPU -&gt; CPU is no joke, so we need to be clever.</p><p>The water plane objects have a trigger to collect a list of touched “floatable” objects that need to query wave height. A List will be used to build a buffer of texture coordinates (made from taking world position -&gt; local position -&gt; uv position) to pass to a Compute Shader, which iterates over the texture and fills an output buffer with wave heights and calculated wave normals.</p><p>Using this method of passing a buffer of positions to a Compute Shader which does the handful of texture samples on the GPU, which are then returned to the CPU, allowing us to get the data we need to calculate buoyancy for hundreds of objects, if necessary, without a significant performance impact of waiting on a whole texture to come back to the CPU.</p><p>However, processing buoyancy math for hundreds of objects would still be slow; C# is bad at math. This is a good use of Unity’s Job System with the Burst Compiler. Buoyancy for multiple objects can be calculated in parallel, since they do not directly affect one another. The slowest part now is iterating over the results of the job, and calling the Rigidbody.AddForce functions.</p><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/DQUVn25Ir1hm4_hTms7JOtnAUVTkL23qWhoXMP27P_jfm8Pf9uQtcC0XvgroykarRPOahZsQPMKZ52sFPK8GhE4_221cLpQVSq02BSVTEejlp4p1-uiF_C1gv-zc6viJFbIvpWCO" class="kg-image" alt></figure><hr><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/1nSbDu6cByXAhPPFJ0mDyzihpMH29_-fcpsuq_7lbGVK6GQpMGoKbbkSvJk4i7m-XYUH3gMcvk5Is89ise1P4ZRoZ3J6hOMslIcmbU5yRKf3ESbnwaQQQ0IF4Ksmk1NM-IBVcoOP" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/BpbopjIOTM3PvJiC_lYNkitVeiibb69eL0TUpxEYrfykzydG57bwuEcpqaYQ-ZF6vsFgTlvedOK0jA4hLlr0X3qQY6Z07VIL-KbaMxE7SJ1WcjDWB-MZo-wg2UK81--OpdPhbXtZ" class="kg-image" alt></figure><p>The Burst job is managed by another script which checks the water plane’s distance from the nearest cameras to LOD, to use for both physics and rendering.</p><p>Notice those high detail strips of triangles between the LODs? Remember when I mentioned we would need a strip of triangles for the manual bilinear filter which samples nearby textures along the edges? That’s what that is!</p><p>The core of the wave system is complete, the rest is polish!</p><hr><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/xmG3Yny_IKXIEZImurOAcRZ1Nevf4WhvhK4C1lqb9JbO91XPvvTo0jsxt9fAEi4I6aS_SoD0FQsZzQduCtJJ9zh5gwlx_5oJ_Lt31ibyQ9SPcbOYnaPt1iJdstMIHIzAKW811el2" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/xmG3Yny_IKXIEZImurOAcRZ1Nevf4WhvhK4C1lqb9JbO91XPvvTo0jsxt9fAEi4I6aS_SoD0FQsZzQduCtJJ9zh5gwlx_5oJ_Lt31ibyQ9SPcbOYnaPt1iJdstMIHIzAKW811el2" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/CkfGW0xGRV1qiQR7oRjjTzJqxp2p6MBgM24OwaUADRcHm0dD8wf_OnJ9kzOfKAWhM-SKm2w8TCgNdpNxHYWxGIiyB8sPahc9OsOn2M0nWNKpwRSAm1qXHfP2r6Sx4VkzkbPpQ3cq" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/r1zKajamvVUnB8zRBz28q-9SOKeaxaFjap6ZP5k36CS3Oio-ACEXEXBZTH_ZyAyTCJILYbXmPKPZxAhBZWBDPigQXnqoMPjZ1FzBrwDgTNYNSEVL-GDzum8xkApp1LgzteqtIWtV" class="kg-image" alt></figure><p>Here’s a question. At what time step do we want to run the physics sim? Every frame would make water move super fast, but a fixed timestep makes it look like a slideshow.</p><p>Solution: time travel?!</p><p>I keep the previous wave texture around, and linearly interpolate between the last physics tick’s texture and current, based on how much time has passed since the last physics step. Because our method is so resolution dependent, we can adjust our timescale based on quality settings in-game. Half the resolution wave sim in quality settings? Then we can also halve the time-step! Huge performance savings for lower quality settings!</p><p>Bonus: This lets us have slow motion waves by just changing the timescale variable in unity!</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/aYd-pvSsA2Zb4P08AZauQKbpwDrXeyTvrZ2j1UzXC7orjPbEsBroHQGsIjc2CYjCW_tas78fY8QzXk0cP7RB2OCKfMmKr8QVLF__mj8aQYyGf5cMf6M6GmcxLsxrwQ1DEZvXRl3g" class="kg-image" alt></figure><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/xhkkamLtDqU19oafZzF0fQDnRxz1GNOvGlzTxy_PSk2EVnLXkWzHl6qK3SpiY1fNuib0AUPS89hEb5JAzM9D_K4EkNx67TranC4u4R90tIQv3lDEjEC8vz_sEnVPaW3yQTuIHKHH" class="kg-image" alt></figure><p>Post process, for most games, is just comparing screen position converted to world position to the water plane’s position/normal. More complex water in most games only calculates the world position of water by using the same wave function the vertex pass of the water is using. Ours isn’t so simple; we need to go from screen space to world space to local space to texture space, get value, back to world space for comparison. We just generate one matrix by multiplying all these matrices together on the CPU, and pass to the post process shader so it’s only one multiple away from the correct position.</p><hr><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/Wi2DIXAjFO7kC5i0ioLIjkjxngAsHTOdvZSi9VhdbHjhcqSP_qejUPs-zMk2fV6g80LAg7TDUdG3kfemC1iB8bPW_CHceNpO4tL0OGoSCyYU3oLRFJo1MEibVEeTPQSZZ1WEYRYW" class="kg-image" alt></figure><p>That’s it! Any questions? You can hit me up.</p><ul><li>Email: <a href="mailto:coty@funktroniclabs.com">coty@funktroniclabs.com</a></li><li>Twitter: <a href="https://www.google.com/url?q=https://twitter.com/cotycrg&amp;sa=D&amp;ust=1592934377139000">@cotycrg</a></li></ul><p>And be sure to check out our game!</p><ul><li>Website: <a href="https://www.google.com/url?q=http://wavebreakgame.com/&amp;sa=D&amp;ust=1592934377139000">wavebreakgame.com</a></li><li>Twitter: <a href="https://www.google.com/url?q=https://twitter.com/WaveBreakGame&amp;sa=D&amp;ust=1592934377139000">@WaveBreakGame</a></li><li>Discord: <a href="https://www.google.com/url?q=https://discordapp.com/invite/funktroniclabs&amp;sa=D&amp;ust=1592934377140000">discord.gg/funktroniclabs</a></li></ul><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/u29IyXxbtPsr_JIse1ZjoD0nT7SrjgObwg38BeCB8MY8wMbhi4Je8IJeAevYHvLPElpMFPt_nZIH0XruQj4ZFWI3q0LcpwKoqMzuxRiSJkKjz-g4dMsdAFe52p4ttJ6UEz9LADpi" class="kg-image" alt></figure><p>Thanks for reading!</p>]]></content:encoded></item></channel></rss>