GameDev Blog: Goblin Rules Football #20: CRT Screen Effect with Shader Graph

While I wait for Goblin Rules Football to release on October 20 (add to your steam wish list now!) I was thinking of things to maybe add to the game that wouldn’t impact gameplay much. Too soon to the release to start adding new game mechanics!

A month or so ago I discovered Acerola‘s videos on shaders/technical art and remembered their video on “Loop Hero’s CRT Effect.” In the video, Acerola creates a shader that warps the image and pinches it in on the corners as well as adds some “scan lines.” The end result looks like this:

I first created a new test project and got everything working right away. Just copy and paste Acerola’s code! Then, when I tried to do it in GRF, I ran into the following problem: The Universal Render Pipeline I was using didn’t support Acerola’s shader.

I’m guessing I, or like a real gamedev/graphics programmer, could have found a way for it to work without too much trouble, but after maybe 2 google search results it seemed like I needed to try and implement it in Shader Graph.

I had previously made a dynamic sprite outline effect using shader graph. That was pretty easy though. All I had to do was copy what someone else did in a video! For this CRT effect thing, though, it was going to be different. I couldn’t find someone else’s video/write-up of doing it with shader graph (or at least didn’t find one that I liked), so I decided to be an “adult” and try to figure out how to do it myself.

At first, it was surprisingly easy. The shader code from Acerola isn’t particularly complicated and I was able to follow along well enough to create it in shader graph. I had a few issues (a lot, if I’m being honest) that took me a few days to resolve, but in the end I think I got the effect down. Here’s what the end result looks like:

And here’s it in motion!

The shader graph “code” can be found on GitHub here. As mentioned on GitHub, it requires a script called “Blit” that you can find on Cyan’s github.

You might be noticing that none of the UI is effected by the shader. That’s because I have my canvas’s Render Mode as “Screen space – overlay” instead of “screen space – camera.” So, the canvas is rendered after all post processing so the shader has no idea it exists. “Why not change the canvas Render Mode?” you might ask. Well, the issue there is that because I am using pixel perfect cameras, a canvas set to “Screen space – camera” causes the UI to “jitter” as the screen moves. It looks really bad, so I avoided it by using “screen space – overlay” instead. The CRT effect doesn’t look as nice as it could since the UI is still “normal.” Oh well.

So, how did I make it? Let me tell you!

Trying to Graph Out Acerola’s Shader

The code for Acerola’s CRT.shader can be found on GitHub. Most of what will be created in Shader Graph will be in the fixed4 fp(v2f i) : SV_Target {} function/method/I-don’t-know-enough-about-shaders-to-know-what-this-is-actually-called. I think most of the stuff at the beginning can be ignored or is taken care of already by URP/shader graph. However, some variables do need to be declared.

sampler2D _MainTex;
float _Curvature;
float _VignetteWidth;

To add these to your Shader Graph, add the following

Add the above variables

All of these variables will be used later. Now, let’s get started creating the graph! The first bit of code is the following:

float2 uv = i.uv * 2.0f - 1.0f;

In shader graph, that looks like this:

First, create a new UV node. Then, drag the “out” value to the screen and create a “Multiply” node. You will multiply it by a float of 2. Then, that output is put into a “subtract” node that subtracts a float value of 1. As Acerola describes it, this will “functionally center the UV coordinate”

Next will be to create an offset value for warping the edges of the image.

float2 offset = uv.yx / _Curvature;

First, take the output from that “Subtract” node and put it into a split node. Then, take the G value into a Vector2’s X value. Then, the vector2 output is put in the A value of a divide node. Finally, the curvature variable is used in the B value of the divide.

Acerola’s code says to use the “.yx” values of the UV here, but for some reason trying to do that, or just putting the UV results directly into the divide node, didn’t “work” quite right for me. After testing it out a bit, using the y/Green value from the split gave me the effect most closely resembling what I wanted. I have no idea why. ¯\_(ツ)_/¯

Next, CRT.shader does the following:

uv = uv + uv * offset * offset;

So, the graph for this looks like this:

The subtract node holds the UV data. The divide node contains the offset value. So, first multiply the UV by the offset (the subtract node and divide node going to that middle multiple node). Then, the result of that multiplication is multiplied by the offset again (the middle multiply to the next multiply, along with the divide to the second multiply). Then, the UV value is sent to an Add node, along with the result of the UV * offset * offset calculation from the second multiply node.

Is this getting confusing? Well, it was confusing making this too!

Anyway, the next step in the code is this:

uv = uv * 0.5f + 0.5f;

In shader graph:

The UV values are in the Add node from before. First, multiply the UV by 0.5f. Then, add the 0.5f value. The UV coordinates should now be warped! Wow!

The next bit of code is:

fixed4 col = tex2D(_MainTex, uv);

To do this in Shader Graph, you will get something like this:

Add the MainTex variable to the graph (note, the name needs to specifically be MainTex so that it’s reference is set to _MainTex and URP knows what to do with it). You then output that to the Texture value of a Sample Texture 2d node.

Next bit of code is:

if (uv.x <= 0.0f || 1.0f <= uv.x || uv.y <= 0.0f || 1.0f <= uv.y)
	col = 0;

I could not, for the life of me, figure out how to do this in Shader Graph. According to the Acerola video, this checks to see if any UV values are greater than 1 or less than 0, and if so, to set the color value of those coordinates to 0 (black). This seemed to be to prevent the end texture from repeating around the edges that are warped away from the edge of the screen. For whatever reason, even though I ignored this section of the shader code completely in the end, my textures didn’t seem to repeat or cause issues. ¯\_(ツ)_/¯

Next shader code:

uv = uv * 2.0f - 1.0f;

Shader Graph:

This is similar to a previous step. The UV info is currently in the Add node. Take that result and multiply it by 2.0f, then subtract 1.0f;

The next bit of code is for creating a Vignette around the edges of the texture (basically blurring them and softening the edges so it looks better).

float2 vignette = _VignetteWidth / _ScreenParams.xy;

Shader graph:

This is where the VignetteWidth variable is used. It is used twice and then divided by the Screen width and height. The result of those two divisions is then stored in a Vector2 node.

Next is to do a “Smoothstep” of the vignette. Shader code:

vignette = smoothstep(0.0f, vignette, 1.0f - abs(uv));

Shader Graph

First, the UV value is sent to an Absolute Value node. Then, that is sent as the B value of a Subtract node, and is subtracted from 1.0f. That is then sent to the “In(2)” value of a Smoothstep node. “Edge1(2)” is kept at 0, and then the Vignette Vector2 is sent to “Edge2(2)”.

Next, Saturate is called on the Vignette value. Shader code:

vignette = saturate(vignette);

Shader Graph:

All you should have to do is take the result of Smoothstep and send it to a Saturate node.

The next few steps are going to look a bit more convoluted in shader graph. The next bit of shader code is:

col.g *= (sin(i.uv.y * _ScreenParams.y * 2.0f) + 1.0f) * 0.15f + 1.0f;

Shader Graph:

This seems like a good time to use the phrase “clear as mud” to me. This starts with the “split” node in the top left, and then ends with the Multiply node in the bottom right, going in a kind of snake flow.

First, the code takes the Sine function of i.uv.y * _ScreenParams.y * 2.0f. It’s important to not that it is the “i.uv” value, not just uv. The i.uv value is the unmodified UV values from the very first UV node. So, drag the result of the very first UV node all the way over to the split node. Then, the y/g value of the i.uv is multiplied and then fed into the Sine node. That value is then added to by 1, multiplied by 0.15, and added to by 1.0.

Finally there is the multiply node. What is it doing? Well, the line of code starts with col.g *= . So, the “G” value from the sample texture needs to be dragged down to this multiply node, and then multiplied by all this Sine math stuff.

Hopefully that makes sense. If it doesn’t, well, bad news. The next step is pretty similar. Shader code:

col.rb *= (cos(i.uv.y * _ScreenParams.y * 2.0f) + 1.0f) * 0.135f + 1.0f; 

Shader Graph:

Again, start at the split in the upper left and end on the (two) multiply nodes on the right. Almost exactly the same as before, but instead this is effecting the SampleTexture’s R(ed) and B(lue) values, hence there being two multiple options.

One thing I added that isn’t in Acerola’s shader is a kind of moving “scan line” effect that goes down the screen. You can see between the multiply and add nodes highlighted below that they aren’t connected.

The shader code would tell us to connect those. Instead, I added the following to create a moving scanline effect/screen flicker effect:

First, a time node is created and multiplied by 0.5f. The 0.5f will affect how fast the scan lines move down, so you can adjust as you see fit. Then, that is added to the G value of i.uv (the original UV value). That is output to a “simple noise” node. The noise is then multiplied by the result of the Multiply node from above, the one missing the connection to the Add node. Finally, the last multiply is fed to that disconnected Add node.

You can always just remove this section. This probably isn’t the “right” way to do the effect I want either, but it gave me something that looked close to what I wanted so I kept it. It’s also a bit more subtle than other moving scanline shaders I saw online, which I was happy with.

Finally, all this work that’s been done needs to be sent to our game. The final line of the shader code is:

return saturate(col) * vignette.x * vignette.y;

Shader graph:

There are a lot of lines here that I imagine make this look very confusing. First off, the “col” value needs to be saturated. The col values have all been modified by the Sine and Cosine functions that were done. Remember, those ended in those Multiply nodes. So, send the results of all the last multiply nodes to a Vector4, matching their RGB values to the XYZ values (R = X, G = Y, B = Z). The Sine function modified the G/Y value, and the Cosine function modified the R/X and B/Z values. Then, that Vector4 is sent to a Saturate node.

The split at the bottom is splitting the Vignette value from waaaayyyy back. vignette.x (the R of the split) is multiplied by the result of the Saturate function. Then the result of that is multiplied by the vignette.y (G of the split node) value. Finally, that result is added to the “Base Color” of the shader graph fragment thing. The A(lpha) value of the SampleTexture2d node is sent to the Alpha value of the fragment.

The final graph in all it’s (unintelligible because it is too small) glory:

The shader graph is complete! Save the asset!

But wait, there are a few more things to do. First, create a material from the Shader Graph asset. Right click on the shader graph asset and create material.

Select the new material and set the Curvature value to 10 and the VignetteWidth to 30 (or whatever the heck you want!):

Finally, you need a way to actually use the material. Thankfully, someone, named Cyanilux or simply Cyan, has already created something for that, called Blit. You don’t need the whole repo. Just download the Blit.cs file and added it to your Unity project.

You will need to add Blit as a render feature to your render pipeline asset. First, to find your render pipeline asset, go to Edit -> Project Settings -> Graphics. Double click on the “Scriptable Render Pipeline Settings”

In your inspector the Render Pipeline Asset should be selected. There, you can find the Renderer. Double click it:

With the Renderer selected in the inspector, you should see “Add Renderer Feature” at the bottom. Click that and select Blit.

Then, under the Blit settings, set the Blit Material to the material created by the new Shader graph

You’re down now! Even when the game isn’t running, you should be able to see the screen effect in the Game view tab of the Unity Editor (though the moving scanlines don’t really show up well)

Turning the Effect On/Off During Runtime

I personally really like the effect, but some players may want to turn it off. To give them the option to, I simply created a “RenderFeaturesManager” with the following code to allow it to be turned on and off:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

public class RenderFeaturesManager : MonoBehaviour
{
    [SerializeField] ScriptableRendererFeature feature;

    public void EnableRetroCRT(bool enable)
    {
        Debug.Log("EnableRetroCRT: " + enable.ToString());
        feature.SetActive(enable);
    }
}

Then, add this script to some sort of gameobject you want to use to control the feature. Have some sort of way for the EnableRetroCRT function to be called to toggle it on and off. I just used a checkbox/toggle in my settings menu but you can handle that however you want, really. Just make sure that “Blit” is added as the “feature” to be turned on and off:

Next Steps…

Uh, well, wait for Goblin Rules Football to be released on October 20, of course! I don’t know what else to do right now. I’m sure I will find something…

Smell ya later, nerds!