In this article, I will be taking a look at what Shader Graph is; using Shader Graph; the features of Shader Graph; how to create a simple shader with Shader Graph; some useful nodes with examples; and some things to look out for when looking at shaders created with older versions of Shader Graph. With this information, you should be ready to create some basic Shader Graph node effects!
I will leave you with some additional resources that you can check out — these can help you create and customize your own desired visual effects that couldn’t be covered in the scope of this article.
Here’s what you’ll learn:
- Shaders and why they are important
- What is Shader Graph?
- Using Shader Graph
- The Shader Graph window’s seven components
- Useful inbuilt Shader Graph features
- Creating your first shader
- How to use popular nodes (with examples)
- Differences between new and old Shader Graph versions
- Additional resources
Shaders and why they are important
Before I get into Shader Graph, we need to know what shaders are and how they affect us.
Shaders are mini programs that run on the GPU that are used for texture mapping, lighting, or coloring objects. Everything that gets displayed on screen (a computer program or game, to include gaming consoles) goes through some sort of shader before it is displayed.
For most programs, this is built in automatically behind the scenes. In 3D modeling, software is usually added to the model before it is finally rendered. Film studios use them to render effects to the movie, and games use them to display everything.
If you use Unity, you are using them without even realizing it: everything displayed has some sort of material that uses a shader.
This image illustrates some of the different types of shaders available by default in Unity.
Now you may be asking yourself, If Unity already provides shaders that can be used, why do we care about shaders? The simple answer is that the default ones provided are bland and do not add anything special for use, like water, holographic, ghost, glow, or dissolve effects, just to name a few.
We can get some of these effects by changing the setting, like the Surface Type (Opaque or Transparent), specifying Metallic or Specular, Smoothness, or adding an Emission. The problem with this is they are all static. What we want are some dynamic effects or custom settings that we can use to add some polish to the look and feel of our game or application.
Different effects that we can obtain with shaders
Let’s take a look at some of the different shader effects. These are just a small fraction of what is possible. Images are from an Internet image search for “Shader Graph Unity effects”:
From a popular YouTube programming channel:
From the Unity blog:
What is Shader Graph?
Now that we know what shaders are and see some of the different possible effects that we can make from shaders, let’s take a look at how we can create those shaders.
Shaders used to be created through code; over the years, different software companies have been adding tools that allow the creation of shaders through visual node-based systems. Unity is no different; they have given us Shader Graph.
Refer to Unity’s Shader Graph features for the highlights: “Shader Graph enables you to build shaders visually. Instead of writing code, you create and connect nodes in a graph framework. Shader Graph gives instant feedback that reflects your changes, and it’s simple enough for users who are new to shader creation.”
Shader Graph was designed for artists, but programmers that are not shader programmers can also use it for easy shader creation. Hey, not all of us work with AAA studios that have the budget for a dedicated team of shader programmers/artists. Also, shaders can require a lot of knowledge of complex math and algorithms to hand-code.
For example, a snapshot of a new shader created in Unity does basic lighting of a model with a texture for the color. This is lines 1571 to 1580 of the 1685 lines of code:
What do I need to use Shader Graph?
Unity notes these requirements to use Shader Graph:
Use Shader Graph with either of the Scriptable Render Pipelines (SRPs) available in Unity version 2018.1 and later:
As of Unity version 2021.2, you can also use Shader Graph with the Built-In Render Pipeline.
This means that as long as we are using one of the SRPs that we can use Shader Graph out of the box.
Even though we can install it with the Package Manager and use it with the Built-In Render Pipeline, Unity goes on to state, “It’s recommended to use Shader Graph with the Scriptable Render Pipelines.”
The first step is to create a new project in Unity Hub using the URP or HDRP template (or install one of the SRPs into an existing project).
Create a project using the URP or HDRP template.
Once the scene loads, I add a capsule to my scene that I can use to display my effects with (feel free to use any 3D object that you want, to include your own model). I also create a material that I can apply my custom shader to and add it to my capsule.
Using Shader Graph
Now that we have a very basic test scene, let’s create a new Shader Graph. Right click > Create > Shader Graph > SRP you want to use > type of shader.
In my case, I am going to use URP > Lit Shader Graph.
The Shader Graph menu will always contain Blank Shader Graph (a completely blank shader graph, no target is selected, and no blocks are added to the Master Stack) and Sub Graph (a blank Sub Graph asset, a reusable graph that can be used in other graphs) options.
There should be a submenu for each installed render pipeline that contains template stacks. In my case, I have URP and inbuilt submenus. The template creates a new Shader Graph that has the Master Stack with default Blocks and a Target selected. You can always change the settings in Shader Graph window later.
Now that we have a Shader Graph created, let’s set the material that we are using for this shader.
There are two ways to do this:
- In the Inspector Window for the material (all Shader Graphs are in the Shader Graph submenu)
- Drag and drop the shader onto the material
Basic shader graph definitions and terms
Now is a good time to pause and go over some useful terms, some of which I have already used; i.e., Master Stack, Blocks, Target.
- Spaces — what the nodes expect their input or outputs to be
- Object space — the position the objects’ vertices are relative to the center/pivot point of the object
- World space — the position of the objects’ vertices are relative to a point in the world. This position is relative to the camera’s position (camera-relative rendering) excluding the camera’s rotation. In HDRP, there is an added position called absolute world space, which is a point in the world. In URP, world space and absolute world space are the same
- Tangent space — relative to the vertex and its normal
- View/eye space — relative to the camera’s forward direction, takes into account the rotation of the camera
- Clip space — relative to the screen, once the view space is projected
Let’s also cover some more complicated terms in depth.
Graph Target (target)
This is the render pipeline that the Shader Graph is for; you must have the render pipeline installed in your project for it to be available in the list. Not all blocks are compatible with all targets.
You can have multiple targets: this allows for easy creation of Shader Graphs that can be used in all render pipelines without having to do duplicate work. This can be changed in the Graph Settings.
Properties and keywords
Properties
These are variables that we can use to modify the shader’s values after it has complied.
For more info on properties, see the Unity docs.
All properties have the following settings. Other settings are available depending on the data type:
- Name — the name that gets displayed for the property; i.e., in the Shader Grap
- Reference — the name used internally by the shader. This must begin with an
_
. It’s automatically set to the Display Name; spaces are converted to_
. You can change it. This name is used to access the property through C# script with theMaterial.Set
andMaterial.Get
methods - Precision — for details, see Precision Modes | Shader Graph | 12.1.7
- Exposed — if
true
, the property will be exposed in the Material Inspector - Override Property Declaration — if enabled, the Override Property Declaration allows a Shader Declaration Enumeration
- Default — the value to use as default for the property. This value depends on the data type of the property
Keywords
Keywords are used to create different variants for your Shader Graph (the shader behaves differently depending on the value of the keyword).
Keywords is an advanced feature and is beyond the scope of this article. As of writing this, Material Quality is the only type that has settings not able to change.
If you want to seek out more about keywords on your own time, the Unity docs are a good place to start.
Block node
Block nodes are part of the Master Stack and are what the outputs of the shader nodes are connected to. Different shader stages (Vertex or Fragment) have different Block nodes.
Vertex Stage
The position, normal and tangent, of the vertices.
- Position — the position of the vertex after being moved by the shader
- Normal — the direction the vertex points
- Tangent — typically lies perpendicular to the vertex normal; it is recommended to change this if you change the vertex normal
Fragment Stage
The Fragment Stage operates on the pixels after the Vertex Stage. This is the color and lighting.
Which Blocks that are available depends on the Target Add Material settings.
- Base Color (albedo) — the color of the object without lighting affecting it. This is a Vector3 color, meaning it is only the red, blue, and green values
- Normal — this is used for Unity’s inbuilt lighting calculations
- Emission — the light that the object emits
- Metallic — takes a float 0–1 how metallic an object is, only when using a metallic workflow
- Specular — takes a color to affect specular highlights, only when using the specular workflow
- Smoothness — 0–1 how smooth an object is. Rough > Polished mirror
- Ambient Occlusion — 0–1 how much of the pixel is blocked by other light. Not Blocked > Fully Blocked (lighting is artificially reduced)
- Alpha — 0–1 how transparent a pixel is. In sprites, this is typically the alpha of the color from the texture of the sprite
- Alpha Clip Threshold — pixels below this threshold get culled
- Sprite Mask — not defined in documentation. Grayed out in all materials except Sprite Light types
Node categories
Shader Graph has over two hundred different nodes that can be used to create a shader; refer to Unity’s Node Library for a detail on all of the nodes.
The nodes are organized by categories in the Create Node menu:
- Artistic nodes: Colors, color channels, and texture operation
- Channel: The order and value of each component of a vector
- Input: Basic primitive types, sampling textures, and getting information about the mesh
- Math: Math operations
- Procedural: Procedural operations like noise
- Utility: Utility nodes, like Preview, Custom Shader Function, and Logic
- UV: Transform the UVs used to sample textures
- Block: The nodes of the Master Stack; represents a part of the surface used for the final shader output. The Unity docs have more info
To access the Create Node menu, you can right-click in the Shader Graph view and select Create Node or press the spacebar. Nodes can be found by looking through the submenus or typing them in the search bar.
Nodes have input ports and output ports. The different ports accept different data types depending on the node. Not all nodes have an input port. For instance, the Block nodes have no output port.
Unity tried to make Shader Graph extremely user friendly — this also helps with the debugging of your effect. They tried to make it so you can see the results of an effect along the way as much as possible. You do not have to wait for the final output to be connected to see how things change.
The Shader Graph window’s seven components
The Shader Graph has seven main components to it. Let’s go over them so you can become familiar with these windows.
The main window
This is the Shader Graph window itself. It contains (listed in order by view precedence) the Toolbar, Blackboard, Graph Inspector, Main Preview, Master Stack, and Nodes.
You can zoom in and out with the scroll wheel, pan with the middle mouse button, and drag-select with and move items around with the left mouse button. To open this window, all we have to do is double-click on a Shader Graph Asset in the Project view.
Blackboard
This is where we can add the properties and keywords. The Blackboard can be moved anywhere within the main view and is set in a manner that we can never lose it. It can be turned on and off.
Main preview
This is a preview of what the final output of the shader will look like. The main preview can be moved anywhere within the Main View and is set in a manner that we can never lose it. It can be turned on and off. You can select one of the inbuilt meshes to use or use a custom mesh (any mesh object that is in the project).
Nodes
The nodes of the graph. This window can be moved anywhere within the main view, zoomed in and out, added and deleted.
Master Stack
The difference between this node and all the others is that we can not add or delete it. It will always be displayed above all other nodes.
Graph Inspector
The Graph Inspector contains the Graph Settings and Node Settings tab. The Node Settings tab changes depending on which node/block/parameter/keyword we have selected.
This window can be moved anywhere within the main view and is set in a manner that we can never lose it. It can be turned on and off.
Toolbar
Permanent at the top of the window, the toolbar contains the File buttons on the left and the Color Mode selection.
Useful inbuilt Shader Graph features
The Shader Graph is packed full of useful features. These are a handful you’ll be using frequently; let’s see how you can take advantage of them.
Accessing documentation
One helpful item is the ability to access the documentation straight from a node. With the node selected, press F1 on the keyboard.
Please note that even though some of the others in the window have the same option, it does not connect all of the time. If you run into that issue, delete what is in the address bar after /manual/, and in the Filter Content, look for one of the keywords.
Smart connections
Connections between nodes are not allowed to be made between incompatible types. When you try, instead of making the connection, it will bring up the Create Node window that displays all of the nodes that has an input of the type that you are using as an output and an output of the type you are trying to use for the input.
For example, I have a Texture Asset that I want to use for the Base Color Block. The Texture2D Asset has an output Texture2D, but the Base Color takes an input of Vector3. When I try to connect the output of the Texture2D Asset to the input of the Base Color Block, I get a Create Node menu that has all of the nodes that have an input of Texture2D and an output that is compatible with a Vector3.
When I choose Input > Texture > Sample Texture2D, I get a new Sample Texture2D node that has a connection to the Texture2D asset node. Now all I need to do is pick which output of the Sample Texture2D I want to use (I will use the RPGA, which is a Vector4).
Grouping
You can group nodes together by clicking a dragging to select a group.
Sticky Notes
Notes that can be used to add details, a to-do list, comments, etc.
Sub Graphs
Take a group of nodes that perform a specific function and turn them into a Sub Graph. These Sub Graphs can be reused in other graphs like any other node. Find yourself repeating the same steps over and over in multiple shaders? Turn it into a Sub Graph.
Notice how it left the properties in the main graph. Double-clicking on the Sub Graph opens it in a new Sub Graph window. The properties got duplicated, but the defaults changed back to Unity defaults.
Multiple Shader Graphs
You are able to have multiple Shaper Graphs open at one time. Each Shader Graph or Sub Graph will open in its own Shader Graph window. These windows can be moved around just like any other window in Unity. You can even copy nodes from one Shader Graph to another.
Viewing the code for a node
In addition to the node library documentation for a node, you can generate the code for each node and select Show Generated Code. This will open the code for that node in your code editor.
You can also view the code for the entire shader by selecting it in the Project view and selecting one of the view/show code buttons.
Creating your first shader
Now we have everything needed to create a simple shader. Let’s make a Lit shader that can be used in both the Built-In and URP that has a color that can be set in the Inspector. If you want to, include HDRP just ensure that you have HDRP added in your project.
Project setup
To start, create a Shader Graph asset and open it in the Shader Graph window. I have already created a Lit Shader Graph, so I just double-click on it to open it. If you have not created one yet, right-click > Create > Shader Graph > SRP you want to use > type of shader.
In my case, I am going to use URP > Lit Shader Graph.
In the Graph Settings, make sure that the Shader Graph has all the targets listed. In my case, I use Built-in and Universal (feel free to use just one, e.g., Universal). Then make sure each of them are set to Lit.
Next, create a new node. I want to be able to change the color, so I need a color node.
Connect the node and set the output
Now the Color node needs to be connected to the Master Stack. To do this, I clicked on the Output of the Color node and drag it to the Input of the Base Color. Notice that the connection updated the Main Preview.
Next, I changed the output color of the Color node to a blue. Notice that this was updated in the Main Preview, too.
Note that I could have done the creation of the node and connected it to the Base Color Block in one step. To do this, click on the output of the Base Color Block and drag to an empty space in the Shader Graph main window.
Save the graph
For this to be used in my scene, I have to make sure to save the graph.
Of course, if it doesn’t appear on my object in the scene, then the first step is to make sure that the Render Component of the GameObject is using a Material that is using the shader.
Convert the color to a property
The last requirement Is to make this color settable in the Inspector. For this, I need a property.
I can create a property in the Blackboard and add a new Color Property.
Then either drag and drop it or use the Create Node menu and select it in the Properties submenu.
I can also simplify this step and convert the Color node that is already in the graph to a property.
The most important thing to note is the Node Settings of the property. In order for it to be exposed in the Inspector, we need to make sure the Exposed setting is set. The other setting that we need to take note of is the Reference setting; this is the name to use in C# scripts.
Don’t forget to save the asset.
The shader is complete
Now we can change the color in the Material’s Inspector.
This is a very simple shader, so let’s take a look at a side-by-side comparison of the code.
Using Shader Graph is definitely easier and quicker. Technically, I got two shaders because it works on two different render pipelines.
How to use some of the popular nodes (with examples)
In this section, I will try to show some of the common things that shaders are used for and the nodes that are used to make them. I will show different nodes in each example, but I will not be able to go through all 200+ nodes that are available.
If you want to know what a different node does that I don’t show here, I suggest checking the documentation, adding it to your Shader Graph, and seeing how it works. Remember: most nodes have their own preview, so you do not have to connect it to anything to see its effects.
Most of these nodes are reusable and can be connected together, so I will convert them to Sub Graphs, and the effect can be added to any other Shader Graph. Every example will use the Property node (probably more than one). The Property node works the same as its base type node with the difference being that the Property node can be easily accessed from C# scripts and the Material’s Inspector.
Texture and color effects
One of the most basic things that you will want to achieve is to have a texture that is associated with your model and have it rendered as a color. Typically, the base texture is combined with a color and then applied to the final output. Both the texture and color are usually settable from within the Inspector.
New nodes you’ll learn for the color and texture effects
Color Node
Color outputs a Vector4 representing an RPGA value. The Mode is Default or HDR.
Texture 2D Asset Node
Allows you to select a Texture2D from the project’s assets. Used in conjunction with Sample Texture2D [LOD] node types. Allows a Texture2D to be loaded once and sampled multiple times.
The Texture2D property provides two additional settings not available with the Texture2D Asset mode:
If the Texture is not set, then it outputs a blank texture.
You can choose White, Black, Grey, Normal Map, Linear Grey, or Red.
To use Tiling and Offset, set to false in order to manipulate scale and offset separately, used from other texture properties like in Split Texture Transform Node | Shader Graph | 12.1.7.
Sample Texture2D Node
Sample Texture2D takes a Texture2D and returns a Vector4 color (RGBA). This can only be used in the Fragment Shader stage. For the Vertex Shader stage, use Sample Texture2D LOD node instead.
The settings Node Settings > Use Global Mip Bias enable the automatic global mip bias (set during certain algorithms to improve detail reconstruction) imposed by the runtime.
Node Settings > Preview: Inherit, Preview 2D, and Preview 3D are the type for the Preview area.
The texture type is Type: Default or Normal, and the space of the normal can be Tangent or Object. (Type must = Normal).
These are Sample Texture2D’s inputs:
- Texture: Texture2D to Sample
- UV: (UV) Vector2 representing UV Coordinates. U[0–3] or provide your own
- Sampler: (Default sampler state) The Sampler to use. For more details, see Sampler State Node | Shader Graph | 12.1.7
And its outputs:
- RGBA: Vector4 color
- R: Float red (x) of RGBA
- B: Float blue (x) of RGBA
- G: Float green (x) of RGBA
- A: Float alpha (x) of RGBA
The Preview is a visual representation of what is output in (RBGA).
Sample Texture2D LOD Node
Sample Texture2D LOD takes a Texture2D and returns a Vector4 color (RGBA). This is only useful for the Vertex Shader stage. Some platforms (unsupported) may return opaque black instead.
Some settings to remember:
- Node Settings > Use Global Mip Bias: Enables the automatic global mip bias (set during certain algorithms to improve detail reconstruction) imposed by the runtime
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
- Type: Default, Normal — the texture type
- Space: Tangent, Object — the space of the normal (Type must = Normal)
Sample Texture 2D LOD’s inputs:
- Texture: Texture2D to sample
- UV: (UV) Vector2 representing UV Coordinates. U[0–3] or provide your own
- Sampler: (Default sampler state) The sampler to use. For more details, see Sampler State Node | Shader Graph | 12.1.7
- LOD: The Level of Detail to sample
Sample Texture2D LOD outputs:
- RGBA: Vector4 Color
- R: Float red (x) of RGBA
- B: Float blue (x) of RGBA
- G: Float green (x) of RGBA
- A: Float alpha (x) of RGBA
Multiply Node
As the name suggests, the Multiply node multiplies input A by input B.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Multiply inputs:
- A: Vector or Matrix
- B: Vector or Matrix
Multiply outputs:
- Vector or Matrix (depends on the types of Input A and B)
Rim light effect
For this, we’ll build a rim light that has a Color and Power exposed in the Inspector.
The new node you’ll learn for the rim light effect
Fresnel Effect
This node approximates a Fresnel Effect by calculating the angle between the surface normal and the view direction. This is often used to achieve rim lighting, common in many art styles.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Inputs:
- Normal: Vector3 — the space to use for the normal
- View Dir: Vector3 — the view direction to use
- Power: Exponent of the power calculation
Outputs:
- Float, representing a Fresnel Effect
Dissolve effect
This is a really cool effect that is used all the time. Some examples are bringing a game object into/out of a scene, destroying things that are on fire, or in a character editor changing the character (you dissolve the old character out and a new character in).
You can increase the DissolveScale while decreasing the DissolveAmount and you can get an interesting effect.
Be sure to enable Alpha Clipping in the Graph Settings and set the Alpha Block to 0.5.
If you want the inside of the model to render, set Render Face to Both, else set to Front.
New nodes you’ll learn with the dissolve effect
Simple Noise Node
Simple Noise generates a Value Node based on the UV Input scaled by the Scale Input. There are a couple of other noise nodes that can be used depending on the type of noise you want (e.g., the Gradient Noise Node generates Perlin noise, and Voronoi Node generates Worley noise).
Inputs:
- UV: (UV) Vector2 UV value
- Scale: Float amount to scale the Input UV by
Outputs:
- Float, representing the noise
Step Node
Step returns a 1 (true, white) or 0 (false, black) if the value input is above or equal to the value of input Edge.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Step Node inputs:
- Edge: Dynamic Vector representing the Step value
- In: Dynamic Vector representing the Input value
Step Node outputs:
- Dynamic Vector (0 or 1) if Input value (In) is greater than or equal to the step value (Edge)
Position Node
The Position Node provides access to the position of the vertex of the mesh; the position is relative to the selected space.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
- Space: This is the space of the coordinate. It can be Object, View, World, Tangent, or Absolute World
As an example, here’s what happens when the Position Node is connected to the Simple Noise Node’s UV input in the Dissolve Shader.
This is the result if Space = World:
And this is the result if Space = View:
Twirl Node
The Twirl Node applies a black hole–like warping effect to the value of input UV.
Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Inputs:
- UV: (UV) Vector2, the Input UV value
- Center: Vector2, reference point to the center
- Strength: Float, how strong the effect is
- Offset: The input UV value
Outputs:
- Vector2, the new UV value
Preview: Visual Representation of what is outputted.
Checkerboard Node
This generates a checkerboard with the colors provided; the scale is defined by the Frequency. This node is used to apply a checkerboard texture to a mesh, also useful for a visual in Shader Graph on what the effect of changing the UV by some value looks like.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Inputs:
- UV: (UV) Vector2, the UV value
- Color A: Vector3, color of first checker
- Color B: Vector3, color of second checker
- Frequency: Scale per UV axis
Outputs:
- Out: Vector2, the UV value
Subtract Node
This node subtracts Input B from Input A
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Inputs:
- A: Dynamic Vector, first input value
- B: Dynamic Vector, second input value
Outputs:
- Out: Dynamic Vector, result
Preview:
- Black result <=0 to white result >= 1
Negate Node
Negate flips the sign value of In. Positive values are now negative and negative values are now positive.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Inputs:
- In: Dynamic Vector, value to flip
Outputs:
- Out: Dynamic Vector, result of flipping sign
Preview:
- Black result <=0 to white result >= 1
Dissolve with emission effect
With the addition of one more node, we can have the dissolve effect emit light.
To help keep things neat and easy to read, I made the dissolve effect a Sub Graph. I outputted the Step Node (this is what we had attached to the Alpha Threshold) and the Emission (we will need this for the Emission Block).
Dissolve with emission nodes
Add Node
Add the two inputs together.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Inputs:
- A: Dynamic Vector, first input value
- B: Dynamic Vector, second input value
Outputs:
- Out: Dynamic Vector, result
Preview:
- Black result <=0 to white result >= 1
Sub Graph
This allows you to add any of your custom created Sub Graphs.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
You can include any dropdown properties defined on the Blackboard. For more info, see Subgraph Dropdown in the docs.
Inputs:
- The properties that you have defined in the Blackboard.
Outputs:
- What you defined in the Sub Graph Output Node. Must have at least one output
Combine them together
Now it is time to combine all of these into one Shader Graph.
Here are each of the previous effects as a Sub Graph:
And the Shader Graph, using all of them together:
Nodes for combining these all together
Comparison Node
This node compares two values to a condition. It’s one of the many logic test nodes available; others are ALL, And, Any, Is Front Face, Is Infinite, Is NaN Nand, Not, and Or.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
- Dropdown: Equal, NotEqual, Less, LessOrEqual, Grater, GreaterOrEqual — the condition to use for comparison
Inputs:
- A: Float to compare to
- B: Float to compare with
Outputs:
Out: Boolean result of comparison
The Preview is a visual representation of what is outputted.
Branch Node
The Branch Node returns a value based on a true/false condition. The other Branching Node is BranchOnInputConnection.
- Node Settings > Preview: Inherit, Preview 2D, Preview 3D — the type for the Preview area
Inputs:
- Predicate: Boolean, determines which input to use
- True: Dynamic Vector, value to use if Predicate is true
- False: Dynamic Vector, value to use if Predicate is false
Outputs:
- Out: either the True or False Input
The Preview is a visual representation of what is outputted.
Differences between new and old Shader Graph versions
There are some things to take note of in versions of Shader Graph that are older than version 10.0.x, whether you are upgrading from an older version or come across how someone created a shader and want to replicate it.
The big difference is the use of Master Stack instead of Master Nodes. What Blocks are available in the Master Stack depends on your Graph Settings. The use of multiple Master Nodes needs special attention. If you are upgrading, see Unity’s upgrade guide.
The other big difference between the two: all of the settings are now found in the Graph Inspector. The Graph Settings tab contains all of the graph-wide settings and the Node Settings tab contains all of the property settings and per-node settings.
There are some minor changes in the Graph Settings, like Two Side is not a checkbox, and you use the Render Face enum (Front, Back, and Both).
One last difference is the input for the Color Block of a Fragment Stack. This tip actually comes from Code Monkey’s video.
The Fragment Stack will contain a Color Block that takes a Vector4 value, which is RPGA. A newly created Shader Graph will have a Color Block that takes a Vector3 value, which is just the RPG color, and it has a separate block for the alpha.
To get the alpha, you just add a Split Node and take the alpha output to the Alpha Block input. He pointed out that specifically for sprites, typically when dealing with shaders, if you are using the alpha of a color, you are specific about what the alpha value represents. Whereas with sprites, the expectation is that the alpha of the color controls the transparency.
Additional resources
Unity has several more resources if you’d like to keep learning:
- Art That Moves: Creating Animated Materials with Shader Graph: Talks about moving the vertices, using masks, and has some applied examples (a waving flag, windy grass, etc.)
- GitHub – UnityTechnologies/ShaderGraph_ExampleLibrary: A bunch of different effects created using Shader Graph
- Custom Function Node docs: A special node that you can use to inject your own custom HLSL code
- The Unity manual for understanding shader performance
The post Getting started with Unity Shader Graph nodes appeared first on LogRocket Blog.
from LogRocket Blog https://ift.tt/oMKTnWi
Gain $200 in a week
via Read more