Love the video! These videos are very good, I'm extremely happy to find them and I've been binging. Based on what I've seen, you know exactly what you're doing in these videos, but one minor detail (Math Geek Here), the vector's don't have to be the same length (magnitude of the vector), they just have to be the same number of dimensions when using the dot product. Unreal or Mathematically, there's nothing wrong with doing the dot product between a vector that is normalized and a vector that isn't. Two normalized vectors, when taken in the dot product is a scalar that can range from 1 to -1, when they are parallel and anti-parallel respectively. When you use some other vector that isn't parallel, the result can be any real 'float' value accepted in unreal. When plugged into the Base Color input, Unreal automatically multiplies your scalar by the Identity vector and then clamps it from 0 - 1.
3 years later these videos are still solid! Has there been any big changes since 2021 to the Shader Graphs in Unity that a beginner should be aware of? Thanks again. I like the intro!
The vector directions are backwards. According to the documentation, CameraVector is from the fragment to the camera (otherwise your render preview would look different). Also, when you subtract two positions, A - B, the result is a vector pointing from B to A (opposite of operation). Thus, CameraPosition - AbsoluteWorldPos points from the object/fragment to the camera.
I have a question: why when doing math for an object that appears black when close and white when pulled away, do we use two math operations: subtract by 500 and divide by 5000?
comment comment comment comment comment comment this is a comment this is a comment this is a comment yo yo this is a comment man dude thas a comment ...for the algorithm. Thanks for the infos and the series, I appreciate this!!
it seems to me that the definition of vector lacked the length when you use the direction. And i think that in minute 14:36 if you do this operation "Camera Position" - "Absolute World Position" you get a vector from "Absolute World Position" to "Camera Position" => A - B = B to A direction
Indeed!and I think it's important,camera pos - abs pos means from object to camera, and we just need this to feed into dot function,not camera to object,ue define camera dir as object to Cam, but unity define that as Cam to object,so need reverse it by negate node
HI Ben,Can you talk about the CUSTOM node, its usage skills, and the HLSL knowledge involved。Thank you for the course. I learned a lot from it. It would be better if there were Chinese subtitles. Sorry for my bad English.
Hi Ben! I love your tutorials. Are you perhaps thinking about making a book with all this shader knowledge? I'd really appreciate having it for quick reviews and study. Thanks!
Godot folks, here's how to do the second (Fresnel) shader (both in the fragment function): Using world space coordinates: ``` vec3 VertexNormalWS = mat3(INV_VIEW_MATRIX) * NORMAL; vec3 vertex_position_world = mat3(INV_VIEW_MATRIX) * VERTEX; vec3 camera_vector_unnormalized = CAMERA_POSITION_WORLD - vertex_position_world; vec3 vertex_normal_ws_normalized = normalize(VertexNormalWS); vec3 camera_vector = normalize(camera_vector_unnormalized); float color = dot(vertex_normal_ws_normalized, camera_vector); color = pow(color, 8.0); ALBEDO = clamp(vec3(color), 0.0, 1.0); ``` Using view space coordinates: ``` float color = dot(VIEW, NORMAL); color = pow(color, 8.0); ALBEDO = clamp(vec3(color), 0.0, 1.0); ``` I'm using "clamp" because in Godot (HLSL?) values >1 are treated as emissive.
I'd really appreciate it if you answer this why do we subtract camera vector with absolute World position if we want increase the length of the camera vector shouldn't we use multiply node do that instead
To create a vector between two positions in space, you subtract the position of the one point from the other. The result is a vector between them. So if we subtract the absolute world position of the camera with the world position of the point we're rendering, it creates a vector between those two points. Then if we measure the length of that vector, we can find how far from the camera our point is.
Oh so this is just like your explanation on how water depth is calculated thanks a lot for the answer I really love watching your video I learn a lot from them
Hey have a off topic question I have a large tile map 40+ tiles when I change the landscape material some of the tiles will take the new material other will not? was told from person I bought the map from said I need to bake them on? But when I do I lose all the grass etc and its blurry? am I missing something?
In the case of the distance mask, how would one calculate only the nearest normal so that the mask uniformly 'fades out' the entire surface as you near it? The example in mind being those games that have objects that become 'transparent' when placed between the camera and a character in 3rd person and disappear entirely before intersecting with the camera.
The mask we made goes from white close to camera to black further away. What you need is a mask that is inverted - that goes from black close to the camera to white further away. So you would need to pass the mask we made in this tutorial into a OneMinus node to invert it.
@@BenCloward Not exactly what I meant. I meant how would you calculate off of only the nearest normal of the object so that even those surfaces further away but still attached to the same texture fade out uniformly (so you don't get that 'clipped' look).
@@Dilligff Oh - sorry I misunderstood. In order to do that, you would need to create a vector between the camera position and the object's position instead of between the camera and the current pixel. That way, the vector will be the same length for the entire object. So in the example in this video, we use the Absolute World Position node and you'd replace that with the ObjectPositionWS node instead (in Unreal). Hope that helps.
Hi Ben, im a little confused, for the camera vector isn't the start point is at the surface and end point at the camera position, so is it correct to say that the camera vector goes from the surface to the camera position and not the other way round (ua-cam.com/video/lrc-j7ub28U/v-deo.html)
when i see Ben‘s new video pubilshed, I always give a Thumbs up before watch!
Very kind - thank you!
Love the video! These videos are very good, I'm extremely happy to find them and I've been binging. Based on what I've seen, you know exactly what you're doing in these videos, but one minor detail (Math Geek Here), the vector's don't have to be the same length (magnitude of the vector), they just have to be the same number of dimensions when using the dot product. Unreal or Mathematically, there's nothing wrong with doing the dot product between a vector that is normalized and a vector that isn't. Two normalized vectors, when taken in the dot product is a scalar that can range from 1 to -1, when they are parallel and anti-parallel respectively. When you use some other vector that isn't parallel, the result can be any real 'float' value accepted in unreal. When plugged into the Base Color input, Unreal automatically multiplies your scalar by the Identity vector and then clamps it from 0 - 1.
I'm glad you still working with Unreal even though you were hired by Unity. Your channel is gold!
Absolute gold!!
@@Slapdash86 this is not Gold . this is uranium
Nice meeting you today! Awesome channel!
Oh hey, thanks!!
3 years later these videos are still solid! Has there been any big changes since 2021 to the Shader Graphs in Unity that a beginner should be aware of? Thanks again. I like the intro!
The vector directions are backwards. According to the documentation, CameraVector is from the fragment to the camera (otherwise your render preview would look different). Also, when you subtract two positions, A - B, the result is a vector pointing from B to A (opposite of operation). Thus, CameraPosition - AbsoluteWorldPos points from the object/fragment to the camera.
Great input - thanks a lot!
In Unity camera node's direction is backwards so when we negate camera direction in fact we are renegating it 😉
Sir, how you explain is very very understandable compare to any other tutorial, Thank you Sir. Please keep the series going.
This kind of videos help us a lot other than unreal Docs (which is a hell to look at).
I have a question: why when doing math for an object that appears black when close and white when pulled away, do we use two math operations: subtract by 500 and divide by 5000?
Exactly what I wanted. And super excited for next video about spaces :D
Yep - on it's way next Thursday!
best shader tutorial ever!!
comment comment comment comment comment comment this is a comment this is a comment this is a comment yo yo this is a comment man dude thas a comment
...for the algorithm. Thanks for the infos and the series, I appreciate this!!
LOL! Thanks - glad you like the video!
it seems to me that the definition of vector lacked the length when you use the direction.
And i think that in minute 14:36 if you do this operation "Camera Position" - "Absolute World Position" you get a vector from "Absolute World Position" to "Camera Position" => A - B = B to A direction
Indeed!and I think it's important,camera pos - abs pos means from object to camera, and we just need this to feed into dot function,not camera to object,ue define camera dir as object to Cam, but unity define that as Cam to object,so need reverse it by negate node
This is like a college course. Really high quality. (What do you do at Unity, Ben?)
Thank you! I'm a senior technical artist on the Shader Graph team.
Thank You
Thanks for Lessons!
HI Ben,Can you talk about the CUSTOM node, its usage skills, and the HLSL knowledge involved。Thank you for the course. I learned a lot from it. It would be better if there were Chinese subtitles. Sorry for my bad English.
Hi Ben! I love your tutorials. Are you perhaps thinking about making a book with all this shader knowledge? I'd really appreciate having it for quick reviews and study. Thanks!
Godot folks, here's how to do the second (Fresnel) shader (both in the fragment function):
Using world space coordinates:
```
vec3 VertexNormalWS = mat3(INV_VIEW_MATRIX) * NORMAL;
vec3 vertex_position_world = mat3(INV_VIEW_MATRIX) * VERTEX;
vec3 camera_vector_unnormalized = CAMERA_POSITION_WORLD - vertex_position_world;
vec3 vertex_normal_ws_normalized = normalize(VertexNormalWS);
vec3 camera_vector = normalize(camera_vector_unnormalized);
float color = dot(vertex_normal_ws_normalized, camera_vector);
color = pow(color, 8.0);
ALBEDO = clamp(vec3(color), 0.0, 1.0);
```
Using view space coordinates:
```
float color = dot(VIEW, NORMAL);
color = pow(color, 8.0);
ALBEDO = clamp(vec3(color), 0.0, 1.0);
```
I'm using "clamp" because in Godot (HLSL?) values >1 are treated as emissive.
Feeding the algorithm. Have a good weekend!
Thanks, grate tutorial
Fantastic job.keep Going.
Great Work!!!
Thanks!
So, by checking the closer you are to the camera in a shader, that's how we would implement dithering?
I'd really appreciate it if you answer this why do we subtract camera vector with absolute World position if we want increase the length of the camera vector shouldn't we use multiply node do that instead
To create a vector between two positions in space, you subtract the position of the one point from the other. The result is a vector between them. So if we subtract the absolute world position of the camera with the world position of the point we're rendering, it creates a vector between those two points. Then if we measure the length of that vector, we can find how far from the camera our point is.
Oh so this is just like your explanation on how water depth is calculated
thanks a lot for the answer I really love watching your video I learn a lot from them
I've tried the third Shader with exactly the same nodes/values, and didn't worked, I only see the dark material (UE 5.1).
Hey have a off topic question I have a large tile map 40+ tiles when I change the landscape material some of the tiles will take the new material other will not? was told from person I bought the map from said I need to bake them on? But when I do I lose all the grass etc and its blurry? am I missing something?
In the case of the distance mask, how would one calculate only the nearest normal so that the mask uniformly 'fades out' the entire surface as you near it? The example in mind being those games that have objects that become 'transparent' when placed between the camera and a character in 3rd person and disappear entirely before intersecting with the camera.
The mask we made goes from white close to camera to black further away. What you need is a mask that is inverted - that goes from black close to the camera to white further away. So you would need to pass the mask we made in this tutorial into a OneMinus node to invert it.
@@BenCloward Not exactly what I meant. I meant how would you calculate off of only the nearest normal of the object so that even those surfaces further away but still attached to the same texture fade out uniformly (so you don't get that 'clipped' look).
@@Dilligff Oh - sorry I misunderstood. In order to do that, you would need to create a vector between the camera position and the object's position instead of between the camera and the current pixel. That way, the vector will be the same length for the entire object. So in the example in this video, we use the Absolute World Position node and you'd replace that with the ObjectPositionWS node instead (in Unreal). Hope that helps.
Why saturate is considered free in the graphics card?
It's an operation that the compiler can usually figure out how to get done without costing any instructions or requiring additional computation.
Hi Ben, im a little confused, for the camera vector isn't the start point is at the surface and end point at the camera position, so is it correct to say that the camera vector goes from the surface to the camera position and not the other way round (ua-cam.com/video/lrc-j7ub28U/v-deo.html)
Please could you show how to reduce big size for mobile ?unreal engine
Too hard but pretty good =))