Single Sample Seamless Tri-Planar Mapping (UE5)

Поділитися
Вставка
  • Опубліковано 15 лис 2024

КОМЕНТАРІ • 168

  • @PrismaticaDev
    @PrismaticaDev Рік тому +23

    I'm late to the party on this one, but this is fantastic! I'd been thinking of ways to use UV manipulation for mappings like this but it seems you've done all the thinking for us.. Awesome stuff :)

  • @megac0ffee
    @megac0ffee 10 місяців тому +8

    When I first learned of triplanar texturing I just implemented it without thinking about the details of the actual problem it solves: it looked good and that was good enough for me. This video just made my brain do a somersault when it realized that the traditional method is a ham-fisted solution that circumvents the problems that the UV solution (which I'd never even thought of but now seems like the more intuitive of the two) creates. Fantastic video. The YT algorithm blessed me today.

    • @TechArtAlex
      @TechArtAlex  10 місяців тому

      Thank you! It's always interesting seeing how different two ways of achieving the same goal can look.

  • @mathy6918
    @mathy6918 3 місяці тому +2

    Simply put, the most useful UE5 tutorial around.

  • @Simonschreibt
    @Simonschreibt Місяць тому +2

    This is very cool! Thank you so much for making such a detailed video.
    One thing you might find useful (I write about it because I saw that you often use Right-Click to preview a certain material node): You can assign a shortcut (I usually use SPACE) in "Editor Preferences" > "Keyboard Shortcuts" > " Start Previewing Node" and now, whenever you press space, the currently selected node gets previewed (or preview is disabled when it's already previewed).

  • @marcusjohnston3138
    @marcusjohnston3138 10 місяців тому +3

    What a brilliant technique! Thanks for collating and improving this from the GDC talk, so useful! ❤

  • @pawelb2682
    @pawelb2682 7 місяців тому +1

    A super "Foxtrot Uniform Charlie Kilo India November Golf" nifty approach! Thx for sharing this insight! ...`been Struggled with this for a while, never thought of integrating "Temp. AA". Big Hug!

  • @FluteboxFan
    @FluteboxFan 9 місяців тому +1

    Oh my god I just added this to my array landscape that already had a tiled height blend that completely removes tiling and it just works with 300 instructions and 2 texture samplers (not including the random texture selection & Dither texture obviously). This is just insane, thank you so much!

  • @LookItsCollin
    @LookItsCollin Рік тому +16

    Your content is insanely valuable its a shame that content like this struggles to get views while the 80th "how to add jumping to your unreal project" videos are getting all the attention

    • @TechArtAlex
      @TechArtAlex  Рік тому

      I really appreciate it. It's definitely slow going.

    • @americanbigbig
      @americanbigbig Рік тому +3

      UE5 brought in a lot of new people to unreal development but give it time. As people learn more and more they'll move on to content like this when they get out of yhe beginner stage. This is awesome content

    • @MonsterJuiced
      @MonsterJuiced Рік тому +3

      So true. Tri-planar is extremely valuable functionality that can save quite literally days or weeks of time :)

  • @Stygmire
    @Stygmire Рік тому +4

    Just wanted to say thank you for the insight. Picking this apart was very educational. I was able to combine this with stochastic pattern-breaking techniques for a VERY efficient, non-tiling material, in 3axis... Pretty much the holy grail right there. And it's very fast:
    Your material:
    Base pass shader: 156 instructions
    Base pass vertex shader: 46 instructions
    Texture samplers: 3/16
    Texture Lookups (Est.): VS(0), PS(4)
    User interpolators: 2/4 Scalars (1/4 Vectors) (TexCoords: 2, Custom: 0)
    vs the pattern-breaking added in:
    Base pass shader: 238 instructions
    Base pass vertex shader: 46 instructions
    Texture samplers: 3/16
    Texture Lookups (Est.): VS(0), PS(5)
    User interpolators: 2/4 Scalars (1/4 Vectors) (TexCoords: 2, Custom: 0)
    This is with BaseColor, PackedNormal_Height, and GlossRoughnessAO (3 map base).
    Since it's all UV-trickery, additional layers are very uncostly compared to what you might have to do.
    I think you've done it. :D Aside from the very first video, this might be the most single-valuable stream I've watched on materials.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Thanks! Yes I too have tested this method with tile-breaking techniques with great effect. Glad it inspired you.

    • @Stygmire
      @Stygmire Рік тому +1

      @@TechArtAlexI've managed to get height-blending working with the Texture Array as well! Thanks again.

    • @Stygmire
      @Stygmire Рік тому +1

      @@TechArtAlex Hello! I wonder if I might be able to trouble you with a follow-up question or two?
      Very appreciative of persons like you, all the free-knowledge I've been able to partake of, so thank you already for just the vid..
      To wit: I'm having a challenge with the nature of the alpha for multiple layers. Since the 3rd texture coordinate is linear 1 -> 2 -> 3 etc and the nature of an alpha for a blend is just 0 -> 1, I can get multiple layers to show, but it seems the alphas for successive layers always 'pass through' the previous layers (ala LERP) and you end up with a border around later layers...
      EG: going from layer 1 to 2 is fine, but from 1 to 3 requires 'passing through' 2; unsure how to skip/bridge that alpha-divide...
      Any thoughts, direction? Suggestions? I tried quite a bit to work it myself but I've been staring at it for too long...
      ref - imgur.com/a/8fTkI6b

    • @MonsterJuiced
      @MonsterJuiced Рік тому

      Hey how did you manage only 3 texture samplers in your stats? I have exactly the same setup as in the video and I'm still getting 9 samplers (colour, packed roughness, Normal).

    • @Stygmire
      @Stygmire Рік тому +1

      @@MonsterJuiced Make sure you set them to shared:wrap if they are using the same coordinates. This can reduce the use of sampler-slots since different textures can share UVs.

  • @sierrafoxtrot1331
    @sierrafoxtrot1331 Рік тому +2

    This video solved what I have spent the last two weeks trying to figure out. Great work! Anyone in game development can really benefit from understanding this information.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Thanks for watching! Glad it helped.

    • @sierrafoxtrot1331
      @sierrafoxtrot1331 Рік тому

      @@TechArtAlex I keep trying to post a link to the unreal dev forum but I guess UA-cam won't allow it. If you get some time do you mind helping me with part of the triplanar equation? Also the video includes the solution to the array sample. Cheers!
      ua-cam.com/video/_XieCNlz_og/v-deo.html

  • @limarest761
    @limarest761 Рік тому +2

    This is huge, thanks for the video Alex

  • @jonludwig1632
    @jonludwig1632 Рік тому +2

    I'm very late to finding this one but that dither is absolutely brilliant. Targeting mobile hardware, I'm always looking to save a texture sampler when I can, and I'll definitely be employing this method.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Better late than never! Thanks for watching.

  • @vaillecgart1854
    @vaillecgart1854 Рік тому +1

    Its an amazing find ! Thank you so much for the explanation, as other comments already stated, your videos are insanely valuable !!!

  • @jakebaxter7235
    @jakebaxter7235 Рік тому +3

    Thank you for this great video. One thing I can't quite figure out. What are the nodes labeled "R Vertex Normal", "Z Plane", "R Normal Map", etc.?

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      Those are named re-route nodes. Very useful for keeping larger materials tidy.

    • @jakebaxter7235
      @jakebaxter7235 Рік тому +2

      @Alex Are you kidding me?! I've been using the standard reroute nodes to organize forever now. I had no idea that "named reroute" nodes existed. Thank you for the edification!

    • @TechArtAlex
      @TechArtAlex  Рік тому

      @@jakebaxter7235 It's relatively new. Added sometime post UE5 if I'm not mistaken.

  • @yamiprincess
    @yamiprincess 7 місяців тому +1

    How does one go about combining this method with distance based tiling repetition break up? Would the tiling break up code go where the "Tri Plannar Coordinates" comment is in your material?
    I would like to use triplannar projection for cliffs & caves but have code that makes them not repeat anymore. I'm not sure how to achieve that because the only way I know is to edit the UV coords is when the texture is samples which seems to be a one time deal. I only know how to do it for either the triplannar projection by itself or the distance based tiling repetition break up. Not both at once.
    I found your video trying to learn how people think about triplanar projection & texture sampling in depth but I can't find anywhere that brings up the specific problem I'm trying to solve. I learned a lot from your video either way though and would love to use this method. Is there a way to combine it with distance based tiling repetition break up?

    • @TechArtAlex
      @TechArtAlex  7 місяців тому +1

      Yes. Just like how we are non-continuous UVs for the projection of different planes, we can use non-continuous UVs for the same plane to break up tiling. In the part of the material where you are creating a planes coordinate system, instead create two coordinate systems, offset from each other. Then you can use noise to drive the blend between the coordinates instead of the surface's normal direction.

  • @PitPalmer
    @PitPalmer Рік тому +1

    Great video. I wonder if with this we can archieve a tri-plannar material that contains bump offset or parallax occlusion. Do you think it would be possible?

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      Triplanar and POM don't play nicely together, but it is technically possible. With this technique the samples needed would be much more reasonable.

  • @unumpolum
    @unumpolum Рік тому +1

    Clever and elegant !
    I am curious what results we could get using this AnimatedBlueNoiseDitherAA to mimic the transparency...

    • @TechArtAlex
      @TechArtAlex  Рік тому +2

      Thanks! I have already tested this with masked transparency to mimic translucency. It seems to provide more even results and no blotchy orange peel compared to default noise. However, you can see a slight "crawling" effect from the noise as it animates. Unfortunately it does nothing to help against the high amounts of ghosting you get with dithered transparency, which is perhaps the biggest issue limiting its usability.

  • @dannys_85
    @dannys_85 Рік тому +1

    very clever way for optimization how can i have this shader can you give a link? Thanks.

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      I don't currently have a download link for it, but the full material is shown in the video, so it can be recreated by anyone.

  • @cshainer1
    @cshainer1 Рік тому +1

    Can you expand on why you perform the round to create the banding? I'm not really getting what that's doing; probably a lack of knowledge on dithering.
    Great video overall and great technique!

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Since we are trying to alternate between two discreet coordinates, rounding ensures that we are sampling from one position or the other and not a blend between the two. Sorry for the delayed response, I didn't get notified.

  • @weightednormal3682
    @weightednormal3682 11 місяців тому +1

    Thanks for clarifying.

  • @devjitpaul1191
    @devjitpaul1191 Рік тому +2

    this is insanely good, ** a PROPER AAA shader **, i can't believe this video gets overshadowed by some slightly more popular videos about the topic which just basically use the standard projection technique, if possible make more shader tutorials that cover more AAA topics, If possible make a pateron, I'll support you on Patreon as much as I can.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Thanks for your kind words and support. I do plan to keep sharing things like this when I can.

  • @arvinmoses
    @arvinmoses Рік тому +1

    Awesome walkthrough. Thank you!!

    • @arvinmoses
      @arvinmoses Рік тому +1

      Saw the quick flash over to the array graph too. Smart using dithering to blend between the uvs!

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Thanks! It's amazing what a bit of dithering can do.

    • @arvinmoses
      @arvinmoses Рік тому +1

      @@TechArtAlex Hahaha true true. Ooc with the arrays were you able to set up mips? This was a bottleneck for me in the past.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      @@arvinmoses Yes, arrays now support mips. Not sure when it was added but you are correct that it wasn't supported in the past. You will need to manually set them though in the asset, I believe they are off by default when you create the array asset.

    • @arvinmoses
      @arvinmoses Рік тому +1

      ​@@TechArtAlex Wow... super interesting... Thank you. Making me reconsider some pipelines now lol

  • @YoanJallais
    @YoanJallais День тому

    Great Stuff! Everything worked so well on my side until I tried to use this technique on a layered material. Layered material apparently cannot have DDY and DDX in them and I'm not too sure of why and if there's a work around. Do you maybe have an idea?

    • @TechArtAlex
      @TechArtAlex  День тому

      I just gave the layering system a test and didn't find any issues with ddx/ddy. Are you using the this workflow?
      dev.epicgames.com/documentation/en-us/unreal-engine/using-material-layers-in-unreal-engine

    • @YoanJallais
      @YoanJallais 22 години тому +1

      @@TechArtAlex I just found the culprit! I use the opacity pin in my layered material to transfer custom mask in between my layers even tho the material is opaque in the end. Not using the pin seems to resolve the error. Weird thing I found also, if we don't use the ddy and ddx for mip maps and instead use the none mip mode we have the blurry effect as you've shown but if I activate Nanite on the mesh, the blurriness diseapears and the result look great. So yeah, I'll try to find an other pin to transfer information without having this issue. Thank you again for the video, it is really amazing stuff!
      Edit: Adding a break and then make material attribute node right before the output node of the Layered material and making sure opacity doesn't connect solves the whole problem. That seems to solve a lot of other issues I had actually!

  • @medmel2160
    @medmel2160 Рік тому +1

    Tri-planar seamless, nice, let's take a look. Single Sample?!!! TAKE MY MONEY

  • @kacperszwajka8163
    @kacperszwajka8163 Рік тому

    Great Stuff! Interested in the way you implement Hashed sampling. How do you avoid problems with anisotropic hashing and also sampling noise as texture?

    • @TechArtAlex
      @TechArtAlex  Рік тому +2

      Thanks! The hashing and noise are always in screen space instead of UV space so they should always be isotropic. The animation of the hashing and noise help even out pretty much all problems, but inevitably leads to ghosting.
      The noise I essentially sample like a flip book. So while any single frame of noise may have less than ideal visual qualities, over multiple frames they likely cannot be noticed.
      The noise is ultimately optional, just the dithering alone can be enough but I think it is worth the low cost, especially since we probably have a good noise texture in memory at all times anyway. Since our eyes are very good at pattern recognition, I find without it the trickery is more easily spotted.
      The basic premise of the Ubisoft paper (and I suppose stochastic sampling in general) is that any single pixel sampled in space (or time in my implementation) is technically wrong - but on average across time and space they are right.
      This also means that the higher the resolution and the better the frame rate, the better the effect works.

    • @kacperszwajka8163
      @kacperszwajka8163 Рік тому +1

      ​@@TechArtAlexWhen I was prototyping this in Unity, screen space noise seems to be to much unstable. Even when animating and using TAA it was easy to notice. I found some implementation on shadertoy but problems occur on hard angles. Maybe I did something wrong. Anyway... Thank you for such a complete answer.

  • @cosmotect
    @cosmotect Рік тому

    Why of course.. dithering! Thanks for sharing!

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      It's kind of incredible that a 60 year old technique can still be used in new and interesting ways. Thanks for watching!

  • @marijanfabris9983
    @marijanfabris9983 7 місяців тому

    Any way to blend it more after dither? Like you have the dither transition and then you somehow average those pixels?

    • @TechArtAlex
      @TechArtAlex  7 місяців тому

      The averaging/blending is done by utilizing TAA, DLSS or any similar temporal accumulation technique.

  • @matteckenrodt
    @matteckenrodt Рік тому +1

    Your channel is my new favorite UE advanced tech art channel. Im a junior technical artist and im currently working on separating a large terrain into ~12 biomes. Im interested in using this method for it. How would you handle separating the different terrain textures in a more natural way(Instead of your linear gradient showcase). Currently Im using tons of Perlin noise textures and height lerps to get nice blends between things like rocks and dirt. I'd love a bit more explanation on the your thought process of terrain texturing and RVTs. Maybe its just using a more complicated masking system in the UV input of the Textures sample(where you have the linear gradient logic) It could probably be a new video on its own (: Thanks!

    • @TechArtAlex
      @TechArtAlex  Рік тому

      That's high praise - thanks! Height lerp is a good option as, like you said, it allows for sharp transitions while maintaining a natural look. You can honestly get away with just a traditional linear blend too. But if the materials contrast sharply, I would keep the transition area as small as possible.
      After posting this video I noticed that the game "Grounded" appears to use a similar dithering for their landscape material. I noticed it easily but I doubt many players would.
      But you're right, while it could be a video on its own and I might make one on that, it ultimately is just a matter of masking. I've also experimented with using this method for terrain anti-tiling with some success.

  • @FORESTTELFORD
    @FORESTTELFORD 10 місяців тому +1

    Very well explained. Thanks for putting this together!

    • @TechArtAlex
      @TechArtAlex  10 місяців тому

      Sure thing, thanks for watching!

  • @enpremi
    @enpremi Рік тому +2

    Interesting approach.
    I've tried this out (in UE 5.1, Dx11, TSR) on a 3planar material (created a basic as you show at the beginning and then a stochastic version with an animated noise like yours).
    Using it on standard material (3 tri-planar, 1 for BaseC, one for masks, one for normals), TSamples went up by one (because of the noise I assume), TLookups went up by 3 on VertexShader, but down by 6 on PS. [ Basic-3sample-tripllanar: TLookups: VS(0) PS(12) || Stochastic-1sample-triplanar: TLookups: VS(3) PS(6)
    Instruction-wise: Base3planar has about 195 || Stochastic3planar around 205
    So the cost seems to be negligible considering the save on Lookups.
    However, I noticed that if a plane using these materials fills up my screen, the Basic3planar costs around 0.5ms in Basepass, but the stochastic one spikes up to over 0.9ms.
    Also there is significant visual difference when applied to a larger flat surface. The stochastic instance is constantly noisy (regardless if there is any dithering applied or not). I narrowed it down to the mipmap value mode being derivative. (I've tried setting the mips to be Blur5, which helped a tiny bit, but obviously my mips looked horrible at that point)
    And because of this constant noise being present, the ghosting (especially from shadows) is just too much.
    I'll try to switch to DX12 and see if it makes any difference.

    • @TechArtAlex
      @TechArtAlex  Рік тому +3

      Thanks! Interesting results. I didn't find any increase in frame time with my testing on DX12.
      I also didn't notice any noise or mip issues, but I didn't do a lot of testing with larger surfaces. I did some on a landscape for both material blending and tile-breaking techniques with promising results. Importantly, there shouldn't be any noise applied in areas that aren't blending.
      But if the blend happens slowly over a large surface area, such as a terrain with subtle texture transitions, then I expect the ghosting will be too extreme. Any pixel being temporally blended will exhibit ghosting.
      You can also experiment with the frame blending vars. The ghosting will be directly impacted by the blending and weights, as well as the frame rate. It may be possible to further tune this.
      I believe I had a 4 frame blend, at 120hz.
      I do plan to revisit this in more detail down the road to test a fully featured terrain material though and see what the total visual and performance impact ends up being.

    • @medmel2160
      @medmel2160 Рік тому +1

      @@TechArtAlex would love to see that

  • @GTexperience_Channel
    @GTexperience_Channel 6 місяців тому

    I wonder, can u combine this with parallax?

    • @TechArtAlex
      @TechArtAlex  6 місяців тому

      It's technically possible but notoriously bothersome to combine POM with triplanar mapping. The issue is that POM relies on tangent space math, but triplanar mapping is done in world space. So there is extra math needed to deal with this (much like how the normal map needs a bunch of fixes). The other issue is that you would end up dithering between two different height values.
      Unlike color values, which can simply blend together, having POM flicker between two height values would not work. So the heightmap sampling would probably need to be done with more traditional triplanar methods - while the base texture could use this method.
      Dithering can be used with POM in other ways though, like dithering the number of steps.
      Personally I'm looking forward to nanite tesselation as a much simpler and flexible alternative.

  • @enriquemunoz6148
    @enriquemunoz6148 Рік тому

    What node is being used to make the DDX and DDY nodes? Thank you so much for making this video.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      DDX and DDY nodes can be added directly. The input for them is typically just a set of unaltered texture coordinates.

    • @shannenmr
      @shannenmr Рік тому

      @@TechArtAlex Just be aware these DDX DDY nodes are not supported on Mobile as far as I am aware.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Any hardware that does not support screen space derivatives for mip calculation probably wouldn't exhibit the artifact in the first place.
      But there are other ways to bias mips if needed, like distance or fresnel.

  • @PrivacyEnt
    @PrivacyEnt Рік тому +1

    awesome technique!

  • @fuglong
    @fuglong Рік тому +1

    You are a real G

  • @radivarig
    @radivarig 6 місяців тому

    Can you tell how to make the Plane nodes please?

    • @TechArtAlex
      @TechArtAlex  6 місяців тому

      I think you're referring to the "named reroute nodes". These can allow you to take a node path and treat it like a variable, which can be referenced elsewhere in the material, and named anything you like.

    • @radivarig
      @radivarig 6 місяців тому +1

      @@TechArtAlex Yep, thanks!

  • @cocopops733
    @cocopops733 Рік тому

    Thanks so much for making this , a really fantastic tutorial , late in the video you mentioned using arrays and temporal AA on landscapes . I’m really interested in pursuing rhis . I set up a system to write out a matid map that that’s used to control which index of the texture array is used , and this works great . However I’m struggling to work out how to set up the dithering for a system like this (I’m suing a 64 texture array and it all seems a bit complex

    • @TechArtAlex
      @TechArtAlex  Рік тому

      I made a working prototype using UEs built in landscape material painting system. It passed through a layer ID and an alpha value. Layer was mapped to index and alpha value was used to control the dithering.
      There are still some tricky parts to keep in mind. Like if you want to blend more than two materials in a single pixel, you will need a more complex dither setup. You also can't blend more materials than your TAA frame blend length. Increasing the period of the blend will increase ghosting. I believe the default is 4 frames, so in theory you could show one material on a given pixel for each of them and blend up to 4.

    • @cocopops733
      @cocopops733 Рік тому

      Sounds interesting ..I’m working on a test level so for now ghosting. Isn’t a huge issue .. could you possibly show me your setup .. I’d love to see how it works. “an alpha layer was used to control dithering “ that’s the part Im unclear on

  • @clement8181
    @clement8181 10 місяців тому

    This is awesome, but it took me almost a day to get it working in glsl, I found this piece of code which really helped me understand the method so I could write my own implementation, I get some dittering but I'm not using any kind of aliasing
    float ScreenSpaceDither(vec2 screenPos,float iTime) {
    const vec2 ditherPattern = vec2(171.0, 231.0);
    float dither = dot(ditherPattern+iTime, screenPos-iTime);
    return fract(dither / 103.0);
    }
    vec2 calculateUniplanarUV(vec3 worldPos, vec3 normal, vec2 screenPos, float contrast, float iTime) {
    vec3 powAlpha = pow(abs(normal), vec3(contrast));
    vec3 alpha = powAlpha / (powAlpha.x + powAlpha.y + powAlpha.z);//divide by dot(normal, vec3(1))
    float ditherValue = ScreenSpaceDither(screenPos, iTime);
    ditherValue = clamp(ditherValue, 0.01, 0.99);
    vec2 uv = alpha.x > ditherValue ? worldPos.yz :
    (1.0 - alpha.z) > ditherValue ? worldPos.xz :
    worldPos.xy;
    return uv;
    }
    //this helped me too www.shadertoy.com/view/cdBfD3

  • @SirWintercrazy
    @SirWintercrazy Рік тому +2

    Awesome video! It's great to see content like this. One little thing I noticed - I think there is a slight error in the x plane of your normal transform. I think you want the vector to be ZYX, but as you have it (at ~13:29 in) your x-plane vector is ZXY. I think you'd want to swizzle the R and G channels of your Normal Map.

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      Thanks! Just to make sure I went back and re-tested every side and it appears to be reacting to light as expected from all sides.

    • @단비-w8n
      @단비-w8n Рік тому

      @@TechArtAlex thank you so so much! you're video helps me a lot.😍 is there any node to fix in the video? I saw this comment but I don't know how to fix it😂

    • @TechArtAlex
      @TechArtAlex  Рік тому

      @@단비-w8n
      ua-cam.com/video/VUoI_IESK7U/v-deo.htmlsi=LpLUkGnuZkSIaKim
      When using triplanar normals, you must "swizzle" or switch places of some of the vectors. If you switch them into the wrong place, the shading will react incorrectly. Here's a video that goes into detail.

    • @DaroxxFR
      @DaroxxFR Рік тому +1

      ​@@TechArtAlex I can confirm that there is a problem, normals are quite strange compared to standard triplanar. Very great job nonetheless

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      @@DaroxxFR Weird that I somehow missed it - I thought I tested pretty thoroughly. But it's a simple fix. Thanks!

  • @monsterinsane2228
    @monsterinsane2228 Рік тому +1

    This is so awesome, again, thank you a lot for this tutorial!
    btw, I just got a weird result with the normal when trying to do like yours, mine the normal is very dark/black like it's not showing correctly, I've checked very carefully that I haven't done anything wrong, I'm not sure why...

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Thanks for watching. You'll want to use the buffer visualizer to see which normal vectors are wrong for which projection so you can make the necessary adjustments. Or check out a more comprehensive tutorial on tri-planar normal correction since I know I gloss past it on this one. There are some videos that explain exactly which vectors need to be swizzled.

    • @monsterinsane2228
      @monsterinsane2228 Рік тому +1

      @@TechArtAlex I'd do that! Thanks a lot master!

    • @MarcoMariaRossiArte
      @MarcoMariaRossiArte Рік тому

      @@monsterinsane2228 what was it?

    • @monsterinsane2228
      @monsterinsane2228 Рік тому

      @@MarcoMariaRossiArte heyy! I've just fixed that already, I connected wrong channel and mixing it wrong like add/multiply red/blue chanel. Now it works perfectly. Thanks for asking!

    • @DaroxxFR
      @DaroxxFR 11 місяців тому

      @@monsterinsane2228 So you first made a mistake and exactly like in the video should work properly ? :)

  • @mindped
    @mindped 6 місяців тому

    question for you. In unreal material editor stats... i noticed that no matter how many samplers i use... as long as they are on the same texture.. even with uv modification different per each usage... the stats still say they only use the same 1 sample. Obviously if i assign different textures it adds to the samples...I thought that every time u sample the same texture .. it should use another sample... I am using 5.4.1.
    Whats the deal with texture samples? Whats the performance hit?

    • @TechArtAlex
      @TechArtAlex  6 місяців тому

      Unless something big has changed without me noticing, yes each sample node with distinct UVs requires a new sampler. Not sure why the interface would suggest otherwise.
      As far as the performance hit goes - anywhere from relatively low to massive. It all depends on the texture settings and if the GPU has other math it can be doing while it waits many cycles for the fetch to complete. There are also factors like cache coherency that can impact performance. When you sample the same texture more than once, sometimes the data may already be on the GPU cache which saves time.

    • @mindped
      @mindped 6 місяців тому

      ​@@TechArtAlex looks like texture lookups for pixel shader go up with each additional sample.. not actual samples.

    • @TechArtAlex
      @TechArtAlex  6 місяців тому

      Are you using Shared: Wrap/Clamp samplers? This setting is common in landscape materials to allow for more sample nodes than would be possible without shared sampling.
      In any case, the lookup is generally the slow part anyway.

    • @mindped
      @mindped 6 місяців тому

      @@TechArtAlex no , just using the default settings.... not sure what a Shared: Wrap/Clamp sampler even is.

    • @mindped
      @mindped 6 місяців тому

      @@TechArtAlex I just tested shared:wrap ... no difference. 2/16 samples either way. And a seperate pixel shader lookup for every texture sample node with shared wrap as well.
      Though shared: wrap does do something rather interesting... it does now put different textures at 2/16 rather then adding additional samples. So textures dont even have to match. Thx for that.. its rather interesting.

  • @PitPalmer
    @PitPalmer Рік тому

    I´ve been using a material with a mapping based on this tutorial, and it works great, with one exception: when applied to landscapes it gets all blurry. Somehow the DDX and DDY get all messy with landscapes. If i change the texture sample mode to "None (use computed mip levels)" the thing fixes itself, but I get the problem of blurry seams again. Do you know how to fix it?

    • @PitPalmer
      @PitPalmer Рік тому

      It only happens when I have multiple layers. With a single one it doesnt seem to happen.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      I'll have to look into it. Off the top of my head I'd try to bias the mip level being calculated by the DDX/DDY to cause it use a higher resolution mip than it calculated to see if that corrects it. I'll see if I can replicate the issue.

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      @@PitPalmer Unfortunately I was not able to replicate this issue. Make sure you're using the unaltered texture coordinates for the DDX and DDY inputs. If you're using world coordinates as the input to DDX and DDY you'll probably need to divide the world coords by something like 100 to bias it into a range that is comparable to texcoord.

  • @VaikNay
    @VaikNay Рік тому +1

    Could you sell this material for download?

  • @mindped
    @mindped Рік тому

    im curious whats the point of using ddx and ddy for mip levels? I dont quite understand the point of these nodes or when its appropriate to use them on a texture.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      DDX/DDY are screen space derivatives. Derivatives as in the calculus definition, which means they measure the rate of change of their input. More specifically they measure the rate of change in a group of pixels value in screen space, across the X and Y axis. This is relevant for mipmapping, because if a texture coordinate changes rapidly in screen space then you know that it's far away or at a steeping angle to the camera. So you can calculate what mip should be used per pixel this way.
      The issue comes into play if you are using non linear texture coordinates. If your coordinates system jumps all over the place, so will the rate of change. The pixels will be assumed farther away than they actually are due to the high rate of change. This causes the sampler to use a very low resolution mip when shading those pixels.
      You should use them any time you're making changes to the UV input of a texture sampler that does not uniformly affect all pixels. For example if you add a constant value to offset your UVs, the whole image will move - but the rate of change in coordinates from one pixel to the next will not be changed.
      If you multiply your UVs, the rate of change will be higher but this is okay because the texture is now smaller and the lower mip level is appropriate.
      Now imagine instead you add a random number to every pixel. The rate of change could be anything, and so the mip level will also randomly vary despite the fact that the texture may not have been scaled at all.
      This is why we feed the DDX/DDY *unaltered* coordinates. We're making it do the calculations based on a coordinate system that isn't all jumbled.

  • @mindped
    @mindped 4 місяці тому +1

    Should i be worried that your method is making the texture upsidedown? Feels like it may cause problems with the normal map later unless thats already factored in.

    • @TechArtAlex
      @TechArtAlex  4 місяці тому

      Most tri-planar mapping techniques will mirror the texture depending on which side of the object you are looking at, due to how projecting world coordinates work. Is that what you mean? This is true even for the built in WorldAlignedTexture material function. In most cases, these textures are surfaces where directionality is irrelevant - grass, rocks, so on - so this is a non-issue. Since fixing it requires extra instructions, and most of the time is irrelevant, it is usually left off.
      If there are elements like text within the image that require a specific directionality, then you need to mirror the coordinates using a dot product to determine which direction the face is pointing.
      If you un-mirror the texture coordinates, then you would need to add additional logic to flip the normal vectors in that direction as well. You can use the buffer visualizer to validate normal vectors to ensure the correct orientation.
      The normals in the video are correct for how this material was set up, but if you change stuff it will be up to you to validate them and make any necessary corrections.

    • @mindped
      @mindped 4 місяці тому

      @@TechArtAlex no i mean the texture is literally upside down regaurdless of the rotation direction. Its a smiley face and its upside down when normal uvs have it right side up. And it doesnt matter if i rotate it180 degrees or not in this triplanar
      If thats what triplanar does.. then thats what triplanar does. I wont worry about it. Was just curious.

    • @TechArtAlex
      @TechArtAlex  4 місяці тому

      @@mindped are you scaling the texture or your mesh by a negative number? The textures facing direction is determined by the UV coordinates. Multiplying them by a negative value will flip them. This can easily be changed with simple math like any other UV coordinate.

    • @mindped
      @mindped 4 місяці тому

      @@TechArtAlex nope... .. this is literally your tutorial in thsi video. No difference other then i scale it at like .001 ..

    • @mindped
      @mindped 4 місяці тому

      @@TechArtAlex No both the texture and mesh are scaled in positive values... the texture orientation stays the same regardless of the scale of the mesh.
      Also this texture isnt rotated 180.. its scaled -1 only in the 1 axis direction... i noticed the right side of the mouth is on teh right side when its upside down(triplanar) or right side up (non triplanar).
      If i try to set the texture to scale in negative 1 .. it makes it right side up.. but the right mouth is on the left now. So scaling uv in negative scales in all axis.

  • @MonsterJuiced
    @MonsterJuiced Рік тому +1

    "take the planes and construct the UV's" at 9:14
    I knew you'd skip over those weird nodes there. Can you please tell me what they are? I can't find any info on them because I don't know what they're called :/
    UPDATE: nevermind I know now from finding a comment asking the same thing. They're name reroute nodes and I never knew they existed at all. This is actually game-changing for me. Thank you!!!

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      Yeah the named re-route nodes are a huge deal. One of my favorite new features in UE5.

  • @mindped
    @mindped 9 місяців тому

    curious.. im using texture arrays... is there an appropriate way to set up the ddx ddy ? because when i scale a mesh really thin and long it starts blurring the texture with your DDX and DDY set up from tex coord 0.... Like its using the wrong mip level.. I was able to make it sharp by linking up the triplanar coordinates to ddx and ddy .. and it gets sharp.. but im not sure thats how im supposed to do that...

    • @TechArtAlex
      @TechArtAlex  9 місяців тому

      You can alter the DDX & DDY based on additional factors like scale if you need. Basically the way I show it assumes you want the same mip level for the triplanar material as you would a traditionally UV mapped material, which may not always be the case. If you multiplied the texture coordinates x and/or y before running DDX and DDY as the object changes size, you could keep the coordinates size constant, regardless of object scale.

    • @mindped
      @mindped 9 місяців тому

      @@TechArtAlex Not sure you are following. With texture arrays... u can have multiple textures on the same array. To switch out the texture you just append the uv coordinates 1 more time by a number.. 0 first first texture.. 1 for 2nd... 2 for 3rd... etc etc....
      I have that working with this triplanar material.. problem is DDX and DDY with texcoord 0 break it so it doesnt run... Appending that third numeric value to DDX and DDY sort of works, but for stretched out objects it uses the wrong mip level...
      I don't know how to make it work with the correct mip level for texture arrays and this triplanar...

    • @mindped
      @mindped 9 місяців тому

      @@TechArtAlex I have been trying to set up the mips manually.. not very good at this.. but it does seem it is also the scale of the object... any examples on how to set up mips so they are the same across multiple objects regaurdless of scale? right now my ddx and ddy come from texcoord 0... im not sure thats correct way of doing this in this circumstance.

    • @mindped
      @mindped 4 місяці тому

      @@TechArtAlex im back to this.. and im noticing something... ddx ddy from the texcoord 0... My texture is scaled so its really large. My object scale is 1,1,1... it is using blurr mips pretty close and very noticeable. And if i scale it to be larger in 1 axis or all axis then the mip stops being blurry... Any ideas? The way you have it set up for ddx ddy calculation doesnt seem to work for me very well.

    • @TechArtAlex
      @TechArtAlex  4 місяці тому

      ​@@mindped If you are multiplying/dividing to scale up the texture, you simply need to multiply/divide the texture coordinates used for DDX and DDY by an amount that results in a consistent rate of change.
      For most cases it should look something like this ((WorldCoords/100)/ScaleFactor) = UVs
      DDX(Texcoords/ScaleFactor) = DDX
      DDY(Texcoords/ScaleFactor) = DDY

  • @mindped
    @mindped Рік тому

    in unreal engine 5.1 it is adding 3 texture lookups instead of the 1 you claim this is :/... unless im confused... are texture samples the same as texture lookups?

    • @TechArtAlex
      @TechArtAlex  Рік тому +2

      The "texture sampler" line is to keep track of the max SM5 limit for non-shared samplers. Lookups are what generally matter from a performance perspective. But people often use the terms interchangeably.
      Even a completely empty material will show one texture sampler and 3 lookups. The second extra sample is from the noise texture on the dither TAA node, which could be removed in theory but is very cheap and low resolution.
      With traditional tri-planar you will have 3 texture lookups per map type. Assuming a separate color, roughness and normal map this would be 9 lookups in total, although only 4/16 samplers will be used because each of the 3 planes can share a sampler.
      With my technique you will have 1 texture lookup per map type, plus optionally one shared noise texture. Assuming again separate CRN maps you would have 4 lookups but you'd see 5/16 texture samplers being used due to the addition of the noise texture.
      The stock dither TAA node uses a 64x64 grayscale noise image, which is much cheaper to sample than most textures. The technique works fine without any noise too if need be which would make it the full 3 to 1 reduction claimed.

  • @Ethan-gu9hm
    @Ethan-gu9hm Рік тому

    I attempted to replicate the material as shown here, but the result is extremely noisy. Tested with the Carbon Fiber textures from Epic's Automotive materials set, UE 5.3.2. It is particularly bad when the texture scale number is low, like anything 8 or less. But this material is not meant to be used at high texture scale value. Any ideas what could have been the cause of this?

    • @TechArtAlex
      @TechArtAlex  Рік тому

      Sounds like aliasing. If you are scaling the texture, you will need to scale the texture coordinates going into the DDX and DDY of the sampler too, otherwise it will calculate mipmapping based on the original texture scale.

    • @Ethan-gu9hm
      @Ethan-gu9hm Рік тому

      @@TechArtAlex I did that, doesn't seem to make any visible improvements. There is also another issue that I forgot to talk about, which is that some of the sided appears darker than others.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      @@Ethan-gu9hm I would need to see the material itself to troubleshoot much more than that. Unusual darkness is usually a result of incorrect normal re-projection. Some comments say I may have made a mistake on the X-plane normal vector re-projection so take a look at that comment thread and see if the information there fixes it.

    • @null643
      @null643 6 місяців тому

      @@TechArtAlex I have the same issue on UE 5.4, using quixel textures or even UE starter content is very noisy with texture scale set to 64, it gets lets noticeable with increasing texture scale. Is there a fix for this?

    • @TechArtAlex
      @TechArtAlex  6 місяців тому

      @@null643 have you already tried adjusting the texture coordinates that are used for the DDX/DDY calculation to compensate for the texture scale? If the scale of those texture coordinates are not the same as the coordinates being used by the actual world aligned coordinate system, then it will select the wrong mipmap, which can cause noise and blur.

  • @LoopSkaify
    @LoopSkaify 6 місяців тому

    or you just use a shared sampler.

    • @LoopSkaify
      @LoopSkaify 6 місяців тому

      But Unreal has this already as default as WorldAlignedTexture function

    • @TechArtAlex
      @TechArtAlex  6 місяців тому

      Shared Sampling will still require 3 separate texture fetches per pixel with the default World Aligned Material function because it will interpolate between all 3 values. This method only requires one texture fetch per frame.

  • @mindped
    @mindped 4 місяці тому

    your current setup the mip maps do not come in correctly. They get blurry to quick. The simple Texcoord[0] to ddx and ddy... doesnt work correctly.
    My triplanar coordinates have an add and multiply before masking while yours uses a divide. Outside that i see no difference in the setup.
    With this current method if i scale a plane in 1 axis to make it a rectangle instead of a square.. it starts changing the mip map.... mips shouldnt change based on the scale of hte object. (It also does this for scaling in all 3 axis at the same time.)

    • @TechArtAlex
      @TechArtAlex  4 місяці тому

      I don't experience any change in mip level by changing the scale of the mesh using tex coords. There must be something else in your material causing that. You are right however that the tex coords ddx/ddy are too simple however, as mentioned in our other conversation they also need to be scaled to match the texture scale of the triplanar UVs / 100.

    • @mindped
      @mindped 4 місяці тому

      @@TechArtAlex i found that multiplying or dividing the uvs (for scaling) after the 2 lerps seems to work better for mip maps.. not using a texcoord 0 modification.
      The problem is doing it this way blurs the dither areas .... only way to get the dither areas decent is to use the texcoord 0 method u use in the video.
      So u either have horrible mips fading in and out based on the models overall scale... and u have good transitions between the triplanar directions... or u can choose to have horrible transitions with the distance based mips working properly.
      My guess is you don't realize your experiencing this problem because your current set up keeps the largest mip loaded in way too long range distance wise.

  • @Extile00
    @Extile00 Рік тому +1

    I have tested this & my conclusion is that it works great with a huge downfall. It's noisy as hell. And also the shader complexity view shows that it's actually less performant. But that is known to be inaccurate so not really sure which is better. All in all not really useable.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      When set up correctly, the noise/ghosting should be limited to exclusively the transition zones. You will want to avoid using it for halfway blends over large areas, because then temporal artifacts will be visible at a large scale. The shorter your blend transition, the better this will work.
      Also make sure TAA, TSR, or better yet DLSS are enabled. These eliminate noise but in exchange can introduce ghosting artifacts.
      The shader complexity view mode only counts the total number of instructions, it makes no consideration towards the actual cost of the instructions, despite certain instructions like texture samples or trigonometric functions having much higher costs.
      Thanks for sharing your results though, hopefully this information is useful.

  • @gehtsiegarnixan
    @gehtsiegarnixan Рік тому

    I think you calcuate your Derivatives wrong form the texture UVs. since your mesh could have any scale and your world coords can by scaled any other way. You should construct your ddx and ddy from the world coordinates, generating the ddx and ddy of the world coords and then calculating a undither cubemaping from them should give you the correct values.

    • @TechArtAlex
      @TechArtAlex  Рік тому +1

      It would be better to just add a scale factor if needed to the texture coordinates method I show, because the world coordinates are V3 but the derivatives need a V2. You would need to correct this first by setting up another triplanar coordinates system, and would still get a seam at the points where you transition between projection planes as a result.

    • @gehtsiegarnixan
      @gehtsiegarnixan Рік тому +1

      ​@@TechArtAlex The artists in my team have trouble reading the tooltips, so the manual scaling is not an option for me XD. Yes, you get edges if you cubemap the world coordinates and then use the derivatives. You have to use DDX and DDY on the 3d coords and cubemap the result, you got no seams and the mips seem to be correct. I have an example on shadertoy with my name.

    • @TechArtAlex
      @TechArtAlex  Рік тому

      @@gehtsiegarnixan Lol. Ah, I understand now. That's clever, good solution.

    • @MarcoMariaRossiArte
      @MarcoMariaRossiArte Рік тому

      I'm really struggling to port this to unreal, got any tips?

    • @gehtsiegarnixan
      @gehtsiegarnixan Рік тому

      @@MarcoMariaRossiArteI don't understand. It works excatly the same as in my shadertoy example called "Uniplanar", you don't even have to use custom nodes and use HLSL code. Every line single line is commted and I even have two versions and you can use Ninja's as refernece as well. I unfortunatly can't link you screenshots or anything on youtube.
      I think you have to be more specific with your problem.