Low Poly Model Alignment for PBR Texture Baking in Blender

Поділитися
Вставка
  • Опубліковано 4 жов 2024

КОМЕНТАРІ • 46

  • @outdoorstours
    @outdoorstours 3 місяці тому +1

    I also work with Blender. Your video is very helpful. I hope you can post similar videos like this one often. Thanks for sharing dear. All the best to you. Have a wonderful day and see you next time again. Many Greetings from Germany ♥ Like 163

  • @manmadeartists
    @manmadeartists 11 місяців тому +1

    6:00 that was an astonishing example to describe the difficulty to get the scale of an object😍

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому +1

      thanks :) [21:00] ends the video :D but I guess you were talking about these rocks at [5:51], werent you?

    • @manmadeartists
      @manmadeartists 11 місяців тому +1

      Oh nooo that would have been a roast hidden behind a compliment. Like pls end that video🙈😂 but yes! You’re right😅 i will change that

  • @cobeer1768
    @cobeer1768 11 місяців тому +1

    Think Im first. Sweet.
    Love your vids. Great content.

  • @lawrence9713
    @lawrence9713 11 місяців тому +1

    Really well explained video. Thx
    I haven't found a 3D scan program that works for me yet.
    I saw substance alchemist now has 3D scan feature as well.
    I might check this out some day.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому

      Thanks. Surely give a Sampler a try, just bear in mind that it has limited reconstruction capacity as it utilises VideoMemory for reconstruction. So while its super fast, it isnt designed to process heavy reconstructions.. unless something has changed recently.

  • @Barnyz
    @Barnyz 11 місяців тому +2

    Excellent video. Really great quality editing, diagrams, and images to help explain everything 👍👍👍

  • @CreativeShrimp
    @CreativeShrimp 11 місяців тому +1

    Brilliant! As others have been saying, it's great to see your photogrammetry workflow within Blender! And the extra effort in editing and explaining is above and beyond! Awesome on every bake level! :D

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому +1

      Thanks dude, veeeeery appreciated :). Yeah, Blender has incredible potential and if nothing changes, it is just the matter of time when it becomes the main and 3D tool on the market.

  • @dvogiatzis
    @dvogiatzis 11 місяців тому +1

    Great material. Thanks for sharing!

  • @H0w3r
    @H0w3r 11 місяців тому +2

    Not first but happy to see you back to posting videos! I have a workflow question. I used to use Unity's AI-based ArtEngine for a lot of my photogrammetry texture processing. Especially for tiling a texture with a specific pattern (like a brick wall or stone tiles). Nowadays i struggle to get results that are as good using manual techniques like painting seams and tiling my materials in Painter for example. Its never as good and it takes so much longer. What do you use today to remove seams and make materials tilable ? Would love to hear more about this. Looking forward to more videos of you

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому +3

      Thanks! Yeah, it has been almost a year since my last full video release on my channel and I hope to change it. To answer your question, I still use Unity ArtEngine on a daily basis as Unity gifted license to all their users until 2030. Unfortunately there are still no any tool available which can replace ArtEngine. Manual seam removal is still a very good but painful option - can be done in PS/Affinity/Painter. Its the option which guarantees high quality tho. Other automated options I tested so far suck. I was planning to test a new Sampler's seam removal algorithm, but didnt have much time yet. Who knows, I might drop a dedicated video about seam removal at some point - especially its kind of in progress for a while, but will see how it goes. Cheers!

    • @H0w3r
      @H0w3r 11 місяців тому

      @@GrzegorzBaranArt unity granted licenses until 2030?? i've missed that news. Is it ok if i DM you for some more info about that ? Perhaps on Artstation ?

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому +1

      @@H0w3r As far as I know they sent an email to every existing user with the seat code to redeem on Unity account. If you were commercial user you should get one. I would suggest to contact their customer support on this regard.

    • @milk697
      @milk697 11 місяців тому +2

      substance 3d sampler's ai image-to-material features are great for this sort of thing, and it also has height-based tiling which can get you pretty good results for removing seams

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому

      @@milk697 Thanks for the info. I am gonna take a closer look on it when I have some time. Last time I checked (before the recent big update) seam removal feature in Substance Sampler was still based on simple masking causing significant quality drop (poor opacity transition) in side-cross-offset sections. Content awareness node was very helful to fix some of these issues but quality drop was too significant to let Sampler do the job. Who knows, I was hoping that with the recent update - the one which improced 'Single Image to Material' reconstruction, seam removal tool also got some special treatment from developers, but unfortunately didnt have opportunity to give it a try yet.

  • @abdelkarimkeraressi1418
    @abdelkarimkeraressi1418 10 місяців тому +1

    blender it can flip uv any direction you like . go into uv hit u and you have flip . there horiz or vertic .

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  10 місяців тому

      I was refering to ZBrush which used to flip/mirror UVs by itself for no reason while it actually shouldnt.

  • @HerrWaffell
    @HerrWaffell 11 місяців тому +2

    Great video. Seeing you give Blender a chance in your photogrammetry workflow is excellent.
    I have a very similar workflow for processing surface scans, except I made a geometry node setup which aligns an unwrapped plane based on the average normal of the subject, projects it on the subject and scales it based on the rulers in the scan. This speeds up the whole process. Generally, it takes longer to export the scan from Metashape than to prepare the projection plane.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому

      Thanks, yeah, Blender totally replaced ZBrush at this stage in my workflow. I plan to give it a chance in manual texture creation at some point and get rid of ZBrush totally if it can make the job. So far ZBrush is stil the king when it comes to highpoly sub-tool based sculpting in super heavy scenes :D. I am kind in love with Blender now and its clearly the future. I consider any time spent on 3DMax, Maya etc. a waste on this regard. Shamefully it needs an RTX card for cycles which I still dont have - I am still on GF1080 :D

    • @ChillieGaming
      @ChillieGaming 7 місяців тому +2

      ​@@GrzegorzBaranArtYou can get the 4070ti if you are in budget.
      It's faster than the 3090ti.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  7 місяців тому

      @@ChillieGaming Thanks for the advice. I actually bought 4070Ti last month and indeed it makes significant difference when it comes to RTX based features. Unfortunately had to upgade my power supply unit too. Cheers!

    • @ChillieGaming
      @ChillieGaming 7 місяців тому +1

      @@GrzegorzBaranArt Hey i have another question. How big of a difference do u get with ur upgrade in lets say processing time

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  7 місяців тому +1

      @@ChillieGaming anything that utilises RTX tech on CPU speeds up significantly. So render in cycles which would take me 20mins takes just 2. But GF1080Ti is a very powerful and quick card. I feel like it was quicker when it comes to overal data transfer (large texture sets) and 4070Ti laggs in comparison. I would risk saying that I feel like GF1080Ti was slightly quicker and performed better things which dont utilise RTX technology, but everything that does, leaves GF1080Ti behind. And recently almost everything does.
      So with all pros and cons of GF1080Ti, I consider upgrade to 4070Ti as a big and significant one which makes the difference.
      When it comes to photogrammetry I didnt compare processing times, but I dont feel like there is a big difference. Maybe because I usually work with large, over 100mln poly meshes and hundreds of images and any of these cards has enough VRAM to handle that, so a lot still depends on motherboard, RAM and CPU here.
      From Blender Cycles point of view. After the upgrade, I can finally tweak light live instead of waiting minutes after any parameter change which is awesome

  • @peterallely5417
    @peterallely5417 11 місяців тому +1

    I’d love to know how people are creating roughness maps from their photogrammetry scans, since by default the roughness is uniform (obviously not ideal)

    • @sleuthman864
      @sleuthman864 11 місяців тому

      Depending on what it is, usually just inverting the albedo and playing with the levels. You can then colour select different areas to change bits, add overlays etc..

    • @matterfield
      @matterfield 11 місяців тому

      Don't invert the albedo. Instead, convert the RGB to HSV in Substance Designer. Use the inverted greyscale saturation map as the basis for roughness as it's vastly superior.

  • @logred--5257
    @logred--5257 11 місяців тому +1

    Hi Grzegorz! Great video. I have two quick questions. In this video, when you take photos on location, it looks like your camera isn't parallel to the ground but is at about a 45-degree angle. Is that the case?
    Regarding the equipment itself, I have a full-frame sensor camera and a Canon 50mm lens that I use in many situations. If I use it for photogrammetry, does this simply mean I would need to take more photos to cover the area ? Or would you recommend photographing with, say, a 35mm lens (which I don't have) ?

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому +1

      I also use full frame camera for photogrammetry. The focal length depends on many factors - like distance to the subject, use of flash and its power (falloff), quality of lens (vignetting and distorsion), the subject structure and depth (grass? concrete? brick?), sensor/image resolution, time to capture, expected quality for reconstruction, coverage, stabilisation, level of light, and many more. I covered some of these in some of my previous videos but I plan to record a separate video about captures at some point, if this form of videos kicks in and finds interest.
      I use a kit zoom lens for this reason and I change the focal length depending on these. Yes, if you use 50mm lens you will need to capture more photos so you have enough image overlapping.
      Regarding the 45-degree angle.. its not really 45 degree :) but its angled indeed. Its angled because I had to avoid having tripod leggs in a frame somehow :D so I consider it as a compromise. The perfect camera position wouldnt be angled at all :D
      I plan to cover all these details and nuance in my incoming book :). Hope that makes sense

    • @logred--5257
      @logred--5257 11 місяців тому +2

      @@GrzegorzBaranArt Very comprehensive response, thank you ! (:

  • @Gringottone
    @Gringottone 11 місяців тому +1

    Your videos are pure gold, what do you think about the Nvidia NeRF?
    Your explanation is so clear I love it! Why did you settle for 2mkw?

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому

      I havent tested Nvidia NeRF. 2x2m coverage is a standard coverage for computer games - due to texture size restrictions. It is wide enough to avoid obvious repetition but small enough to provide all visual detail. In 3D art all these 2x2m materials blends with each other using masks, vertex colors ... you name it. This coverage works the best when you create environments seen from human perspective. While it is considered as a standard coverage, its not fixed and depends on its purpose. For example fabrics (often used for characters or visualisations) cover often just 50x50cm. If you produce CG environments seen from larger distance, the larger coverage makse sense - as far as I remember World of Tanks use 4x4meters. If you make a product seen from a long distance - flight simulator - you might even use 100x100 meters etc. Btw. its one of subjects I was planning to cover in detail in one of my incoming videos about 'Single Image based' reconstruction for materials :). So there is no fixed size of texture coverage, but the 2x2 is the most common. At the end it is important to know the coverage to be able to set proper tiling setting to keep coverage consistency in a CG scene. Cheers!

  • @anficyon
    @anficyon 11 місяців тому +1

    When your new book will be available?

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому +1

      Hey, its still in progress. I dont rush it since I really want it to be the one of the most comprehensive books on the subject and dont want to release it if its not 100% ready. If nothing changes, I plan to release it next year.

  • @1982Jonte
    @1982Jonte 10 місяців тому +1

    Hi, I trying to follow the video and I get a little bit confused, when you almost have the correct scale on the added plane in blender (2x2meters) almost 180x180 cm. and then you scale the the plane to match the imported HP. don't you lose the scale if you do that? or is this because you need to match the lowpoly to the insanely high poly that you can't import? Hmm. I might have answered my own question...

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  10 місяців тому +1

      not sure whats the question, but yeah, I basicaly use a medium poly mesh made of (decimated in Photogrammetry app) to the size I can load without crashing and navigate around in 3D app like Blender. Because this medium poly is basically a decimated super high one and carries all its features needed for alignment - like position is 3D space, size, coverage... just without super tiny details which arent need for positioning. So yeah, as you said, I do that to match a low poly model to the high poly I cant import as its too heavy :D

    • @1982Jonte
      @1982Jonte 10 місяців тому +1

      @GrzegorzBaranArt perfect, thanks for the answer! I first thought, why does he not match the HP at the world origin and scale it to match 180 cm. But if you need the origin from the super dense mesh, i understand why you move the lowpoly instead. :) Thanks for the answer!

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  10 місяців тому +1

      @@1982Jonte Yeah, the med-poly model loaded is a proxy version or super heavy high poly one which cannont be even loaded :). I use this med-poly model only at this stage and only for that purpose. It takes just a minute to decimate a 150mln heavy poly model into light 6mln in Metashape for example and with that one used as proxy alignent is smooth and easy. No probs, anytime :)

  • @matterfield
    @matterfield 11 місяців тому

    You should *not* subdivide the plane (unless you have a non-organic surface with structure/lines) No matter your subdivisions, when you conform/project, edges will appear in your 32bit floating point depth map and normal map. Instead, high pass the depth and blue channel of the normal with a very large radius to eliminate those low frequency "hills." Then, run some basic math on resetting the height to its natural range. All of this can be done in Substance Designer.
    You also should be using a set of ground control points via fiducial markers (April Tags) when scanning so you don't need a ruler. Great tutorial otherwise!

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  11 місяців тому

      Hey, thanks, are you really sure of that? I made some tests - I even shared one in this video and smoothing, smoothes the projection angle by averaging vertices and eliminates 'faceting' on normalmap (I use 16bits for normal). Cant say about 32bits dept map as didnt try it. This is something I do on a daily basis for years and never had faceting on any of my models. I use different subdivision levels from low to crazy dense depending on the subject.
      Regarding rulers, they also help me to maintain even camera coverage and move in even rows during scanning. Results I get without them are much worse. I never used fiducial markers as I never needed them, I guess I should try them at some point. Thanks for the comment and nice words, very appreciated

    • @matterfield
      @matterfield 11 місяців тому

      @@GrzegorzBaranArt ​ 100%. The faceting is less but still evident in the video. Bring the channel into substance designer and increase the levels to see. I suppose it has to do with the baker you're using, but we've used sbsbaker.exe as well as Reality Capture's and the faceting exists. Keeping the surface coplanar and parallel with the broader surface is ideal, then handling the variance on a per-pixel basis produces perfect results.