Unreal Engine 5.2 Metahuman Path Tracing

Поділитися
Вставка
  • Опубліковано 29 сер 2024
  • Unreal Engine 5.2 Metahuman Path Tracing
    Big thanks to vimeo.com/user... and Lee Adams for collaborating me with on this
    Join this channel to get access to perks:
    / @jsfilmz
    Grab my new Unreal Engine 5.1 Course here! Be sure to share it with everyone!
    www.artstation...
    jsfilmz.gumroa...
    Sign up with Artlist and get two extra months free when using my link below.
    Artlist
    artlist.io/art...
    Artgrid
    artgrid.io/Art...
    ‪@UnrealEngine‬ ‪@FACEGOOD‬ #unrealengine5 #metahuman #facegood #pathtracing

КОМЕНТАРІ • 96

  • @Jsfilmz
    @Jsfilmz  Рік тому +15

    I couldn't fix his eyelash its messed up due to mesh to metahuman but overall im pretty happy with this test. 1300 Frames in 37 hours

    • @starwarz8479
      @starwarz8479 Рік тому +1

      How long does it take per frame?

    • @Jsfilmz
      @Jsfilmz  Рік тому +1

      ​@@starwarz84791300 frames

    • @starwarz8479
      @starwarz8479 Рік тому +4

      @@Jsfilmz I see, 1.7 mins per frame then

    • @joannot6706
      @joannot6706 Рік тому +1

      Pretty good

    • @mikaelpettersson5916
      @mikaelpettersson5916 Рік тому +1

      Does it depends on the physical feature of the actor? (Notice he got glasses) some have heavier eyelids than others, does metadata compensate…

  • @Romeo615Videos
    @Romeo615Videos Рік тому +16

    this is much better than what we had and no crazy camera n head mount price... this will be my main reason to upgrade to 5.2

    • @Jsfilmz
      @Jsfilmz  Рік тому +3

      yea bro some of them helmets like $30k plus software :( Facegood $500 helmet free software.

    • @jelowang4941
      @jelowang4941 Рік тому

      MOCAP-level and DESKTOP-level means two different way to captures for different scenarios.

  • @smepable
    @smepable Рік тому +4

    The accuracy ist amazing

  • @NBWDOUGHBOY
    @NBWDOUGHBOY Рік тому +2

    Yeah The accuracy is pretty crazy! This is great for pushing graphics forward.

  • @florianschmoldt8659
    @florianschmoldt8659 Рік тому +5

    Great job! And props to Lee Adams. Pretty challenging expressions to test the limits :D

  • @EdVizenor
    @EdVizenor Рік тому +2

    So Excited. Thanks, Unreal Man.

  • @charlytutors
    @charlytutors Рік тому +3

    Wow! Awesome!

  • @digital0785
    @digital0785 Рік тому +2

    if they add wrinkles and sss as default on the teeth this will be scary good

  • @vartannazarian3451
    @vartannazarian3451 Рік тому +2

    He should see a shrink 😂(joke). Superb technological advancement impressive thank you for sharing.

  • @ak_fx
    @ak_fx Рік тому +7

    Can we get a lumen vs path tracing for metahumans?

  • @ManuelOrbeaOtaola
    @ManuelOrbeaOtaola Рік тому +3

    Omg man!! This is amazing!!

  • @TR-707
    @TR-707 Рік тому +2

    wow insane

  • @rabih1978
    @rabih1978 Рік тому +1

    This is amazing

  • @stevesween3744
    @stevesween3744 Рік тому +1

    Ok, so many questions: 1. What hardware did you use, a single RTX 4090?; 2. Facegood Seattle or Facegood free?; 3. What does path tracing do for Metahuman facial animation?; 4. Optional: How about a tutorial on fixing teeth? This looks SO GOOD JS!

    • @Jsfilmz
      @Jsfilmz  Рік тому

      facegood d2 $500 60fps version, one 4090 you can do this with facegood free pathtracing looks more realistic on skin teeth is still questionable so not yet

    • @stevesween3744
      @stevesween3744 Рік тому

      @@Jsfilmz Thank you. "you can do this with face good free"...Did YOU do this on face good free? Or are you doing testing with Jello/Adams using a more-advanced-than-released version of Seattle? Because I haven't seen anything with this fidelity from Facegood unless it's the free version with tons of those tedious keyframes. But maybe you put in the time.

    • @Jsfilmz
      @Jsfilmz  Рік тому +1

      ​​​​​@@stevesween3744his was manual solve so nothing free works the same yup tidious keyframes in facegood still beats hand animating from scratch. Nothing else free out there beats this aint no way for now atleast

  • @FrostbiteCinematics
    @FrostbiteCinematics Рік тому +4

    Nice!! :)
    I'm assuming this was done with the Facegood D2 as I saw a few comments below about it.
    If so, is it done using the *free* license where you need to solve the data through Maya? Or is it maybe done with the Metahuman retargeting model in Maya (or done via live link in UE)?
    Also, how much time was spent on cleaning up the data, if any. Or is this just the raw data from the initial solve? I have the D2 and the free version of Avatary myself, I just haven't used it yet so was wondering what methods were used to get the anim data seen in the final render. The results are really good so I'm remaining hopeful that I can get some good use out of the helmet, even before Metahuman Animator releases.

    • @RichardRiegel
      @RichardRiegel Рік тому +2

      As I tried, it always needs to be solved by 3rd app such as Maya (and that's free feature of facegood). However, iphone via livelink face has better live results, but it's mostly usable for non motion body animation.. such as torso only. Iphone is too heavy to hang it on some helmet, so it's better to use facegood helmet f.e., because it's really lightweigth.

    • @Jsfilmz
      @Jsfilmz  Рік тому +2

      yes D2

    • @jelowang4941
      @jelowang4941 Рік тому +1

      This is one raw data solved by Avatary free vesion with a Mocap-Level performance pipeline ( HMC & Retargeting Manually ).

  • @PeterLeban
    @PeterLeban Рік тому +2

    Crazy! Did they fix demon eyes or do we still need to hack the material?

  • @3Dgiants
    @3Dgiants Рік тому +4

    Hello JSFILMZ. Does this mean that in Metahuman 5.2 path tracing works without having to turn of the occlusion setting in the eye section material? Is it setup by default? Thanks

    • @Jsfilmz
      @Jsfilmz  Рік тому

      no but substrate might fix that tbh

  • @emrearslan9572
    @emrearslan9572 Рік тому +5

    It looks so good. Right now I am more in Blender and UE5, but this stuff looks cool. I just dont know where to begin lol, its like a whole new section to learn and I dont know what kind of equipment I need. I have a good WS with RTX 4090, but the rest?

  • @thufailbasalamah3342
    @thufailbasalamah3342 Рік тому +1

    Unreal engine it's just unreal

  • @DruuzilTechGames
    @DruuzilTechGames Рік тому +2

    Impressive.

    • @Jsfilmz
      @Jsfilmz  Рік тому +2

      yea for a free software not bad at all

  • @DeadLoya
    @DeadLoya Рік тому +2

    Is this using an iPhone Face ID selfiecam like the recent tech demo at GDC or is this an infrared camera?

    • @Jsfilmz
      @Jsfilmz  Рік тому +1

      nope just camera

  • @DLVRYDRYVR
    @DLVRYDRYVR Рік тому +5

    Something looks different about you 🤔

  • @smepable
    @smepable Рік тому +1

    Is this with the iPhone? The teeth Look really better now , is that because of path tracing?

    • @Jsfilmz
      @Jsfilmz  Рік тому

      nah this is not default teeth this is with my tweaks

    • @smepable
      @smepable Рік тому +1

      Looks really good

    • @Jsfilmz
      @Jsfilmz  Рік тому

      ​@@smepablethanks man im tryin hahaha

  • @christiandebney1989
    @christiandebney1989 Рік тому +1

    can this animation be exported to another package like 3dmax? to render with vray?

    • @Jsfilmz
      @Jsfilmz  Рік тому

      cant render metas out of unreal tos

  • @damienlemongolien5303
    @damienlemongolien5303 Рік тому

    Wow amazing result! What was your workflow and gear to capture the facial performance? You got a new subscriber

  • @sikliztailbunch
    @sikliztailbunch Рік тому +2

    So if this took 37 hours to render, pathtracing is not really a thing for games, I suppose. But then its useless because octane render is already a great free pathtracer for unreal engine and much faster. What´s the point then?

    • @AnimeBadBoi
      @AnimeBadBoi Рік тому +1

      For octane in most cases there’s multiple crashes that I’ve experienced - pathtracer is a built in render with optimization already in mind when it comes to materials, volumetrics, shaders and especially since splines are now added to the newest update - it’s an easier way to just jump to a render than to work with octane especially since you had to add in its own components and so those to design scene
      Pathtracing was never meant nor built for games lol - it’s more designed for cinematic rendering/offline rendering
      That’s why there’s a LIT (game) - pathtracing (offline/high render) modes

    • @AnimeBadBoi
      @AnimeBadBoi Рік тому

      I’ll give octane a test Watson when I get home but from before it was a whole process to set up and wasn’t compatible with certain things in sequencer

    • @AndersHaalandverby
      @AndersHaalandverby Рік тому

      In Toy Story 1, some of the individual frames took 30 HOURS to render. 1 frame, 30 hours. And I just saw it the other day, and honestly it looks like crap now, compared to even the least impressive games these days.

    • @AndersHaalandverby
      @AndersHaalandverby Рік тому +1

      My point is, Toy Story looked completely insanely good when it came out, real time rendering will always be behind prerendering in quality. But give it a couple of years.. (then prerendered stuff will be even better, obv)

  • @sahinerdem5496
    @sahinerdem5496 Рік тому

    i coudln't find answer of the issue i am having for long time which i gave up on metahumans that far from camera body parts doesnt render correctly, all body parts spreads ugly. does anyone know the soultion? it seem simple but not for me. i tried all types of lods etc.

  • @elvismorellidigitalvisuala6211

    Wow... The free version of the software let users achieve this result? Cause I am scared about annual software fee!

    • @ulysse6916
      @ulysse6916 Рік тому

      The capture helmet cost 10 grand tho if i understood correctly
      Edit: The old version cost 10 grand on their website, the new one ... 469 dollars... I don't get it

    • @Jsfilmz
      @Jsfilmz  Рік тому +1

      yea u can do this with free

    • @elvismorellidigitalvisuala6211
      @elvismorellidigitalvisuala6211 Рік тому

      @@Jsfilmz amazing... I'll buy for sure next week or asap, You think it work even with latest metahuman animator ? or we need an iPhone?

    • @Jsfilmz
      @Jsfilmz  Рік тому +1

      ​@@elvismorellidigitalvisuala6211u can use any camera with it facegood ones are infared i like

  • @yiiarts6641
    @yiiarts6641 Рік тому +2

    Now do please lumen raytracing 🤙

  • @aliamirdivan8333
    @aliamirdivan8333 Рік тому

    I have two questions , is this a free upgrade ? Also is this a version of metahuman which is imported in the UE on your PC ?

  • @djcodsta
    @djcodsta Рік тому +1

    That’s crazy I thought it comes out in June

    • @Jsfilmz
      @Jsfilmz  Рік тому

      what comes out in june?

    • @vincev4630
      @vincev4630 Рік тому +1

      @@Jsfilmz - the new metahuman face tracking thingy they used for hellblade 2

    • @Jsfilmz
      @Jsfilmz  Рік тому +1

      ​​no man this is facegood free software lol

  • @Broski-TheLlama
    @Broski-TheLlama Рік тому

    What are your temporal samples?

  • @Walm89
    @Walm89 Рік тому

    Will we be able to use this feature on models made from scratch?

  • @apixel.content
    @apixel.content Рік тому +2

    how many 4090 did you use for this render?

    • @Jsfilmz
      @Jsfilmz  Рік тому +1

      one

    • @apixel.content
      @apixel.content Рік тому +2

      @@Jsfilmz that electric bill just keep on increasing every month :D

    • @Jsfilmz
      @Jsfilmz  Рік тому +1

      ​​@@apixel.contentol yea

    • @steedlei1
      @steedlei1 Рік тому +2

      I hope one day UE can use multiple GPU in one PC.. even use H100 AI GPU for rendering.. so far only one 4090 still have a lot of limitations..

    • @apixel.content
      @apixel.content Рік тому

      @@steedlei1 I believe path tracing support more than one GPU

  • @jerrtinlsc
    @jerrtinlsc Рік тому +1

    Is this with D2 Helmet?

  • @reubencf
    @reubencf Рік тому

    Did you use rokoko for the facial capture ?

  • @NaughtyKlaus
    @NaughtyKlaus Рік тому

    now we just need a model of markiplier, then we can all be markiplier

    • @Jsfilmz
      @Jsfilmz  Рік тому

      no idea who that is but sure 😂

  • @MarvelMaster
    @MarvelMaster Рік тому +1

    37 hours...

  • @user-uv8xl1qq8g
    @user-uv8xl1qq8g Рік тому +1

    so uncanny valleyish...

  • @bobtruck1594
    @bobtruck1594 Рік тому +1

    Dayummmm. Lol