Unreal Engine 5.2 Metahuman Path Tracing
Вставка
- Опубліковано 29 сер 2024
- Unreal Engine 5.2 Metahuman Path Tracing
Big thanks to vimeo.com/user... and Lee Adams for collaborating me with on this
Join this channel to get access to perks:
/ @jsfilmz
Grab my new Unreal Engine 5.1 Course here! Be sure to share it with everyone!
www.artstation...
jsfilmz.gumroa...
Sign up with Artlist and get two extra months free when using my link below.
Artlist
artlist.io/art...
Artgrid
artgrid.io/Art...
@UnrealEngine @FACEGOOD #unrealengine5 #metahuman #facegood #pathtracing
I couldn't fix his eyelash its messed up due to mesh to metahuman but overall im pretty happy with this test. 1300 Frames in 37 hours
How long does it take per frame?
@@starwarz84791300 frames
@@Jsfilmz I see, 1.7 mins per frame then
Pretty good
Does it depends on the physical feature of the actor? (Notice he got glasses) some have heavier eyelids than others, does metadata compensate…
this is much better than what we had and no crazy camera n head mount price... this will be my main reason to upgrade to 5.2
yea bro some of them helmets like $30k plus software :( Facegood $500 helmet free software.
MOCAP-level and DESKTOP-level means two different way to captures for different scenarios.
The accuracy ist amazing
Yeah The accuracy is pretty crazy! This is great for pushing graphics forward.
Great job! And props to Lee Adams. Pretty challenging expressions to test the limits :D
So Excited. Thanks, Unreal Man.
Wow! Awesome!
if they add wrinkles and sss as default on the teeth this will be scary good
He should see a shrink 😂(joke). Superb technological advancement impressive thank you for sharing.
Can we get a lumen vs path tracing for metahumans?
Omg man!! This is amazing!!
Thanks a ton!
wow insane
This is amazing
Ok, so many questions: 1. What hardware did you use, a single RTX 4090?; 2. Facegood Seattle or Facegood free?; 3. What does path tracing do for Metahuman facial animation?; 4. Optional: How about a tutorial on fixing teeth? This looks SO GOOD JS!
facegood d2 $500 60fps version, one 4090 you can do this with facegood free pathtracing looks more realistic on skin teeth is still questionable so not yet
@@Jsfilmz Thank you. "you can do this with face good free"...Did YOU do this on face good free? Or are you doing testing with Jello/Adams using a more-advanced-than-released version of Seattle? Because I haven't seen anything with this fidelity from Facegood unless it's the free version with tons of those tedious keyframes. But maybe you put in the time.
@@stevesween3744his was manual solve so nothing free works the same yup tidious keyframes in facegood still beats hand animating from scratch. Nothing else free out there beats this aint no way for now atleast
Nice!! :)
I'm assuming this was done with the Facegood D2 as I saw a few comments below about it.
If so, is it done using the *free* license where you need to solve the data through Maya? Or is it maybe done with the Metahuman retargeting model in Maya (or done via live link in UE)?
Also, how much time was spent on cleaning up the data, if any. Or is this just the raw data from the initial solve? I have the D2 and the free version of Avatary myself, I just haven't used it yet so was wondering what methods were used to get the anim data seen in the final render. The results are really good so I'm remaining hopeful that I can get some good use out of the helmet, even before Metahuman Animator releases.
As I tried, it always needs to be solved by 3rd app such as Maya (and that's free feature of facegood). However, iphone via livelink face has better live results, but it's mostly usable for non motion body animation.. such as torso only. Iphone is too heavy to hang it on some helmet, so it's better to use facegood helmet f.e., because it's really lightweigth.
yes D2
This is one raw data solved by Avatary free vesion with a Mocap-Level performance pipeline ( HMC & Retargeting Manually ).
Crazy! Did they fix demon eyes or do we still need to hack the material?
still need to hack
@@Jsfilmz Thanks!
Hello JSFILMZ. Does this mean that in Metahuman 5.2 path tracing works without having to turn of the occlusion setting in the eye section material? Is it setup by default? Thanks
no but substrate might fix that tbh
It looks so good. Right now I am more in Blender and UE5, but this stuff looks cool. I just dont know where to begin lol, its like a whole new section to learn and I dont know what kind of equipment I need. I have a good WS with RTX 4090, but the rest?
Unreal engine it's just unreal
Impressive.
yea for a free software not bad at all
Is this using an iPhone Face ID selfiecam like the recent tech demo at GDC or is this an infrared camera?
nope just camera
Something looks different about you 🤔
Is this with the iPhone? The teeth Look really better now , is that because of path tracing?
nah this is not default teeth this is with my tweaks
Looks really good
@@smepablethanks man im tryin hahaha
can this animation be exported to another package like 3dmax? to render with vray?
cant render metas out of unreal tos
Wow amazing result! What was your workflow and gear to capture the facial performance? You got a new subscriber
So if this took 37 hours to render, pathtracing is not really a thing for games, I suppose. But then its useless because octane render is already a great free pathtracer for unreal engine and much faster. What´s the point then?
For octane in most cases there’s multiple crashes that I’ve experienced - pathtracer is a built in render with optimization already in mind when it comes to materials, volumetrics, shaders and especially since splines are now added to the newest update - it’s an easier way to just jump to a render than to work with octane especially since you had to add in its own components and so those to design scene
Pathtracing was never meant nor built for games lol - it’s more designed for cinematic rendering/offline rendering
That’s why there’s a LIT (game) - pathtracing (offline/high render) modes
I’ll give octane a test Watson when I get home but from before it was a whole process to set up and wasn’t compatible with certain things in sequencer
In Toy Story 1, some of the individual frames took 30 HOURS to render. 1 frame, 30 hours. And I just saw it the other day, and honestly it looks like crap now, compared to even the least impressive games these days.
My point is, Toy Story looked completely insanely good when it came out, real time rendering will always be behind prerendering in quality. But give it a couple of years.. (then prerendered stuff will be even better, obv)
i coudln't find answer of the issue i am having for long time which i gave up on metahumans that far from camera body parts doesnt render correctly, all body parts spreads ugly. does anyone know the soultion? it seem simple but not for me. i tried all types of lods etc.
Wow... The free version of the software let users achieve this result? Cause I am scared about annual software fee!
The capture helmet cost 10 grand tho if i understood correctly
Edit: The old version cost 10 grand on their website, the new one ... 469 dollars... I don't get it
yea u can do this with free
@@Jsfilmz amazing... I'll buy for sure next week or asap, You think it work even with latest metahuman animator ? or we need an iPhone?
@@elvismorellidigitalvisuala6211u can use any camera with it facegood ones are infared i like
Now do please lumen raytracing 🤙
oof 😂
I have two questions , is this a free upgrade ? Also is this a version of metahuman which is imported in the UE on your PC ?
That’s crazy I thought it comes out in June
what comes out in june?
@@Jsfilmz - the new metahuman face tracking thingy they used for hellblade 2
no man this is facegood free software lol
What are your temporal samples?
Will we be able to use this feature on models made from scratch?
how many 4090 did you use for this render?
one
@@Jsfilmz that electric bill just keep on increasing every month :D
@@apixel.contentol yea
I hope one day UE can use multiple GPU in one PC.. even use H100 AI GPU for rendering.. so far only one 4090 still have a lot of limitations..
@@steedlei1 I believe path tracing support more than one GPU
Is this with D2 Helmet?
yes
Did you use rokoko for the facial capture ?
u joking right?
@@Jsfilmz no
now we just need a model of markiplier, then we can all be markiplier
no idea who that is but sure 😂
37 hours...
so uncanny valleyish...
Dayummmm. Lol