- 140
- 146 248
Marc
Приєднався 15 вер 2011
Відео
how to use Metahuman Animator to track custom videos
Переглядів 46314 днів тому
how to use Metahuman Animator to track custom videos
Houdini brings light where Maya is dark
Переглядів 13121 день тому
Stop metahumans some time to learn some houdinis
imagine metahuman with a dynamic layer
Переглядів 6542 місяці тому
adding a dynamic layer to get more fleshy look, the point is try to make metahuman look the most similar possible a 4d scan animation and if lucky use the coming tech like Machine Learning deformer to can export it to UE, but no idea if its possible, anyways happy 78 bday FreddieMercury 😅
Extracting old subtle emotions with MHA
Переглядів 5753 місяці тому
Extracting old subtle emotions with MHA
Jack is back
Переглядів 1,1 тис.4 місяці тому
year ago this Jack awesome performance was one of my first try with MHA, after a year of experiments... now using it again for a new MetaM double resolution test
May the 4th be with you - episode 2024
Переглядів 1766 місяців тому
May the 4th be with you - episode 2024
Double resolution Metahuman with gaps using HD scan store textures coming to UE
Переглядів 1,3 тис.8 місяців тому
Double resolution Metahuman with gaps using HD scan store textures coming to UE
some more tips about this Wrap tool in maya
Переглядів 1,1 тис.8 місяців тому
some more tips about this Wrap tool in maya
metaAnimator need a rest after this... by Lee Sun-bin
Переглядів 4839 місяців тому
metaAnimator need a rest after this... by Lee Sun-bin
wrap metahuman heads inside Maya - Gumroad link
Переглядів 94310 місяців тому
wrap metahuman heads inside Maya - Gumroad link
History warning - this didnt happen either - happy new year 2024
Переглядів 8710 місяців тому
History warning - this didnt happen either - happy new year 2024
merry xmas! metaWrap updated (soon link will be updated with body wrap solution)
Переглядів 3,2 тис.10 місяців тому
merry xmas! metaWrap updated (soon link will be updated with body wrap solution)
MetaHumans custom topology and face muscles WIP
Переглядів 4,4 тис.11 місяців тому
MetaHumans custom topology and face muscles WIP
pushing the limits what can be tracked and what meta animator can track
Переглядів 700Рік тому
pushing the limits what can be tracked and what meta animator can track
"Gentlemen. You can't fight in here. This is the War Room!"
Переглядів 1,4 тис.Рік тому
"Gentlemen. You can't fight in here. This is the War Room!"
Genial!!!
Bro so it's cool and wonderful but I can't find the way to export depth y map prosess from nuke cane u explain it clearly because it's hard to understand if all nodes are not visible to create form over won
He is not going to explain that easily. You have to do trial and error to figure it out 😂
love it ...
..thanks 🙂
This is super cool but I see a lot of ethical problems with this.
Love from Korea.
sometimes i like go jeju island to relax
It is almost a torture that there aren't any course made by you
Thank you ❤❤❤
Thank you very much for this!
Amazing and brilliant workflow. Thanks a lot
Thank you Marc ! You know you are the man and the goat !
Thank you!
I have figured out a way do all of these without using Keentools or Nuke. Keentools help with videos where too much head rotation is involved but once the output video from KT is inserted in my tools, it breaks down the video in jpg and depth file for processing. Your hints helped along the way to figure out some stuff. Thanks for this video.
can use whatever system , just need a rgb and depth Y ... maybe soon AI will do this and epic games will include this custom video option by default
@@marc1137 What version of Nuke you are using?
@@MotuDaaduBhai can use whatever version 13,14,15 , nothing special in the workflow
Hello Marc @marc1137 - Great job as always! I also have a question if you don't mind. I already found a practical way to generate consistent depth maps and use them in metahuman animator. My main problem is that after loading them the normalized values of my depth map are in a range that the MHA is not expecting and makes the face scaled 3 or 4 times in the depth axis (Z). I am using a MySlate_# folder created by my iPhone and replacing the Depth_Frames exr files, but I don't find a how to make the depth* map to fit the exact white to black range that it is expecting. Would you mind helping me find a solution, I am guessing you came through all these already. Thank you for posting your inspiring work in this channel, you should have more subscribers.
normal map? anyways the whites should be greater than 1 , i think soon i will do some video covering some more details about last steps
@@marc1137 Thank you Marc for your reply. I meant depth* map (sorry). Let me explain better. I get depth/height map in black and white 32 from AI, temporally consistent, and very close to the original 3D shape. These sequences of depth maps are in 32bits and I have to convert them to EXR format using a python script. When I do this I "normalize" the range of values form de 0 to 1 space. In this case I am not referring to normal maps (sorry for the confusion). This means that the lowest point will be represented by black color (0) and de highest point, closer to the camera, will be represented by white color (1) in the 0 to 1 color space of float 32 bits in the EXR format. Also, I invert the range because the iPhone uses inverted depth maps (1 farther form the camera and 0 closest to the camera). The problem is that MHA is reading these depth maps in a way that is stretching them in the MHA viewport and makes them unusable inside the MHA plugin in UE5. If you think you could share how to do this final step I will appreciate it. You could also contact me at register.online.100 at gmail It seems that this step works for you, but I am doing wrong with the format that I can not figure out. How do you handle this particular range of values in the depth map and the EXR 32 bits format? Thank you again for reading this and for your kind reply.
Wow
Silke Vennegerts. Dieterwergel ❤😅
😂😂😂😂
marry me
perfect! I have a question. Is it possible to transform the target model only if it is identical to the metahuman topology?
always need a metahuman topology matching your custom model , but then could keep the custom topology model to export ,metahuman topology is needed to transfer skin weights and some stuff
hello, can you please help me with understanding how convert exr image with only Y channel, so unreal can read it.. i search everythere.. but cant understand how you do this..
congratulations, but channel thing i think is just convert ur new exr to Y with a shuffle node or similar thing in whatever app like nuke or AE
@@marc1137 thank you so much, really work with shuffle in Nuke, and thank you for giving idea what metahuman can use without iphone)
No to this channel
That's really impressive, well done!
Or go even more crazy: do the cloth simulation in Unreal. BTW Epic apparently has a full 4D facial scan data ML training workflow. They mentioned several levels of complexity for advanced metahumans - 1 scan, 50 scans and 50,000 scans (ML rig, MH4D) - "Pushing Next-Gen Real-Time Technology in Marvel 1943: Rise of Hydra | GDC 2024".
for now i want try build some fake muscle model under the skin to have more collision sliding things going on , and my point is do the things the more simple possible that could be scripted to automate the process for whatever head , then see how could export to UE
Really impressive work. Did you use Maya dynamics to simulate sticky lips as well?
yes using the normal nCloth stuff , old tech but very good , and since just need some subtle movements for now , using 3 steps very low for faster simulation, next step is try add some real geometry under the face to act as muscles colliders
Amazing. I'll look forward to it. Keep up the good work!
Is it similar at what Unreal didi with the Matrix project ? Using ML deformer?
this is Maya dynamics , not ML , but its the thing to try soon , export all this to UE using this tech
@@marc1137 Cool, let us know if it export well! Maya 2025 Announced ML deformer , maybe it can ease the export to ML unreal.
@@wolfyrig1583 Autodesk has said the ML deformer should only be used for animation purposes. They suggest going back to the original dynamics for final render. This implies that ML deformer isn't top quality enough to use for display purposes.
Wild, that we are able to extract information from an old video and apply it to a 3D model. Amazing!
lot of previous tests showing part of the process
1:53
this is great! this is what I need! Is it possible to discuss cooperation in telegram?
hi thanks but i dont have telegram , can send me some email but to be honest i dont have much free time to work freelance
@@marc1137 Is it possible to buy a guide from you?
Hello?
Tutorial?
Did you modify the MetaHuman expressions or leave them exactly as they are after processing in the Unreal Engine (MetaHuman Animator)?
i have my own tool to modify original metahuman in maya , i use double resolution and custom blendshapes , one of the points i want to try is metahuman look like a 4d scan the most possible
@@marc1137 If you make it look like a 4D scan then you could use it to train a neural network without having to use real 4D data and it will generalize well when the networks sees actual 4D.
amazing man, really cleaver :)
thank you! even if probably not many people will take a look, im a bit the anti social media style... 😅
@@marc1137 i am curious if you ever tried face tools from reallusion?
Cool and very interesting how you got the depth map from the video. Can I get a hint?)
i just replied the same in other comment , and some older videos i show main process tracking the face in nuke
NIce, wich ML do you use for depth estimation? I've been using MIDAS, but it's really slow
i track the footage in nuke with keen tools so since theres a 3d face there moving similar to the video i can export the depth
@@marc1137 oh super smart! I'll try it out
Nice work!
thanks!
My brotha!! Your WeChat, please. We need to talk!
i am trying to do the same thing like you did but with a cat, I have the 3d cat done and I can not zrwap it to transfer its topology, it does not work.. Did you sculpt the lion head from the meathuman head?
zwrap or face form russian software can do that , but even if its like magic softwares , theres a moment need some manual work to fit more extreme shapes
how did you atach hair to the head?
hair done in maya ,export .abc and in UE just attach it in the normal way create binding to the skeleton
J'adore les paroles je lai écouté en français elle est triste mais bien les sont bien choisi elle n'a pas de chance. Et en plus sont bonneur ne reviens pas je fais plusieurs ❤❤❤❤❤❤ pour monter mon estime paix a elle Alain
Pretty realistic, however when you look at the lips more closely theres an element of touch missing. Meaning imagine if you clap your hands, you can tell the sound came from your hands because yo saw your hands tough and the momentum and impact of your skin , and that small reaction of your skin moving and tiny tiny of pink and then pale. When you move your lips you lips change color woth force and there is a slight change in shape when they touch which we notice.
one effect im trying to achieve is the famous sticky lips that in maya could do it in some dynamic way , but in UE the controls in metahuman is not enough so im thinking to try some blendshape tricks but not sure yet , and need try better the shader with all those masks controlling the 3 main textures , but not much time , just hope next one will look better
Dope
I question why youtube brought me here
Men of culture we meet again
So we can wait another videos ? are you back from hollidays ? :P
i never know when its the last one , i dont post just to keep the channel alive... and these videos are done in some kind of hurry free time between company projects....... future is always unpredictible.........
Great! Go on, the eyes need some blinks
You can say that to jack nicholson.. its his tracked performance 🤦🏻
Hi Marc Excellent work! When do you think you the product will be available?
for now never since the support is 0 and as i said many times i wont share what i have in the "dirty" way i have..... so its kind of personal learning development that never finish , always something to update to fix to test........
Que miedo 😮
Hello! Nice work I have one question, how you transfered Horse to metahuman?
🤩 wow
I want the full tutorial plss.
So beautiful to see a xenomorph to sing Whitney Houston
i had the song in the timeline from a previous work.....and felt the lyrics fit quite good... .........but if i come to you.....will u stay... will u run away...
very kissable lips 🤭
Pani,Konecna,Vas,nebudu,volit,ale,chci,vzkazat,budte,jako,komunistka,oblibena,Zapadem,preji,Vam,to,,necitim,nenavist,
Prosím,Vás,nevěřte Rusakum
Píši ,Paní , Konečná demokraticky
Klaus ,příčinou ,všeho zla ,tunelářů ,kontakty s,Rusakama mafie
Paní,Konečná,prosím ,Vás nevěřte Rusakum,