WSJ did an amazing job covering this topic. What do you all think about this matter? P.S. I forgot to factor in texturing also plus eyeballs,teeth like yea it adds up
I honestly do not think metahuman will be able to replace traditional scan technique anytime soon. This means Hollywood will always have an edge in AAA contents. However, metahuman may be good enough for non-AAA contents, esp. for secondary characters. Epic claimed democratization of hyperreal character technology, but anything that has to do with realistic human character is easier said than done.
In 2018, when I got my scans, the price was about $7K for a day of scanning, processing 20 expressions. This isn’t normally enough FACS scans to make a digital double face rig, but it’s a good beginning. This was at Pixel Light’s Beijing location so I’m positive it would have been more expensive if stateside. This doesn’t include the rigging. The Snappers rig costed $10K. Shortly after, Metahuman was introduced. After spending as much money as for a good used car, you know I was kicking myself.😂
IMO the cost of CAA to do it in-house for the actors they represent would be a tiny fraction of what it would cost to contract them to do it for someone on the outside (which they probably wouldn't do anyway). I'm guessing if you went to a union studio to do it, soup to nuts, it would most likely be in the 75K range for scan with rigging etc. Then, on top of that, each film that used them would have to do all of the mocap etc which would be even more cost to integrate and actually use the character. It's nothing for Disney to be spending 500K -1 million per day on a Marvel movie so it's a drop in the bucket to them. At the end of the day, making an independent film using a Metahuman(s) is an exciting possibility but make no mistake, the Hollywood "gatekeepers" are in full control. So, unless someone is willing to make the sacrafice of making an independent film from start to finish with no or minimal funding (whether it's inside Uneal or live action) and without a distribution deal (which is also controlled by the gatekeepers) no one will ever see it except on UA-cam, or perhaps the extremely remote possibility of selling the completed piece to a Netflix or other streamer which in most cases the licensing fee would most likely not cover the production costs on the first run. Then, of course to even pitch to Netflix etc you'd have to have an agent (even when you're a producer) which is it's own nightmare. This issue in general of making a film for the mass market "on one's own" has been around forever and although I wish it would, I don't see it changing any time soon. The gatekeepers won't let it. Apologies for the long winded post.
Absolutely! The traditional way to break into the industry has been what I did which was get a low level gig at a network or production house. Do whatever you're told, strive to be the best at what you do, and meet and befriend as many people as you can in all disciplines especially in the upper management realm which isn't easy as everyone is guarded and the competition is fierce. As time goes on and you want to advance, people will remember how you interacted and treated them and this works both ways from bottom to top and from top to bottom. Everyone is talented that's working, and with minimal exception from a talent standpoint most are at an even keel but the cream rises to the top. The industry isn't only about what one knows but more of who they know. The only thing that matters to the gatekeepers is "can this person make us a lot of money". On another more positive note I think the most effective use of the UE Meta technology is making realistic looking "Pitch Reels" (or shorts) which then can be used to pitch projects when one can get to the right person. I'm fortunate that my producing partner/wife does know some of the gatekeepers but that's taken years of effort. No one gives anyone any money unless you've made something first which looks like it could have been made by a studio and even then they'll put one of their own to supervise. IMHO your recent character work is outstanding. I could go on but don't want to bore you. Also, on another note... Thank you for your service. 🙂 @@Jsfilmz
In 2017 I made a full body 3d scanner using 80 raspberry pi. Cost me over $7k in parts and I coded everything myself. Back then I used photogrammetry. I used the scanner at comic con in Puerto Rico to offer 3d prints os people. After seeing this video i wish I still have it to go full i to vfx and 3d movies.
I really like your insight and agree that people need to realize their potential at home. This may mean also buying a few additional plugins, maps, etc. but I think I've spent less than $1000 marketplace stuff (UDS, etc) and have what would have cost millions a decade ago. One other question, do you have a comprehensive video on the latest likeness, what you did, skinning, etc? I have seen a lot of your videos but seems like this recent one just exploded with more realism and not sure if I missed another video about it?
I Agree with you Metahuman is Really Impressive and It's In Only First Stages I can't Imagine What Epic Games will Achieve 4 or 5 Years from Now, By The Way Great Video Buddy ♥
When I was in college pre 2004 I said there would be a day that actors never die because they would all be CG. Royalties go to the survivors of the family.
Dope video bro! I did a custom meta human with dreads. I used my sony A7Sii to do the textures with a little morph mesh. It scary how much it looks like me bro. So we gon get it fam!
Your ‘Metahumon Carbon’ video last week says, “$4K (IF YOU HAVE A HEAD SCULPT)”, and I assume that means $4K for just your rigging service alone, no sculpting or texturing or etc. I’m about to buy a Threadripper system + RTX4090 with the intent to animate gorgeous sculpts of historical figures of similar quality to the sculpts on ArtStation, e.g., like those of Hadi Karimi. Seems like you’re saying I’ll be unsatisfied by the rigs/animation results I get from the rig generated by the 'Mesh to Metahuman' plugin. 1. What problems will I see? 2. What will your $4K rigging process do for my results to alleviate those problems?
mesh to metahuman rounds off shapes so much but best is to try it, depending on the shape you might get lucky. Funny you said that ill upload a sculpt rigged with metahuman carbon soon
I think blockchain would make the legal stuff easier, smart contracts in those metahumans which can be NFT's. and the smartcontract tells that the company e.g. get 10% of everything the actor would make with those metahumans. this would mean, that the way movies get published needed to change from a data aspect. eventhough it would be a lot of work to migrate this, it could give us the possibility to move even faster toward the future. Just a thought I got the first time I got in contact with open AI and all the other stuff.
@@Jsfilmz I ain't talking about crypto in form of stocks but blockchain technologie and how NFTs were meant to be. Like one can deal real estate via NFTs e.g.
Do you have any courses on the construction, rigging, tweaking for getting realistic metahumans? Love your stuff. Would like to see you go back to doing more tuts too.👍👍
Modern Warfare 3 actors were scanned. Auto desk maya was used to rig and animate. Infinity Ward Sledgehammer Games accomplished MM3 in record breaking time thanks to photogrammetry. It alleviates complete reliance on human actors who can be late to set or even miss shoots. I'm down to use this tech for game and animated film making. Not so interested in a "live action" film application.
Hey! I'm starting my Unreal filmmaking journey and I would love to see/try a sample of your face and body mocap. I feel like there is not a lot of samples that I can just import into Unreal to build confidence and experiment. 5 seconds would be great if possible! I have a pretty good computer with rtx3090 , Ryzen 5900 or whatever its called and 64 gb of ram. thanks for your time.
Thanks for the amazing educational content, you are right about everything. At the same time, it makes no sense for an ordinary user to “revive” current pop and film stars, unless for a portfolio like see how I can do it. Because a 3D model of a celebrity cannot be used in large projects; this will result in millions in fines. Question - what do you think about historical characters, Macedonsky, Charlie Chaplin, Bruce Lee and thousands of interesting personalities of whom there are no photos, if only a couple of pictures somewhere. How to create them in 3d? I don't mean ZBrush and substance painter, etc., we are talking about photogrammetry technology
Am I the only person who hopes that actors go the way of the Dodo? imagine a world with no celebrities. I love Metahuman and hope it is a step to removing actors from the mainstage. I for one do not want to see them immortalized in the 3d world space.
Sure Metahuman is definitely a game changer on being able to create a digital copy of yourself at home without having to spend thousands of dollars, but as far as I know at the moment this only refers to the face and you still need supported, expensive hardware to be able to achieve a good to perfect scan and to be able to animate your Metahuman. I don't know if it's already possible to realize a full scan of yourself within Metahuman, but if not, then the traditional and more cost intensive way is probably still the way to go. However, I saw your video introducing your Metahuman and it definitely looks amazing! Would like to see how you were able to implement all the facial details like the birthmark, because as last time I tried to create a Metahuman with a 3D scan of myself, details like this weren't captured by UE5/Metahuman Creator.
The meta DIY is still uncanny walley. mainly because of the slow fps in microexpressions. (I think you need to capture at 60-120 fps). One camera could do face but I think at least two will be needed. And that will still be most expensive equipment one have to get, (unless unreal start charging for its service). But what will be the cost of those cameras in the future, will ir and 3d capture be included in consumer products (like webcams)? then the price will come down fast as people do not need apples products. 1000 $ I would say is the cost now.
they dont need rigging because they use deepfake, since their concern is for movie. and the day when deepfake become realtime, rigging become obsolete. This and everything like NERF and Gaussian Splats will sooner or later stream and provide generative solutions. Rigging is good for games, but we have to see the pace of AI and the potential future
@@Jsfilmz it is not the dumbest move if it can create the higher quality. Metahuman is accessible but it means everyone can make the same stuff, so if u wanna stand out u gotta have a better solution. Im a tech artist and I can only see amazing things this kind of tech promises. In fact the only bad thing with deepfake now is that it require too much training data, once it reach certain lvl of quality deepfake will be better than polygons. Even when skeletal mesh supports nanite the data consumption will be too large, this is where neural nets shine, they are master of compression, which is what deepfake is, or any other AI really. Unreal uses Oodle compression with AI based techniques for a reason.
@@Madlionbruh thats what your not understanding you cant deepfake a statue photoscanned only remember the source will still have to have facial animatiom for you to deepfake on top off so at that point just hire an actor that looks the same and deepfake on top
@@Jsfilmz that is how deepfake is today, how do u think metahuman animator can use ur facial animation mocap and drive any other metahuman? U gotta understand what a rig is, and what neural nets can represent. What i'm saying is in order to create high fidelity systems u need high fidelity input, something these guys have the tech to do. This is also how metahuman was made, by analyzing facial muscle deformations then simulate it inside the RigLogic node in Unreal. Without good data you cannot have good output. Now everyone have different muscle structure and facial asymmetry as well as movements of tendons. So if you want hyper realistic you still need these machines. You gotta see things outside the box and understand how they work under the hood. Maybe u dont quite see how things are connected and think this is just photoscanning/photogrammetry... But really it's the data that's hyper valuable.
WSJ did an amazing job covering this topic. What do you all think about this matter? P.S. I forgot to factor in texturing also plus eyeballs,teeth like yea it adds up
As I experienced VFX artist, this is great video and you brought a lot of valid points! Got yourself a fan!
@@LonelionZK thanks man think companies should be worried bout the tech?
VFX house will he using it but I prefer your method which is more convenient for independent filmmakers
I honestly do not think metahuman will be able to replace traditional scan technique anytime soon. This means Hollywood will always have an edge in AAA contents. However, metahuman may be good enough for non-AAA contents, esp. for secondary characters. Epic claimed democratization of hyperreal character technology, but anything that has to do with realistic human character is easier said than done.
I'm a musician, myself and the idea that I could potentially make my own music videos with UE5 an iphone and a ring light is just mind blowing.
music videos was the first usecase i showed here couple years back when metahumans came out mang its the future
In 2018, when I got my scans, the price was about $7K for a day of scanning, processing 20 expressions. This isn’t normally enough FACS scans to make a digital double face rig, but it’s a good beginning. This was at Pixel Light’s Beijing location so I’m positive it would have been more expensive if stateside.
This doesn’t include the rigging. The Snappers rig costed $10K. Shortly after, Metahuman was introduced. After spending as much money as for a good used car, you know I was kicking myself.😂
most def it would have cost more here. thanks for input man.
unreal engine rules bro , metahuman is state of the art
Now a popular actor can make 20 movies in a year, leaving none for the upcoming actor
alongside both - I can see an AI AutoRig being not to far away...
hahah yea man you add ai to the mix, companies like CAA prolly sweating
Black Mirror IRL.
IMO the cost of CAA to do it in-house for the actors they represent would be a tiny fraction of what it would cost to contract them to do it for someone on the outside (which they probably wouldn't do anyway). I'm guessing if you went to a union studio to do it, soup to nuts, it would most likely be in the 75K range for scan with rigging etc. Then, on top of that, each film that used them would have to do all of the mocap etc which would be even more cost to integrate and actually use the character. It's nothing for Disney to be spending 500K -1 million per day on a Marvel movie so it's a drop in the bucket to them. At the end of the day, making an independent film using a Metahuman(s) is an exciting possibility but make no mistake, the Hollywood "gatekeepers" are in full control. So, unless someone is willing to make the sacrafice of making an independent film from start to finish with no or minimal funding (whether it's inside Uneal or live action) and without a distribution deal (which is also controlled by the gatekeepers) no one will ever see it except on UA-cam, or perhaps the extremely remote possibility of selling the completed piece to a Netflix or other streamer which in most cases the licensing fee would most likely not cover the production costs on the first run. Then, of course to even pitch to Netflix etc you'd have to have an agent (even when you're a producer) which is it's own nightmare. This issue in general of making a film for the mass market "on one's own" has been around forever and although I wish it would, I don't see it changing any time soon. The gatekeepers won't let it. Apologies for the long winded post.
bam!! nice post
Just speaking from experience. 🙂@@Jsfilmz
you think these studios startin to sweat a little with new tech like meta@@rjb7269
Absolutely! The traditional way to break into the industry has been what I did which was get a low level gig at a network or production house. Do whatever you're told, strive to be the best at what you do, and meet and befriend as many people as you can in all disciplines especially in the upper management realm which isn't easy as everyone is guarded and the competition is fierce. As time goes on and you want to advance, people will remember how you interacted and treated them and this works both ways from bottom to top and from top to bottom. Everyone is talented that's working, and with minimal exception from a talent standpoint most are at an even keel but the cream rises to the top. The industry isn't only about what one knows but more of who they know. The only thing that matters to the gatekeepers is "can this person make us a lot of money". On another more positive note I think the most effective use of the UE Meta technology is making realistic looking "Pitch Reels" (or shorts) which then can be used to pitch projects when one can get to the right person. I'm fortunate that my producing partner/wife does know some of the gatekeepers but that's taken years of effort. No one gives anyone any money unless you've made something first which looks like it could have been made by a studio and even then they'll put one of their own to supervise. IMHO your recent character work is outstanding. I could go on but don't want to bore you. Also, on another note... Thank you for your service. 🙂 @@Jsfilmz
In 2017 I made a full body 3d scanner using 80 raspberry pi. Cost me over $7k in parts and I coded everything myself. Back then I used photogrammetry. I used the scanner at comic con in Puerto Rico to offer 3d prints os people. After seeing this video i wish I still have it to go full i to vfx and 3d movies.
I really like your insight and agree that people need to realize their potential at home. This may mean also buying a few additional plugins, maps, etc. but I think I've spent less than $1000 marketplace stuff (UDS, etc) and have what would have cost millions a decade ago.
One other question, do you have a comprehensive video on the latest likeness, what you did, skinning, etc? I have seen a lot of your videos but seems like this recent one just exploded with more realism and not sure if I missed another video about it?
I Agree with you Metahuman is Really Impressive and It's In Only First Stages I can't Imagine What Epic Games will Achieve 4 or 5 Years from Now, By The Way Great Video Buddy ♥
yea man time and money saver imo
How to render green screen video please Sir 😢😢😢😢😢
When I was in college pre 2004 I said there would be a day that actors never die because they would all be CG. Royalties go to the survivors of the family.
Dope video bro! I did a custom meta human with dreads. I used my sony A7Sii to do the textures with a little morph mesh. It scary how much it looks like me bro. So we gon get it fam!
Your ‘Metahumon Carbon’ video last week says, “$4K (IF YOU HAVE A HEAD SCULPT)”, and I assume that means $4K for just your rigging service alone, no sculpting or texturing or etc. I’m about to buy a Threadripper system + RTX4090 with the intent to animate gorgeous sculpts of historical figures of similar quality to the sculpts on ArtStation, e.g., like those of Hadi Karimi. Seems like you’re saying I’ll be unsatisfied by the rigs/animation results I get from the rig generated by the 'Mesh to Metahuman' plugin. 1. What problems will I see? 2. What will your $4K rigging process do for my results to alleviate those problems?
mesh to metahuman rounds off shapes so much but best is to try it, depending on the shape you might get lucky. Funny you said that ill upload a sculpt rigged with metahuman carbon soon
@@JsfilmzThanks Jae.
I just watched 3D Scans video about their new meta human packs and they are pretty awesome.
yea man i finally convinced him to make metahuman versions of his skins hahaha he saw the potential of metahumans man now he just has to scan more
In that case, thank you for doing that! It opens up lots of possibilities for those of us who don’t have Hollywood resources 😊
I think blockchain would make the legal stuff easier, smart contracts in those metahumans which can be NFT's. and the smartcontract tells that the company e.g. get 10% of everything the actor would make with those metahumans. this would mean, that the way movies get published needed to change from a data aspect. eventhough it would be a lot of work to migrate this, it could give us the possibility to move even faster toward the future. Just a thought I got the first time I got in contact with open AI and all the other stuff.
then get scammed
@@Jsfilmz I ain't talking about crypto in form of stocks but blockchain technologie and how NFTs were meant to be. Like one can deal real estate via NFTs e.g.
Now I am curious how much it costs
Do you have any courses on the construction, rigging, tweaking for getting realistic metahumans? Love your stuff. Would like to see you go back to doing more tuts too.👍👍
no im done makin courses tbh pirates ruined it
Really sorry to hear that. You have an amazing amount of knowledge and you love to share. I've learned a lot from you as have many others.@@Jsfilmz
eve thou I'm and ass for my comments, i wish you nothing but the best for next year broddy, thanks your all you videos you upload.....
I agree all these cameras and rigs are going to be all obsolete within three years. The only hardware you will need is the one to run your software
Modern Warfare 3 actors were scanned. Auto desk maya was used to rig and animate. Infinity Ward Sledgehammer Games accomplished MM3 in record breaking time thanks to photogrammetry. It alleviates complete reliance on human actors who can be late to set or even miss shoots. I'm down to use this tech for game and animated film making. Not so interested in a "live action" film application.
Hey! I'm starting my Unreal filmmaking journey and I would love to see/try a sample of your face and body mocap. I feel like there is not a lot of samples that I can just import into Unreal to build confidence and experiment. 5 seconds would be great if possible! I have a pretty good computer with rtx3090 , Ryzen 5900 or whatever its called and 64 gb of ram.
thanks for your time.
Did you made tutorial how you created so realistic metahuman? If not WHY???😢
Thanks for the amazing educational content, you are right about everything.
At the same time, it makes no sense for an ordinary user to “revive” current pop and film stars, unless for a portfolio like see how I can do it. Because a 3D model of a celebrity cannot be used in large projects; this will result in millions in fines.
Question - what do you think about historical characters, Macedonsky, Charlie Chaplin, Bruce Lee and thousands of interesting personalities of whom there are no photos, if only a couple of pictures somewhere. How to create them in 3d? I don't mean ZBrush and substance painter, etc., we are talking about photogrammetry technology
Creators! Just work man! Use what you got!
Unreal Engine to the moon guys!
Am I the only person who hopes that actors go the way of the Dodo? imagine a world with no celebrities. I love Metahuman and hope it is a step to removing actors from the mainstage. I for one do not want to see them immortalized in the 3d world space.
Sure Metahuman is definitely a game changer on being able to create a digital copy of yourself at home without having to spend thousands of dollars, but as far as I know at the moment this only refers to the face and you still need supported, expensive hardware to be able to achieve a good to perfect scan and to be able to animate your Metahuman.
I don't know if it's already possible to realize a full scan of yourself within Metahuman, but if not, then the traditional and more cost intensive way is probably still the way to go.
However, I saw your video introducing your Metahuman and it definitely looks amazing!
Would like to see how you were able to implement all the facial details like the birthmark, because as last time I tried to create a Metahuman with a 3D scan of myself, details like this weren't captured by UE5/Metahuman Creator.
The meta DIY is still uncanny walley. mainly because of the slow fps in microexpressions. (I think you need to capture at 60-120 fps). One camera could do face but I think at least two will be needed. And that will still be most expensive equipment one have to get, (unless unreal start charging for its service). But what will be the cost of those cameras in the future, will ir and 3d capture be included in consumer products (like webcams)? then the price will come down fast as people do not need apples products. 1000 $ I would say is the cost now.
agreed i also have facegood stereo cam for $1200 but epics process to to ingest it is a clusterf***
they dont need rigging because they use deepfake, since their concern is for movie. and the day when deepfake become realtime, rigging become obsolete. This and everything like NERF and Gaussian Splats will sooner or later stream and provide generative solutions. Rigging is good for games, but we have to see the pace of AI and the potential future
bruh for them to spend $20k+ to photoscan a person to only deepfake at the end would be the dumbest move in the history of vfx 😂
@@Jsfilmz it is not the dumbest move if it can create the higher quality. Metahuman is accessible but it means everyone can make the same stuff, so if u wanna stand out u gotta have a better solution. Im a tech artist and I can only see amazing things this kind of tech promises.
In fact the only bad thing with deepfake now is that it require too much training data, once it reach certain lvl of quality deepfake will be better than polygons.
Even when skeletal mesh supports nanite the data consumption will be too large, this is where neural nets shine, they are master of compression, which is what deepfake is, or any other AI really. Unreal uses Oodle compression with AI based techniques for a reason.
@@Madlionbruh thats what your not understanding you cant deepfake a statue photoscanned only remember the source will still have to have facial animatiom for you to deepfake on top off so at that point just hire an actor that looks the same and deepfake on top
@@Jsfilmz that is how deepfake is today, how do u think metahuman animator can use ur facial animation mocap and drive any other metahuman? U gotta understand what a rig is, and what neural nets can represent. What i'm saying is in order to create high fidelity systems u need high fidelity input, something these guys have the tech to do. This is also how metahuman was made, by analyzing facial muscle deformations then simulate it inside the RigLogic node in Unreal. Without good data you cannot have good output. Now everyone have different muscle structure and facial asymmetry as well as movements of tendons. So if you want hyper realistic you still need these machines. You gotta see things outside the box and understand how they work under the hood.
Maybe u dont quite see how things are connected and think this is just photoscanning/photogrammetry... But really it's the data that's hyper valuable.
Nice!
Black mirror anyone?
Jones Edward Lopez Margaret Brown Ruth
Go Jaayyy. UE5 to the world 🎉
I agree all these cameras and rigs are going to be all obsolete within three years. The only hardware you will need is the one to run your software