There are also scenes that were recorded but not added to the game due to a lack of budget, time, and technology, which would be amazing to have added to the game
I see a lot of people saying “it doesn’t look real” don’t get it. This is a new technology that allows for a director to see the post production cgi product in a lower quality version than the final but in real time on set during filming. This could have major effects on the use of these technologies when you can have them integrated in real time with a person’s performance. And not to mention an actor being able to really see were and what their characters are doing animated or cgi. Making for a better performance from all actors and a better ability for a director to see what up till now wouldn’t be seen till wrap and months of post production work afterwards often the actual cause of some of the bad cgi I think. Just trying to make sub par shots “just work”.
I thought I was crazy reading these comments. It’s like no one even watched the video or what the point of the presentation was. This isn’t actually the effects for the movie just creating moving concepts for what it could look like. Without taking up the time and resources when initialing the production of the film.
@@wilmcl9209 Nah your right let’s go back to filming the whole movie and taking that footage and adding cgi raw then when any issues arise they can just “make it work” with lots of sloppy cgi work to force footage to fit a new concept for the scene. Spending potential millions on reshoots if they can’t use existing footage. Also adding to scenes shot and in post production they learn it just doesn’t work like they imagined ending with scrapped script sequences. Not all technological advancements are negative. Maybe just maybe seeing a snapshot of what a scene would look like DURING the initial filming process and not after filming in post production cgi could not only prevent costly reshoots. But could also help marry the practical and cgi like never before allowing for a direct visual representation of what they are attempting for any on camera actors. Cgi isn’t something that should be wholly removed from movies bc without it we wouldn’t get believable flying or no more crossing styles like “Cool World” or “Who Framed Roger Rabbit” even in a world where a movie may use hand drawn animation and computer software to insert a cartoon character like the cartoon cat cop from the last action hero. It would allow for a on set actor to “see” their digital co star reacting with them in “real time” and not having them added afterwards and the actor was talking to a ball on a stick for hours and hours of filming with no real idea of how it would look and be. Too much cgi is an issue but to deny it has any place in movies is as idiotic as someone from the late 1920s complaining about too much long winded dialogue in movies. “Movies don’t need all that talking just pop a dialogue text card up and get back to the action” those where probably something your great grandfather would have said.
True enough; however, people want to see a bit more on the polished production-style of quality in the result when the title says "movie" in it. Seeing this, its no better than any standard video game. Motion, physics, volumetric lighting, all of it screams video game and very little to none of it screams cinema.
People trying to compare this to any regular CGI are missing the point. Most CGI takes hours a frame, this is playing out at real time like a real set would during a take. For actual final shots, they would move this into a rendering pipeline that renders it at a far better quality, but the idea is they can now plan out scenes this way very economically. They can iterate faster, save more on physical shots that might get cut, and deliver movies faster this way
Where was that stated? Never once was that said. They literally are talking about real time rendering. Btw, they actually already do what you're talking about, that isn't new. You're clearly missing the point here.
@captaincaptain2128 if you look closely, the video at the beginning has a text at the bottom of the frame saying that this is a real-time visual effects proof of concept video. It is a game changer.
@@MrGreenAKAguci00 Not really. The technology isn't quite powerful enough to make real time rendering look as good as traditional CGI. Give it a decade.
@captaincaptain2128 it doesn't have to be that good if it's only used during production and to allow the crew to work more efficiently and be more informed about what the end result will look like. Single She-Hulk episode after all the CGI changes they had to do due to the director's indecision and constant script changes ended up costing around 25 million USD. 25 mln per episode, mostly because of vfx. Now that show had many more issues, with the writters room and other things but what Sony shows here makes integrating vfx way easier than before because changes can be implemented and verified in real-time. Then after everyone is happy they will render it for the big screen with all the bells and whistles. What you saw here was not the final release.
@@MrGreenAKAguci00 what you're describing already exists and has existed for nearly 2 decades. This isn't that though. I suggest you do a little more research on this technology instead of just watching one video. There's dozens of articles and stuff about it. You clearly misunderstand the purpose of this tech. It's meant to replace the post production CGI work by essentially using digital puppets and rendering it all in real time. It's not meant to be a reference for the director because that technology already exists and allows them to do just that. This tech is meant to replace post production CGI and allow the directors to edit the final shots that day, such as editing lighting. I understand not everyone knows how filmmaking works, but you must know that the amount of time and effort that went into this test alone is in the millions of dollars, right? The assets need to be made prior to filming, and can take hundreds of hours and millions of dollars, and for what? To be a reference for the director and just get scrapped so that the CGI team can spend hundreds more hours and millions more dollars recreating the same stuff but at a little higher quality? You understand how crazy that is, right!? It's called real time rendering for a reason, it's rendering in real time. Ultimately it's going to replace traditional CGI eventually, but the tech isn't there yet. I suggest you read up about it more, as it's very fascinating tech, but you also clearly don't know how it works or it's purpose.
Just came from a website where they said there are upcoming games & animated projects in the works for Ghostbusters, so this could very well be what they will be like.
@@user-lz5vh9bb5wpeople have been saying that for decades. Nothing beats the weight, the tactile nature, the REALITY of practical effects. Even the best digital approximation is still only an approximation. Diminishing returns means that getting it to perfect will be prohibitively costly.
@@alpinion323 100%. I remember video game designers telling me that their best tech would replace movies within two to three years. And that was 14 years ago.
@@user-lz5vh9bb5w maybe. But even if it could, would you want it to? What’s more impressive? Seeing an acrobat do a backflip into a glass of water or seeing an animation? There used to be a sense of “wow, how did they do that?” But even if your right that will be replaced with “AI make all my eyeball food. Nom nom nom”
This isn't about this footage being the "final product." This is about a filmmaker being on set, looking through the camera, and being able to adjust everything based on the real-time feedback from the monitor/s. This, combined with the recent Screen Craft technology used on The Mandalorian and other shows is pretty huge. We're getting to the point where directors can direct scenes organically and in person the way they would on location...but this time it's virtual.
@@maymayman0 What you see in the video is entirely rendered in the computer, and the director directed it by standing in a warehouse-sized room with lots of equipment, computers, etc. He had a tablet-sort of device, sort of a "camera equivalent" but it wasn't a camera. On the screen of that device and on other monitors around the room, he and other crew members could see what the "3D camera" was pointing at. The director could move this tablet/camera-equivalent around and see what the computer was showcasing in real time. However, no actual cameras were used to make this short film. Everything you see was created without an actual video camera that recorded any sort of live picture. That's why it's called cameraless.
@@andysmith1996 I don't know, dude. I"m not someone who worked on this. My guess is that this was a test and the final renders for an actual movie would be "full pixel quality."
I love it. They should use this for one new preview for the new movie. Before showing other preview about new Ghostbusters movie get fans excited for the new move coming out. I know I can't wait to see it.
Because it was made with a software called Softimage. That software was soo good that it almost destroyed the 3D software industry. That is why Autodesk bought them and killed it.
A lot of it was a mix of practical and CGI. What still makes the effects believable to this day is because they knew how to mix the two together to their strengths.
@@hopperhelp1 100%. The pure CG elements like the wide Brachiosaurus shot and herds of running dinos looks kinda of dated, but the hybrid approach was best!
JUrrasic park did everything right. Also they made use of the Dinos in the dark, and the mix with practical. Hope hollowood soon realized, they need to go back to do both.
Oh come on people comparing with jurassic park, seriously?, understand this not a finished product, this is kind of a sketch of what can be done, still to be worked on and developed, is the basement for future great and amazing things in the industry, in my opinion this looks and feels AMAZING, the render concept, ths inmersive story (NYC streets), the clearly remastered soundtrack, the timeline setting with rusted Ecto 1 from Ghostbusters 3 fiercely facing the Marshmallow Man, the amazing sound effects, I mean, this is actually very enjoyable, thank you Ghost Corps, Pixomondo, Playstation Studios and Epic Games for this incredible project which you can be sure will be rewarded in the future with so much support from the Ghostbusters fans whatever your final product(s) be.
Okay, I think everyone is missing the point here. The important part of this is "real-time". They're able to get high quality effects in camera on the sound stage without waiting for a render, which can take days or weeks depending on the length of the shot and can still require compositing. That said, I don't think they're going to use those real time effects in the movie. First of all, I don't think you're seeing the real time effects here. If you look at the screens on the sound stage, they seem just a little rougher than what we were shown at the start. I suspect that was put through a little render pass to sweeten the textures and effects and lighting. However, even that it's a finished scene yet. They'd need to put real people in the scene, those digital stick figures just ain't gonna fly on the big screen. There's very few particle effects, with dust and paper and debris flying. It's just not a movie quality scene. That said, real time graphics have appeared on the screen more than a few times, because real time rendering is used in the Volume, which was created for the Mandalorian and used in other shows as well. But those are for more or less static backgrounds and they didn't take up the whole scene, as there was usually a human actor and physical set pieces in front of them. Is it possible that the real time graphics could get real enough to be what makes up a final scene? One day maybe. But it would have to be a carefully constructed scene, and one that doesn't betray the unreal-ness of what you're seeing. And I mean "unreal" both in the sense of the technology and in the uncanneyness.
I worked in the VFX industry for 5 years and I totally agree (I actually briefly worked on Ghostbusters Afterlife). Practical effects mixed with today's digital comping abilities is a match made in heaven, yet the default is still always CG. It's a sad world we live in today.
Heaven forbid storytelling ever become accessible to the common person. Filmmaking should cost millions of dollars and only be accessible to the elite.
@@heckensteiner4713 the point of this test was to help VFX artists and actors to get an idea of what they'll be doing so they can limit the amount of reshooting because thats expensive
Wow. I remember a few papers about the realtime rendering concept 15 years ago. They thought the technology for doing it this well was a couple of decades away. Got it in a few years early. Nice job.
THIS IS WHAT WE WANT!!! A fully realized, sandbox style game where we can drive around NYC, NJ, maybe even upstate and PA as the Ghostbusters. Taking jobs in businesses (jewelry stores), subways, restaurants, haunted mansions, Central Park, amusement parks (Coney Island), and sports arenas/stadiums would be wild. Even constructing and eventually piloting Ecto-2 would be epic.
So, I should note that a) this wasn’t for a new Ghostbusters movie at all… it was a stand-alone proof of concept. And b) the output of this process is never what would end up on screen. This is just a way to get a faster, on-set VFX capture that can be fully visualized as it is shot. Once all that was done, it would still go to post-production for performance, animation and timing tweaks, as well as some additional compositing, then would get a full render with ray-tracing (which Unreal’s live render engine is currently incapable of). The shots would then be mixed with (and matched to) actual shot footage, not ever delivered as a full, humanless scene like this. In short, there’s no fear that movies will ever look like this. It’s just a method for making the VFX pipeline faster and more connected to principal photography.
Yeah, but they''re not pitching this as previz or volume capture material. As a proof of concept its clearly intended to say; this is what we can do; final pixel in real time. I work with a lot of volume work and they dont put this much effort into the output lol
This is the thrill-a-minute, action-packed, special effects extravaganza I expect from Ghostbusters, without all the boring dialogue and outdated attempts at comedy. You've done it again, Sony.
So this is basically some form of HD layout? Or test-running an effects package in a realtime environment? I checked the link and it's in German, for whatever reason I don't have translation options available. There's a video embedded that explains the staypuff effect in detail but I'm still a bit confused what else is real time, maybe some of the camera effects and I guess the vehicle rig was moved around in real time? The vid name is "SCC - Visual effects using real time technology" I'm still a bit puzzled, because realtime rig setups to preview mocap have been used many times before. But I'm guessing the 'tech demo' stuff here is the traffic sim, lighting, and fluid FX in real time? In the sizzle reel, there's still an artist on scene making adjustments to animation just like on any regular mograph stage, so that stuff still takes 'time' even if it's closer to realtime. I'm not really sure how that would be used-- I guess you'd use all this as some kind of springboard for the final product? In the demo it... doesn't look very dynamic, especially the camera angles so I'm just surprised they're trying to pass it off as some big interesting innovation. BASICALLY any mocap you see on screen is highly HIGHLY adjusted by animators. Like, to the point of the mocap basically becoming a base to the final product after many many many iterations. Realtime performance capture is currently a thing-- so I still don't get why they did all this just do have the demo have.. nice lighting? And some fluid? That doesn't really look too great?
3D Animated series are made for the last 25 years at least. After Pixars Toy Story everyone went into 3d animation. I am using Unreal and Unity for the last 10 years and i started with 3D in 1990. The question is what the production is aiming for? Today i am just working real-time. I am not rendering anymore. The possibilities that RT Engines likes Unreal provide are huge because you can almost work like in the real world but in a virtual one, without waiting hours for some usable results. But what makes me think here is the description. Is it about VFX or a "new" production workflow for whole films and series?
Incredible that it is real time. CGI takes are now possible. The title is confusing though, as you still need to get the shots with a camera, even if it is virtual.
This is a proof of concept rendering for Jason and the creators to work from. This is not the finished final edited cut of the scene. I think there will be so much more to film than just chases thru city streets.
Why is everyone using the same Unreal Matrix demo city scene. I appreciate the work and art direction, but this wouldn't hold up as a sequence cut into a live action film. It still has video game quality. Just because you can render the whole sequence in a day, doesn't mean you should. All thoughts aside, still a fun short sequence though :)
Excellent work by all involved. I wonder if human characters were avoided to prevent the uncanny valley effect. Some parts appeared AI-generated, but in realtime it's all amazing. Wonder what the result would be if they put all settings on epic and render non-realtime to a file in unreal.
this is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. this was never meant to be enjoyed as an actual finished product.
@@FablestoneSeriesWow, IDK. I know what pre vis is of course, although none of the level of productions I've ever worked on had a budget for it. But what I didn't know, is that they did so much work and spent so much just for pre-vis alone. The budget for the movie must have been limitless. But this explains leaving out the humans for know. Still curious about the epic setting and using unreal to do non realtime rendering. Because hey, why not.
If it's realtime rendering... could be fun if they found a way to have some games where a few daily boss fights had a guy in a mo-cap suit reacting to players. I know there's a lot of issues with that for scaling, but I think it would be funny if someone managed it. Would need to hire a lot of people to be on standby in mo-cap suits for instancing.
I thought they had been a mix up as it looked like a computer game. The Stay Puft Marshmallow Man looked great, probably the only thing to be "cinema quality". But it is only a "proof of concept".
We are so close to photorealistic. This is still slightly uncanny (not taking anything away from it, it’s technically the best Ive ever seen). Going to be scary when we cant distinguish life from fake. Scary times ahead Im afraid
Still a long way to go with the effects, getting there though... for me it was Stay Puft...in original film when he walked you had reactive effects like fire hydrants exploding or crack lines along the road as he walked giving weight to the character this seemed missing in this sequence.
I think the implications of this are pretty clear. Sony released this while they are in labor dispute with actors & writers to show them what they were capable of using the Unreal video game engine. This sends a clear message. Get off the picket lines and back to work, or next time, we're going to figure out how to do photo realistic people using the Unreal engine.
No, Give the industry employees a raise and screw your stock holders, the strike must continue and fans should not be watching any new productions until fair wages are achieved.
@grimacres It's not about the shareholders. It's about what was and will be no more. For actors, they want to renegotiate contracts they signed that they are no longer happy with, and it generally revolves around streaming. When they made tv show in, say, 2002, they were paid for the work and offered a residual for syndication. These contracts existed before Hulu, Disney+, Netflix, etc. Should they get paid for something not spelled out in their contacts. Most competent people would say no. If you said yes, then how far would you go back and who do you go after? Every show ever made for over the past 60 or 70 years? Everybody knows the big streaming companies, but what about the little ones? The Roku channel store has dozens of little streamers that charge very little or are completely ad supported who stream niche content such as old westerns. Should actors go after these guys too for not paying a residual when they stream Gunsmoke or Bonaza or even Zorro. Another actor complaint is the number of episodes they make and the breaks in between seasons. Network shows and established actors get their 20+ season shows like Law & Order SVU, Blue Bloods, The Rookie, etc. When you act on a streaming show, you only get six, maybe eight episodes, and you're never guaranteed the role will come around again. If it does, then it might be two years later. Should they get paid more for working less than say an established star who worked hard and earned that 20-episode role that might have taken them 5 to 10 years to get. The writers complaint is similar. Established writers get longer episodes runs and make more money. New writers working on streaming projects get paid, but their job may only last months, where established writers on shows like Law & Order or Blue Bloods have to work pretty much year round to make those shows. These writers on these short-term assignments are calling these gigs mini rooms instead of writers' rooms. Are the actors & writers right or wrong. I don't know. I do know they took the wrong course of action. Striking never makes any sense, no matter the industry. It just emboldens the other side to dig their heels in and be less cooperative. Hollywood can sit and wait for the desperation of not getting paid to sink in.
@hokemoseley2934 That's not what they were trying to do. They were trying to show you that the Unreal Engine technology has advanced to the point of photo realistic environments. Essentially they could do a car chase without having to wreck any cars or endanger human life. I believe the next step would be to create a believable digital actor that can get out of the car and fool the audience by moving around and delivering lines.
If people find this compelling, ok, it's just slick bullshit for those without nuance, human first movies will always be better, these are just Special effects, vacuous and corny, it's like a really dramatic Skip Intro
I won't pay for this movie quality in cinema hall experience. For gaming standard it is where it is with this achievement. Best if making it online streaming and that's remarkable breakthru in entertainment technology.
Reminds me of a video game then it got me thinking.. What if they had some first person shots. Like Act of Valor but Ghostbusters.. The new era Ghostbuster kids could take it to the next level and train like Seals etc. Special equipment etc Raids would be spooky
@@jess_n_atx It's good for real time, I guess this is more for the artists and actors making the film and less about what people will see in the final product. Making on the fly changes easier and all that
If you don't want to make movies, just say so. This is a little better than some amateur stuff I've seen, but only a little. It certainly is WAY below a real film. There is so much missing, and so much that is "not right." Like I said, if you don't want to make movies, just say so, and get out of the way of people who do want to make movies. There are many, many of them.
This is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. This was never meant to be enjoyed as an actual finished product.
Nothing to do with a new movie. Clearly stated that it was designed and created to push real time graphics tech, ie in game graphics. If its anything more than that its related to an unannounced game, not a movie
Apparently, it was just a fun little project that Jason did just to let fans know that Ghostbusters is alive & well & to help tide us over till the next movie is released. There is talk of some games coming out in the future as well however.
I think the effects in the original look better😅 This could get more convincing over time, but there’s some existential hurdles as well such as the jobs and the lacking “human element.” I understand having an artistic vision and wanting it shaped perfectly, but I don’t think any human is capable of micromanaging every detail. AI will be able to fill in those details, but will they make sense and feel real? And couldn’t the lacking details for realism just be added by image capture of real objects and actors rather than generating everything? If you want to represent elements of the real world in a movie, you may as well use the real world when you can. In many cases, actors being filmed add character or expression that wasn’t originally intended, but works best for the story. This tech is really cool, but I don’t think it should always be a replacement for real “filming.” In the same way cartoons don’t replace live action and visa versa; They should co-exist. These media companies are so greedy though, I think they want to avoid paying anyone they cannot completely control. It seems they want to separate IPs from real people, that way the IP’s can live on while the people who make it all possible can be replaced at any time. It’s very concerning
this is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. this was never meant to be enjoyed as an actual finished product.
I don't know what needs more time or money to realize a 'proof of concept' likes this or a traditionell one with an portion of imagination, what the film/shot can look like...
Man this really makes me want another 2009 game, I love Spirits Unleashed but man I miss the solo story and those graphics would be incredible
Agreed, I want a REAL remaster of the game
There are also scenes that were recorded but not added to the game due to a lack of budget, time, and technology, which would be amazing to have added to the game
@@Musicalmane True, plus MULTIPLAYER! For God sakes
I miss Pong
We need a open world game of ghostbusters!
What is it about the music, the siren, the car, and characters that just works so goddamn well? Always puts a smile on my face.
I see a lot of people saying “it doesn’t look real” don’t get it. This is a new technology that allows for a director to see the post production cgi product in a lower quality version than the final but in real time on set during filming. This could have major effects on the use of these technologies when you can have them integrated in real time with a person’s performance. And not to mention an actor being able to really see were and what their characters are doing animated or cgi. Making for a better performance from all actors and a better ability for a director to see what up till now wouldn’t be seen till wrap and months of post production work afterwards often the actual cause of some of the bad cgi I think. Just trying to make sub par shots “just work”.
I thought I was crazy reading these comments. It’s like no one even watched the video or what the point of the presentation was. This isn’t actually the effects for the movie just creating moving concepts for what it could look like. Without taking up the time and resources when initialing the production of the film.
Its literally a step in the Wrong direction. Theres nothing good about this
@@wilmcl9209 Nah your right let’s go back to filming the whole movie and taking that footage and adding cgi raw then when any issues arise they can just “make it work” with lots of sloppy cgi work to force footage to fit a new concept for the scene. Spending potential millions on reshoots if they can’t use existing footage. Also adding to scenes shot and in post production they learn it just doesn’t work like they imagined ending with scrapped script sequences. Not all technological advancements are negative. Maybe just maybe seeing a snapshot of what a scene would look like DURING the initial filming process and not after filming in post production cgi could not only prevent costly reshoots. But could also help marry the practical and cgi like never before allowing for a direct visual representation of what they are attempting for any on camera actors. Cgi isn’t something that should be wholly removed from movies bc without it we wouldn’t get believable flying or no more crossing styles like “Cool World” or “Who Framed Roger Rabbit” even in a world where a movie may use hand drawn animation and computer software to insert a cartoon character like the cartoon cat cop from the last action hero. It would allow for a on set actor to “see” their digital co star reacting with them in “real time” and not having them added afterwards and the actor was talking to a ball on a stick for hours and hours of filming with no real idea of how it would look and be. Too much cgi is an issue but to deny it has any place in movies is as idiotic as someone from the late 1920s complaining about too much long winded dialogue in movies. “Movies don’t need all that talking just pop a dialogue text card up and get back to the action” those where probably something your great grandfather would have said.
These are the people who think CGI means you press a button and the computer does all the work for you
@@whosmansisthis6624where was that stated? When specifically did they say any of that?
This is called a proof of concept. It’s meant to showcase their ability to display camera angles and VFX capabilities. This isn’t a finished sequence.
True enough; however, people want to see a bit more on the polished production-style of quality in the result when the title says "movie" in it.
Seeing this, its no better than any standard video game. Motion, physics, volumetric lighting, all of it screams video game and very little to none of it screams cinema.
pretty sure they're using a customized unreal engine 5 to shoot this
Exactly. It was part of a Playstation Showcase. Has nothing to do with movies. Clickbait headline/title.@@cagneybillingsley2165
the title also says "test" lol
@@cagneybillingsley2165 they legit show ue5 being used, no pretty sure, they are.
People trying to compare this to any regular CGI are missing the point. Most CGI takes hours a frame, this is playing out at real time like a real set would during a take. For actual final shots, they would move this into a rendering pipeline that renders it at a far better quality, but the idea is they can now plan out scenes this way very economically.
They can iterate faster, save more on physical shots that might get cut, and deliver movies faster this way
Where was that stated? Never once was that said. They literally are talking about real time rendering. Btw, they actually already do what you're talking about, that isn't new. You're clearly missing the point here.
@captaincaptain2128 if you look closely, the video at the beginning has a text at the bottom of the frame saying that this is a real-time visual effects proof of concept video. It is a game changer.
@@MrGreenAKAguci00 Not really. The technology isn't quite powerful enough to make real time rendering look as good as traditional CGI. Give it a decade.
@captaincaptain2128 it doesn't have to be that good if it's only used during production and to allow the crew to work more efficiently and be more informed about what the end result will look like. Single She-Hulk episode after all the CGI changes they had to do due to the director's indecision and constant script changes ended up costing around 25 million USD. 25 mln per episode, mostly because of vfx. Now that show had many more issues, with the writters room and other things but what Sony shows here makes integrating vfx way easier than before because changes can be implemented and verified in real-time. Then after everyone is happy they will render it for the big screen with all the bells and whistles. What you saw here was not the final release.
@@MrGreenAKAguci00 what you're describing already exists and has existed for nearly 2 decades. This isn't that though. I suggest you do a little more research on this technology instead of just watching one video. There's dozens of articles and stuff about it. You clearly misunderstand the purpose of this tech. It's meant to replace the post production CGI work by essentially using digital puppets and rendering it all in real time. It's not meant to be a reference for the director because that technology already exists and allows them to do just that. This tech is meant to replace post production CGI and allow the directors to edit the final shots that day, such as editing lighting. I understand not everyone knows how filmmaking works, but you must know that the amount of time and effort that went into this test alone is in the millions of dollars, right? The assets need to be made prior to filming, and can take hundreds of hours and millions of dollars, and for what? To be a reference for the director and just get scrapped so that the CGI team can spend hundreds more hours and millions more dollars recreating the same stuff but at a little higher quality? You understand how crazy that is, right!? It's called real time rendering for a reason, it's rendering in real time. Ultimately it's going to replace traditional CGI eventually, but the tech isn't there yet. I suggest you read up about it more, as it's very fascinating tech, but you also clearly don't know how it works or it's purpose.
It has the real ghostbusters cartoon vibe too it.
Just came from a website where they said there are upcoming games & animated projects in the works for Ghostbusters, so this could very well be what they will be like.
Looks like a great video game. Doesn't look like a movie though.
Nope, it sure doesn't.
This was rendered in a day. Imagine what they could do in a month…
Ok boomer
Yup... is Sony only concept-testing 2018 technology now?
This is real-time directed - it’s not so much about the cg.
This is proof of why in camera practical effects are still the way to go.
For now. Give it a few more years and a AAA asset budget and you'll be wrong.
@@user-lz5vh9bb5wpeople have been saying that for decades. Nothing beats the weight, the tactile nature, the REALITY of practical effects. Even the best digital approximation is still only an approximation. Diminishing returns means that getting it to perfect will be prohibitively costly.
@@alpinion323 100%. I remember video game designers telling me that their best tech would replace movies within two to three years. And that was 14 years ago.
@@alpinion323 Yes, but we haven't had such good AI prospects for decades. Things are about to change exponentially fast.
@@user-lz5vh9bb5w maybe. But even if it could, would you want it to? What’s more impressive? Seeing an acrobat do a backflip into a glass of water or seeing an animation? There used to be a sense of “wow, how did they do that?” But even if your right that will be replaced with “AI make all my eyeball food. Nom nom nom”
This isn't about this footage being the "final product." This is about a filmmaker being on set, looking through the camera, and being able to adjust everything based on the real-time feedback from the monitor/s. This, combined with the recent Screen Craft technology used on The Mandalorian and other shows is pretty huge. We're getting to the point where directors can direct scenes organically and in person the way they would on location...but this time it's virtual.
Why is it called cameraless if the director will be using a camera?? Genuine question not being fesecious
@@maymayman0 What you see in the video is entirely rendered in the computer, and the director directed it by standing in a warehouse-sized room with lots of equipment, computers, etc. He had a tablet-sort of device, sort of a "camera equivalent" but it wasn't a camera. On the screen of that device and on other monitors around the room, he and other crew members could see what the "3D camera" was pointing at. The director could move this tablet/camera-equivalent around and see what the computer was showcasing in real time. However, no actual cameras were used to make this short film. Everything you see was created without an actual video camera that recorded any sort of live picture. That's why it's called cameraless.
But the clip states that this shows you can have "final pixel quality in a fully synthetic environment".
@@andysmith1996 I don't know, dude. I"m not someone who worked on this. My guess is that this was a test and the final renders for an actual movie would be "full pixel quality."
I love it. They should use this for one new preview for the new movie. Before showing other preview about new Ghostbusters movie get fans excited for the new move coming out. I know I can't wait to see it.
The CGI looks great. That being said, Jurassic Park came out 30 years ago and still feels more convincing.
Because it was made with a software called Softimage.
That software was soo good that it almost destroyed the 3D software industry.
That is why Autodesk bought them and killed it.
A lot of it was a mix of practical and CGI. What still makes the effects believable to this day is because they knew how to mix the two together to their strengths.
@@hopperhelp1 100%. The pure CG elements like the wide Brachiosaurus shot and herds of running dinos looks kinda of dated, but the hybrid approach was best!
JUrrasic park did everything right.
Also they made use of the Dinos in the dark, and the mix with practical.
Hope hollowood soon realized, they need to go back to do both.
JP had a 1:1 hydraulic monster T Rex. And the cgi was driven by a stop motion puppet operated by a stop motion master.
This is awesome. We need a follow up to the 2009 game but co-op this time around
It was coop on the wii 👍
🙌🏾
@@sideskrollfour man in story and open world is what we want
@@trealsteve Aren't you tired of "open world" though? I can't play one single "open world" game more...
@@sideskroll No, I’m not. Too bad for you.
This has got to be the next Ghostbusters video game.🤞🙏🤞🙏🤞
the remastered song of ray parker jr sounds really good, they should release it
Oh come on people comparing with jurassic park, seriously?, understand this not a finished product, this is kind of a sketch of what can be done, still to be worked on and developed, is the basement for future great and amazing things in the industry, in my opinion this looks and feels AMAZING, the render concept, ths inmersive story (NYC streets), the clearly remastered soundtrack, the timeline setting with rusted Ecto 1 from Ghostbusters 3 fiercely facing the Marshmallow Man, the amazing sound effects, I mean, this is actually very enjoyable, thank you Ghost Corps, Pixomondo, Playstation Studios and Epic Games for this incredible project which you can be sure will be rewarded in the future with so much support from the Ghostbusters fans whatever your final product(s) be.
Okay, I think everyone is missing the point here. The important part of this is "real-time". They're able to get high quality effects in camera on the sound stage without waiting for a render, which can take days or weeks depending on the length of the shot and can still require compositing.
That said, I don't think they're going to use those real time effects in the movie. First of all, I don't think you're seeing the real time effects here. If you look at the screens on the sound stage, they seem just a little rougher than what we were shown at the start. I suspect that was put through a little render pass to sweeten the textures and effects and lighting. However, even that it's a finished scene yet. They'd need to put real people in the scene, those digital stick figures just ain't gonna fly on the big screen. There's very few particle effects, with dust and paper and debris flying. It's just not a movie quality scene.
That said, real time graphics have appeared on the screen more than a few times, because real time rendering is used in the Volume, which was created for the Mandalorian and used in other shows as well. But those are for more or less static backgrounds and they didn't take up the whole scene, as there was usually a human actor and physical set pieces in front of them. Is it possible that the real time graphics could get real enough to be what makes up a final scene? One day maybe. But it would have to be a carefully constructed scene, and one that doesn't betray the unreal-ness of what you're seeing. And I mean "unreal" both in the sense of the technology and in the uncanneyness.
Nothing will ever replace real character actors for a film that is in a human world, the original still set the bar way back, we miss those days! 💙
I worked in the VFX industry for 5 years and I totally agree (I actually briefly worked on Ghostbusters Afterlife). Practical effects mixed with today's digital comping abilities is a match made in heaven, yet the default is still always CG. It's a sad world we live in today.
Heaven forbid storytelling ever become accessible to the common person. Filmmaking should cost millions of dollars and only be accessible to the elite.
Calm down boomer
@@A_Distant_Life
I am sorry, but what?
Story telling is already accesible to everyone. Don't mix up story telling with million dollar CGI partys...
@@heckensteiner4713 the point of this test was to help VFX artists and actors to get an idea of what they'll be doing so they can limit the amount of reshooting because thats expensive
Ray or somebody’s been keeping the Ecto-1’s engine in great condition!
Ha ha, glad I'm not the only one who said out loud that Ecto would not drive like that.
Yep lol😂
Pretty sure that's Winston. ;)
Certainly we're not supposed to accept this as looking real...
its bloody test footage mate
looks more like a game to me
Looks like a superbowl ad
Because it was made in Unreal engine which is usually used for video games
Wow. I remember a few papers about the realtime rendering concept 15 years ago. They thought the technology for doing it this well was a couple of decades away. Got it in a few years early. Nice job.
you could say it was indeed a decade away
lol the marshmallow man said “hmmm nah Il get that better car”
bro bouta play with it like a toy car😭🙏
The stay puft looks very very similar to the mini puffs
That's exactly what I was thinking because he wasn't wearing his "uniform", only his hat.
"If there's something weird and it don't look good...." (Face Palm)
Wow, a videogame cutscene. The future is now!!
THIS IS WHAT WE WANT!!! A fully realized, sandbox style game where we can drive around NYC, NJ, maybe even upstate and PA as the Ghostbusters. Taking jobs in businesses (jewelry stores), subways, restaurants, haunted mansions, Central Park, amusement parks (Coney Island), and sports arenas/stadiums would be wild. Even constructing and eventually piloting Ecto-2 would be epic.
You know this isn’t a game trailer and a clip from an actual movie that will come out… right?
@@charliekill88 Shaddup!
My siblings and I lived and breathed Ghost Busters in the 1980's. Kids were drinking Ecto-Cooler.
So, I should note that a) this wasn’t for a new Ghostbusters movie at all… it was a stand-alone proof of concept. And b) the output of this process is never what would end up on screen. This is just a way to get a faster, on-set VFX capture that can be fully visualized as it is shot. Once all that was done, it would still go to post-production for performance, animation and timing tweaks, as well as some additional compositing, then would get a full render with ray-tracing (which Unreal’s live render engine is currently incapable of). The shots would then be mixed with (and matched to) actual shot footage, not ever delivered as a full, humanless scene like this.
In short, there’s no fear that movies will ever look like this. It’s just a method for making the VFX pipeline faster and more connected to principal photography.
Yeah people will believe anything they see on YT without researching it whatsoever!
Yeah, but they''re not pitching this as previz or volume capture material. As a proof of concept its clearly intended to say; this is what we can do; final pixel in real time. I work with a lot of volume work and they dont put this much effort into the output lol
Isn't "real-time raytracing" something they advertise? I see the term used all over their marketing material.
It looks almost real. There are things about colour, contrast and physics that the thing just doesn't get right but it's bloody good!
This is literally why there are strikes going on right now.
Because of video games?! Mk.
It looked a lot like the UE 5 demo, and then they told us that is what it is.... with a twist!
I know this is more-so a tool for filmmakers but this looks simultaneously terrifying and amazing.
This is the thrill-a-minute, action-packed, special effects extravaganza I expect from Ghostbusters, without all the boring dialogue and outdated attempts at comedy. You've done it again, Sony.
I like how puff just looks around and drops the car like oh sorry 😂 1:13
So this is basically some form of HD layout? Or test-running an effects package in a realtime environment? I checked the link and it's in German, for whatever reason I don't have translation options available. There's a video embedded that explains the staypuff effect in detail but I'm still a bit confused what else is real time, maybe some of the camera effects and I guess the vehicle rig was moved around in real time? The vid name is "SCC - Visual effects using real time technology"
I'm still a bit puzzled, because realtime rig setups to preview mocap have been used many times before. But I'm guessing the 'tech demo' stuff here is the traffic sim, lighting, and fluid FX in real time? In the sizzle reel, there's still an artist on scene making adjustments to animation just like on any regular mograph stage, so that stuff still takes 'time' even if it's closer to realtime.
I'm not really sure how that would be used-- I guess you'd use all this as some kind of springboard for the final product? In the demo it... doesn't look very dynamic, especially the camera angles so I'm just surprised they're trying to pass it off as some big interesting innovation.
BASICALLY any mocap you see on screen is highly HIGHLY adjusted by animators. Like, to the point of the mocap basically becoming a base to the final product after many many many iterations. Realtime performance capture is currently a thing-- so I still don't get why they did all this just do have the demo have.. nice lighting? And some fluid? That doesn't really look too great?
If you want to be a cameraman, AI took that too as far as Sony is concerned
3D Animated series are made for the last 25 years at least. After Pixars Toy Story everyone went into 3d animation. I am using Unreal and Unity for the last 10 years and i started with 3D in 1990. The question is what the production is aiming for? Today i am just working real-time. I am not rendering anymore. The possibilities that RT Engines likes Unreal provide are huge because you can almost work like in the real world but in a virtual one, without waiting hours for some usable results. But what makes me think here is the description. Is it about VFX or a "new" production workflow for whole films and series?
Incredible that it is real time. CGI takes are now possible. The title is confusing though, as you still need to get the shots with a camera, even if it is virtual.
Might be fine for shorts. Reminds me of many of the Love Death and Robots clips, some of them go for this type of realism
This misses the spirit of Ghostbusters. It feels like a Fast and Furious movie.
Ecto 1 is family
This is a proof of concept rendering for Jason and the creators to work from. This is not the finished final edited cut of the scene. I think there will be so much more to film than just chases thru city streets.
@@neo529Damn straight.
ecto1: speed i'm speed
ecto1: clear the path i have to hurry
RIP mini big puft
Please use this technology for something other than a movie
They are. This was a tech demo for Playstation.
@@GeeksmithingI hope so. Then we’d finally get an open world GB game.
@@trealsteve that would be a ton of fun. GTA but Ghostbusters!
@@Geeksmithing 💯
We're getting closer. But we still need cameras and real people for movies.
This looks awesome
Why is everyone using the same Unreal Matrix demo city scene. I appreciate the work and art direction, but this wouldn't hold up as a sequence cut into a live action film. It still has video game quality. Just because you can render the whole sequence in a day, doesn't mean you should. All thoughts aside, still a fun short sequence though :)
I thought the exact same thing
Excellent work by all involved. I wonder if human characters were avoided to prevent the uncanny valley effect. Some parts appeared AI-generated, but in realtime it's all amazing. Wonder what the result would be if they put all settings on epic and render non-realtime to a file in unreal.
this is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. this was never meant to be enjoyed as an actual finished product.
@@FablestoneSeriesWow, IDK. I know what pre vis is of course, although none of the level of productions I've ever worked on had a budget for it. But what I didn't know, is that they did so much work and spent so much just for pre-vis alone. The budget for the movie must have been limitless. But this explains leaving out the humans for know.
Still curious about the epic setting and using unreal to do non realtime rendering. Because hey, why not.
@@MaraldBes it is getting so fast and cheap to do this level of previs in unreal engines. most previs don't even add music.
would be cool if this was an early 90s continuation of The Real Ghostbusters cartoon
Bro why is there a naked marshmallow walking around the city...
This SHOULD be the opening cutscene to a 4 player co-op Ghostbusters game…
It seems like this is an "in your face" to the people striking.
This is amazing
Ecto-1 drives to the scene gives me a Christine vibe. Anybody who saw the movie Christine based on the Steven King novel knows what I mean.
Yeah, I was thinking myself that Ecto-1 looked like it was doing this itself, especially as you don't see a driver.
Me in GTA when I find the fast car:
If it's realtime rendering... could be fun if they found a way to have some games where a few daily boss fights had a guy in a mo-cap suit reacting to players. I know there's a lot of issues with that for scaling, but I think it would be funny if someone managed it. Would need to hire a lot of people to be on standby in mo-cap suits for instancing.
This looks more like a really good pre-vis
This ecto-1 is such a fast car for how old it is
Wow 95% of the comments dont even get for what this is... Thats exactly why the world gets dumber and dumber every day...
I wonder what the name will be called for the next Ghostbusters movie?!
*Shows a trailer void of emotional impact* "Our goal at Sony is to fill the world with emotion..."
Looks very good.
It's tough to film in NY. This is a good solution for that. It shouldn't be used for everything, but this is a pretty good idea.
The tires of the taxi that hit look floating over the road instead of a heavy taxi.
I thought they had been a mix up as it looked like a computer game. The Stay Puft Marshmallow Man looked great, probably the only thing to be "cinema quality". But it is only a "proof of concept".
I was expecting it to say, "I am Groot!"
I wish Sony would make a sequel to Ghostbusters The Videogame 2009. Or even remake the original in this engine.
We are so close to photorealistic. This is still slightly uncanny (not taking anything away from it, it’s technically the best Ive ever seen). Going to be scary when we cant distinguish life from fake. Scary times ahead Im afraid
Dear Sony,
here some topic recommendations for a stunning proof of concept:
* Gaussian splatting
* NeRF
* Photogrammetry
* Nanite
* Lumen
Looks like a great sequel to the Ghostbusters Playstation 3 game!
I guess one would shoot against "draft mode" in real-time - then follow that up by "quality mode" re-rendering (and some consequent re-editing)
This is an animated movie. If that's what you want, then great. But this is in no way going to replace live action filmmaking. Very videogamey.
It's literally from a video game engine. This clickbait title is clickbait.
Still a long way to go with the effects, getting there though... for me it was Stay Puft...in original film when he walked you had reactive effects like fire hydrants exploding or crack lines along the road as he walked giving weight to the character this seemed missing in this sequence.
I want to hear this song version of the theme song. Sounds soooo good
Looks like a video game cutscene
I think the implications of this are pretty clear. Sony released this while they are in labor dispute with actors & writers to show them what they were capable of using the Unreal video game engine.
This sends a clear message. Get off the picket lines and back to work, or next time, we're going to figure out how to do photo realistic people using the Unreal engine.
No, Give the industry employees a raise and screw your stock holders, the strike must continue and fans should not be watching any new productions until fair wages are achieved.
@grimacres It's not about the shareholders. It's about what was and will be no more.
For actors, they want to renegotiate contracts they signed that they are no longer happy with, and it generally revolves around streaming. When they made tv show in, say, 2002, they were paid for the work and offered a residual for syndication. These contracts existed before Hulu, Disney+, Netflix, etc. Should they get paid for something not spelled out in their contacts. Most competent people would say no.
If you said yes, then how far would you go back and who do you go after? Every show ever made for over the past 60 or 70 years? Everybody knows the big streaming companies, but what about the little ones? The Roku channel store has dozens of little streamers that charge very little or are completely ad supported who stream niche content such as old westerns. Should actors go after these guys too for not paying a residual when they stream Gunsmoke or Bonaza or even Zorro.
Another actor complaint is the number of episodes they make and the breaks in between seasons. Network shows and established actors get their 20+ season shows like Law & Order SVU, Blue Bloods, The Rookie, etc. When you act on a streaming show, you only get six, maybe eight episodes, and you're never guaranteed the role will come around again. If it does, then it might be two years later. Should they get paid more for working less than say an established star who worked hard and earned that 20-episode role that might have taken them 5 to 10 years to get.
The writers complaint is similar. Established writers get longer episodes runs and make more money. New writers working on streaming projects get paid, but their job may only last months, where established writers on shows like Law & Order or Blue Bloods have to work pretty much year round to make those shows. These writers on these short-term assignments are calling these gigs mini rooms instead of writers' rooms.
Are the actors & writers right or wrong. I don't know. I do know they took the wrong course of action. Striking never makes any sense, no matter the industry. It just emboldens the other side to dig their heels in and be less cooperative.
Hollywood can sit and wait for the desperation of not getting paid to sink in.
I'm not going to pay money to watch a 2 hour video game cutscene in the theater
@hokemoseley2934 That's not what they were trying to do. They were trying to show you that the Unreal Engine technology has advanced to the point of photo realistic environments. Essentially they could do a car chase without having to wreck any cars or endanger human life.
I believe the next step would be to create a believable digital actor that can get out of the car and fool the audience by moving around and delivering lines.
If people find this compelling, ok, it's just slick bullshit for those without nuance, human first movies will always be better, these are just Special effects, vacuous and corny, it's like a really dramatic Skip Intro
This ought to be a video game instead, perhaps one w/a brand new movie quality writing & cast.
Perhaps a midquel between 2 & Afterlife.
would be great to see one next year so it ties in with the 40th anniversary
I feel like I'm watching a car commercial
you really hit the uncanny valley when you see humans
I won't pay for this movie quality in cinema hall experience. For gaming standard it is where it is with this achievement. Best if making it online streaming and that's remarkable breakthru in entertainment technology.
Rendering humans 100% realistically will always be the real litmus test
Oh my mistake, I thought this was a BTS of the production for a new Ghostbusters *video game*, my bad...
Is this better than what Jim did for Avatar?
Just kind of looks like a lot of the high end unreal 5 demos honestly.
Reminds me of a video game then it got me thinking.. What if they had some first person shots. Like Act of Valor but Ghostbusters.. The new era Ghostbuster kids could take it to the next level and train like Seals etc. Special equipment etc Raids would be spooky
Looks like CGI from the early 2010s
What ? How can you not appreciate the technology?
Cgi like this from 2010 would take weeks or months to render. This is real-time. Its a big deal
@@jess_n_atx It's good for real time, I guess this is more for the artists and actors making the film and less about what people will see in the final product. Making on the fly changes easier and all that
Haven't Ninja Theory been doing this for years now...?
I know it's just a demo but I'm going to be honest this looks like a Cable tv commercial in terms of production quality.
Surfs up was filmed like this in 2006-2007
If you don't want to make movies, just say so. This is a little better than some amateur stuff I've seen, but only a little. It certainly is WAY below a real film. There is so much missing, and so much that is "not right." Like I said, if you don't want to make movies, just say so, and get out of the way of people who do want to make movies. There are many, many of them.
This is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. This was never meant to be enjoyed as an actual finished product.
Not a fan of the new Ghostbusters movies but this is cool. I prefer the original cast.
Kewl. You get to see a video game version. But this is not final production quality, unless they’re making a cartoon version.
Nothing to do with a new movie. Clearly stated that it was designed and created to push real time graphics tech, ie in game graphics. If its anything more than that its related to an unannounced game, not a movie
Apparently, it was just a fun little project that Jason did just to let fans know that Ghostbusters is alive & well & to help tide us over till the next movie is released. There is talk of some games coming out in the future as well however.
This uses the same assets as that one spiderm-man two trailer we had a while back
I think the effects in the original look better😅 This could get more convincing over time, but there’s some existential hurdles as well such as the jobs and the lacking “human element.” I understand having an artistic vision and wanting it shaped perfectly, but I don’t think any human is capable of micromanaging every detail. AI will be able to fill in those details, but will they make sense and feel real? And couldn’t the lacking details for realism just be added by image capture of real objects and actors rather than generating everything? If you want to represent elements of the real world in a movie, you may as well use the real world when you can. In many cases, actors being filmed add character or expression that wasn’t originally intended, but works best for the story. This tech is really cool, but I don’t think it should always be a replacement for real “filming.” In the same way cartoons don’t replace live action and visa versa; They should co-exist. These media companies are so greedy though, I think they want to avoid paying anyone they cannot completely control. It seems they want to separate IPs from real people, that way the IP’s can live on while the people who make it all possible can be replaced at any time. It’s very concerning
You don't know sh*t.
this is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. this was never meant to be enjoyed as an actual finished product.
Good but still needs some work. For example the Marshmallow man moves too fast and agile for its size
I hope it's only doesn't try to insult us and try to pass that off as a movie. I've seen better CGI graphics on a Nintendo.
I don't know what needs more time or money to realize a 'proof of concept' likes this or a traditionell one with an portion of imagination, what the film/shot can look like...
Dan Ackroyd's dream car. It's got some legs!
Looks like the studios they used to film the horizon forbidden west cutscenes