I feel like a big part of these UE showcases that gets overlooked is what these features mean beyond the little box under your TV. Coming from a VFX and Visualization background, these things allow us to get closer to certain looks at a fraction of the offline costs. Which means that things like accumulation times etc aren't as big of a concern when we're looking at 1frame per second vs 1 frame per 1 hour. It also allows smaller teams to operate far more flexible and iterative if the production allows for it. So while I understand the disappointment of the actual performance, I think it's important to understand that UE isn't just for games anymore. We do a lot of things with it, and MegaLights is just one of those tools to get us in the ballpark of offline rendering while still leveraging realtime feedback.
THIS! A lot of tv shows use unreal to accelerate VFX work these days. I saw a video demonstrating how the show Superman & Lois uses it and now I'm convinced unreal in tandem with some very dedicated VFX artists is the reason that show's CGI looks as good as it does
For me as game dev the tech is not the problem, but being confronted with non game-devs expecting games to make use of the new tech is the hard thing. This creates unnecessary pressure.
True, but it's kind of lame to use a PS5 to advertise when it's so prohibitively expensive for a console's rendering budget that it's effectively useless.
Exactly! UE is becoming an absolute powerhouse in the VFX industry right now and I'm all here for it. As far as the gaming industry goes; games need to be optimized for consoles, and so developers are held back by the processing power of consoles. Rockstar Games would be 10 years ahead if they could optimize GTA6 for a 4090 card, and that's sadly not possible.
@@oktusprime3637 Except Megalights actually _improves_ lighting performance, even on low to mid-range cards you can get performance boosts of potentially double the frames per second. This isn't like Nanite where the performance improvements are more conditional (for the most part, anyway), Megalights actually allows for thousands of shadow-casting lights while basically having the same performance as one with almost no caveats aside from a couple of visual issues in regards to shadows.
My biggest gripe with these technologies is how bad they look in motion. The temporal accumulation methods that make RT viable right now creates so much ghosting and smudging. I feel like we'll get to a more acceptable image quality eventually, but it's going to take work on many fronts to solve it, chief of which is more rays, better sampling, and better denoising.
@@griffin5734 agree, wish it wasn´t that way but sadly for now it is true, it may look real in still frames but in motion to many filters and "solutions" you end up loosing a lot of detail.
That's a bit of an unfair comparison since DLAA is much higher quality than DLSS quality which has less than half the total resolution to work with than DLAA. Of course to gain image quality you have to give up performance.
I think it's interesting how many potentially Paradigm shifting technologies have come out over the decades that just died off yet Ray tracing and Improvement of video game lighting it's still going would we even have Ray tracing without nvidia's introduction of RTX
Matrix Demo wasn't all about hardware lumen, it was just about Lumen. In general, new UE5 rendering features are shipping in games by now, but triple A games take a while to develop. Black Myth Wukong is a great example of next gen game that shipped with many of these new rendering features.
@8:11 it's clear the lights in the front are BRIGHTER in this scene than the ones in the back, and in the PC screenshot, its a different lighting scheme where all the orbs have increased brightness. It's still possible the noise limitation is affecting the visibility of specular as you suggest, but it's worth noting these are not the same lighting scheme.
yup my thoughts exactly, in the shot at 7:50 the lights toward the back are simply dimmer hence them not giving off as much light + not having strong specular reflections
@@jamesgreen2495 Well. Baked lighting will always be better in a static lighting environment. So if they'd make more use of dynamic lights then, yes. (I recall the cat having a flashlight at some point, so that would benefit for sure)
I am happy to do architectural visualization with UE5 😁 this means I will leverage all of the latest and greates features a lot earlier than publicly available games.
Mega-Lights are probably a life saver for a very specific type of games (as well as VFX probably), Sandbox games. I think Sandbox games will benefit from Mega-Lights since it's almost impossible to prevent players to place thousands of lights in their world, most UE sandbox games currently have a forced cutoff of lights in the distance, sometimes for lights less than 50m away... With Mega-Lights that cutoff can be expanded to 200m or more, and it allows for more detailed lighting and things like screens properly emitting lights in more "modern" type sandboxes. It's stunning!
i dont really care for any unreal engine features as long as almost all the games made on them stutter. and while yes it may be a cost cut from devs or a knowledge deficit from devs on how to make the engine work better. epic should put more work into making sure games dont stutter on their engine on pc. be it by making sure devs know how or even better make devs dont need to know how. the best showcase from my pov would them show some tech that smoothes out rendering spikes
100% THIS. I see so many ue defenders saying " but its the devs fault" is it the devs fault when 95% of UE5 games have the same stupid stuttering, and eratic performance issues? With very few exceptions, UE5 games ALL have performance issues. The tech is amazing but its useless for such a terribly performing engine. Make it known HOW to fix the issues, or FIX the issues, the blame is almost 100% on EPIC
UE5 is the first engine to implement revolutionary technology. And the creator of these technologies said it themselves that it’s gonna take some time till the engine solicited. As of now, already released UE5 games are using old version of UE5, so they don’t properly utilize the new crucial features in UE5.
@@Vartazian360 epic created revolutionary technologies with UE5. But as always with introductions to new technologies, there is always gonna be a lot to fix. The creator of nanite made a present on his journey to creating this technology and as of the day that nanite was launched, there will still be a huge amount of things to work on with it. But that was like 4 ago already, UE5 had been fixing this issues, and in the next few years we’ll be seeing games that are leveraging UE5 to what it was meant to do. UE5 games now are just using old versions of UE5
One thing you guys are not realizing, this isn't just for games. Clearly they are aiming at VFX, which I'm a part of. This to us is a huge update that we can utilize immediately. A lot of us in the industry who use UE for vfx are drooling at this.
Fortnite IS the most important product for Epic and the first place to introduce or test all they features in Real condition. The first EU5 (beta 5, and 5.0) game was Fornite if you remember
7:36 hm I see a specular reflection vor every light in that time where it's bright enough to also get some bloom. You don't see reflektion of the back light in the inner circle because of the angle. Its reflection in placed on the dark ring on the ground. In fact on the next inner white slim ring you see specular reflections on the back lights too. EDIT: i just saw the PC screenshot. On that shot all the light are brigther, getting some some bloom and therefore are in the range where the specular reflection occur, as observed with the PS5 where the lights have to be bright enough ( in bloom range ) to make specular reflection on the ground.
While these new features are cool I'm sure everyone would prefer they did something to mitigate the horrendous stutter that's almost ubiquitous in UE5 games. Sure it might not be all on Epic as not all games have it (Hellblade 2), or have it at an acceptably rare frequency (Wukong), but it's clearly something they need to help devs sort out because it's getting ridiculous and giving their engine a bad reputation.
@@gameguy301 fingers crossed, but I think traversal stutter is even more egregious as most games have (thankfully) implemented shader compilation on launch. Have they done anything in regards to that? It's not only a UE issue though tbf, DS remake used Frostbite and was absolutely terrible in that regard.
Unreal Engine is trash... All these games run like crap and the graphics look worse with low res textures with stuttering... I see a pattern happening doesn't look good. Won't even bother buying anymore games that use Unreal engine they're all garbage. Even this trailer in the video looks like Fortnite graphics but run at 20fps.
@@SemperValor Lol. Unreal is one of the most advanced engines out there, if not the most advanced one. It has issues, sure. But "trash"? Yeah, you don't know what you are talking about. Tell the devs to optimize more, don't blame the engine like a dumbass.
@@SemperValor Unreal Engine is not trash just because bad developers don't use it properly for their games. Most stuttering/hitching is because of DirectX 12 and has also been a problem in other engines such as Frostbite. You can easily mitigate stuttering/hitching in DX12 by precompiling shaders as well as providing pso cache files, Epic has this in their documentation. Alternatively you can just use DX11 which has way less of the problems that DX12 has
I could see Cyberpunk 2 using this stuff as they move to UE5. It’s really intriguing stuff. With emerging technologies like this, I think anyone who claims we’re hitting diminishing returns on visuals in gaming are going to once again be mistaken (we hear this every gen, and we always surpass it). It really feels like we’re early stages into the next leap in graphics. And we just mignt get there.
I'd assume that Cyberpunk 2 will use Nvidia's UE5 branch, as delivering anything but path tracing (at least on the PC) would be a kinda ridiculous regression. Mega Lights, Virtual Shadow Maps, Lumen, and so on are impressive technologies, but at the end of the day they are just trying to recreate what path tracing is doing out of the box in higher quality.
@@FunctionalBreadMachine somehow i finished it on my old 2016 pc with gtx 1070, it was actually playable on the lowest setting with crashes here and there.
@@Emulator_Crossing Oh I believe it. Seems to be decently scalable. It's just extremely taxing. And I'm curious to see what the digital foundry Bois are cooking up. I'm loving the game, even with the performance issues I'd recommend buying it.
What’s the difference between this and ray tracing? Is this meant to be used together or separate with lumen? Or is it meant to be a way to increase the “rays” similar to path tracing?
"What’s the difference between this and ray tracing?" There's no difference because this _is_ raytracing. The current definition of raytracing in the video game industry pretty much encompasses any and all lighting techniques that simulate light as physical rays traveling through the scene, that don't rely solely on screen-space information (since if it could, screen-space reflections could be considered raytracing as you trace a ray through the screen-space depth buffer). This technique fits that definition, as it tries to solve direct lighting by tracing rays through the scene towards nearby light sources, using clever tricks and optimisations to be able to do so more efficiently. "Is this meant to be used together or separate with lumen?" Together. Lumen tries to solve indirect lighting by building a structure of probes and sampling ambient light from those probes by tracing rays outward from the probes, so Lumen is tackling a different aspect of lighting than this technique is. This technique could probably replace Lumen's area lighting functionality as this technique can support area lights with much higher quality, but the rest of Lumen can stay. "Or is it meant to be a way to increase the “rays” similar to path tracing?" Path tracing would replace both this _and_ Lumen. Path tracing is a specific lighting technique that solves all aspects of lighting (direct lighting, indirect lighting, reflections, refraction, subsurface scattering, etc) within a single unified pass, so path tracing would replace both this and Lumen as path tracing can support efficient direct lighting through ReSTIR, and it naturally solves indirect lighting just due to how it works.
@@jcm2606 Thanks for the great explanation! This was easy to follow and helped me understand the concept well, it was alway pretty confusing before lol. Looking foward to see how developers implement these different techniques.
If it's not fast enough for games, maybe for low budget films and series? Compared to rendering stuff in blender with cycles, this seems much more efficient and the PC screen shot looks fantastic.
Honestly I’d lay the blame the lack of hardware lumen on console squarely at AMD’s feet. It’ll be interesting to see if one of the PS5 Pro enhancements is hardware lumen on future titles.
these kinds of technology's are cutting edge, & honestly its crazy that you think current gen consoles WOULD EVER truly be able to handle this tech with a $500 price tag......... It has nothing to do with amd, consoles are designed to be cheap with decent gaming performance!!
It's not an AMD thing. Lumen needed a lot of additional optimization. In this Unreal Fest talk they say 60fps hardware lumen on consoles is doable in 5.5: ua-cam.com/users/lives1qdbJtjUI0?si=R_6eEqBqI5mgoSGM&t=5696
@@yegoat05 Yep, not sure even how the ps6 era will handle it in full aaa game with all the of npcs, physics, ray tracing and ai at 4k and 60fps at once.
@@Emulator_Crossingdo you even understand the entire point of these technologies? Can you not comprehend the fact that without these technologies, traditional method would be way to hardware intensive for console hardware to run anything near this. These technologies are made to make rendering easier both hardware and developers.
The Matrix demo had features like draw distance that didn't fade at a certain distance until it was too far away to see it, infinite resolution textures(? other demos I can't remember the names of, they promised a higher level of Nanite, procedural animations, water physics, etc. Many of those features were lost or came with too much downgrade.
@@roar104 These consoles are aimed at being 4K consoles. It's up to the developers to do better and target higher resolutions. The culprit here as they age is Unreal Engine 5, which massively overshot what these machines can do. Looking at first party Sony titles, you can get ~1440p and 40FPS even with a little ray tracing, and they look amazing and feel great to play. I would say 1440p is a decent target compromise resolution. Looking at badly built UE5 games you get 1080p on a good day with bad upscaling and shonky sub 30FPS framerate. That's not acceptable.
@@pgr3290 they're aimed at outputting 4k via a dynamic internal resolution on most games, not actually being 4k. Nowhere near good enough hardware for proper 4k. They know console users mostly don't know the difference so they can slap it on and people will buy it. That's not to say UE5 isn't horribly optimized with bad features like nanite on top of that though.
@@roar104 The developer determines how the hardware is used. The hardware is good enough for native 4K and 60FPS. Gran Turismo 7 for example. It's probably not the ideal target for most games intended for those machines, however 1080p is poor for the hardware and upscaling does not cut it. If they had DLSS that's another story, but they missed out on machine learning which has always been a big win for PC the last five years
@@pgr3290 " The hardware is good enough for native 4K and 60FPS" LMFAO even last gen games like Shdow of the Tomb Raider run 1920 * 2160 ( Checkerboard 4K ) and still drops into 50s. The only other option is 1080p for stable 60FPS. How delulu do you have to be to think games can actually do native 4K on this machine lol. Even Sony First party titles run 4K only at 30FPS ( maybe 40 in case of Uncharted legacy of thieves collection )
A game like the upcoming Routine would certainly benefit from Mega Lights given that it's a game that aiming for a photorealistic visual style with its horror sci-fi setting.
These kinds of videos serve only as marketing/publicity/awareness for whichever company's new product. Those CEO's couldn't be happier with the state of affairs where gaming/tech "news" sites/"creators" are just an extension of marketing announcements.
Its all cool, but devs really need to go back to games like half life 2, fear, deus ex...to realize that you not need to invest soo much money, time and resource on cutcenes/voice actors/trailers. just need good level design, story and pacing without stopping the player every 10 min.
All three of those games are PC games. The AAA PC industry died with the Great Consolization of 2008. I would love a AAA PC gaming renaissance, but as long as AAA devs conceive, design, and code only for console HW and the sensibilities and preferences of the console market, the best you and I can hope for is something like DOOM 2016. Which as good as it is, is nothing like Deus Ex, Doom 3, Half-Life 2, F.E.A.R., Dark Messiah of Might and Magic, BioShock**, Crysis, or I'm sure lots of other PC games that I look forward to playing for the first time. ** BioShock is interestingly a console game, but it is so well designed, in all aspects, and plays so well on PC, that I didn't realize it was a console game until several years after it released.
@@bricaaron3978 my fav in the series is bioshock 3 + dlcs, just love the story with elizabeth, and the all floating city fever dream vibe, even tough the game feels short like it was cut in the middle. nothing like this since.
Developers think that a game needs to be cutting edge to be AAA. If it gets more than 60fps on a 4060 at 1080p or 60fps on 4090 at 4k then.. they didn't do a good job. We don't need incredible graphics. Art style can go so far in a game. I thought Ghost of Sushi was quite beautiful and the graphics were absolutely very convincing for a AAA games and yet it runs better than most games in the last couple of years. Lazy devs or is Nvidia pushing heavy engines more than they should?
@@Emulator_Crossing Yes, I love playing through all three BioShock games. Infinite is my second favorite --- BioShock will always be my favorite. Believe it or not, I've played Infinite twice so far, but I haven't played the expansions yet, so I have those to look forward to.
Most of these techs you see with UE5 on PS5 have been present in the Decima Engine in some shape or form. I feel like Decima could become a general Game Engine for PS studios that don't use their own.
Thank you so much for this. I've been using lumens on a lot of big vfx projects. But I am definantly going to reload up some of those worlds with RTXDI and see how they fair in the render queue.
Going from a genuine high frame rate experience at 120-240fps down to 60 is disappointing, no matter the image quality bump. Hope they continue to push optimizations too.
That Matrix demo ran on PS5. So, what's the issue? One would guess there would be development happening during this time, and it would be viable to make a game with it. I'd accept 30 fps if my game looked like that on my console.
@@kanta32100 LMFAO no. Frame gen absolutely doesn't recommend 30FPS as baseline. It'd be horrible in both latency and visuals. It is much more useful for 45 to 90FPS or 60 to 120
Projector lights or whatever you whant to call them are a feature since ue4 called rect lights. Thats nothing new. The fact that they lifted the self imposed limit of 4 dynamic lights is amazing, really groundbreaking, almost worthy of DF to do a deep dive on (in?) but im guessing these mega lights are megaBS as usual, probably rendered in a separate pass and then composited together with the rest of the scene. As the guys noticed theres heavy time slicing and probably upsampling going on "behind the scene". I remember when deferred rendering started to pick up and how it promised huge numbers of dynamic lights but god forbid you enable shadows on them. This is probably similar, god forbid you have more than 30 FPS...
This sounds neat and all. However, it means fuck all if developers don't put the time and effort into optimization. There is ZERO excuse for doing the bare minimum. It seems nowadays AI upscaling tech is being used a crutch to make up for the poor performance at a baseline level.
i remember when the technology was announced everybody was excited thinking that it would help us achieve playable 4k at high frames and good graphics, instead we need it to berely play at 4k 30fps with stuttering
If graphics card companies didn’t hamstring their cards with VRAM for so long, maybe they could progress games faster. Now developers are playing catch-up because cards are getting faster. Games take years development so they expecting cards to get faster, but they aren’t except for the fastest cards. Everything below the 4070Ti is the same as two years ago. By optimization, you mean “make it look worse”.
Well no, hardware simply cannot keep up with the progress and technology. I've been saying this for years since 1997, full RT, and bounce path tracing etc will never become viable in real time as its just too expensive. That's why the most powerful computers, even today take hours/days to render a CGI movie using these things.
What about PS5 Pro with better Raytracing hardware, 45% better GPU performance and PSSR, it would benefit from this UE5 technology? I believe what we have seen until now for the PS5 Pro is was just current games improvement no games built for it. It was the same when PS4 pro and PS5 was lunched.
Nah, The Pro is a mid range GPU, 7700/7800 XT raster with 4070 ray tracing capabilities. The things they’re demo’ing here are going to require far more powerful hardware to pull off without shortcuts , not to mention build a game around which will take years . As they pointed out, there were several shortcuts taken in the demo that were hidden in the video feed at 1080. This certainly has next gen potential though.
Before the PS5 was released, it brought with it many expectations. Although this generation of machines can now read roughly 100 times faster than in the HDD era (said Mark Cerny), the actual difference is not as great as expressed in the numbers. I'm looking forward to seeing what Naughty Dog comes up with next, but they are still asleep at the wheel.
They got rid of all their actual creators and replaced them with feminist/gay activists. Not a recipe for success. They didn’t move to UE5 because it would be better, they did because I honestly think they kicked out all the dudes that actually know how to code an engine
Not sure if you guys read the comments here but we shipped Funko Fusion with all the bells and whistles and a 60fps mode that is fairly consistent on console.
@@AdamKiraly_3d I figured as much, I was just confused as to how the comment relased to the video. Now I see. But their analysis is still correct, they said it was rare not impossible. I'm glad you targeted 60fps, and I'm curious what sacrifices you made to achieve it. Internal resolution? High down sample factor on certain lumen effects? Regardless, good job. I'm currently working on a fork of UE5 to get better performance out of it.
I accept 30fps and 1080p for this quality. (This is why it's a bullshiʇ that there's no need for Performance/Quality Modes on PS5 Pro. No matter how powerful the hardware is, you can still sacrifice fps for quality.)
Looks a shit ton like LTCGI people use inside VRChat Worlds which is unity 2022 and has almost no performance impact at 90 FPS inside VR (Aka rendering each frame 3 times)
@@Mitchell-p4h These nerds have experience and insights about what they are saying. Which is kinda the OP thing about being a nerd in gaming and tech industry.
UE5 does not provide a good enough quality/performance level for current gen. Look at the best looking games and they all have something in common. Forbidden West Red Read Redemption 2 Battlefield 1 Arkham Knight Space Marine 2 Crysis 3 Last of Us 2 Demon Souls remake RE4 remake Doom Eternal Baked lighting
Crysis isn't baked lighting it uses Cryengine. Battlefield 1 isn't baked lighting it uses it's own engine as well Doom Eternal, who knows it uses it's own engine as well RDR2 isn't baked lighting it uses it's own engine as well Forbidden West isnt' baked lighting it uses it's own engine as well Last of Us 2 don't know, but not on UE5 either.
@@devonmarr9872 No that's not how it works there's no such thing as fake time of day in those games, what even is fake time of day. RDR and cryengine have tod changes you cant use baked lighiting then, neither can you use it on large map environments like Battlefield 1, technically you can but the shadows look garbage at those sizes, battlefield 2 used it, but very low res
This is cool technology thats going to push AAA forward but at the same time I really don't care because I mostly play retro, indies and street fighter 😅.
Sounds like MegaLight and Lumen are a stop-gap before we get to high performance path tracing. When path tracing is more viable (5-10 years?), I imagine those two technologies will no longer be needed.
I feel like DF would do well to maybe talk to some actual tech artists. Because every tech artist I know and/or follow on social media collectively lost their minds about this tech because it makes the amount of teethpulling hacky workarounds to light scenes and maintain performance basically go away completely.
Even if there is stutter, or compromises in other ways, it's still just so nice to see this incredible new tech being released. Things always improve over time, I'm just thankful for all this tech coming out this generation sooner rather than later. Since the PS5/XSX and UE5 launched, I've always said this is a BIG transition period where this tech is new and emerging, but next-gen in a few years is where we really are gonna get to see these techs live in action and greatly improved upon. PS6 I'm sure is going to be very powerful based on leaks/rumors and just comparisons from tech jumps from previous generations, it's very exciting to think about :D
Very little explanation of technology here for a video titled "Technology Explained". If you don't actually even know how it works, don't advertise your video like you do.
unreal is great but you can get into alot of trouble with all the gee wiz stuff. keeping things simple is usually a smarter way through it. alot of the perf issues are devs not making proper compromises with all this stuff and instead going full idiot on graphics because they think they can and maybe shouldnt. or should make more clever compromises and forgo some things. the pressure to look good kills rational thinking about whats holistically good for the whole game. i say this as an artist. often simpler lighting actually looks better btw. adding too many lights muddies the overall reads. again simplicity is often better. this is lost on alot of people.
Another proof of the Epic's UE strength and capability of their SW development magic. It's also another showcase what SW alone can do on most of the HW and that u don't have to pay extra for HW feature like with Nvidia's RTX asking ridiculous price for it also by not having any real competition. I give Epic BIG thumbs up for democratizing expensive feature for everybody even they might fully realize on the next gen consoles and more performant PCs👍👍
You realise that this can use hardware RT, which is what NVIDIA's RTX is, right? This, like Lumen, is just a more efficient way of using and managing rays. The underlying raytracer is a completely separate thing, so you could theoretically plug a voxel raytracer into this and have it work.
@@jcm2606 Yes, it can but it doesn't have to use Nvidia's or other proprietary GPU HW feature to run smoothly... of course dedicated/specific HW based solution will always deliver better results than just a SW one, but that's exactly why I like Epic's decision to not directly promote it and rather use PS5 even it's based on the AMD platform. AMD is clearly not as far as Nvidia with the GPU HW features so it's fair from Epic to show it will work there as well. Overall they sent clear message they've got SW solution which delivers results almost everywhere and that's right way to go for the company like Epic.
these technologies are all well and good and will simplify the development process for any applicable application, but they are not viable for games, at least not in the immediate future. the cost is simply to high. if it cannot run at 30fps at 1080p console hardware will always hold it back. maybe itll be a smooth experience on a pc with a 5090, but not sure who is going to target a game at 0.0001% of the market. lets get back to real improvements. this generation of shoehorning in more and more tech that needs to be multi sampled and upscaled into a blurry laggy nightmare needs to end.
The worse you understand why and for what but them as always you remenber aahh i saw somenthing like this somewhere... in this case Arkhan Knight lmao...
In all fairness, even in preview Megalights substantially improves lighting performance (even using it a minimal amount of lights can result in potentially doubling your FPS, based on the tests I've seen, and 1000 lights can potentially go from 20-30 FPS or less to possibly 100+ FPS, which is wild), and it basically makes it so you can functionally use as many lights as you possibly want in a scene without having to weigh up performance considerations (with one exception, it doesn't work as well when a bunch of lights are heavily overlapping, but that's easily fixed by spacing out lights in your scene design). That's practically the holy grail of lighting. Sure, it has drawbacks visually, but the moment 5.5 comes out of preview I'm turning on Megalights just for the performance boost alone.
y they had to do it on a high end PC cause atleast we would know what it will cost us to have it at a good state,why do u think there is no Path Tracing on Cyberpunk console versions?cause they will tank if they use it the console will prop shut down
@@christonchev9762 even with no ray tracing unreal engine games perform very poorly. I have played dozens of ue games. And apart from very few titles , they all suffer from performance issues. And I've played them on a pc that's way above the recommended system requirements. I like how ue games look , but I hate how they perform. IMO they are good for photo mode. Not actual gameplay.
we can always buy stronger gpus and new consoles but the reality is that i would be happy with the performance of my 4070 TiSuper for a decade or 2.. if games released that refined game engine tech ibstead of just meaninglessly increasing hardware demands for little benefit to image quality. The influence that nvidia has on graphics programming will continue to incease demanda of games. The 2024 games that have the best graphics look good enough for me... Forever if im being honest. Id be happy to see the creative art styles and clever techniques we would see from devs if a 32gb 4080 was the only gpu available. We need innovation. Not games using super demanding engines tham make developing easy and make games harder to run. Its about time for a new engine built for gpus instead of nvidia and developers.
It looks incredible in the demo, but from my experience playing other UE5 games that already end up looking pretty blurry and ghosty even on higher end PC hardware, the added distraction of noisy, temporally inconsistent lighting and shadows doesn't really seem like something I want in any sort of fast paced game. I hope these features are used responsibly by developers, and not to the detriment of the image quality of their games, but I don't have much faith after experiencing the current crop of ue5 games.
Can't we just go back to baked lights and SH probes for dynamic objects. Weary with all the noise & artifacts. It's refreshing to see clear and crisp native 4K rendering.
People were complaining about game worlds being too static for ages, and now that there is lighting technology that allows more dynamic worlds, people are complaining about that. How about just playing idk... older games until you have a better PC?
No, because shadowing is still a problem in that case. It doesn't matter if you have a hundred baked lights if you need to then render a shadow map for each of those 100 lights, to be able to have them cast a shadow originating from dynamic objects in the scene. That is the current state of rasterisation, which is why this technique is a big deal since it offers a much cheaper way to have hundreds of lights that can cast dynamic shadows.
while I understand that there was specific work done to make it run on ps5 and we probably won't see it on ps5 since it's a nascent technology, it's freaking cool that it's able to run on modern consoles. I hope it's implemented for ps6 by devs who use ue5.
I have use 12 Months a 7900 XTX and since 2 Months a 4080 Super. In most Games Raytracing is not good enougth, Software Raytracing or good Cubemaps looks very similair, only some Games like Cyberpunk really makes a quality difference with Raytracing On.
I thought UE5's ray-tracing works backwards. Instead of ray-tracing from the light source the ray is traced from what your eye sees. That makes sense. Why shoot a ray to something you can't see. But I don't know how they do that backwards.
Most ray tracing actually uses that method. There’s a lot of complex math behind it, and still a lot of Misses, but the fundamental idea is that going from the lights means there’s no ground truth for where the rays need to end up. Starting from the viewport, we send rays to objects, and then send rays from those objects and hope those rays hit a light source
@@456MrPeople Path tracing does not mean that you trace rays from light sources. Path tracing is just when you treat light as a single, unbroken path from camera to light source, and track the energy transferred across that path as light bounces from surface to surface. This can be done in either direction, and is even sometimes done in both directions simultaneously (bidirectional path tracing). No clue where this "path tracing starts from light sources" thing started from, but it's wrong.
I'm kind of sick of RT already. The massive performance cost for anything approaching decent visual quality is too much and it very much appears to be a brute force approach.
There is no doubt UE5 is the most realistic engine. The games same as hellblade 2 or black myth wokun proved it. Yes it needs high ends GPU but it's worth it. Right now 4000 series easily can handle the games andvin next 2 years everybody use 4000 or 5000 GPU. AMDs next GEN also supports raytracing with high frame rates as they said.
There are other engines that come close to the level of UE5 and in some ways might surpass it but there is no way Unreal is the most versatile engine used by so many different developers.
I feel like a big part of these UE showcases that gets overlooked is what these features mean beyond the little box under your TV. Coming from a VFX and Visualization background, these things allow us to get closer to certain looks at a fraction of the offline costs. Which means that things like accumulation times etc aren't as big of a concern when we're looking at 1frame per second vs 1 frame per 1 hour. It also allows smaller teams to operate far more flexible and iterative if the production allows for it. So while I understand the disappointment of the actual performance, I think it's important to understand that UE isn't just for games anymore. We do a lot of things with it, and MegaLights is just one of those tools to get us in the ballpark of offline rendering while still leveraging realtime feedback.
THIS! A lot of tv shows use unreal to accelerate VFX work these days. I saw a video demonstrating how the show Superman & Lois uses it and now I'm convinced unreal in tandem with some very dedicated VFX artists is the reason that show's CGI looks as good as it does
For me as game dev the tech is not the problem, but being confronted with non game-devs expecting games to make use of the new tech is the hard thing. This creates unnecessary pressure.
True, but it's kind of lame to use a PS5 to advertise when it's so prohibitively expensive for a console's rendering budget that it's effectively useless.
Exactly! UE is becoming an absolute powerhouse in the VFX industry right now and I'm all here for it. As far as the gaming industry goes; games need to be optimized for consoles, and so developers are held back by the processing power of consoles. Rockstar Games would be 10 years ahead if they could optimize GTA6 for a 4090 card, and that's sadly not possible.
@@oktusprime3637 Except Megalights actually _improves_ lighting performance, even on low to mid-range cards you can get performance boosts of potentially double the frames per second. This isn't like Nanite where the performance improvements are more conditional (for the most part, anyway), Megalights actually allows for thousands of shadow-casting lights while basically having the same performance as one with almost no caveats aside from a couple of visual issues in regards to shadows.
My biggest gripe with these technologies is how bad they look in motion. The temporal accumulation methods that make RT viable right now creates so much ghosting and smudging. I feel like we'll get to a more acceptable image quality eventually, but it's going to take work on many fronts to solve it, chief of which is more rays, better sampling, and better denoising.
RTX 10080
@@griffin5734 agree, wish it wasn´t that way but sadly for now it is true, it may look real in still frames but in motion to many filters and "solutions" you end up loosing a lot of detail.
That's a bit of an unfair comparison since DLAA is much higher quality than DLSS quality which has less than half the total resolution to work with than DLAA. Of course to gain image quality you have to give up performance.
This seems to be preventable for upscaling and AA, but accumulating light emissions and occlusions without those artifacts is a more complex problem.
I think it's interesting how many potentially Paradigm shifting technologies have come out over the decades that just died off yet Ray tracing and Improvement of video game lighting it's still going would we even have Ray tracing without nvidia's introduction of RTX
Can't wait for the "no more stutter Technology"
With zero lag!
That was 5.4. You're 6 months out of date.
@@LittleBlue42 right, when will we see it in a real game?
@@UhOhUmm by the speed management of most game companies allow such updates, probably till the next console generation.
Just stick to last gen games on next gen hardware.
Matrix Demo wasn't all about hardware lumen, it was just about Lumen. In general, new UE5 rendering features are shipping in games by now, but triple A games take a while to develop. Black Myth Wukong is a great example of next gen game that shipped with many of these new rendering features.
It's funny we still say "next-gen" while meaning "current-gen 4 years since launch, probably 3-4 years before the actual next-gen" 😀
@8:11 it's clear the lights in the front are BRIGHTER in this scene than the ones in the back, and in the PC screenshot, its a different lighting scheme where all the orbs have increased brightness. It's still possible the noise limitation is affecting the visibility of specular as you suggest, but it's worth noting these are not the same lighting scheme.
yup my thoughts exactly, in the shot at 7:50 the lights toward the back are simply dimmer hence them not giving off as much light + not having strong specular reflections
Stray would benefit from this.
So would halo in the interiors
Not really, since Stray was all baked lighting
@@Rivandu maybe I should reword it. Stray would look good using this lighting in that game instead of their current lighting.
god no, the smearing from the taa makes this technology frankly useless and ugly asf within any fast pace game with quick camera movements like stray
@@jamesgreen2495 Well. Baked lighting will always be better in a static lighting environment. So if they'd make more use of dynamic lights then, yes. (I recall the cat having a flashlight at some point, so that would benefit for sure)
I am happy to do architectural visualization with UE5 😁 this means I will leverage all of the latest and greates features a lot earlier than publicly available games.
damn right!
Welcome to DF, where everything is "bespoke".
Mega-Lights are probably a life saver for a very specific type of games (as well as VFX probably), Sandbox games. I think Sandbox games will benefit from Mega-Lights since it's almost impossible to prevent players to place thousands of lights in their world, most UE sandbox games currently have a forced cutoff of lights in the distance, sometimes for lights less than 50m away... With Mega-Lights that cutoff can be expanded to 200m or more, and it allows for more detailed lighting and things like screens properly emitting lights in more "modern" type sandboxes. It's stunning!
i dont really care for any unreal engine features as long as almost all the games made on them stutter. and while yes it may be a cost cut from devs or a knowledge deficit from devs on how to make the engine work better. epic should put more work into making sure games dont stutter on their engine on pc. be it by making sure devs know how or even better make devs dont need to know how. the best showcase from my pov would them show some tech that smoothes out rendering spikes
100% THIS. I see so many ue defenders saying " but its the devs fault" is it the devs fault when 95% of UE5 games have the same stupid stuttering, and eratic performance issues? With very few exceptions, UE5 games ALL have performance issues. The tech is amazing but its useless for such a terribly performing engine. Make it known HOW to fix the issues, or FIX the issues, the blame is almost 100% on EPIC
UE5 is the first engine to implement revolutionary technology. And the creator of these technologies said it themselves that it’s gonna take some time till the engine solicited. As of now, already released UE5 games are using old version of UE5, so they don’t properly utilize the new crucial features in UE5.
@@Vartazian360 epic created revolutionary technologies with UE5. But as always with introductions to new technologies, there is always gonna be a lot to fix. The creator of nanite made a present on his journey to creating this technology and as of the day that nanite was launched, there will still be a huge amount of things to work on with it. But that was like 4 ago already, UE5 had been fixing this issues, and in the next few years we’ll be seeing games that are leveraging UE5 to what it was meant to do. UE5 games now are just using old versions of UE5
@@balloonb0y677 revolutionarily badly optimized, yes. Lumen is trash
One thing you guys are not realizing, this isn't just for games. Clearly they are aiming at VFX, which I'm a part of. This to us is a huge update that we can utilize immediately. A lot of us in the industry who use UE for vfx are drooling at this.
what camera and lighting oliver uses? it looks so nice always.
It would be great to know which version of UE5 games launch with (when that info is available).
Fortnite IS the most important product for Epic and the first place to introduce or test all they features in Real condition.
The first EU5 (beta 5, and 5.0) game was Fornite if you remember
imo I don't think it matters since the devs have to implement these technologies, not considering reverse engineering or modding by community.
Gears of War: E-Day is going to truly showcase Unreal Engine 5s potential
😂😂😂
@@alpha6games751 don't underestimate the Coalitions mastery in Unreal Engine development, those that have seen gameplay were blown away
😭😭😭
@@sk8ermGs you must be a playstation fanboy who has no idea about the Coalitions track record on Unreal Engine development 🤣🤣🤣🤣
7:36 hm I see a specular reflection vor every light in that time where it's bright enough to also get some bloom. You don't see reflektion of the back light in the inner circle because of the angle. Its reflection in placed on the dark ring on the ground. In fact on the next inner white slim ring you see specular reflections on the back lights too. EDIT: i just saw the PC screenshot. On that shot all the light are brigther, getting some some bloom and therefore are in the range where the specular reflection occur, as observed with the PS5 where the lights have to be bright enough ( in bloom range ) to make specular reflection on the ground.
I'll wait for the GigaLights...
While these new features are cool I'm sure everyone would prefer they did something to mitigate the horrendous stutter that's almost ubiquitous in UE5 games.
Sure it might not be all on Epic as not all games have it (Hellblade 2), or have it at an acceptably rare frequency (Wukong), but it's clearly something they need to help devs sort out because it's getting ridiculous and giving their engine a bad reputation.
The did work addressing shader comp stutter in 5.3 and 5.4 but we need games using those new versions to see if they accomplished the job.
@@gameguy301 fingers crossed, but I think traversal stutter is even more egregious as most games have (thankfully) implemented shader compilation on launch. Have they done anything in regards to that?
It's not only a UE issue though tbf, DS remake used Frostbite and was absolutely terrible in that regard.
I've tried virtualised geometry on 3 different mid-high end PCs/laptops on Fortnite and they all have insane stutter... 🤦♂️
ok cool but when will they fix the stuttering? it looks very cool but theres no point if in game works as a stop motion movie.
They are fixing the stuttering. It’s just that games like wukong are made on old versions of UE5
Unreal Engine is trash... All these games run like crap and the graphics look worse with low res textures with stuttering... I see a pattern happening doesn't look good. Won't even bother buying anymore games that use Unreal engine they're all garbage. Even this trailer in the video looks like Fortnite graphics but run at 20fps.
@@SemperValorin motion as well most of the games have bad ghosting issues.
@@SemperValor Lol. Unreal is one of the most advanced engines out there, if not the most advanced one. It has issues, sure. But "trash"? Yeah, you don't know what you are talking about. Tell the devs to optimize more, don't blame the engine like a dumbass.
@@SemperValor Unreal Engine is not trash just because bad developers don't use it properly for their games. Most stuttering/hitching is because of DirectX 12 and has also been a problem in other engines such as Frostbite. You can easily mitigate stuttering/hitching in DX12 by precompiling shaders as well as providing pso cache files, Epic has this in their documentation. Alternatively you can just use DX11 which has way less of the problems that DX12 has
3:54 he said the thing!
Bespoke!
I hate this word lol sounds so bougie
I don't even know what that means.
@@dav1dparkerread a book once in a while.
@@Chimera_Photographybe nice!
Cyberpunk definitely needs this technology
I could see Cyberpunk 2 using this stuff as they move to UE5. It’s really intriguing stuff. With emerging technologies like this, I think anyone who claims we’re hitting diminishing returns on visuals in gaming are going to once again be mistaken (we hear this every gen, and we always surpass it). It really feels like we’re early stages into the next leap in graphics. And we just mignt get there.
I'd assume that Cyberpunk 2 will use Nvidia's UE5 branch, as delivering anything but path tracing (at least on the PC) would be a kinda ridiculous regression. Mega Lights, Virtual Shadow Maps, Lumen, and so on are impressive technologies, but at the end of the day they are just trying to recreate what path tracing is doing out of the box in higher quality.
There's no way Nvidia doesn't stay involved in Cyberpunk 2. Cyberpunk2077 was a massive advertisement for Nvidia.
Exactly. Nvidia is peak GPU company, Unreal is peak game engine, CDPR is peak game company. Oh and Digital Foundry is a peak UA-cam channel.
My dudes. Where is the SH2R PC vid? Y'all are killing me!
Im wondering the same thing. Im guessing it has allot of issues lol
@@Zetchzie they should call the game stutter hill 2
@@FunctionalBreadMachine somehow i finished it on my old 2016 pc with gtx 1070, it was actually playable on the lowest setting with crashes here and there.
@@Emulator_Crossing Oh I believe it. Seems to be decently scalable. It's just extremely taxing. And I'm curious to see what the digital foundry Bois are cooking up. I'm loving the game, even with the performance issues I'd recommend buying it.
@@Emulator_CrossingGod that must have been miserable
Where is DF dissection of the PC version of Silent Hill 2 Remake?
iirc Alex is working on it
At first glance the thumb nail reads like UE5 Malignant Tech Explained.
that's just the stroke you had
@@Dezzyyx yeah I'm stoked.
What’s the difference between this and ray tracing? Is this meant to be used together or separate with lumen? Or is it meant to be a way to increase the “rays” similar to path tracing?
"What’s the difference between this and ray tracing?"
There's no difference because this _is_ raytracing. The current definition of raytracing in the video game industry pretty much encompasses any and all lighting techniques that simulate light as physical rays traveling through the scene, that don't rely solely on screen-space information (since if it could, screen-space reflections could be considered raytracing as you trace a ray through the screen-space depth buffer). This technique fits that definition, as it tries to solve direct lighting by tracing rays through the scene towards nearby light sources, using clever tricks and optimisations to be able to do so more efficiently.
"Is this meant to be used together or separate with lumen?"
Together. Lumen tries to solve indirect lighting by building a structure of probes and sampling ambient light from those probes by tracing rays outward from the probes, so Lumen is tackling a different aspect of lighting than this technique is. This technique could probably replace Lumen's area lighting functionality as this technique can support area lights with much higher quality, but the rest of Lumen can stay.
"Or is it meant to be a way to increase the “rays” similar to path tracing?"
Path tracing would replace both this _and_ Lumen. Path tracing is a specific lighting technique that solves all aspects of lighting (direct lighting, indirect lighting, reflections, refraction, subsurface scattering, etc) within a single unified pass, so path tracing would replace both this and Lumen as path tracing can support efficient direct lighting through ReSTIR, and it naturally solves indirect lighting just due to how it works.
@@jcm2606 Thanks for the great explanation! This was easy to follow and helped me understand the concept well, it was alway pretty confusing before lol. Looking foward to see how developers implement these different techniques.
@@jcm2606 If path tracing solves everything, why don't they invest in making path tracing more optimised?
If it's not fast enough for games, maybe for low budget films and series? Compared to rendering stuff in blender with cycles, this seems much more efficient and the PC screen shot looks fantastic.
Downloaded 5.5 last night. Going to check it out today!
Honestly I’d lay the blame the lack of hardware lumen on console squarely at AMD’s feet. It’ll be interesting to see if one of the PS5 Pro enhancements is hardware lumen on future titles.
these kinds of technology's are cutting edge, & honestly its crazy that you think current gen consoles WOULD EVER truly be able to handle this tech with a $500 price tag.........
It has nothing to do with amd, consoles are designed to be cheap with decent gaming performance!!
It's not an AMD thing. Lumen needed a lot of additional optimization. In this Unreal Fest talk they say 60fps hardware lumen on consoles is doable in 5.5: ua-cam.com/users/lives1qdbJtjUI0?si=R_6eEqBqI5mgoSGM&t=5696
@@yegoat05 Yep, not sure even how the ps6 era will handle it in full aaa game with all the of npcs, physics, ray tracing and ai at 4k and 60fps at once.
@@Emulator_Crossingdo you even understand the entire point of these technologies? Can you not comprehend the fact that without these technologies, traditional method would be way to hardware intensive for console hardware to run anything near this. These technologies are made to make rendering easier both hardware and developers.
The Matrix demo had features like draw distance that didn't fade at a certain distance until it was too far away to see it, infinite resolution textures(? other demos I can't remember the names of, they promised a higher level of Nanite, procedural animations, water physics, etc. Many of those features were lost or came with too much downgrade.
Yeah I'm sure it's great running at 800p upscaled heavily. That's not the kind of image quality I bought into though on PS5 and a 4K OLED
that's exactly the kind of quality you bought into with a console thinking it'll do 4k somehow
@@roar104 These consoles are aimed at being 4K consoles. It's up to the developers to do better and target higher resolutions. The culprit here as they age is Unreal Engine 5, which massively overshot what these machines can do. Looking at first party Sony titles, you can get ~1440p and 40FPS even with a little ray tracing, and they look amazing and feel great to play. I would say 1440p is a decent target compromise resolution. Looking at badly built UE5 games you get 1080p on a good day with bad upscaling and shonky sub 30FPS framerate. That's not acceptable.
@@pgr3290 they're aimed at outputting 4k via a dynamic internal resolution on most games, not actually being 4k. Nowhere near good enough hardware for proper 4k. They know console users mostly don't know the difference so they can slap it on and people will buy it.
That's not to say UE5 isn't horribly optimized with bad features like nanite on top of that though.
@@roar104 The developer determines how the hardware is used. The hardware is good enough for native 4K and 60FPS. Gran Turismo 7 for example. It's probably not the ideal target for most games intended for those machines, however 1080p is poor for the hardware and upscaling does not cut it. If they had DLSS that's another story, but they missed out on machine learning which has always been a big win for PC the last five years
@@pgr3290 " The hardware is good enough for native 4K and 60FPS"
LMFAO even last gen games like Shdow of the Tomb Raider run 1920 * 2160 ( Checkerboard 4K ) and still drops into 50s.
The only other option is 1080p for stable 60FPS. How delulu do you have to be to think games can actually do native 4K on this machine lol. Even Sony First party titles run 4K only at 30FPS ( maybe 40 in case of Uncharted legacy of thieves collection )
Fix #StutterStruggle
@alger-y3q what kind of issues have you had since upgrading to 5.4?
@@LittleBlue42 still there is traversal stutter
traversal stutter has existed in every engine that uses level and asset streaming, it's really unavoidable but can be mitigated
Does anyone here experience compiler error on older iMac on UE 5.5 upgrade?
the game works but compiler doesn't work.
A game like the upcoming Routine would certainly benefit from Mega Lights given that it's a game that aiming for a photorealistic visual style with its horror sci-fi setting.
These kinds of videos serve only as marketing/publicity/awareness for whichever company's new product. Those CEO's couldn't be happier with the state of affairs where gaming/tech "news" sites/"creators" are just an extension of marketing announcements.
“Technology explained” means you know exactly how it works, and will explain it to us.. but apparently not 😂
Its all cool, but devs really need to go back to games like half life 2, fear, deus ex...to realize that you not need to invest soo much money, time and resource on cutcenes/voice actors/trailers. just need good level design, story and pacing without stopping the player every 10 min.
All visuals , no fun lmao
All three of those games are PC games. The AAA PC industry died with the Great Consolization of 2008.
I would love a AAA PC gaming renaissance, but as long as AAA devs conceive, design, and code only for console HW and the sensibilities and preferences of the console market, the best you and I can hope for is something like DOOM 2016.
Which as good as it is, is nothing like Deus Ex, Doom 3, Half-Life 2, F.E.A.R., Dark Messiah of Might and Magic, BioShock**, Crysis, or I'm sure lots of other PC games that I look forward to playing for the first time.
** BioShock is interestingly a console game, but it is so well designed, in all aspects, and plays so well on PC, that I didn't realize it was a console game until several years after it released.
@@bricaaron3978 my fav in the series is bioshock 3 + dlcs, just love the story with elizabeth, and the all floating city fever dream vibe, even tough the game feels short like it was cut in the middle. nothing like this since.
Developers think that a game needs to be cutting edge to be AAA. If it gets more than 60fps on a 4060 at 1080p or 60fps on 4090 at 4k then.. they didn't do a good job. We don't need incredible graphics. Art style can go so far in a game. I thought Ghost of Sushi was quite beautiful and the graphics were absolutely very convincing for a AAA games and yet it runs better than most games in the last couple of years. Lazy devs or is Nvidia pushing heavy engines more than they should?
@@Emulator_Crossing Yes, I love playing through all three BioShock games. Infinite is my second favorite --- BioShock will always be my favorite.
Believe it or not, I've played Infinite twice so far, but I haven't played the expansions yet, so I have those to look forward to.
Epic should release the demo to PS5 and XSX with all of these feature to set benchmark for all publishers. Not just showing stream.
Most of these techs you see with UE5 on PS5 have been present in the Decima Engine in some shape or form. I feel like Decima could become a general Game Engine for PS studios that don't use their own.
Thank you so much for this. I've been using lumens on a lot of big vfx projects. But I am definantly going to reload up some of those worlds with RTXDI and see how they fair in the render queue.
Going from a genuine high frame rate experience at 120-240fps down to 60 is disappointing, no matter the image quality bump. Hope they continue to push optimizations too.
That Matrix demo ran on PS5. So, what's the issue? One would guess there would be development happening during this time, and it would be viable to make a game with it. I'd accept 30 fps if my game looked like that on my console.
30fps + frame gen is actually good, or fake 60fps. Idk if it's possible on consoles.
@@kanta32100 LMFAO no. Frame gen absolutely doesn't recommend 30FPS as baseline. It'd be horrible in both latency and visuals. It is much more useful for 45 to 90FPS or 60 to 120
@@DragonOfTheMortalKombat Did you tried it, or you listen to reviewers? I tried it and it's decent, i'm talking about DLSS. It's similar to vsync lag.
Projector lights or whatever you whant to call them are a feature since ue4 called rect lights. Thats nothing new. The fact that they lifted the self imposed limit of 4 dynamic lights is amazing, really groundbreaking, almost worthy of DF to do a deep dive on (in?) but im guessing these mega lights are megaBS as usual, probably rendered in a separate pass and then composited together with the rest of the scene. As the guys noticed theres heavy time slicing and probably upsampling going on "behind the scene". I remember when deferred rendering started to pick up and how it promised huge numbers of dynamic lights but god forbid you enable shadows on them. This is probably similar, god forbid you have more than 30 FPS...
Games, no. But VFX industry is pretty dang excited about this.
I mean... The next cyberpunk is using UE... And that is a game that needs all the light tech it can get 😌☺️
I think that light should be aprouched with the same philosophy as nanite
This sounds neat and all. However, it means fuck all if developers don't put the time and effort into optimization.
There is ZERO excuse for doing the bare minimum. It seems nowadays AI upscaling tech is being used a crutch to make up for the poor performance at a baseline level.
i remember when the technology was announced everybody was excited thinking that it would help us achieve playable 4k at high frames and good graphics, instead we need it to berely play at 4k 30fps with stuttering
100%
DF used to vehemently deny this and tried to gaslight people but now even they can't hide from the truth.
If graphics card companies didn’t hamstring their cards with VRAM for so long, maybe they could progress games faster. Now developers are playing catch-up because cards are getting faster. Games take years development so they expecting cards to get faster, but they aren’t except for the fastest cards. Everything below the 4070Ti is the same as two years ago. By optimization, you mean “make it look worse”.
All I want is a modern FOX Engine like optimised game engine. Is it too much to ask? 😭
Well no, hardware simply cannot keep up with the progress and technology. I've been saying this for years since 1997, full RT, and bounce path tracing etc will never become viable in real time as its just too expensive. That's why the most powerful computers, even today take hours/days to render a CGI movie using these things.
i dig this tech for all archvis purposes
can't wait for all games to look the same
That'll only happen for lazy devs that aren't creative enough, you can make a game in any engine look unique if you have a good art direction
Some of the scenes looked like upcoming Witcher areas
What about PS5 Pro with better Raytracing hardware, 45% better GPU performance and PSSR, it would benefit from this UE5 technology? I believe what we have seen until now for the PS5 Pro is was just current games improvement no games built for it. It was the same when PS4 pro and PS5 was lunched.
Nah, The Pro is a mid range GPU, 7700/7800 XT raster with 4070 ray tracing capabilities. The things they’re demo’ing here are going to require far more powerful hardware to pull off without shortcuts , not to mention build a game around which will take years . As they pointed out, there were several shortcuts taken in the demo that were hidden in the video feed at 1080. This certainly has next gen potential though.
This tech is what will fuck the next gen console performance,
Base ps5 is an absolute beast!
Before the PS5 was released, it brought with it many expectations.
Although this generation of machines can now read roughly 100 times faster than in the HDD era (said Mark Cerny), the actual difference is not as great as expressed in the numbers.
I'm looking forward to seeing what Naughty Dog comes up with next, but they are still asleep at the wheel.
Halo Studios: "Good. GOOOD."
They got rid of all their actual creators and replaced them with feminist/gay activists. Not a recipe for success. They didn’t move to UE5 because it would be better, they did because I honestly think they kicked out all the dudes that actually know how to code an engine
Not sure if you guys read the comments here but we shipped Funko Fusion with all the bells and whistles and a 60fps mode that is fairly consistent on console.
Using UE5.5 preview?
@@Hybred obviously no. But they also talk about how unreal 5 games with the new features rarely ship at 60
@@AdamKiraly_3d I figured as much, I was just confused as to how the comment relased to the video. Now I see.
But their analysis is still correct, they said it was rare not impossible. I'm glad you targeted 60fps, and I'm curious what sacrifices you made to achieve it.
Internal resolution? High down sample factor on certain lumen effects? Regardless, good job. I'm currently working on a fork of UE5 to get better performance out of it.
I accept 30fps and 1080p for this quality. (This is why it's a bullshiʇ that there's no need for Performance/Quality Modes on PS5 Pro. No matter how powerful the hardware is, you can still sacrifice fps for quality.)
Looks a shit ton like LTCGI people use inside VRChat Worlds which is unity 2022 and has almost no performance impact at 90 FPS inside VR (Aka rendering each frame 3 times)
UE5 slapping the shit out of RTX
There you have it, Alex said it: "PS6 games".
So nothing to worry about in the hereand now.
Whatever these nerds say yall believe
@@Mitchell-p4h These nerds have experience and insights about what they are saying.
Which is kinda the OP thing about being a nerd in gaming and tech industry.
UE5 does not provide a good enough quality/performance level for current gen.
Look at the best looking games and they all have something in common.
Forbidden West
Red Read Redemption 2
Battlefield 1
Arkham Knight
Space Marine 2
Crysis 3
Last of Us 2
Demon Souls remake
RE4 remake
Doom Eternal
Baked lighting
Crysis isn't baked lighting it uses Cryengine.
Battlefield 1 isn't baked lighting it uses it's own engine as well
Doom Eternal, who knows it uses it's own engine as well
RDR2 isn't baked lighting it uses it's own engine as well
Forbidden West isnt' baked lighting it uses it's own engine as well
Last of Us 2 don't know, but not on UE5 either.
@karambiatos Yout comment is nonsense. All of these gsnes use a baked gi solution to save on performance cost.
@@devonmarr9872 Giant maps or giant maps and real time TOD.
Yeah one is impossible the other would make insanely heavy maps.
@@karambiatos in my list are games with a fake time of day that allows based lighting. Spiderman 1,Morales,2 could be added.
@@devonmarr9872 No that's not how it works there's no such thing as fake time of day in those games, what even is fake time of day.
RDR and cryengine have tod changes you cant use baked lighiting then, neither can you use it on large map environments like Battlefield 1, technically you can but the shadows look garbage at those sizes, battlefield 2 used it, but very low res
y am not sure why would they do it on a console and not a high end PC so they can show what it can do and at what cost
Useful for Cyberpunk 2078 now that it will use UE 5!!
This is cool technology thats going to push AAA forward but at the same time I really don't care because I mostly play retro, indies and street fighter 😅.
Hope this technology help devs actually finish games before releasing it lol.
no, they say every single light source. every light source
The main guy sounds like elmer fudd.
Hey, there are issues and imperfections, but dang am I still excited for what's to come in the next decade.
Sounds like MegaLight and Lumen are a stop-gap before we get to high performance path tracing.
When path tracing is more viable (5-10 years?), I imagine those two technologies will no longer be needed.
Mega Lights is awesome. And I'm very excited how games will look in the future.
The lighting and shadows look top tier but what of the in-game physics? Tired of pretty AAA games with lackluster gameplay
Donnelly Mill
is this like Ray’s tracing?
It traces rays to figure out if a light source is visible for a given pixel, so yep.
I feel like DF would do well to maybe talk to some actual tech artists. Because every tech artist I know and/or follow on social media collectively lost their minds about this tech because it makes the amount of teethpulling hacky workarounds to light scenes and maintain performance basically go away completely.
Another garbage technology that maybe deliver incremental visual upgrade while costing a fuck ton of performance
Even if there is stutter, or compromises in other ways, it's still just so nice to see this incredible new tech being released. Things always improve over time, I'm just thankful for all this tech coming out this generation sooner rather than later. Since the PS5/XSX and UE5 launched, I've always said this is a BIG transition period where this tech is new and emerging, but next-gen in a few years is where we really are gonna get to see these techs live in action and greatly improved upon. PS6 I'm sure is going to be very powerful based on leaks/rumors and just comparisons from tech jumps from previous generations, it's very exciting to think about :D
Welcome back, 30fps games on PlayStation 5 Pro!
Very little explanation of technology here for a video titled "Technology Explained". If you don't actually even know how it works, don't advertise your video like you do.
unreal is great but you can get into alot of trouble with all the gee wiz stuff. keeping things simple is usually a smarter way through it. alot of the perf issues are devs not making proper compromises with all this stuff and instead going full idiot on graphics because they think they can and maybe shouldnt. or should make more clever compromises and forgo some things. the pressure to look good kills rational thinking about whats holistically good for the whole game. i say this as an artist. often simpler lighting actually looks better btw. adding too many lights muddies the overall reads. again simplicity is often better. this is lost on alot of people.
Another proof of the Epic's UE strength and capability of their SW development magic. It's also another showcase what SW alone can do on most of the HW and that u don't have to pay extra for HW feature like with Nvidia's RTX asking ridiculous price for it also by not having any real competition. I give Epic BIG thumbs up for democratizing expensive feature for everybody even they might fully realize on the next gen consoles and more performant PCs👍👍
You realise that this can use hardware RT, which is what NVIDIA's RTX is, right? This, like Lumen, is just a more efficient way of using and managing rays. The underlying raytracer is a completely separate thing, so you could theoretically plug a voxel raytracer into this and have it work.
@@jcm2606 Yes, it can but it doesn't have to use Nvidia's or other proprietary GPU HW feature to run smoothly... of course dedicated/specific HW based solution will always deliver better results than just a SW one, but that's exactly why I like Epic's decision to not directly promote it and rather use PS5 even it's based on the AMD platform. AMD is clearly not as far as Nvidia with the GPU HW features so it's fair from Epic to show it will work there as well. Overall they sent clear message they've got SW solution which delivers results almost everywhere and that's right way to go for the company like Epic.
these technologies are all well and good and will simplify the development process for any applicable application, but they are not viable for games, at least not in the immediate future. the cost is simply to high. if it cannot run at 30fps at 1080p console hardware will always hold it back. maybe itll be a smooth experience on a pc with a 5090, but not sure who is going to target a game at 0.0001% of the market.
lets get back to real improvements. this generation of shoehorning in more and more tech that needs to be multi sampled and upscaled into a blurry laggy nightmare needs to end.
The worse you understand why and for what but them as always you remenber aahh i saw somenthing like this somewhere... in this case Arkhan Knight lmao...
If that's in a PS5 why are we getting PS4 games on PS5?
Stutter Megalights Demo
More specifically DX12 stutter, thanks Microsoft
In all fairness, even in preview Megalights substantially improves lighting performance (even using it a minimal amount of lights can result in potentially doubling your FPS, based on the tests I've seen, and 1000 lights can potentially go from 20-30 FPS or less to possibly 100+ FPS, which is wild), and it basically makes it so you can functionally use as many lights as you possibly want in a scene without having to weigh up performance considerations (with one exception, it doesn't work as well when a bunch of lights are heavily overlapping, but that's easily fixed by spacing out lights in your scene design). That's practically the holy grail of lighting. Sure, it has drawbacks visually, but the moment 5.5 comes out of preview I'm turning on Megalights just for the performance boost alone.
All this tech for an engine that is known for freezes and performance issues.
Yup
y they had to do it on a high end PC cause atleast we would know what it will cost us to have it at a good state,why do u think there is no Path Tracing on Cyberpunk console versions?cause they will tank if they use it the console will prop shut down
@@christonchev9762 even with no ray tracing unreal engine games perform very poorly. I have played dozens of ue games. And apart from very few titles , they all suffer from performance issues. And I've played them on a pc that's way above the recommended system requirements. I like how ue games look , but I hate how they perform. IMO they are good for photo mode. Not actual gameplay.
they pushing for realism and not so much stability, i bet they care more about the use in films and that hollywood money than gaming, sadly.
@christonchev9762 the demo was running on base PS5
Not defending UE5 and all of its issues, just pointing out this wasn't run on a high-end PC.
Kling Lodge
Why should we care about new UE5 features when 9/10 Games are a stuttery mess.
Pollich Views
we can always buy stronger gpus and new consoles but the reality is that i would be happy with the performance of my 4070 TiSuper for a decade or 2.. if games released that refined game engine tech ibstead of just meaninglessly increasing hardware demands for little benefit to image quality. The influence that nvidia has on graphics programming will continue to incease demanda of games. The 2024 games that have the best graphics look good enough for me... Forever if im being honest. Id be happy to see the creative art styles and clever techniques we would see from devs if a 32gb 4080 was the only gpu available. We need innovation. Not games using super demanding engines tham make developing easy and make games harder to run. Its about time for a new engine built for gpus instead of nvidia and developers.
It looks incredible in the demo, but from my experience playing other UE5 games that already end up looking pretty blurry and ghosty even on higher end PC hardware, the added distraction of noisy, temporally inconsistent lighting and shadows doesn't really seem like something I want in any sort of fast paced game. I hope these features are used responsibly by developers, and not to the detriment of the image quality of their games, but I don't have much faith after experiencing the current crop of ue5 games.
Game look like for ps6 pro
Can't we just go back to baked lights and SH probes for dynamic objects. Weary with all the noise & artifacts. It's refreshing to see clear and crisp native 4K rendering.
People were complaining about game worlds being too static for ages, and now that there is lighting technology that allows more dynamic worlds, people are complaining about that.
How about just playing idk... older games until you have a better PC?
No, because shadowing is still a problem in that case. It doesn't matter if you have a hundred baked lights if you need to then render a shadow map for each of those 100 lights, to be able to have them cast a shadow originating from dynamic objects in the scene. That is the current state of rasterisation, which is why this technique is a big deal since it offers a much cheaper way to have hundreds of lights that can cast dynamic shadows.
while I understand that there was specific work done to make it run on ps5 and we probably won't see it on ps5 since it's a nascent technology, it's freaking cool that it's able to run on modern consoles. I hope it's implemented for ps6 by devs who use ue5.
Insanely impressive.
Show this to someone 20 or even 10 years OR FIVE years ago and they call you crazy!
I have use 12 Months a 7900 XTX and since 2 Months a 4080 Super. In most Games Raytracing is not good enougth, Software Raytracing or good Cubemaps looks very similair, only some Games like Cyberpunk really makes a quality difference with Raytracing On.
I thought UE5's ray-tracing works backwards. Instead of ray-tracing from the light source the ray is traced from what your eye sees. That makes sense. Why shoot a ray to something you can't see. But I don't know how they do that backwards.
Most ray tracing actually uses that method. There’s a lot of complex math behind it, and still a lot of Misses, but the fundamental idea is that going from the lights means there’s no ground truth for where the rays need to end up. Starting from the viewport, we send rays to objects, and then send rays from those objects and hope those rays hit a light source
The vast majority of ray tracing is done from the camera perspective. It's when you get into path tracing when rays are traced from light sources.
That's literally how ray tracing always worked.
@@456MrPeople Path tracing does not mean that you trace rays from light sources. Path tracing is just when you treat light as a single, unbroken path from camera to light source, and track the energy transferred across that path as light bounces from surface to surface. This can be done in either direction, and is even sometimes done in both directions simultaneously (bidirectional path tracing). No clue where this "path tracing starts from light sources" thing started from, but it's wrong.
I'm kind of sick of RT already. The massive performance cost for anything approaching decent visual quality is too much and it very much appears to be a brute force approach.
Shown on a ps5.
There is no doubt UE5 is the most realistic engine. The games same as hellblade 2 or black myth wokun proved it. Yes it needs high ends GPU but it's worth it. Right now 4000 series easily can handle the games andvin next 2 years everybody use 4000 or 5000 GPU. AMDs next GEN also supports raytracing with high frame rates as they said.
There are other engines that come close to the level of UE5 and in some ways might surpass it but there is no way Unreal is the most versatile engine used by so many different developers.
Everything looks amazing. 😂 C'mon what are we even talking about anymore?