Unreal Engine 5.3 - Next Level Tech Is Coming!
Вставка
- Опубліковано 10 жов 2023
- ❤️ Check out Lambda here and sign up for their GPU Cloud: lambdalabs.com/papers
📝 The announcement is available here:
www.unrealengine.com/en-US/bl...
My latest paper on simulations that look almost like reality is available for free here:
rdcu.be/cWPfD
Or this is the orig. Nature Physics link with clickable citations:
www.nature.com/articles/s4156...
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bret Brizzee, Bryan Learn, B Shang, Christian Ahlin, Gaston Ingaramo, Geronimo Moralez, Gordon Child, Jace O'Brien, Jack Lukic, John Le, Kenneth Davis, Klaus Busse, Kyle Davis, Lukas Biewald, Martin, Matthew Valle, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Richard Sundvall, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: / twominutepapers
Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
Károly Zsolnai-Fehér's links:
Twitter: / twominutepapers
Web: cg.tuwien.ac.at/~zsolnai/ - Наука та технологія
Unreal indeed. These are gorgeous simulations in real time, photo real, physically accurate, and highly compressed.
real deed
rails for cameras appeared a long time ago, back in version 4 of Unrial
@@MrFEARFLASH
The 5.3 feature list on the roadmap shows a new CineCameraRigRail actor "tailored towards advanced cinematic and virtual camera workflows".
and then nothing works smoothly in games ;p
@@Urbananana i think this is more focused for cinematics but isn't to far to get into some games
I just want to know what deal with the devil, or black magic Epic engineers use to get all this working so well and so fast!
I see a lot of people say this tech is only good for demos, but not practical development, but it seems Epic learned this and have really been pushing to make all these features easy and practical for devs to use.
Those explosions and volumetrics look amazing!
Fortnite bucks are being used wisely
@@INameIsGood that counts
@@INameIsGood The engine has been making a lot of money too. For quite some time
hundreds of developers around the world collaborating with unreal engine on github too
@@INameIsGood yes, thanks for those teenagers for funding the state of the Art Game Engine for all of us.
can't thank them more for always nagging their parent to buy them V-bucks.
this is the only type of Microtransaction that i'm willing to let it slide
Can't wait to see how all of this leads to unoptimized games
It's only partly engine's fault tho. Developers are not taking their time optimising the engine and the game but instead rush out a game with all the shiny new tech. Obviously it will lead to games like immortal of aveum. Simply cause they didn't spend enough time optimising. Can't blame it all on the engine now
i agree with you, but unfortunately this wont make people want to spend the time optimizing their game. @@shreyass5756
@@shreyass5756A thing about Unreal is there's only so much optimization you can do though. The systems tend to scale very well, but have a high upfront cost you can't just wish away
I just watch a Digital Foundry video on how CD Projekt Red were going to use UE5 for their next project and they hope that they make some optimizations for the game using the CPU on their branch and include it with a future release. They say that because Cyberpunk 2077 is highly optimized for the CPU. Also, Nvidia has a UE5 branch where they can implement full path tracing that they have in Cyberpunk 2077.
We must wade through the swamp of unoptimized games if we are to make it to the promised land of self-optimizing games on the other side
If only the fine scholars that program the graphics could also program the developer facing parts of the engine :(
The only way to fix that is to deprecate all the APIs, replace all the inner infrastructure, and get rid of the old ways to manage memory to favor RAII. Not going to happen because is a lot of work and people are too attached to the old ways.
The foveated rendering seems interesting. Could absolutely supercharge the potential for realism in games. Don't have to worry about performance anywhere as much when all you need to render properly is a relatively small spot.
I think it could be especially useful for VR, where the resolution requirements are very high
It's mainly for VR where some headsets are starting to get foveated rendering. Basically means you can get the same fidelity for a fraction of the cost.
A little bit back it was a main selling point for Playstation's VR. I think it will finally make VR games more accessible because suddenly almost everyone with a gaming PC can use VR.
Thanks so much for tracking this domain. My focus is in LLMs and I don't have the bandwidth to keep up with the periphery, so these updates are super-useful.
Feels like this video is an add, not a Two Minute Paper video.
Yeah I was going to say the same, felt like he was reading the changelogs. I do appreciate the callouts of which tech correspond to which papers and tech previously discussed though
Planing on becoming a tech artist. This video shows so much stuff to play with :) Thank you.
As a Unity user it hurts so much how much stuff unreal is getting!
Truly Amazing
Why? Isnt Unity meant for indie games? This is tailored for AAA development
@@wallacesousuke1433 I mean if you consider genshin impact an indie game.
also why don't Indies deserve some love in the form of great features.
@@vakuzar definitely not AAA, it's a F2P game after all, with meh graphics and meant to run on mobile devices.
"also why don't Indies deserve some love in the form of great features."
That's what Unreal Engine is for?
It's free* to use..
And I know Unity can create visually impressive games too
@@wallacesousuke1433
No, Unity isn't meant for indie games and there are lots of tools being improved on Unity side; Unreal gets the spot because their improvements are end user friendly concepts (graphics). AAA isn't about graphics or the platform, AAA is about budget; and Genshin Impact's budget was $100M; it is AAA. Genshin Impact's graphics are stylistic; just because it doesn't look good doesn't mean it is lacking or bad or not AAA.
Maybe learn about facts you are talking about before sharing conclusions that can "midguide" other people?
@@yokunjon LMAO Genshin, a triple A game... Sure thing, buddy
We're finally achieveing that DirectX 9 tech demo level of visual fidelity :D
Thanks! 🙏🏼
I am thankful for captions.
I can't help but hear the "Book of Love" from the game "It Takes Two" when I hear your voice. Keep up the good work!
unreal lack of optimization, LETS GOOO!
They should make a tournament.
Oh, the UT99 old times...
The VR foveated stuff is an incredible addon to the api.
One of the most restrictive aspect of developing with VR is its limited processing power.
We usually have to cut corners just about everywhere we can.
As a guy who like video games very much is very cool what future devs can do with unreal engine 5! Nice and short video
I'm surprised you didn't mention Nanite landscapes! That's my favorite part of the update so far.
oh yess!
In my opinion, Foveated Rendering is probably one of the most important performance improving pieces of tech. If we can get high speed/high quality eye tracking to be common in general, then it could help with even regular desktop monitor gameplay.
FYI that was VR specific
@@raphael9052 You mean the implementation or his comments?
@@raphael9052he’s saying it doesn’t have to be. You could use the tech with VR or without it
So excited to be able to see asset streaming and shader compilation stuttering in even better quality soon!
Lol so true
I know this is slightly off topic. But I changed from Unity to Unreal. After the whole situation with the run time fee.
And Honestly I should have switched over a year ago.
I cna't wait to see what elese they'll have to offer.
The audio engine and Metasounds.
Godot is gaining traction fairly good for a indie dev engine.
The important part is not about what engine you use, but your skills as a developer in the first place. You need your assets, your design, your notes, references, concepts, and ideas first.
Then you prototype over and over again, refining it as you solidify what you want to develop in your software/game, you don't need to jump into the deep end with super complex or high fidelity AAA market engines.
They will act as if they are user friendly and such because they want you into their eco-system, they could care less if you wanna refine your development skills.
You realize that you jumped from one company controlled walled garden to another right? You might someday get bitten in a similar way again, don't forget that Unreal got in hot water with Apple over the Fortnite in-app purchases and lost dev access which meant they could have permanently seriously screwed over a ton of iOS devs using Unreal Engine. Just food for thought.
@@zeikjt "Be a good developer, not a free marketing tool to corporations."
they also just released their future development roadmap and it's absolutely breathtaking :D what a time to be alive indeed!
Where can I see it?
Edit: I've come across a video that talks about it.
@@simongravel7407 I'd love to tell you but it's getting blocked^^ you'll have to look it up yourself if you google the terms in my original comment along with ue5
I wonder if the friction between objects (like the tires of the Rivian and the ground) can also be simulated to individual objects in realtime. Or is there just a "standard" friction per scene if it comes to things like simulating vehicles behavior?
Also Nanite coming to terrain system! and should I say npcs and player characters aren't far way getting it's own nanite support.. ( vetrex animation with micromesh is hard but I'ce seen a paper implement that) so with neural compression for vdb and the upcoming path tracing reconstruction ( yes even for microplygons like leafs , and other small details ) we can use neural decompression for both GI and virtualized geometry to use it on low end devices!
I kind of want a realistic forest walking/exploring simulator. For example every now and then you will spot this mushroom and then the same kind later but it looks different. And then lots. And a berry bush which is rare. And there's some junk. And who spiked planks here?
I think you get me. I feel like there should be more games that focus more on taking time on watching the new graphics while the gameplay is relaxing. And maybe get inspired from exploring real life places with details that we usually don't see in games.
I also think the title of that ideal game could be called "Chanterelles".
unreal engine is one of those softwares that are so advanced i don't even consider making a game from scratch with my own game engine.
Unreal!
Ha @1:05 I used that model for the Caustic Hardware Raytracing card demos as well years ago. It's gone full circle!
what a time to be aliiiiive!
...said AI voice calmly
finally, now we'll get games that don't absolutely destroy my CPU and actually uses the $1000 GPU I bought, I hate how every big game is so CPU intensive these days that the GPU is barely a bottleneck anymore
Because CPU is the central processing unit of the computer, it is what determines everything is ran, your GPU doesn't dictate anything except the graphical instructions the CPU gives them.
So if your CPU is too weak, it can't send enough instructions efficiently to handle bigger loads for graphics rendering.
Without Nvidia owning PhysX, you couldn't offload physics related simulations onto the GPU, it'd be done on the CPU.
Or rather, perhaps if PhysX was open sourced and was widely available, it wouldn't be a black box paywall to use for efficient use of physics.
They had to do something to justify intel's 3% generational uplifts lol.
Things that are compressed have to be uncompressed.
@@Jeremyzor Things that are intended to be used by the GPU are usually compressed with highly-parallelizable decompression in mind that is tailor made to be decompressed by the GPU
@@dra6o0nGames dont actually require millions of operations per second, most games just don't optimize any more. Games like cyberpunk do not need to render cars that are for example on the other side of a building, but they do, which is very problematic, there is a lot of similar issues that are ignored in favor of releasing asap, and fixing later.
Foveated rendering could be huge for VR.
It's available in the PSVR2
Unreal release notes 😮
100x is incredible
Is it Ren from Ren and Stimpy narrating?! Can I get a "Stiiiimpy, you iiidiot?!" 😂
the question is not "can you simulate everything ?" but it's "can you implement it in real time in a game and maintain both performance and optimization with current hardware at disposal ? "...
Ue5 in vr withing a few years is going to be insane
Super excited for foveated rendering. VR is amazing but its main downside at the moment is the rendering power needed and low resolution.
I think it goes past VR, you could use eyetracking on regular desktop monitors to increase performance too.
@@LanceThumpingCan you imagine how horrid it would be to watch a streamer using eye based foveated rendering while he plays? He would be fine, but his audience would be miserable haha
@@LanceThumping I imagine you don't even need eye tracking and could get a decent performance uplift for "free" by combining it with existing depth of field effects
@@cheeeeezewizzz Sure but streamers are a tiny sliver of gaming compared to other players.
@@achillesa5894 That's a good point, it might not be something that will always be a big help but it'd be a cool way of getting free performance out of a game that puts depth of field or other blurring effects to good use.
I feel these videos have gotten a bit vague. I don't mean to be negative, but I do miss the more numbers-approach and apples to apples approach the old videos had
I think this was a more 'broadcasting new features', not evaluating them. It's more of a hype video. It wouldn't be a few minute paper if it had to discuss every one of those features in detail, it would've taken too long.
It's marketing hype. Listen to the way the person 'talks', and then listen to the way Jensen talks in his speech when selling a product to the audience.
If you 'feel' something is off, nothing is wrong with you, it means your sensibility to detect BS is working. Don't let any marketing types of people brainwash or gaslight you into thinking differently.
The goals of talking points is to be vague and to allow the viewers/readers to 'assume' things from their own point of view, driving up hype, and leading to a company able to exploit this perception (either in a workplace with interns for example, or in a business front to get customers to buy into their programs).
I see Two Minute Papers still refuses to invest in a speech therapist.
VR games with AI NPC's in the near future are gonna be INSANE!!
How fat is your wallet for you to own hardware to run such simulations?
@@dra6o0n It's not even that outrageously expensive... Just have a decently paying job and don't live in a poor 3rd world country. Around 3000€ pc should be enough to run highly realistic VR games with AI npcs at high frame rate on Varjo Aero (p.s. its now a no brainer headset ever since they halved the price. Its quite affordable for its quality. Why even get anything other than that?).
@@smokyz_ Inflation and economy getting screwed into the ground and joblessness is on the rise so not everyone can afford yearly splurging of money. And it's 1st world nations being impacted.
It was only less than 10 years ago you could build PCs for under $700 that can play the latest games.
@@dra6o0n You can still do that, but you'll just have to buy second hand last gen parts. Only thing I'd recommend getting new is storage, since SSD/HDDs die fast in retrospect.
I still rock gtx 1080 n' i5-6600k and I can still play all the games I want. I don't play latest AAA games, therefore it doesn't bother me if it can't run them at 60fps. Also you don't need to buy new pc every year, since that would just be burning money. If you're middle low income citizen, you're probably upgrading at 5 years intervals on average. By buying best parts now, you can easily make it last for 7 years or more.
Really hope to see adventures of one Draenei in U.E. 5.3
Master of mis emphasis.
It's a pain to listen to but very interesting stuff on this channel.
Wait im failing to understand how an orthographic camera is something new, i thought that was trivial lol
It is...
Orthographic camera was broken from UE4.0 to UE4.27. Most features need unique support explicitly added for orthographic rendering.
Yeah it was derelict. Seems like they revamped it. It was a much needed update.
@@ZacDonald cool
@@ZacDonald I doubt it was 'accidentally' broken, there is no such thing as accidents in software stacks.
I still find it odd that at Unreal they still don't make smoke particles stay for longer time. A fire explosion's smoke shouldn't just dissapear. Fire smoke stays for so much longer in the air.
must be a quiet week
So the before foto is just setting the saturation to 30 out of 100?
00:06 Unreal Engine 5.3 ist da mit neuen Verbesserungen.
00:47 Neue Raytracing-Effekte und bessere Stoffsimulationen werden jetzt unterstützt.
01:36 Simulationen können mit reduziertem Speicherbedarf erstellt oder importiert und gerendert werden
02:15 Die Unreal Engine verfügt möglicherweise bald über Echtzeit-Raytracing
02:56 Stoffsimulationen in der Unreal Engine sind jetzt effizienter
03:34 Unreal Engine führt orthografisches Rendering und Kamera-Rig-Schienen ein
04:13 Höhere Auflösung, Bildraten, Weitwinkelobjektive und Unterstützung für Foveated-Rendering.
04:53 Verbessertes Rendering mit volumetrischem Nebel und mehrschichtigen Materialien
VDBs in a game engine at last!
I'm sure this guy narrated The Water Margin TV series (50 years ago ..)
In reference to the thumbnail... Havent see seen this exact thing for two years though?
When do we get The Matrix?
elevenlabs voiceover?
It seems like they're pouring most of their efforts into improving path tracing. While that's cool, I'm worried that Lumen might be pushed aside. Their long term roadmap shows barely anything Lumen related. It doesn't perform well enough despite being an inferior solution quality wise to RTX (screen traces are still a huge part of it for instance), and is still full of problems like ghosting, artifacts and temporal issues.
The thought of unreal engine 10 both excites me and terrifies me
Not 100x less. 1/100x as much memory. 100x less is -99x as much. You could say 99% less memory, which is 1% as much memory or 1/100x as much memory.
Kedves Károly !
Nagyszerű és érdekes tartalom mint korábban. Viszont ahogy mások is emlitették, a hanghordozás borzalmas. Légyszives próbálj meg normális hangsúlyozással beszélni. Előre is köszönöm !
Orthographic cameras...defo what all unreal users were waitting for 😂
I wonder if someone has figured out tech that can convert 2D sprites or those that are put in games like Minecraft for games or fire in other games and turn them into 3D volumetric objects or voxels. Like trees and grass in older games but made more dense and volumetric with AI. So instead of having patches of 2D grass. It fakes having dense 3D grass.
thanks for sharing this
5.1: This update is insane!
5.2: This update is going to change everything!
5.3: Next Level!
5.4: Causes WWIII!
Where in the video is the strawberry SSS from the thumbnail?
Please post more Unreal Engine contents!
this guy is certified internet explorer. These new feature btw only work efficiently in RTX 40 series cards because they are still demanding. forget tgem coming to video games till another 10 years.
Where is my accurate realtime simulation of a translucent shiny balloon with powder inside popped by a candle in slomo illuminated with a soft spotlight from above???
>:(
Wishing I had a beefy computer 😢
5:33 A100*
Meanwhile in the "Unity" world: "yeah u gonna need to pay now and no we ain't adding those cool things"
didn't unreal just announce a price increase?
Unity is paid now? 😱😔
@@glenneric1You're right. It's rumored they will be moving to a subscription service in 2024 😭
Edit: apparently only non-game developers will have to pay subscription fees.
Usage of Unreal is free if you develop game with < 1 millions $ revenue
Otherwise it's 5% rev share for gaming, and a subscription (much like Adobe) for the other industries
@@chrisfender8014 Yeah, but only in 2024. Tim Sweeney says they will be moving to paid usage if you are not a game developer.
The problem with all these new simulations, if they are to be in real-time gaming, is needing a 10 fold more powerful GPU. While UE5 and all of the new additions I think are great, the GPU industry is not yet ready, especially for gaming is not ready.
Upcoming version of Unreal Engine is in development using Intel's Cross platform ray tracing Embree for all platforms even ray traced high end mobile devices using Vulkan API also Linux and Mac supported
cool
What a live to be a time!!
I mean...
😅
why on UA-cam studio it says that you suggest me-
I want to say that he speaks in a fun-ny way
well now we all know what engine the real matrix will use
Right, AI engine.
Still a lot of space for improvement. Skeleton inside unreal is nice but I can't see how I possibly can make fine face animations with it for non-metahuman model. Animations content is for unreal skeleton only - transfering animations for even a bit diffirent skeleton is pain and don't work correct everytime, I hope AI will help with that in future. Tools for creating cool effects like fog are quite limited and not easy to use. It will be very cool if more stuff would be added directly into unreal in easy to use format in the future.
well, skeleton, skinning and weight painting inside UE?
why would people use Unreal for that kind of task in the first place, like they're using a wrong tool for that.
also don't rely AI on everything AI is highly inconsistent, and it rely on stealing because it need a gargantuan amount of data for training.
its not make any sense that you got yourself a super state of the Art engine, yet you stained it with AI.
a good old function is way more reliable & predictable than using AI
@@jensenraylight8011 The problem I tried to uncover is that Unreal making a lot of nice solutions, but when you want to add something a bit beyond that is already polished (and some things are not even polished lol) it becomes nightmare very fast. For me classic example is capsule collision. It works good, but when I attempted to make it square I've realised it cannot be done with several clicks - I came up with strange solution with attaching box on top of capsule and switching phisycs off and on on start of a scene. (For example in Unity you can just attach diffirent collider and its done)
Now Epics added skeleton creator inside unreal and I can be sure if ill use that tool skeleton will have 0 problems with unreal, but again if u want something a bit more you forced to use external tools that may be not in a good compitability. For me it is blender and to make proper skeleton I faced ton of problems. Wrong root corrdinates, animations issues, blend shapes issues (I tried to add new emotions to face and it caused unreal rebuild skeleton and all work that I made with skeleton is gone) and more I already forgot. To resolve them you need to make proper setup and it takes even more time and problems to solve. (Maybe I just needed to use something like Maya, but still for regular user like me it is very hard to setup custom skeleton even with tons of material in the internet; most of them just saying copy paste ue manny or make ur model similar to it which seems to be only possible approach for me).
AI in shape it is now can't solve these problems, but I personally can see it helping in cases of automatization such processes as animation handling in the future - there is already promising papers about it. (I bet even when it will come up true it will work only with pre made setup for years lol)
I think they kan ditch the 'un' in Unreal Engine soon :O
The foveated update is going to be a massive update for VR
Indeed.
Can also be used for better DOF as you know where the focal point is.
To me the biggest failing of “3D” movies has been this. Either everything is in focus which looks unnatural, or DOF is baked in and static. The first approach works well for animated movies as they are already unnatural, the second approach can work well but requires directing the viewers attention to the correct focal point.
It would be cool to see media for VR headsets that includes a Z depth stream that could be used to properly blur the image based on what you are focusing on. Unfortunately since at the moment this is a niche market I don’t see this happening any time soon.
"One hundred ecks" - Do you mean 'times'?
simply amazing. What are the games of the near future gonna be like...?
What a time to be alive!!
Just introduced orthographic rendering? That doesn't sound right to me
What about Unity ?
This is also why Epic changed the Unreal license so that it's more expensive if you're using it for work outside game development
It's amazing what can be achieved with just code. Most of my poorly-written python scripts take several minutes to do basic math. That's how bad my code is. Yet, on the same hardware, Unreal Engine can simulate tyre pressures, dust storms and billions of light rays in just a few milliseconds 😳
so when will we see any of these new features in new games? 2027? 2030?
Take a look at the all new features of UE5.4!
after all experiments,, maybe somedays people re-learn from things in simulated world that,,
how they lived in a home....,,
beamng still has better suspension ;P
I wonder if my 2070 can render any of these scenes :p
yes (be patient)...but try a GPU-waterblock. 2070 is a "beast" at its cost.
rails for cameras appeared a long time ago, back in version 4 of Unrial
the rail system has been revamped and is more feature complete now.
A little off topic, but it would be interesting to see a developer build AI into a game engine to map a more realistic face onto the characters in the game (see Corridor Crew’s recent video). Being built into the engine as part of the facial animation they could create a latent space of emotional expressions and include variables for the latent space coordinates into the animation key frames to augment the base animation.
Seems like with those tricks you should be able to generate much more convincing characters without having to brute force all of the details.
I don't know how they achieve that level of compression, but we need more things like that. Before games get to 300Gb in average.
While Unity self destructs, unreal engine does this 😂
I see the negatives, if Unreal Engine picks up but no one can compete, it will lead into a era where developers are at the mercy of Epic's whims, and you might not know if they too adds paywall to their software stack.
Like a 'Microsoft' of software monopoly, they will carefully curate what you can see or access, play social engineering with online internet communities to slowly shift your psychological response to their news and media.
Maybe in x.1 you rely on a function to make your game work... Oops they updated to x.2 and that function is voided, now you gotta redo your game.
This happens a lot in a project's life cycle since a project can last for several years.
2030s gta will look like this 😂
I still cant tell if your voice is AI or carefully crafted for entertainment. Genius!
Ah yes, the next level tech of weight painting and orthographic rendering.
Margot Robbie: did you say OpenVDB?
Cool tech, but the widely available hardware will be ready in about 5 years. Till then it's classic tech hype 🤷
Just give me 6.0!
We need eye tracked foveated rendering on monitor to increase performance. This would require some high-speed cameras, as you would need at least 240hz camera if you are targeting 120FPS in game.
Asynchronous Re-projection on monitor would fix the problem of low frame rates too. People get tricked thinking 30-60 fps is more like 120-240FPS. Combine with AI to make the warped frame more accurate and it could be better than Frame Generation.