i did change the thumbnail a little after the fact, sry :). & the Tomb Raider test was backwards. It’s funny because they was so close that I messed it up in the edit. Ig that’s kinda says something
Raytracing is pretty new and not really worth it right now but it can add a lot more to a game just by simply adding real-time reflection and shadows to them. And it can also probably function as real-time sound tracing for more immersive sound in the future.
I think on the Tomb Raider one you can tell by the shadows on her back. In raytracing, the shadow cast by the hair is sharper while the shadow formed by the shape of the back itself is softer. In rasterised mode, it all looks the same...
Like 99% of time not worth it. Been playing spiderman on pc and there ray tracing makes a huge difference when you end up close to buildings or climbing on them you get functional reflections SSR or cubemapped or other rasterized method cant handle stuff like that.
I think people are being too hard on it cuz it makes them look cool.. not accusing that of you personally but I get that vibe alot from some people who are like 'it's a dumb meme' or whatever. My logic is comparing RT on/off to 2k and 4k pixels. I personally don't notice 4k over 2k that much unless I'm sitting really close to a relatively large monitor and scrutinize it. Furthermore, 4k hits your performance MORE than RT (for series 3x and 4x nvidea cards at least, which you will have if you want the RT most likely), and then the monitor you have is GOING to have worse refresh rate and stuff at 4k than 2k, unless you fork over an extra GRAND (which would enable you to get a graphics card that CAN handle RT at over 200fps at 2k or whatever) over a good 2k monitor. Nobody says 4k is a meme, but I think 4k is much more meme-status than RT Imo.. when RT is actually implemented
raytracing is actually more usefull for artists because we won't have to fake stuff (that is hard) to make things look real. Raytracing will do that job for us and we will be able to focus other areas for improving art. So eventually, when it becomes fast enough, we will ditch rasterization.
@@PringleSn1ffer RT in video games exist since at least 30 years, pre backed lighting. RTRT doesn't add anything compared to taditionnal methods because it's performance costs are so tremendous it's never worth it.
@@SedatedSloth @4R8YnTH3CH33F @attractivegd9531 @nevermore6954 All of you are on wrong track. As an environment artist we have to fake lot's of lighting in order to make things look real in a video game despite Baked RT lights. Baking alone does not give better results. GI, Reflections, Shadows, Subsurface and stuff will only look great in baked scenarios when stuff in games aren't moving. If you have ever played video games then you might have seen how many moving parts are there in a game that is why we artist need realtime RT so that we don't have to painstakingly fake lights for hours to make a single area of the game look real. Current raster methods sure are not realistic but in heat of the moment, you don't notice a difference between real time RT and raster, This is due to limited use of ray tracing again due to poor technology. In future when GPU's are more capable enough, we will ditch raster rendering.
@@attractivegd9531 Prebaked lightings limitations are numerous and quite obvious, which is why the industry has been moving away from it for years. VoxelGI solutions have pretty much been the standard for last gen consoles. Realtime raytracing is the future and although it is still out of reach for many gamers we are moving in the right direction.
The problem for RT being adopted by game developers instead of using rasterisation is not the artists, but the business model. Imagine you're the CEO and you're asking your project managers how the developers are doing with the latest game. They say they've ditched rasterisation and are solely using RT which has cut down development time/costs by about 20% because they don't have to waste so much time faking light sources with rasterisation. So far so good. You ask if there are any downsides to this approach: Well only 20% of your potential market for that game can afford graphics cards powerful enough to run RT, so you're probably going to lose 80% of sales. See the problem? That's why RT being useful for artists is never going to gain traction in the gaming world while ever most of the market can't afford graphics cards that can run at max settings with RT on.
For me the problem with Raytracing is, that after a while you stop really looking at the graphics and envoirements and just focus on the game. It doesn't matter as much as people think it does.
I agree. I tried to game in 4k or 2K with good graphics, and then I thought. Why... Do I need this? I was find with 1080p or 720p at great colors and performance!. I wasted my money :/
One my biggest problems with ray tracing is how some scenarios make it terrible difficult to tell the difference but there is still a huge performance hit. Even with still images. Like walking somewhere outside with heavy clouds and no highly reflective surfaces nearby. There won't be much shadows and reflections to ray trace. That's why I feel some devs tend to overdo reflections and make everything look shiny wet or chrome coated. Which ends up looking not natural at all. One of the best example is Hogwarts castle. Some floors look ridiculous with RT and I much prefer the rasterized reflections.
Perfect reflections are easier/computationally cheaper to ray trace compared to rough or varied roughness reflections, so until we get faster hardware across the board, developers are going to focus on the cheaper effects.
Part of the issue with that is most devs don't implement fully raytraced environments, because it's just so damn expensive. Most of the time it's just raytraced shadows, MAYBE reflection and on the very rare occasion global illumination, and only then does it really start to make a difference, as rasterized shadows and reflections had already become quite good hacks (but the workflow for those hacks is much worse than doing raytracing. Games should have to state the level of raytracing, instead of just "this game is raytracing enabled".
For the last 20 years, developers and graphics designers have mastered the approximation of many rendering effects with good-enough quality for real-time graphics. Most of the reflections can be done with portal rendering, planar or spherical mapping and even scree-space methods, where appropriate. Shadow maps have already been perfected in pretty much every engine out there. Dynamic GI is one area where real-time RT is bringing tangible benefits and game devs have chronically struggled to come with universal solutions using various light transfer hacks. For the time being, cost-effective and carefully tuned hybrid implementation of RT is the way to go.
Planar can only be used on flat surfaces and is more expensive than raytracing unless it's used very sparingly. Cubemaps are incredibly inaccurate and can only approximate the general lighting conditions. Screenspace is full of artifacts, breakup, temporal instability, and can (by definition) only reflect objects on screen. Reflections are by far the worst part of rasterised visuals. They're messy, full of artifacts, don't make logical sense, and are just terrible. Even games like forza still don't have cars reflecting themselves which makes everything 'glow' because internal reflections are impossible without the mess of ssr
When RT was new, it looked to me like the main benefit was that it was less work for game devs than having to use a bunch of those techniques, but until everyone can afford a decent raytracing GPU, it probably won't catch on
@@Daniel_WR_Hart it already is catching on tbh, it's been a slow start but with consoles being able to pull it off, everything is starting to use it. Unreal engines entire lighting system has been completely overhauled from the ground up to use raytracing! That's the most popular game engine out there. It might take time for ue5 games to release but trust me, it's already caught on.
@@Daniel_WR_Hart this is the real answer. once GPU's that will be able to render RT for minimal performance costs be something most people own (this is going to take a lot of time, but NVIDIA and AMD had to start integrating stuff at some point for it to actually progress) games wont be developed WITHOUT ray tracing anymore. it will be the only way to make games because it will simply be easier and better
Bro the fact that the most powerful gaming gpu (4090) in the market, will currently need to use an upscaling method (dlss), just to run cyberpunk 2077 4k60fps rt just shows what kind of a hit to performance it actually is
You're making it sound like it's an issue that a 4090 struggles with RT at 4k 60 FPS but .. what proportion of people are actually trying to game at 4K? Lol
@@shamadooee bruh it doesn't matter if it's the 1% or the 10% or the god damn 50%, 4k displays been around since 2001, in 2015 you could make a 4k rig at 1000$ (without any dlss or that bullshit). The fact that billion dollar cooperations couldn't give us anything that is capable of running along side these 4k monitors at a proper price / wattage is a god damn disgrace. Like bro even what i said about 4k ain't matter that much, ray tracing at 1080p is extremely taxing, even the rtx 3070 and 3080 struggles with mainstream games having rt at 1080p, it's so taxing and unoptimized that it makes me think wtf?
@@shamadooee Have been using a 4k screen for like 8+ years now. It was 350€ back then. Later I upgraded to one with a better panel for a bit more. So you are telling me people who buy a $1800 graphics card dont have a half decent monitor and it's not a problem that they can't run it at playable fps?
It would run worse if those pathtraced results would be done with raster. That doesnt scale great. The fact that it runs in frames per second and not seconds per frame is a miracle and even industry experts didnt really think its possible even on a 4090.
One of the biggest advantages of RT doesn't even have anything to do with fidelity. It makes things significantly easier for artists when they don't have to hand-make all of the lighting and can use RT to let math decide how the scene should be lit, only having to worry about light sources and their properties.
Ray tracing can look incredible, depends of the implementation but... my main issue is that developers seem to have forgotten about their art style and instead rely too heavily on RTX, for example I've seen many recent games that use RTX for reflections to make bodies of water look pretty but when it's disabled, there will be no reflections whatsoever, not even a cubemap or anything or other games that use it for it's shadows but when RT it's disabled it will have really bad looking and jagged shadows that had no care or effort into their implementation. It kinda shows how some devs rely too much on the technology and use it as a buzzword to sell morr copies
4:04 - That "mushroom shaped tree" in the center frame really captures my feeling on RTX right now. The shading on the left is just soooo much better, looks very Pixar-esque. I think raytracing right now might kind of be like bloom back in ~2007. Meaning that it feels like games just "turn RTX on" blindly, and it ends up hurting the art direction more than anything else. Maybe after some time, more devs/artists will be able to utilize it more effectively and with more intention, but yea, as it stands right now I think RTX tends to hurt games more than it helps. Like even with the Witcher 3, there's something about how RTX is interacting with the textures, or bump maps, or something like that just sticks out to me as "not right".
I agree. Its kinda just tacked on, and not thought out in most games. However I wills say that Fortnite has a good benefit from it. That tree is just a bad comparison because of the change in the lighting direction. In the rasterization view, the lighting is coming from the right, highlighting the shading and depth. Where as in RTX, its coming from behind the player, hiding the shading behind the tree and making it look more flat. Honestly this entire video was pretty bad at showing off what RTX is even for. Static, open environments are the worst place to showcase RTX. It is most beneficial with interiors, and dynamic environments, hence why I say Fortnite is a good choice for RTX. Lots of dynamic interiors spaces.
Lols, this comment aged extremely poorly. Cyberpunk just released demos of their new overdrive raytracing setting and it looks absolutely spectacular. Guaranteed the 5000 series of nvidia cards will be able to do actual pathtracing reliably with dlss 3.0+ frame generation. ua-cam.com/video/I-ORt8313Og/v-deo.html
@@nathandam6415 Even a 4090 right now is pretty reliable at 1440p. Can comfortable get in that 100-120 fps sweet spot without the game looking terrible with path tracing. With more optimization I'm sure that path tracing can be implemented well for other games and be playable even on a 4080 or 4070ti.
I like RT. But I don't actively use it in every game that has it. It is nice to look at, and depending on the game it can make a huge difference in looks. But I feel like especially on modern games that utilize modern rasterized graphics and shaders, you often run into an area where the visual improvement simply doesn't justify the immense performance cost. Some older titles such as Portal, Half-Life or Quake that got a RT-enabled port, RT can breathe a completely new life into the game, together with an immensely more convincing atmosphere despite the otherwise aged looks due to how simple lighting was constructed and calculated in older engines. They also have the available performance headroom to make RT play- and viable. IMO This is where RT really shines, and I hope that with the advent of NVidia Remix we can see a lot more of such examples.
Games like half-life, portal and quake don't use typical RT you see on modern games, but whole new "path tracing" based renderer that is more advanced than RT in modern games like Metro, CP77 Psycho etc. Only modern game that supports similarly advanced renderer is CP77 on RT Overdrive mode
Doing a side by side for Cyberpunk 2077 would have been nice because of how nice RTX is done in that game. There's quite a few games that have great implementations of RTX that have quite the difference in fidelity. I loved RTX the first day I bought a 2080ti after upgrading from a GTX 1650 Super. Now that I have a 4080, I want all the RTX.
Metro Exodus, Cyberpunk 2077, Witcher 3 Remastered, Control, Doom Eternal, Quake 2 RTX, Portal RTX, Resident Evil Village, Minecraft RTX etc. All these games are night and day difference with ray tracing on. I really like RTX and DLSS. My 2070 Super is showing it's age but I don't want to spend $1,200-1500 on a GPU. I used to be able to build an entire mid-high end PC for that hahaha. Eventually I will buy it though. Probably just wait for next gen refresh of 4000 series like I did with 2070 Super.
@@jonny-b4954 Prices do suck now, but hopefully the 4080/90 will be found cheaply on the used market in a year or two. I'm thinking about downgrading from 4k to 1440p and grabbing a better CPU just so I can run RTX games a bit smoother
@@Glubbdubdrib They're atrocious! That's how it goes though. Inflation and supply/demand but once you add in covid, companies have great cover to raise prices even more. I've been considering upgrading from 1080p to a 1440p monitor for years. I think 1440p really is the sweet spot. You're getting diminishing returns after 1440p and its a nice middle ground performance wise. I'm often shocked how much more FPS say a 13900k gets vs a Ryzen 3700x, for instance. Just insane. I'm probably switching back to Intel next upgrade.
I had a tough time trying to find the difference in the witcher 3 test, now I understand the artistic part vs the raytraced one, sometimes the artistic one is better.
I have only used ray tracing in a few games, and it was on a 2070 super so not exactly the best for it. But I think ray traced reflections are great, normal screen space reflections look good but it's annoying when they disappear as soon as you can't see the object being reflected. [I would still go for the higher frame rate at the moment though] I couldn't give a shit about the shadows or lighting [ in most cases]. I usually prefer the art designed shadows to the ray traced ones. They might not be as realistic but they often look better to me. In that witcher 3 scene you showed I preferred the rasterized version but that could come down to time of day in that game.
@@AlexBarbu wrong, just look at masterpieces like forza horizon 5 at first then say something, the rasterization used in that game is almost the same as raytracing enabled
Open world single player games are perfect to play with RT on. Currently Metro, Cyberpunk, Far Cry 6, Witcher 3 are all games where RT is totally worth enabling.
I agree on most of these except for Far Cry 6. Their implementation is really quite aweful imho. The RTGI shader from "Pascal Gilcher" that is nothing more than a screen-space solution, doesn't cost more than 3 fps on a RTX 3070 and runs on any graphics card looks better than the implemented RT.
As someone who been gaming since the 90s and have gamed on every single platform, even when I was on tour in Iraq and Afghanistan 😂 I personally don’t care about Ray tracing, i feel like it would be a nice feature if it didn’t have such a performance hit and such a price hike. But as it stands to me it’s not all that. Games look more and more beautiful every year, and honestly when the game is good and already looks so good then honestly I ain’t finna check for shadows or anything like that.
It's not worth the performance hit in actual gameplay. But i like it for games where i can slow down and appreciate the work put on the visuals of a game. Global illumination and RT lighting feel more like style choices in a lot of games on vs off. It's not necessarily about looking better per say. CP2077 in specific spots definitely has a more "cyberpunk" vibe with all the lights with rt lighting on the higher settings. But i prefer how screen space reflection look in the game over RT in a lot of places too. Problem with screen space is how they disappear if you change the angle of the view. It's like pop in for reflections, can break immersion if you see it. Sometimes the stylistics choices made by devs a lot of the times. The shadows in the Witcher may not be realistic but the look fits.
Path-traced Cyberpunk 2077, why not include it? It looks jaw-dropping awesome but yeah, the performance hit is massive and yet rtx 4060 can run it (with dlss frame generation) at 60fps. So we can already see how much better ray-tracing is compared to rasterisation.
Rasterization often uses screen space reflections which means that only objects in sight will get reflected to gain more Performance, shadows are just textures that get calculated in real time and then get overlayed onto the scene to fake the impression of shadows and in raytracing you shoot a ray from the camera Position into the scene and looks for intersection, shadows get calculated by shooting a second ray from the ray hitpont to the camera and checking if it hits something
That’s really a genius thing to do - to show scene with almost no shadows on screen in the game where the RT effect is shadows (SOTR) P. S. RT is on the right, not left, wtf is that?
I prefer the rasterized versions of almost every game cause I just love how deep the shadows are. Idc if ray tracing is more realistic, that almost black shadow is so beautiful in some games.
Honestly I couldn't care less about ray tracing, all I want is a s*** ton of shadow maps and realistically rasterization can't do that well, but ray tracing can
I'm not even kidding, there are some game trailers out there that have the "RTX off / RTX on" comparisons, and I literally cannot see _any_ difference between them. This even though these trailers are supposed to specifically demonstrate the difference! Of course then in games themselves it can be difficult to see where the "raytracing" is supposed to be happening. There are some games where it's extremely obvious and cool (such as Control), but then there are other games where it's a real game (hah!) of "spot the difference". For example Deathloop, at least on the PS5, is supposed to have raytracing support. However, the _only_ effect that turning it on has is to tank the framerate so low as to make the game literally unplayable (and no, I'm not one of those snobs who thinks that 30 FPS is "unplayable". I'm not exaggerating here. Try it if you don't believe me. The game literally cannot be played when "raytracing" mode is turned on. The input lag is astonishingly bad. It would be bad even for some slow-paced turn-based tactics game, it makes it literally impossible for a fast-paced first-person shooter.) And the kicker is: There literally is no visual difference between the normal and raytraced modes. Again, I'm not exaggerating. I have taken screenshots of the same locations in both modes, locations that ought to have some differences in them (significant reflections, shadows, lighting, etc) and switching back-and-forth between the screenshots shows literally no difference. Clearly the game is doing _something_ because the framerate is so horrendous in the "raytracing" mode, but visually there is literally no difference. Such a scam. (And the worst thing is that I bought the game precisely because of the alleged raytracing support, which is rare on the PS5.)
Main benefits of ray tracing are mirror effects like water reflecting, and actual mirrors, etc. That's where I can really notice it, especially once you know what rasterization limitations are.
Lmao no Raytracing is a better rendering method which basically simulates light rays to create physically accurate results While rasterization just draws some shape on screen and then fills colour like a small child
I'm dying to see a TRON reboot multiplayer, Raytracing would seem the perfect choice but it would be the worst game to have it. I want games to go oldschool roots not film adaptations like Mandeloran
I Think RT is a miss , watching the DF guys looking at the proper path tracing on Cyberpunk it just shows the current RT is a half step its just too neutered down for hardware. i think full proper path tracing is the future but because it makes development faster not really because it looks better. and at the end of the day the big issue is it doesnt make any difference. proper 3dfx cards enabled several genres of games , sure you could do fps with sprites like doom, but walking into the world of everquest a fully realised 3d world.. that was the dawn of a new epoch. but path tracing is just not going to do that.
I'm a genius according to you. I spotted every single example of raytracing. But I'm very detail oriented and very picky, so I would obviously be able to tell in most scenarios. Edit: I should note, I also think ray tracing is a gimmick rn. It's not usable until we have 4090 raytracing performance on a $300USD Card.
Raytracing is great for movies when realism is desired. In games, there is no need for it, you are trying to kill the enemy not admire how realistic your character's butt looks like.
@@SalimShahdiOff I know. What I meant is that they should spend more time making games that are fun to play instead of games that look super realistic. Animal Well is an excellent game, while Hellblade is basically a movie.
As an illustrator, i can easily tell which side is the the raytraced one and RT GI/AO usually looks way better than rasterization GI/AO especially in open world title. If you are playing a single player game with 3080ti plus tier gpu then you should try the ray tracing IMO. Still no reason to use it in competitive games tho until we can get a 3090 tier mid range(50/60) gpu.
I'm hoping 5060 will at least equal 4080 or 4070ti performance because the 3060 performed like a 2080, the 2060 performed like a 1080, the 1060 performed better than a 980, etc.@@joshuagoldshteyn8651
Well, today I learned that I don't give a shit about RT Global Illumination as I picked the raster version as more visually pleasing for 2 out of 3 lol. I think ray tracing for reflections is really, really cool and way more obvious / impactful. Agree that it's not worth caring about for now with games as the perfmance cost is way, way too high for the vast majority of users. Give me the frames instead.
I think once RT has completely taken over rasterized rendering (which might be about 10-15 years into the future). A lot of people will become nostalgic toward the hand crafted and artistic deliberate nature of rasterized graphics... You can actually tell that this is already happening when people play new games that are developed in old school game engines, as developers are painstakingly deliberate with everything they do in level design. Many people feel, myself included, that a lot of the artistic talent is lost in games when we start to rely on large photogrammetry scanned libraries like Quixel, in which assets are copy and pasted from a library rather than being hand crafted. Same goes for lighting, Just have a look at Ion Fury and how deliberate the lighting is. designers even construct the map around the lighting, putting it into the actual vectors of the map. or let say Wrath: Aeon of Ruin which uses the Quake 1 engine. It looks so moody, and brilliant, however it still uses GPU and game engine tech from 1996. Today, RT is a nice curiosity, and great in some places, but I am afraid it will become as bland as the current AAA gaming market.
I kinda disagree. There is a lot of stylization that can be done with RT. Just look at the CG movie industry and the things being done there. I do bet there will be some nostalgia as there always is for stuff like this but I don't think it will be bland. There is a ton of stuff that can be done with Ray tracing but it will rely on more artist driven stuff like textures, materials, and models. Just because you go ray traced doesn't mean you have to strictly rely on photo realism. Ray tracing can be perfectly suited for more cartoony or stylized stuff as well
@@crestofhonor2349 Yeah but you still wont have flare of old. Its like the CG vs non CG. He is 100% right on old game thing old game style is extremely popular at this point.
@@crestofhonor2349 I don't understand why would anyone use RT, it looks unrealistic at all. You need path-tracing to be realistic. Before path-tracing is implemented, no-rt nearly always look more realistic and more performance + efficient. Those shiny and reflective shit, you don't see them in real life. Search a river/sea picture in google, and compare to ray-traced fake reflection, how dumb it is.
@@christianwilliam1167 Are you a idiot? Im talking about CG VS NON CG and old games vs not old games in terms of aesthetics. A ray traced really old game shouldnt look like a really old rasterized game due to how basic rasterized effects are. So you lose something you gain something. Same way with movies CG vs practical effects. Also by old i mean like quake, doom, etc. However now even ones going for a realistic art style have the same thing going on where something is lost in the Last of Us 1 remaster aesthetically vs the original ps3/ps4 version. Like look at Ultrakill and Dusk. Also in this video and other rare instances ray tracing makes modern games look worse because art is as important for visuals as the tech. Also read what og commenter said first couple sentences.
Main benefit of ray tracing is it takes less work on the development side to make things look good. But since most people aren't using raytracing it doesn't really matter cause they'll still make it work with rasterized rendering anyways. With ray tracing I can just throw a few lights in the scene and bam done, but rasterizing it's a bit more work if you want it to look good (a lot of developers don't put the work in to make it look good. Rasterizing renders are pretty decent out of the box at this point so people don't put in the extra effort, and most consumers don't seem to notice. But I notice, I know you used the default settings in unreal engine 4.)
from what I heard, if you go full ray-tracing, it also makes it easier to develop the game, because making a rasterized game look good is a lot of work. So in the future we might start see titles that are ray-tracing only.
RT only games will only be possible when the entire market has GPUs capable enough to do RT optimally. Or else, even if the game is easier to develop due to RT, it won't sell well because nobody won't buy it cause nobody can't run it.
This isn't correct. Reflections are actually worse for rasterization. The way reflections are often done for puddles and water sources on the ground rely on flipping the image around 180 degrees and basically filling in the reflection that way. It's very flawed. It doesn't show objects that aren't on the screen- if you looked down on a reflective surface and the view of trees disappeared from your line of sight their reflection would disappear. Secondly, reflections in transparent objects like glass also suffer greatly as rasterization relies on inaccurate cubemaps as an approximation and do not appropriately represent what is being reflected.
Honestly ... i like the idea of raytracing But yeah it doesnt do too much to me I remember switching it on and off in cyberpunk and didnt notice on a major scale
@@alals6794 rx 6650 xt My graphics card isn't really built for ray tracing but when I've seen it on a PS5 it's so little I can't notice it Had problems before but that was because my CPU was pretty old got a newer one and loving cyberpunk Do have some frame dips in the DLC area but other then that still looks amazing to be honest Been playing some god of war and it dips in the main area with the gates .... kinda understandable though because its rendering so much and its a huge area Considering I've seen the ps5 version and the same with cyberpunk I can't notice ray tracing to well (turns out the PS5 has raytracing although I don't know if performance mode affects it) Did tell my partner to put it on quality mode and she couldn't do it xD Once you go 60 frames you can't go back to 30
The thing is too... games are art. Art is about illusion. Some of the most dramatic and beautiful scenes and moments in gaming, are built around heavy use of illusion; sky boxes with pre-rendered backgrounds, levels intelligently culling detail so that they can optimize performance but imply far more than is really there, tricking the imagination. Just look at some of the crazy stuff they did to make Dead Space 2 work, for example, with entire level cityscapes appearing or disappearing in the background... This is the real artistry of gaming. To take what isn't even remotely real, and make the player think it's real, and to do so in an efficient way. Remember when Star Citizen was proud of the fact that it had re-invented the way FPS games handle the camera, and had stuck a camera in the character's actual head, and then discovered that the character's movements were chaotic and unpleasant to look at, so they created a new way to stabilize the camera in the character's head so that it looked... uh... like the old FPS cameras did? That's RTX. Flinging a lot of processing power at a problem that was already solved by the artist. I'm playing Metal Gear Solid 1 at the moment, the PS1 version emulated. I'm not doing anything fancy to the visuals; I have a CRT filter on, that's about it. You know what really strikes me about the game? The lighting is beautiful. Oh, I can see that it's not "real" lighting; if I were to guess, I'd say most of the lighting is baked into the textures themselves. But the choice of colours, the striking way the game uses neutral shades of grey and blue and green to somehow accentuate what matters, and make things stand out, even when the colour palette is so subtle? Beautiful work. Distinctive, and memorable. And very, very deliberate. What is it, exactly, that RTX offers... except the automation of that which artistry had already conquered? And through that automation, are we not in danger of losing that deliberate artistry? In the Witcher 3 example - the raytraced version demonstrated that, if we're going by "realistic" global illumination, the reflective qualities of something simple as pale stone... results in the lighting getting washed out; no more high contrast shadows, serving to draw attention to the player's movements through the world... just a whole lot of glare. And sure, that glare is realistic. But games aren't meant to just be "realistic". The lighting of a scene is supposed to be like the lighting of a movie, or a painting; it's supposed to serve specific artistic purposes, not just capture how things should realistically look.
2:33 To clarify, raytracing usually isn't rays from the light sources, but actually done in reverse, where for each pixel on screen, a ray is traced from the camera/eye out to those points and bounced until they either hit a light source or don't, and then propagate that information back down the path. This drastically cuts down on costs since you only trace what would actually be visible. But, it isn't 100% physically accurate, though close enough. When you want high physical accuracy, that's when "path tracing" comes in, which is usually the term used when the tracing will be happening from the light sources. This ensures lighting even from far distant away sources is bounced and scattered all over the objects, but is quite expensive. Cyberpunk just implemented pathtracing in their update today which is exclusive to 40 series cards and basically requires a 4090 running the game at a very low resolution with the newest DLSS upscaling to get acceptable framerates haha
RT only increase the visuals 15% and decrease 40%+ performance on any card it's really not worth it for me only tho but I'd still wait for better tech to consider if it's a ok
Hmm on the Difference one, I actually liked the Trees of the Fortnite comparison better on the rasterized one, because they had more shadows in them, which to me made them look more realistic and I choose that one to be the RTX one, which I guessed wrong. Hmm, I guessed wrong on the Witcher one as well, I just like dark shadows too much haha. On the Tomb Raider one I was right, by your comment, I did spot the less pixelated shadows.
Apart from Fortnite, I barely noticed a difference and I liked the Witcher 3 rasterized image better for some reason. I don't know ray tracing needs another 2 or 3 generations of gpus to actually be worth investing into it as a gamer. As a producer it is a GODSEND I have no doubt.
Guessed all 3 correctly, but I'm pretty sure the first 2 could at least fake the global illumination without raytracing, while on SotTR it was mainly the softness of the shadows, which can be done rasterized as well.
Only good AAA RT game is Metro Exodus Enhanced edition, having semi-photoreal lighting while also having better or on par performance as the rasterized version. The problem is that developers just don't spend enough time optimizing their game/engine. Recent examples being hogwarts legacy, dead space, atomic hearts - unreal engines lumen is just mediocre reflections while the goal should be RT GI.
@@MAzK001 I was more so talking about game optimization in general, I don't think the RT is out for that game, and hope it never will be, as I sadly made a clown of myself by preordering that trash heap of a game.
@@MAzK001 Depends on when you played it, I played it day 1 and it ran quite terribly on my 3060ti at 1440p, I even tried running on low settings but it just kept stuttering like most unreal games with shader caching issues.. And I'm on a 5600x + 16gb ram w a nvme SSD, so my system isn't the problem. Furthermore, my gripe with the game wasn't really the performance, as I completed it with these issues, but rather that the game is a 5/10 for me at best.
@@3rd.world.eliteAJ ah I see, I played it quite recently and it runs quite well on max settings. Perhaps it's been fixed then! But yeah I haven't finished the game but it does feel rather tedious so far.
I've been looking into this for like 5 minutes and I could guess all your tests easily. It's pretty obvious when it's on or not and what changes. Whether it's worth its hype or worth it I have no idea. But it clearly does things.
I do love ray tracing. It's a technlogy that's super interesting and will be the future of in game lighting for 3D games going into the future. It's just about how it's implemented. Normally it can solve lots of issues with games and lighting today that come with ambient occlusion, shadows, global illumination, and reflections that standard rasterization. Now whether I turn all the features on is up to me on a game by game basis. Cyberpunk and Metro Exodus Enhanced Edition will always be a great example of ray tracing and it's benefits. Once UE5 finally starts getting major releases in games ray tracing will become even more prevalent because Lumen can utilize ray tracing hardware to improve the quality of what they are doing and can apply to their Virtual shadow maps, Global Illumination and Reflections. Ray tracing is also well suited for both stylized graphics and photo realism assuming the artist can do it. Yes we are far away from fully ray or path traced games due to GPU performance in ray tracing but every new generation we inch closer and closer. I assume it will truly take over once the PS6 and Xbox whatever come out. It has a lot of potential but many people just don't know what really any graphical feature is, just whether or not it has a large performance impact and whether it's worth it.
the thing about ray tracing is it's one of those technologies that only get the developers get hot and bothered over. most consumers don't understand what it is and they can't tell much diference like they could with say physX for example. so many think it will die out like physX did. mean while game artist often work between the game industry and the 3d movie industry so they are fully aware of what ray tracing is and it's benefits both on the visual level as well as the work flow level. it's been around since the begining of 3d technology so they are very familiar with transitioning to it. and that is why it WON'T fade away like physX did. in 3d movies it has been the "go to" lighting system from the get go , from terminator 2 to toy story. Ray tracing is the history of 3d rendering which is why it will be the future of game rendering going forward now that we actually got hardware that can process it in real time frame rates.
I think you swapped the raytraced vs rasterized for Tomb Raider. Raytracing should make shadows softer, not sharper, and it shouldn't increase performance.
I said it once, RTX is still in its infancy, it impacts performance of games quite a bit that said I wouldn't use ray tracing unless I'm curious to see how it looks in a game that has it for a awhile but then just turn it off. RTX looks beautiful but I wouldn't use it unless I see see higher performance with it. NVidia believes in pushing RTX over DLSS when DLSS is a godsend and they'd rather give you the shiny over the performance lol
Do they though? DLSS has made some pretty huge leaps, but the devs have to actually implement it. Also, while Nvidia is ahead in RT performance, RT can be used on any brand of card, while you'll need and Nvidia card to use DLSS.
@@wiremesh2 True, But as for RT I honestly don't think we are quite there yet (for gaming anyways); Yet RT has been around for quite awhile it is implemented in CGI films like pixar films long before we knew what it was or even heard of it. I mean give it a few more years in the oven and RTX will become the new standard that NVidia is shooting for. I say they're choosing RTX over DLSS because that's exactly what they are trying to do, if you haven't listened or watched most tech tubers it sounds to me they are pushing RTX more yet they give use cards with not enough VRAM to use with it unless it's an 80's or 90's series card, they need to do what AMD is and include 10GB-12GB of VRAM if they truly want to push RTX onto us. After all Ray Tracing is their primary marketing atm.
@@blckmlr7573 I guess you're right in terms of marketing. I should have noticed, considering how many people use the term "RTX" for ray tracing, when that's specifically an Nvidia term. It's become ubiquitous.
3:54 that's something I saw on a video, a guy was taking about how RTX "was bad" on RE4 because he cannot see a reflection on the water compared to rasterization, and I make a point about that "no reflection" was real because there's no way that light realistically reflects on the water on that angle, sometimes we prefer more those "game fakes" nonrealistic points on light that shines and make it more brighter. 8:52 As you said before on single player games is not so important to have 120 FPS, and Marvel's Spiderman is a really god example on Ray Tracing my FPS slowdown from 110FPS to 65 on RTX, at the Beginning the reflections are great and something you notice, but after some hours I disable RTX to have lower consumption and is not a that huge changing, I prefer performance.
I prefer the one that doesn't need DLSS or FSR to run properly so yeah Raytracing still has a long way to go at least a long way to go before we can have Raytracing without having to sell a kidney.
Dunno if I am the crazy one but the only comparison which I got correct was Fortnite one. I assumed that RTX ON in Tomb raider and Witcher was the other way around. The interesting thing is that I also compared which image looked better to me and in all 3 of those cases I preferred the side that turned out to be RTX OFF... Wild.
Try Amid Evil you can really see the difference between ray tracing and rasterisation. Game looks fantastic with ray tracing. I know I'm the odd one out in that i really like ray tracing and having RT did affect my latest PC build certainly to my detriment. Wanted to downsize from full atx build with 3070ti and 21:9 monitor to something way smaller and decided to go full team blue with arc a750 and ray tracing was a factor in that vs a better card from AMD for the same price or cheaper. I will say RT performance is all right on the A750 on less demanding games like Amid Evil.
I think the end point of RT is just as another notch on the quality settings. Like, as an indie dev who works in unreal, I don't see a lot of people I know using RT for their projects. Lumen can be used without RT, and still produce great results. The RT just makes things like reflections more accurate, but Lumen does plenty of other things like allowing for emissive materials to cast light, and that doesn't require hardware RT, i.e. Nvidia or AMD path tracing solvers. Godot also has a novel, similar lighting system called SDFGI, or Signed Distance Field Global Illumination, which is also not hardware dependent. Similar to a point you make, rastering is already so good that in a lot of cases RT is just serving as the equivalent of a new notch on the shadow and reflection quality sliders. Right now RT is like Hairworks was back in the day. It's a marketing gimmick, where Nvidia can pump a lot of resources into partnering with a game to show off the tech to push their cards. It's not a reasonable solution, since it's pure brute force with a technique that just isn't suited for real time on consumer hardware to begin with, and as a result AMD and Nvidia are developing stuff to side step its hit. What I see as more likely than stuff like DLSS persisting as a necessary evil of RT, is simply engines getting better at just doing that work on their end (like with Lumen), so they don't have to worry about it. Because at the end of the day, the best solution for the devs in terms of accessibility is going to be making sure people have access to the stuff they show off at keynotes and stuff like the VGAs. Making the game equally performant for everything is significantly more desirable as a dev than relying on someone else to make sure you don't exist in a weird edge case where their stuff doesn't play nice with yours - a thing we've already seen a bunch of with DLSS and FSR. Beyond that, at some point Nvidia and AMD will wear out the top end of the market, and there will simply not be more people willing to upgrade if they have a 4090 or 5090 that's still running games well because the underlying tech has matured. Which is something I don't think they're really counting on with DLSS and FSR in the mix. If anyone cracks the code on a better method of efficiently faking GI even closer to RT, which I think is definitely within the realm of possibility, then there's a good chance that DLSS and FSR create some very nasty clashes between the companies and their consumers as support is dropped to try to maintain the high end. Either that or Nvidia and AMD eat a boatload of losses. And that's before you consider issues with how the AAA space operates as businesses in terms of their ability to sustain these massive Live services at the top of the industry and whether or not indies that already rely more on stylization or just designs that don't lean on fidelity even care at all about RT as a feature. My money is on the top falling off, the indies not caring (because I'm surrounded by them all day), and the ability of all the code wizards out there to negate their reliance on hardware.
Software lumen still is rt. Hardware lumen is just more accurate and more expensive. Your statement that rt is not meant for realtime is simply wrong. Rasterization has its limits and we are right on its limit. There already are situations were rt is performing better and it scales way better. If you want to increase graphics quality, reasterization can only go so far.
I just guessed based on which look I preferred, assuming that was ray tracing, and I was surprised to find that in my opinion, rasterized look better, especially in the witcher 3
The Witcher 3 seems wrong? The shadows are way too sharp, I don't know but other raytraced games have softer shadows because that's how it is in real life.
@@dhgmrz17 the witcher uses RTGI & RT reflections, not shadows afaik (so both use oldschool shadowmaps). The difference is very obvious in places where skylighting doesn't physically reach. In the rasterised scene skylighting is applied to places where it shouldn't (like an overhanging pass or an archway). It'll have this typical blue/gray hue to it.
@@MLWJ1993 A lot of people are also very used to the way graphics look in rasterization even though it actually looks weird when you get aware of the difference. Like I can no longer unsee that a lot of places in The Witcher 3 are lit like they are in direct sunlight even when they are not, but people I find often tend to not really compare how games look to real life but to other games. A lot of people for example complaining about surfaces being overly reflective with ray traced reflections would be surprised to see how reflective a lot of surfaces are when you actually pay attention to how light bounces in an environment. There are also some indirect benefits to Ray-Tracing that are maybe not clear just by looking at comparisons of games not made with it at there core, but a certain things like real time global illumination you get from Ray-Tracing actually does great things for allowing artists to more quickly iterate over scenes to make the lighting actually look good. I should also say just because a graphical features allows stuff to more accurately approximate real life doesn't mean it can't be used in more stylized art. Just because things get stylized doesn't mean that people don't care about things like anatomy, lighting etc. Those are all very important things even in stylization, so having access to easier to iterate upon and more realistic lighting methods does a lot for everyone not just people trying to make hyper realistic shit.
Raytracing is so useless. You get 10% better visuals but 50% your performance. In some games like Cyberpunk RT literally kills your GPU. My 3070 runs 70 fps on ULTRA 1440p Native but RT gives me 18 fps. Its ridicilous.
Holy shit.....that destroys your GPU and the 3070 is beefy..... I bought cyberpung 2 months ago but have not played it yet. Anyways, I'll just skpt the RT cuz all I got is an Intel Arc 770 16GB, lol
I don't get it, if I could choose to enable raytracing would do it. It looks so beautiful for single player games if you can have a taste of the future just do it. I mean that is one of the reasons people moved from consoles back then.
Not worth it, notice how even NVIDIA realized that Ray Tracing even after 5 years of them pushing it, is still extremely demanding and pretty much useless, which in turn forced their hand into borrowing Frame Interpolation which is something that already existed FOR YEARS and call it DLSS 3 - Frame Generation to magically say they now can double your FPS, which is a lie and if i remember correctly, their Frame Generation is dependent of NVIDIA Reflex which in turn is something AMD introduced first into the market with their RDNA 1 - Anti-Lag, NVIDIA Reflex only arrived with Ampere, so yeah, thanks AMD for your Frame Generation not multiplying your latency by 4x.
Simplest way to describe RT is. A occlusion test between ray and primitive. Rasterization solves occlusion for primitive rotated to view frustum. Advantage becomes from ability to easily sample world from any location to any direction. (In rasterization this is just a not good idea.) How you try to solve rendering equation with these tools is where the differences become obvious.
I think ray tracing is the future of illumination in games. Perhaps specific exceptions for artistic reasons. However, GPUs in their current state aren't good enough for it (real time gaming applications). Perhaps when the ##50 cards match the 3080 in fully path traced scenarios, but even then the 3080 can't lock 1080 60fps games like portal RTX without DLSS.
Ray Tracing is incredible. But it needs performance updates. Not sure what kind of magic they’re gonna do to make that happen, but things only ever get better. So we’ll see how this video holds up 10 years from now! Great video btw!
I would like to point out that most RT-enabled games are built to support both rasterizaton and ray tracing. This means a lot of software compromise. Look at metro exodus enhanced edition that was built to purely support RT. You get both visuals and performance. It's a considerable software limitation.
I don't think it's a software limitation as much as it's an art direction and time limitation. The issue is that the art directors, QA, and others all have to review and design every part of the game's lighting in every scene TWICE. Once with traditional shadow maps, probe GI, reflections, etc. Then another time with each RT effect enabled.
@@bananaboy482 it is increased work, sure. I'll agree. But if you look at it most games that we take about when RT comes to the picture implement RT over rasterized techniques so it's playable on all hardware. Even if your using RT your wasting computation on preliminary calculations for rasterized lighting. Again, I'm gonna talk about metro exodus as my example, it has some pretty heavy RT (RTGI with infinite bounces, most other games just have reflections, shadows and lighting with 3 bounces.) RT on the gold edition is about as performant as youd expect which is to say, limited light bounces and low fps. Enhanced edition has its entire raster lighting ripped out and a pure RT implementation (with means you can’t run it on non-RT capable hardware.) The thing is this RT-only optimized approach runs the aforementioned heavy duty RT effects with performance that equals (if not exceeds by 1 or 2%) the raster version of the game. Considering game dev times, must of our so called RT-games are really raster games with RT slapped on top. That said, not everyone has the time or money to build an RT-only edition (there's a reason Meteo Exodus enhanced edition is the only game I know to have done this,) so until game engines embrace optimized RT-first lighting (or at least swapable lighting systems instead of overlapping lighting,) I'm not holding my breath for performance RT. P.S. I just wanted to put this out because I'm tired of hearing people say there's no difference (because the RT implementation is skin deep) or asking if it's worth the performance hit (right now, it depends on the game, but that's an optimization problem, not a raytracing one.) P.P.S. Typed this on my phone, to lazy to spellcheck
@@shadyti5 I don't think you understand what you're talking about. You're not wasting preliminary calculations on rasterized lighting as the game is still inherently rasterized. It's just using a lighting cache that's computed using triangle ray tracing. Also Metro Exodus enhanced edition runs near identical to the Metro Exodus original with RTX GI probes. The only difference is in the implementation of the lighting calculations using a surface cache instead of augmenting the original probe based lighting. In your words, Metro Exodus enhanced is exactly a traditional rasterized game with RT effects layered on top. It would be stupid to do it otherwise and have full ray traced direct lighting like something like Portal with RTX. It doesn't have to do with the developer time or optimization, building a BVH on the CPU and doing triangle ray tracing is always going to be very expensive without dedicated hardware acceleration, and even then we can see that that's very difficult. The improvements in ray tracing performance from the 20 series to the 40 series is almost none, they just added more cache to help speed it up.
4:48 - On the left screen, everthing is so bright, has less shadow has less personality, rasterized in dark scene always better, however is there any bright sunlight reflection scene RTX is better
How to tell? RT is usually brighter, glossy, and at times not very realistic looking. Is it needed? Probably not at this time since they need to improve RT first. I feel like sometimes RT is just over doing the reflection. When it was first announced, to me it was just a sales pitch to get people to buy. They needed to bring something new to the table or their gpu will be boring compared to their rival. But RT has always existed in the past, it just that no one really cared to make a big deal out of it till now.
5:45 I picked left being raytraced because of the lighting on the distal zone of the scene. It's more rewarding to my eyes, and so it would have been more than justifiable for me to turn it on if not for how taxing raytracing is on performance.
All the people who think the RT Witcher clip doesn't look right need to go look around during a low to no cloud day outside. The RT clip is believable, the other one looks fake because light doesn't just go poof and disappear like that. It bounces around and diffuses in the atmosphere. I've never been outside during a sunny day and been suddenly in the dark because of a shadow (though a couple full eclipses came close, but that's a pretty extreme shadow there).
The bulk of gamers are mostly maximizing a frames vs monetary cost function. RT increases monetary cost and decreases frames. It should be no surprise it isn't wildly popular. Sure there are gamers who have extra money to spend and play mostly single-player games with high-fidelity graphics. But that pool of customers isn't going to be the majority by a longshot. As you say, RT is only ever going to hit the mainstream when it can compete on price and performance with rasterization, which is going to be a while.
I'll care about ray tracing when it becomes a mainstream feature in games, which will start to happen when we get 3080-level ray tracing in a $300 entry-level card with at least 12 GB of VRAM. Originally, I figured that would be with the 5060 in 2025, but the way prices keep getting jacked up... Maybe by 2030.
Please add more raytracing tests. I got the Witcher wrong. The takeaway seems to be that raytracing isnt bad , but rather rasterization is really advanced and come a long way.
One HUGE advantage RT has over rasterization on reflections is obviously that RT does not depend on screen space so there's no weird missing reflections when the camera is not at a straight angle from the reflective surface, another advantage should be global illumination but Unreal developers also did an incredible job at emulating that with Lumen Note: I actually hate that nobody knows what RT is and they think is just fancy graphics with tons of filters on. Like all those content creators making "RTX ON" game remakes or literally just putting nvidia filters on top of genshin impact and calling it RTX or shit like that Edit: I almost forgot about colored shadows and soft shadows but nobody seems to utilize those with RTX
raytracing is a great tech for companies making games to save time, though the reason its not much better then rasterization is the fact that game developers have gotten so good at using it to make realistic lighting that it doesnt make that much of a difference, but in the future it will make creating games much faster without having to worry about how the lighting is going to be.
Fortnite reflection raytracing is still very screen-spacey. You can still see the pickaxe reflecting in the water despite being closer to the camera. It's very strange that many "ray tracing" graphics modes only effect shadows and lighting but don't effect reflections at all, whereas some games only focus on reflections.
If you only play games, ray tracing is pretty cool, but not necessary. If you work with graphics, such as making video games, offline/real-time rendering, and movies as such, ray tracing-capable cards is a MUST.
The only real difference I really enjoy with RT is the indirect lighting. So with games that have RT, I usually either have just the lighting on or if it doesn't allow for individual parts of RT to be toggle, I just turn it off.
Showing ray tracing examples outdoors isn't a very fair example, because a blue sky is dead easy to approximate in rasterization, and the bounced light isn't going to outshine it. Try some indoor areas lit by multiple light sources, throw in some reflective surfaces and the visual difference is massive. Also, in the last 20 years we've been trained to ignore what's in the reflections, because there's never anything useful to see in there. In ray traced reflections, you have an opportunity to see reflected things that you can't otherwise see on your screen, which can be an advantage.
For most games Rtx at this point isn’t much different from rasterized rendering. However for rasterization to look realistic they use lots of tricks like light/ reflection probes and bake almost all lights in scenes beforehand. What raytracing allows game devs to do is create dynamic scenes with dynamic objects that change all the time and still render them realistic without the need to pre bake those altered scenes after changing a thing. For most games this isn’t really necessary because most games consist of static scenes with a few dynamic objects. For visualizations or architectural applications raytracing is an amazing technology cause it allows users to interact with their “scenes” and see realistic results without having to pre bake that scene for hours or even days.
Current implementation of ray tracing does look good especially when developers add the full suit not just shadows and ambient occlusion then slap the ray tracing label on most people have no clue what that is. Real time path tracing will be the show stopper in terms of visuals but it will take another 5-10 years before the performance impact is negligible so RTX 7090Ti and beyond is when I would expect to see path tracing be fully viable for ( reflections, refractions, shadows, ambient occlusion, Global illumination, ) That being said Probe based and cube map rasterized lighting, reflections and shadows are getting very good and in some cases look similar to todays ray traced counterparts.
Cyberpunk is running 4k60 dlss without frame gen. A 5080 will run it easily. Rt performance is doubled per generation and i think that will continue for some time.
Good rasterisation can visually look better than bad ray tracing and vice versa, and in the end it's a relatively small feature for the huge price point that comes with it. And lighting just like any other part of the graphical side of the game can be heavily stylised, like I really disliked how RTX looked on the witcher because it felt like the contrast was taken away with RTX and the stronger more contrasting shadows on the raster version hid some of the imperfections in the textures and models.
I foujnd myself wrong multiple times with your tests. The crazy part is , I also use raytracing on all my games. I notice ray tracing more when I'm playing the game vs watching a video. But very often i find myself wondering "what's the difference?"
It's not just about FPS in MP, it's the trade-off against other settings in SP. For example - If you can run RT on at 1440, but RT off at 4k; you're going to choose 4k over RT. - If you can run most settings at "high" (eg postprocessing, texture resolution, volumetric effects, water effects, etc) with RT on, but "ultra" with RT off; you're going to chose maxing out everything else over RT. In other words, it doesn't just need graphics cards capable of running RT in some form, it needs graphics cards capable of running RT at 4k with everything maxed out. And at the moment that means only the 4090, and there's very few people who want to drop over £2k on a graphics card (and fit the big bastard in their case and feed its power requirements).
Ray Tracing is especially awesome in older, highly stylized games. In these situations, I think the lighting just really pops. When hardware gets gud enough to run raytracing in stylized VR environments at a high frame rate and there is haptic feedback, game over man.
Here is my opinion, I think ray tracing just outright looks better, but at the moment it has a long way to go (mainly in performance) before it would be worth it to actually enable in most games, I have a 3060 card and I mainly only enable it when I want to take pretty screenshots of things in games like Fortnite and Minecraft.
I do play with highest settings in fortnite with Lumen etc enabled but i leave hardware raytracing disabled. The difference for me between software and hardware raytracing is almost not noticeable anymore.
It's funny with the tests I chose rasterized every time. I guess I'm used to it. In some cases I even found it more natural looking. That being said when on a game it gives a little visual extra and it doesn't affect performance too much for my hardware I live it on. Like Doom Eternal for example, or Metro exodus. On Cyberpunk I never used it cause the game looks amazing already rasterized on HDR tv and the performance hit with RT is huge.
So here's the thing. In games with static environments, they are technically already using raytracing for GI. It's just that the rays are precomputed, and the baked into lightmaps. So visually it isn't going to be much different. Almost any dynamic lighting model will generally look, at best, only as good as a static lighting model. Most look slightly worse. And all dynamic lighting models run worse than their static counterparts.....provided nothing is moving. The difference is when things move. Static lighting severely limits how dynamic the world can be. The ground can't move. People moving through environments won't see the environment reflected back at them. Forget about a dynamic day/night cycle or destructible environments. But if the map is completely static anyway, then visually there will be virtually no difference. Pretty much just reflection accuracy. And when rasterized games do have dynamic environments, they usually look rather bad, or they have to do some really rough approximations to get anywhere close. They can't pre-render their shadows, so the shadows are generally fixed-blur lightmaps, vertex projections (Doom3 style), or tricks to try to make variable penumbras from what are still essentially basic point lights. But if you want lighting from an animated LED billboard? Good luck. Rasterization doesn't actually have a reasonable means of handling area lights. The closest approximations were extremely intensive. This also matters a lot in game development. The reason many rasterized games look as natural as they do today comes down to a lot of tricks the level designers themselves do. They place fake lights freaking everywhere to tweak this place to be brighter or that place to look like it has reflected light coming in. With raytracing, that just isn't nearly as necessary. If you place a glowing sign, it'll just make light. You don't have to place baked lights with shadows disabled to hide the fact that you are using point lights to emulate an areal light. You just place a thing with a glowing texture. This goes completely nuts if your world is precedurally generated, which is why Minecraft is such a good test bed for raytracing. Everything is procedurally generated, and everything can be broken. There is NO static lighting in minecraft. So adding raytracing, or ray marching, or voxel illumination, or any other method makes a HUGE difference in how the game looks. But if you're playing CS:GO where the maps are basically just static stone with zero interactivity, then yeah it'll be really hard to make raytracing look any better than what the level lightmap calculator already does.
I think raytracing will start to look good in games when developers begin to make them primarily _for_ raytracing instead of rasterization. The problem as I see it is that many gamedevs still think of raytracing as an element of visual flair or eye candy, instead of as a central part of the rendered image. And this makes sense, because no matter what any marketing department tells consumers, devs are going to give the most attention to settings that a majority of consumers can run, which means: rasterization. Unlike CG animated movies or TV shows, the technology created to control lighting, shadow, and visual contrast in a frame with raytracing are simply far more advanced than what we have for raytracing in the games industry today, and the people who touch that sort of lighting in movie and TV projects are far more experienced with it than gamedevs are. I think that once art and lighting departments can actually start designing with raytracing in mind _first,_ then we will begin to see scenes that look artistically nicer than their rasterized counterparts, and with even _less_ of a raytracing performance hit than we see now.
When ray tracing was released with the RTX 2000 series it was really bad. I had the RTX 2080 and then the RTX 2080 Super and games was lagging so much with ray tracing enabled and the visual difference was so small. I have had the RTX 4090 since launch and it is really worth it now. The difference is very noticeable and games finally have a high framerate even when ray tracing is at the max/ultra preset. I have tried Cyberpunk 2077 with path tracing too and it ran fine but the difference between psycho ray tracing and path tracing is not nearly as noticeable as going from rasterized to ray traced.
so my question is, how well does RT work in consoles? because that was one of the main selling points of the ps5 and the new xbox, if pcs can't handle RT what can the console users expect?
i did change the thumbnail a little after the fact, sry :). & the Tomb Raider test was backwards.
It’s funny because they was so close that I messed it up in the edit. Ig that’s kinda says something
Also I said “actually” so many times. I’ve been workin on that 😂
Backwards? Meaning?
Raytracing is pretty new and not really worth it right now but it can add a lot more to a game just by simply adding real-time reflection and shadows to them. And it can also probably function as real-time sound tracing for more immersive sound in the future.
@@aktheking9841He had more fps with rtx than without, there was a mistake. It's the opposite
I think on the Tomb Raider one you can tell by the shadows on her back. In raytracing, the shadow cast by the hair is sharper while the shadow formed by the shape of the back itself is softer. In rasterised mode, it all looks the same...
I tried RT. Depending on the game, it's sometimes quite easy to spot the difference but the performance loss is just not worth it.
Like 99% of time not worth it. Been playing spiderman on pc and there ray tracing makes a huge difference when you end up close to buildings or climbing on them you get functional reflections SSR or cubemapped or other rasterized method cant handle stuff like that.
I think people are being too hard on it cuz it makes them look cool.. not accusing that of you personally but I get that vibe alot from some people who are like 'it's a dumb meme' or whatever. My logic is comparing RT on/off to 2k and 4k pixels. I personally don't notice 4k over 2k that much unless I'm sitting really close to a relatively large monitor and scrutinize it. Furthermore, 4k hits your performance MORE than RT (for series 3x and 4x nvidea cards at least, which you will have if you want the RT most likely), and then the monitor you have is GOING to have worse refresh rate and stuff at 4k than 2k, unless you fork over an extra GRAND (which would enable you to get a graphics card that CAN handle RT at over 200fps at 2k or whatever) over a good 2k monitor.
Nobody says 4k is a meme, but I think 4k is much more meme-status than RT Imo.. when RT is actually implemented
on what graphic card u tried
fr
@@crescentmoon256 RTX 3080
raytracing is actually more usefull for artists because we won't have to fake stuff (that is hard) to make things look real. Raytracing will do that job for us and we will be able to focus other areas for improving art. So eventually, when it becomes fast enough, we will ditch rasterization.
@@SedatedSloth OP is talking about realtime graphics, AKA video games.
@@PringleSn1ffer RT in video games exist since at least 30 years, pre backed lighting. RTRT doesn't add anything compared to taditionnal methods because it's performance costs are so tremendous it's never worth it.
@@SedatedSloth @4R8YnTH3CH33F @attractivegd9531
@nevermore6954 All of you are on wrong track. As an environment artist we have to fake lot's of lighting in order to make things look real in a video game despite Baked RT lights. Baking alone does not give better results. GI, Reflections, Shadows, Subsurface and stuff will only look great in baked scenarios when stuff in games aren't moving. If you have ever played video games then you might have seen how many moving parts are there in a game that is why we artist need realtime RT so that we don't have to painstakingly fake lights for hours to make a single area of the game look real. Current raster methods sure are not realistic but in heat of the moment, you don't notice a difference between real time RT and raster, This is due to limited use of ray tracing again due to poor technology. In future when GPU's are more capable enough, we will ditch raster rendering.
@@attractivegd9531 Prebaked lightings limitations are numerous and quite obvious, which is why the industry has been moving away from it for years. VoxelGI solutions have pretty much been the standard for last gen consoles. Realtime raytracing is the future and although it is still out of reach for many gamers we are moving in the right direction.
The problem for RT being adopted by game developers instead of using rasterisation is not the artists, but the business model.
Imagine you're the CEO and you're asking your project managers how the developers are doing with the latest game.
They say they've ditched rasterisation and are solely using RT which has cut down development time/costs by about 20% because they don't have to waste so much time faking light sources with rasterisation. So far so good.
You ask if there are any downsides to this approach: Well only 20% of your potential market for that game can afford graphics cards powerful enough to run RT, so you're probably going to lose 80% of sales.
See the problem?
That's why RT being useful for artists is never going to gain traction in the gaming world while ever most of the market can't afford graphics cards that can run at max settings with RT on.
For me the problem with Raytracing is, that after a while you stop really looking at the graphics and envoirements and just focus on the game. It doesn't matter as much as people think it does.
Well I have a 4090 so might as well run RT.
@@Crecrosssame
the only setting i always hate is the shadow without rtx they are too glitchy when they are cast on character face
I agree. I tried to game in 4k or 2K with good graphics, and then I thought. Why... Do I need this? I was find with 1080p or 720p at great colors and performance!. I wasted my money :/
@@unkown34x33 I don't agree with the 2K part, but 4k is definitely a waste of money imo
One my biggest problems with ray tracing is how some scenarios make it terrible difficult to tell the difference but there is still a huge performance hit. Even with still images. Like walking somewhere outside with heavy clouds and no highly reflective surfaces nearby. There won't be much shadows and reflections to ray trace.
That's why I feel some devs tend to overdo reflections and make everything look shiny wet or chrome coated. Which ends up looking not natural at all. One of the best example is Hogwarts castle. Some floors look ridiculous with RT and I much prefer the rasterized reflections.
Rasterized have better reflections in that game than ray tracing.
It's kinda pathetic, honestly
Perfect reflections are easier/computationally cheaper to ray trace compared to rough or varied roughness reflections, so until we get faster hardware across the board, developers are going to focus on the cheaper effects.
Part of the issue with that is most devs don't implement fully raytraced environments, because it's just so damn expensive. Most of the time it's just raytraced shadows, MAYBE reflection and on the very rare occasion global illumination, and only then does it really start to make a difference, as rasterized shadows and reflections had already become quite good hacks (but the workflow for those hacks is much worse than doing raytracing.
Games should have to state the level of raytracing, instead of just "this game is raytracing enabled".
And what about GI? Especially in a dynamic scene?
Now don't tell me you don't even know what GI means
Most people can tell which one is raytraced by the FPS!
For the last 20 years, developers and graphics designers have mastered the approximation of many rendering effects with good-enough quality for real-time graphics. Most of the reflections can be done with portal rendering, planar or spherical mapping and even scree-space methods, where appropriate. Shadow maps have already been perfected in pretty much every engine out there. Dynamic GI is one area where real-time RT is bringing tangible benefits and game devs have chronically struggled to come with universal solutions using various light transfer hacks. For the time being, cost-effective and carefully tuned hybrid implementation of RT is the way to go.
This is the only correct answer
Planar can only be used on flat surfaces and is more expensive than raytracing unless it's used very sparingly. Cubemaps are incredibly inaccurate and can only approximate the general lighting conditions. Screenspace is full of artifacts, breakup, temporal instability, and can (by definition) only reflect objects on screen.
Reflections are by far the worst part of rasterised visuals. They're messy, full of artifacts, don't make logical sense, and are just terrible. Even games like forza still don't have cars reflecting themselves which makes everything 'glow' because internal reflections are impossible without the mess of ssr
When RT was new, it looked to me like the main benefit was that it was less work for game devs than having to use a bunch of those techniques, but until everyone can afford a decent raytracing GPU, it probably won't catch on
@@Daniel_WR_Hart it already is catching on tbh, it's been a slow start but with consoles being able to pull it off, everything is starting to use it.
Unreal engines entire lighting system has been completely overhauled from the ground up to use raytracing! That's the most popular game engine out there. It might take time for ue5 games to release but trust me, it's already caught on.
@@Daniel_WR_Hart this is the real answer. once GPU's that will be able to render RT for minimal performance costs be something most people own (this is going to take a lot of time, but NVIDIA and AMD had to start integrating stuff at some point for it to actually progress) games wont be developed WITHOUT ray tracing anymore. it will be the only way to make games because it will simply be easier and better
Bro the fact that the most powerful gaming gpu (4090) in the market, will currently need to use an upscaling method (dlss), just to run cyberpunk 2077 4k60fps rt just shows what kind of a hit to performance it actually is
It's going to need DLSS3 for 4K60 too in Cyberpunk with the extra RT.
You're making it sound like it's an issue that a 4090 struggles with RT at 4k 60 FPS but .. what proportion of people are actually trying to game at 4K? Lol
@@shamadooee bruh it doesn't matter if it's the 1% or the 10% or the god damn 50%, 4k displays been around since 2001, in 2015 you could make a 4k rig at 1000$ (without any dlss or that bullshit). The fact that billion dollar cooperations couldn't give us anything that is capable of running along side these 4k monitors at a proper price / wattage is a god damn disgrace.
Like bro even what i said about 4k ain't matter that much, ray tracing at 1080p is extremely taxing, even the rtx 3070 and 3080 struggles with mainstream games having rt at 1080p, it's so taxing and unoptimized that it makes me think wtf?
@@shamadooee Have been using a 4k screen for like 8+ years now. It was 350€ back then. Later I upgraded to one with a better panel for a bit more. So you are telling me people who buy a $1800 graphics card dont have a half decent monitor and it's not a problem that they can't run it at playable fps?
It would run worse if those pathtraced results would be done with raster. That doesnt scale great.
The fact that it runs in frames per second and not seconds per frame is a miracle and even industry experts didnt really think its possible even on a 4090.
One of the biggest advantages of RT doesn't even have anything to do with fidelity. It makes things significantly easier for artists when they don't have to hand-make all of the lighting and can use RT to let math decide how the scene should be lit, only having to worry about light sources and their properties.
Ray tracing can look incredible, depends of the implementation but... my main issue is that developers seem to have forgotten about their art style and instead rely too heavily on RTX, for example I've seen many recent games that use RTX for reflections to make bodies of water look pretty but when it's disabled, there will be no reflections whatsoever, not even a cubemap or anything or other games that use it for it's shadows but when RT it's disabled it will have really bad looking and jagged shadows that had no care or effort into their implementation.
It kinda shows how some devs rely too much on the technology and use it as a buzzword to sell morr copies
4:04 - That "mushroom shaped tree" in the center frame really captures my feeling on RTX right now. The shading on the left is just soooo much better, looks very Pixar-esque.
I think raytracing right now might kind of be like bloom back in ~2007. Meaning that it feels like games just "turn RTX on" blindly, and it ends up hurting the art direction more than anything else. Maybe after some time, more devs/artists will be able to utilize it more effectively and with more intention, but yea, as it stands right now I think RTX tends to hurt games more than it helps. Like even with the Witcher 3, there's something about how RTX is interacting with the textures, or bump maps, or something like that just sticks out to me as "not right".
I agree. Its kinda just tacked on, and not thought out in most games. However I wills say that Fortnite has a good benefit from it. That tree is just a bad comparison because of the change in the lighting direction. In the rasterization view, the lighting is coming from the right, highlighting the shading and depth. Where as in RTX, its coming from behind the player, hiding the shading behind the tree and making it look more flat.
Honestly this entire video was pretty bad at showing off what RTX is even for. Static, open environments are the worst place to showcase RTX. It is most beneficial with interiors, and dynamic environments, hence why I say Fortnite is a good choice for RTX. Lots of dynamic interiors spaces.
"RT is an overdone novelty" MFs when they step outside and see sunlight bouncing around (they have been inside a dark room for years) (/j)
Lols, this comment aged extremely poorly. Cyberpunk just released demos of their new overdrive raytracing setting and it looks absolutely spectacular. Guaranteed the 5000 series of nvidia cards will be able to do actual pathtracing reliably with dlss 3.0+ frame generation.
ua-cam.com/video/I-ORt8313Og/v-deo.html
@@nathandam6415 Even a 4090 right now is pretty reliable at 1440p. Can comfortable get in that 100-120 fps sweet spot without the game looking terrible with path tracing. With more optimization I'm sure that path tracing can be implemented well for other games and be playable even on a 4080 or 4070ti.
@@nathandam6415 ehhh it looks ok isn't that much better also that's pathtracing not raytracing
It almost looks like the difference between them is the contrast settings of the monitor , lol
Fr
keep a note how rockstar used raytracing as a marketing technique and did not even implement it fully into their game until few months later
I like RT. But I don't actively use it in every game that has it. It is nice to look at, and depending on the game it can make a huge difference in looks. But I feel like especially on modern games that utilize modern rasterized graphics and shaders, you often run into an area where the visual improvement simply doesn't justify the immense performance cost.
Some older titles such as Portal, Half-Life or Quake that got a RT-enabled port, RT can breathe a completely new life into the game, together with an immensely more convincing atmosphere despite the otherwise aged looks due to how simple lighting was constructed and calculated in older engines. They also have the available performance headroom to make RT play- and viable. IMO This is where RT really shines, and I hope that with the advent of NVidia Remix we can see a lot more of such examples.
Shame RTX Remix still isnt out.
awww poor lil peasant has a weak gpu and not a 4070 ti or up... 😂
Games like half-life, portal and quake don't use typical RT you see on modern games, but whole new "path tracing" based renderer that is more advanced than RT in modern games like Metro, CP77 Psycho etc.
Only modern game that supports similarly advanced renderer is CP77 on RT Overdrive mode
@@Navhkrin cp2077 overdrive mode is actual pathtracing lmao
@@Intelwinsbigly the remix runtime was just open sourced meaning it's coming out really really soon, so likely within the next month or so.
Doing a side by side for Cyberpunk 2077 would have been nice because of how nice RTX is done in that game. There's quite a few games that have great implementations of RTX that have quite the difference in fidelity. I loved RTX the first day I bought a 2080ti after upgrading from a GTX 1650 Super. Now that I have a 4080, I want all the RTX.
That performance hit still sucks though
Metro Exodus, Cyberpunk 2077, Witcher 3 Remastered, Control, Doom Eternal, Quake 2 RTX, Portal RTX, Resident Evil Village, Minecraft RTX etc. All these games are night and day difference with ray tracing on. I really like RTX and DLSS. My 2070 Super is showing it's age but I don't want to spend $1,200-1500 on a GPU. I used to be able to build an entire mid-high end PC for that hahaha. Eventually I will buy it though. Probably just wait for next gen refresh of 4000 series like I did with 2070 Super.
@@jonny-b4954 Prices do suck now, but hopefully the 4080/90 will be found cheaply on the used market in a year or two. I'm thinking about downgrading from 4k to 1440p and grabbing a better CPU just so I can run RTX games a bit smoother
@@Glubbdubdrib They're atrocious! That's how it goes though. Inflation and supply/demand but once you add in covid, companies have great cover to raise prices even more. I've been considering upgrading from 1080p to a 1440p monitor for years. I think 1440p really is the sweet spot. You're getting diminishing returns after 1440p and its a nice middle ground performance wise. I'm often shocked how much more FPS say a 13900k gets vs a Ryzen 3700x, for instance. Just insane. I'm probably switching back to Intel next upgrade.
Also Spiderman, spider man is one of the best cases where there's simply no other viable solution for what it does.
After turning settings down to match the FPS, the image always looks worse when going raytracing than if I didn't raytrace at all.
I had a tough time trying to find the difference in the witcher 3 test, now I understand the artistic part vs the raytraced one, sometimes the artistic one is better.
Yeah, I thought the non-raytraced one for witcher 3 was a lot better too.
I have only used ray tracing in a few games, and it was on a 2070 super so not exactly the best for it. But I think ray traced reflections are great, normal screen space reflections look good but it's annoying when they disappear as soon as you can't see the object being reflected. [I would still go for the higher frame rate at the moment though]
I couldn't give a shit about the shadows or lighting [ in most cases]. I usually prefer the art designed shadows to the ray traced ones. They might not be as realistic but they often look better to me. In that witcher 3 scene you showed I preferred the rasterized version but that could come down to time of day in that game.
@@AlexBarbu Well this is just straight up wrong.
@@AlexBarbu wrong, just look at masterpieces like forza horizon 5 at first then say something, the rasterization used in that game is almost the same as raytracing enabled
6:13 pls explain how u got more fps with raytracing here. Did you accidentally switch the videos?
Actually, rt cores do things that cuda cores has to do, thus improving fps when rt cores have good headroom
Open world single player games are perfect to play with RT on. Currently Metro, Cyberpunk, Far Cry 6, Witcher 3 are all games where RT is totally worth enabling.
Nah mean not really impresive at all. (though didnt try Witcher 3 yet.) Only felt a game changing difference with Spiderman.
Dying Light 2 is another great open world game to play with RT on. Especially the Global Illumination!
Don't forget about Control
I agree on most of these except for Far Cry 6. Their implementation is really quite aweful imho. The RTGI shader from "Pascal Gilcher" that is nothing more than a screen-space solution, doesn't cost more than 3 fps on a RTX 3070 and runs on any graphics card looks better than the implemented RT.
As someone who been gaming since the 90s and have gamed on every single platform, even when I was on tour in Iraq and Afghanistan 😂
I personally don’t care about Ray tracing, i feel like it would be a nice feature if it didn’t have such a performance hit and such a price hike. But as it stands to me it’s not all that.
Games look more and more beautiful every year, and honestly when the game is good and already looks so good then honestly I ain’t finna check for shadows or anything like that.
It's not worth the performance hit in actual gameplay. But i like it for games where i can slow down and appreciate the work put on the visuals of a game. Global illumination and RT lighting feel more like style choices in a lot of games on vs off. It's not necessarily about looking better per say. CP2077 in specific spots definitely has a more "cyberpunk" vibe with all the lights with rt lighting on the higher settings. But i prefer how screen space reflection look in the game over RT in a lot of places too. Problem with screen space is how they disappear if you change the angle of the view. It's like pop in for reflections, can break immersion if you see it. Sometimes the stylistics choices made by devs a lot of the times. The shadows in the Witcher may not be realistic but the look fits.
I much prefer the rt shadows myself. I think the raster ones look like crap and break immersion because it feels so off.
The rampart shadows' dithering and pop-in on Witcher 3 was what gave it away.
Path-traced Cyberpunk 2077, why not include it? It looks jaw-dropping awesome but yeah, the performance hit is massive and yet rtx 4060 can run it (with dlss frame generation) at 60fps. So we can already see how much better ray-tracing is compared to rasterisation.
Rasterization often uses screen space reflections which means that only objects in sight will get reflected to gain more Performance, shadows are just textures that get calculated in real time and then get overlayed onto the scene to fake the impression of shadows and in raytracing you shoot a ray from the camera Position into the scene and looks for intersection, shadows get calculated by shooting a second ray from the ray hitpont to the camera and checking if it hits something
That’s really a genius thing to do - to show scene with almost no shadows on screen in the game where the RT effect is shadows (SOTR)
P. S. RT is on the right, not left, wtf is that?
I prefer the rasterized versions of almost every game cause I just love how deep the shadows are. Idc if ray tracing is more realistic, that almost black shadow is so beautiful in some games.
Honestly I couldn't care less about ray tracing, all I want is a s*** ton of shadow maps and realistically rasterization can't do that well, but ray tracing can
I'm not even kidding, there are some game trailers out there that have the "RTX off / RTX on" comparisons, and I literally cannot see _any_ difference between them. This even though these trailers are supposed to specifically demonstrate the difference!
Of course then in games themselves it can be difficult to see where the "raytracing" is supposed to be happening. There are some games where it's extremely obvious and cool (such as Control), but then there are other games where it's a real game (hah!) of "spot the difference".
For example Deathloop, at least on the PS5, is supposed to have raytracing support. However, the _only_ effect that turning it on has is to tank the framerate so low as to make the game literally unplayable (and no, I'm not one of those snobs who thinks that 30 FPS is "unplayable". I'm not exaggerating here. Try it if you don't believe me. The game literally cannot be played when "raytracing" mode is turned on. The input lag is astonishingly bad. It would be bad even for some slow-paced turn-based tactics game, it makes it literally impossible for a fast-paced first-person shooter.)
And the kicker is: There literally is no visual difference between the normal and raytraced modes. Again, I'm not exaggerating. I have taken screenshots of the same locations in both modes, locations that ought to have some differences in them (significant reflections, shadows, lighting, etc) and switching back-and-forth between the screenshots shows literally no difference. Clearly the game is doing _something_ because the framerate is so horrendous in the "raytracing" mode, but visually there is literally no difference.
Such a scam. (And the worst thing is that I bought the game precisely because of the alleged raytracing support, which is rare on the PS5.)
As a man of culture, I prefer FPS over quality.
Main benefits of ray tracing are mirror effects like water reflecting, and actual mirrors, etc. That's where I can really notice it, especially once you know what rasterization limitations are.
Lmao no
Raytracing is a better rendering method which basically simulates light rays to create physically accurate results
While rasterization just draws some shape on screen and then fills colour like a small child
I'm dying to see a TRON reboot multiplayer, Raytracing would seem the perfect choice but it would be the worst game to have it. I want games to go oldschool roots not film adaptations like Mandeloran
I Think RT is a miss , watching the DF guys looking at the proper path tracing on Cyberpunk it just shows the current RT is a half step its just too neutered down for hardware. i think full proper path tracing is the future but because it makes development faster not really because it looks better.
and at the end of the day the big issue is it doesnt make any difference. proper 3dfx cards enabled several genres of games , sure you could do fps with sprites like doom, but walking into the world of everquest a fully realised 3d world.. that was the dawn of a new epoch. but path tracing is just not going to do that.
I'm a genius according to you. I spotted every single example of raytracing. But I'm very detail oriented and very picky, so I would obviously be able to tell in most scenarios.
Edit: I should note, I also think ray tracing is a gimmick rn. It's not usable until we have 4090 raytracing performance on a $300USD Card.
Raytracing is great for movies when realism is desired. In games, there is no need for it, you are trying to kill the enemy not admire how realistic your character's butt looks like.
Not all games are about killing the enemy
@@SalimShahdiOff I know. What I meant is that they should spend more time making games that are fun to play instead of games that look super realistic. Animal Well is an excellent game, while Hellblade is basically a movie.
As an illustrator, i can easily tell which side is the the raytraced one and RT GI/AO usually looks way better than rasterization GI/AO especially in open world title.
If you are playing a single player game with 3080ti plus tier gpu then you should try the ray tracing IMO.
Still no reason to use it in competitive games tho until we can get a 3090 tier mid range(50/60) gpu.
The 5000 series is rumored to get a significant performance boost, so I'm thinking 5060 ~= 4070 ~= 3090 will be in a couple years.
I'm hoping 5060 will at least equal 4080 or 4070ti performance because the 3060 performed like a 2080, the 2060 performed like a 1080, the 1060 performed better than a 980, etc.@@joshuagoldshteyn8651
Well, today I learned that I don't give a shit about RT Global Illumination as I picked the raster version as more visually pleasing for 2 out of 3 lol. I think ray tracing for reflections is really, really cool and way more obvious / impactful. Agree that it's not worth caring about for now with games as the perfmance cost is way, way too high for the vast majority of users. Give me the frames instead.
I think once RT has completely taken over rasterized rendering (which might be about 10-15 years into the future). A lot of people will become nostalgic toward the hand crafted and artistic deliberate nature of rasterized graphics... You can actually tell that this is already happening when people play new games that are developed in old school game engines, as developers are painstakingly deliberate with everything they do in level design. Many people feel, myself included, that a lot of the artistic talent is lost in games when we start to rely on large photogrammetry scanned libraries like Quixel, in which assets are copy and pasted from a library rather than being hand crafted. Same goes for lighting, Just have a look at Ion Fury and how deliberate the lighting is. designers even construct the map around the lighting, putting it into the actual vectors of the map. or let say Wrath: Aeon of Ruin which uses the Quake 1 engine. It looks so moody, and brilliant, however it still uses GPU and game engine tech from 1996. Today, RT is a nice curiosity, and great in some places, but I am afraid it will become as bland as the current AAA gaming market.
I kinda disagree. There is a lot of stylization that can be done with RT. Just look at the CG movie industry and the things being done there. I do bet there will be some nostalgia as there always is for stuff like this but I don't think it will be bland. There is a ton of stuff that can be done with Ray tracing but it will rely on more artist driven stuff like textures, materials, and models. Just because you go ray traced doesn't mean you have to strictly rely on photo realism. Ray tracing can be perfectly suited for more cartoony or stylized stuff as well
@@crestofhonor2349 Yeah but you still wont have flare of old. Its like the CG vs non CG. He is 100% right on old game thing old game style is extremely popular at this point.
@@crestofhonor2349 I don't understand why would anyone use RT, it looks unrealistic at all. You need path-tracing to be realistic. Before path-tracing is implemented, no-rt nearly always look more realistic and more performance + efficient. Those shiny and reflective shit, you don't see them in real life. Search a river/sea picture in google, and compare to ray-traced fake reflection, how dumb it is.
@@Ay-xq7mjdo you realize that ray tracing came before rasterization? Do your research
@@christianwilliam1167 Are you a idiot? Im talking about CG VS NON CG and old games vs not old games in terms of aesthetics. A ray traced really old game shouldnt look like a really old rasterized game due to how basic rasterized effects are. So you lose something you gain something. Same way with movies CG vs practical effects. Also by old i mean like quake, doom, etc. However now even ones going for a realistic art style have the same thing going on where something is lost in the Last of Us 1 remaster aesthetically vs the original ps3/ps4 version. Like look at Ultrakill and Dusk. Also in this video and other rare instances ray tracing makes modern games look worse because art is as important for visuals as the tech. Also read what og commenter said first couple sentences.
is the reveal at 5:48 wrong? it feels like rasterised is on the left and raytraced is on the right
Main benefit of ray tracing is it takes less work on the development side to make things look good.
But since most people aren't using raytracing it doesn't really matter cause they'll still make it work with rasterized rendering anyways.
With ray tracing I can just throw a few lights in the scene and bam done, but rasterizing it's a bit more work if you want it to look good (a lot of developers don't put the work in to make it look good. Rasterizing renders are pretty decent out of the box at this point so people don't put in the extra effort, and most consumers don't seem to notice. But I notice, I know you used the default settings in unreal engine 4.)
from what I heard, if you go full ray-tracing, it also makes it easier to develop the game, because making a rasterized game look good is a lot of work. So in the future we might start see titles that are ray-tracing only.
i hope that never happens😅 cuz cards like rtx 3050 3060 already struggle at getting good fps with rt at 1080p, budget gamers will be screwed
RT only games will only be possible when the entire market has GPUs capable enough to do RT optimally. Or else, even if the game is easier to develop due to RT, it won't sell well because nobody won't buy it cause nobody can't run it.
Got a 4080, I don't activate RTX, the frame rate loss is not worth it. Reflexions and such have been pushed to a very high level with rasterrization.
This isn't correct. Reflections are actually worse for rasterization. The way reflections are often done for puddles and water sources on the ground rely on flipping the image around 180 degrees and basically filling in the reflection that way. It's very flawed. It doesn't show objects that aren't on the screen- if you looked down on a reflective surface and the view of trees disappeared from your line of sight their reflection would disappear. Secondly, reflections in transparent objects like glass also suffer greatly as rasterization relies on inaccurate cubemaps as an approximation and do not appropriately represent what is being reflected.
Honestly ... i like the idea of raytracing
But yeah it doesnt do too much to me
I remember switching it on and off in cyberpunk and didnt notice on a major scale
Oh yeah? I bought it 2 months ago but have not played it yet. Anyways, what GPU are you playing it with? Just curious.
@@alals6794 rx 6650 xt
My graphics card isn't really built for ray tracing but when I've seen it on a PS5 it's so little I can't notice it
Had problems before but that was because my CPU was pretty old got a newer one and loving cyberpunk
Do have some frame dips in the DLC area but other then that still looks amazing to be honest
Been playing some god of war and it dips in the main area with the gates
.... kinda understandable though because its rendering so much and its a huge area
Considering I've seen the ps5 version and the same with cyberpunk I can't notice ray tracing to well
(turns out the PS5 has raytracing although I don't know if performance mode affects it)
Did tell my partner to put it on quality mode and she couldn't do it xD
Once you go 60 frames you can't go back to 30
The thing is too... games are art. Art is about illusion. Some of the most dramatic and beautiful scenes and moments in gaming, are built around heavy use of illusion; sky boxes with pre-rendered backgrounds, levels intelligently culling detail so that they can optimize performance but imply far more than is really there, tricking the imagination. Just look at some of the crazy stuff they did to make Dead Space 2 work, for example, with entire level cityscapes appearing or disappearing in the background...
This is the real artistry of gaming. To take what isn't even remotely real, and make the player think it's real, and to do so in an efficient way.
Remember when Star Citizen was proud of the fact that it had re-invented the way FPS games handle the camera, and had stuck a camera in the character's actual head, and then discovered that the character's movements were chaotic and unpleasant to look at, so they created a new way to stabilize the camera in the character's head so that it looked... uh... like the old FPS cameras did?
That's RTX. Flinging a lot of processing power at a problem that was already solved by the artist.
I'm playing Metal Gear Solid 1 at the moment, the PS1 version emulated. I'm not doing anything fancy to the visuals; I have a CRT filter on, that's about it. You know what really strikes me about the game? The lighting is beautiful.
Oh, I can see that it's not "real" lighting; if I were to guess, I'd say most of the lighting is baked into the textures themselves. But the choice of colours, the striking way the game uses neutral shades of grey and blue and green to somehow accentuate what matters, and make things stand out, even when the colour palette is so subtle? Beautiful work. Distinctive, and memorable. And very, very deliberate.
What is it, exactly, that RTX offers... except the automation of that which artistry had already conquered?
And through that automation, are we not in danger of losing that deliberate artistry? In the Witcher 3 example - the raytraced version demonstrated that, if we're going by "realistic" global illumination, the reflective qualities of something simple as pale stone... results in the lighting getting washed out; no more high contrast shadows, serving to draw attention to the player's movements through the world... just a whole lot of glare. And sure, that glare is realistic.
But games aren't meant to just be "realistic". The lighting of a scene is supposed to be like the lighting of a movie, or a painting; it's supposed to serve specific artistic purposes, not just capture how things should realistically look.
2:33 To clarify, raytracing usually isn't rays from the light sources, but actually done in reverse, where for each pixel on screen, a ray is traced from the camera/eye out to those points and bounced until they either hit a light source or don't, and then propagate that information back down the path. This drastically cuts down on costs since you only trace what would actually be visible.
But, it isn't 100% physically accurate, though close enough. When you want high physical accuracy, that's when "path tracing" comes in, which is usually the term used when the tracing will be happening from the light sources. This ensures lighting even from far distant away sources is bounced and scattered all over the objects, but is quite expensive.
Cyberpunk just implemented pathtracing in their update today which is exclusive to 40 series cards and basically requires a 4090 running the game at a very low resolution with the newest DLSS upscaling to get acceptable framerates haha
RT only increase the visuals 15% and decrease 40%+ performance on any card it's really not worth it for me only tho but I'd still wait for better tech to consider if it's a ok
Hmm on the Difference one, I actually liked the Trees of the Fortnite comparison better on the rasterized one, because they had more shadows in them, which to me made them look more realistic and I choose that one to be the RTX one, which I guessed wrong.
Hmm, I guessed wrong on the Witcher one as well, I just like dark shadows too much haha.
On the Tomb Raider one I was right, by your comment, I did spot the less pixelated shadows.
Apart from Fortnite, I barely noticed a difference and I liked the Witcher 3 rasterized image better for some reason. I don't know ray tracing needs another 2 or 3 generations of gpus to actually be worth investing into it as a gamer. As a producer it is a GODSEND I have no doubt.
Was the fps count on the toomraider one reversed as the raytraced is shown as a higher fps number?
Guessed all 3 correctly, but I'm pretty sure the first 2 could at least fake the global illumination without raytracing, while on SotTR it was mainly the softness of the shadows, which can be done rasterized as well.
The only game which shows off ray tracing nicely is Minecraft.
Only good AAA RT game is Metro Exodus Enhanced edition, having semi-photoreal lighting while also having better or on par performance as the rasterized version. The problem is that developers just don't spend enough time optimizing their game/engine. Recent examples being hogwarts legacy, dead space, atomic hearts - unreal engines lumen is just mediocre reflections while the goal should be RT GI.
Eh RT on Atomic Heart is out? I do agree with the Metro statement, it runs beautifully for a fully ray-traced game.
@@MAzK001 I was more so talking about game optimization in general, I don't think the RT is out for that game, and hope it never will be, as I sadly made a clown of myself by preordering that trash heap of a game.
@@3rd.world.eliteAJ Ah i see. What GPU are you running? It runs quite well on my 7900XTX at native 3840 x 1600.
@@MAzK001 Depends on when you played it, I played it day 1 and it ran quite terribly on my 3060ti at 1440p, I even tried running on low settings but it just kept stuttering like most unreal games with shader caching issues.. And I'm on a 5600x + 16gb ram w a nvme SSD, so my system isn't the problem.
Furthermore, my gripe with the game wasn't really the performance, as I completed it with these issues, but rather that the game is a 5/10 for me at best.
@@3rd.world.eliteAJ ah I see, I played it quite recently and it runs quite well on max settings. Perhaps it's been fixed then! But yeah I haven't finished the game but it does feel rather tedious so far.
I've been looking into this for like 5 minutes and I could guess all your tests easily. It's pretty obvious when it's on or not and what changes. Whether it's worth its hype or worth it I have no idea. But it clearly does things.
I do love ray tracing. It's a technlogy that's super interesting and will be the future of in game lighting for 3D games going into the future. It's just about how it's implemented. Normally it can solve lots of issues with games and lighting today that come with ambient occlusion, shadows, global illumination, and reflections that standard rasterization. Now whether I turn all the features on is up to me on a game by game basis. Cyberpunk and Metro Exodus Enhanced Edition will always be a great example of ray tracing and it's benefits. Once UE5 finally starts getting major releases in games ray tracing will become even more prevalent because Lumen can utilize ray tracing hardware to improve the quality of what they are doing and can apply to their Virtual shadow maps, Global Illumination and Reflections. Ray tracing is also well suited for both stylized graphics and photo realism assuming the artist can do it. Yes we are far away from fully ray or path traced games due to GPU performance in ray tracing but every new generation we inch closer and closer. I assume it will truly take over once the PS6 and Xbox whatever come out. It has a lot of potential but many people just don't know what really any graphical feature is, just whether or not it has a large performance impact and whether it's worth it.
the thing about ray tracing is it's one of those technologies that only get the developers get hot and bothered over. most consumers don't understand what it is and they can't tell much diference like they could with say physX for example. so many think it will die out like physX did. mean while game artist often work between the game industry and the 3d movie industry so they are fully aware of what ray tracing is and it's benefits both on the visual level as well as the work flow level. it's been around since the begining of 3d technology so they are very familiar with transitioning to it. and that is why it WON'T fade away like physX did. in 3d movies it has been the "go to" lighting system from the get go , from terminator 2 to toy story. Ray tracing is the history of 3d rendering which is why it will be the future of game rendering going forward now that we actually got hardware that can process it in real time frame rates.
I agree@@DenverStarkey
I think you swapped the raytraced vs rasterized for Tomb Raider. Raytracing should make shadows softer, not sharper, and it shouldn't increase performance.
I said it once, RTX is still in its infancy, it impacts performance of games quite a bit that said I wouldn't use ray tracing unless I'm curious to see how it looks in a game that has it for a awhile but then just turn it off. RTX looks beautiful but I wouldn't use it unless I see see higher performance with it.
NVidia believes in pushing RTX over DLSS when DLSS is a godsend and they'd rather give you the shiny over the performance lol
Do they though? DLSS has made some pretty huge leaps, but the devs have to actually implement it. Also, while Nvidia is ahead in RT performance, RT can be used on any brand of card, while you'll need and Nvidia card to use DLSS.
@@wiremesh2 True, But as for RT I honestly don't think we are quite there yet (for gaming anyways); Yet RT has been around for quite awhile it is implemented in CGI films like pixar films long before we knew what it was or even heard of it. I mean give it a few more years in the oven and RTX will become the new standard that NVidia is shooting for. I say they're choosing RTX over DLSS because that's exactly what they are trying to do, if you haven't listened or watched most tech tubers it sounds to me they are pushing RTX more yet they give use cards with not enough VRAM to use with it unless it's an 80's or 90's series card, they need to do what AMD is and include 10GB-12GB of VRAM if they truly want to push RTX onto us. After all Ray Tracing is their primary marketing atm.
@@blckmlr7573 I guess you're right in terms of marketing. I should have noticed, considering how many people use the term "RTX" for ray tracing, when that's specifically an Nvidia term. It's become ubiquitous.
3:54 that's something I saw on a video, a guy was taking about how RTX "was bad" on RE4 because he cannot see a reflection on the water compared to rasterization, and I make a point about that "no reflection" was real because there's no way that light realistically reflects on the water on that angle, sometimes we prefer more those "game fakes" nonrealistic points on light that shines and make it more brighter. 8:52 As you said before on single player games is not so important to have 120 FPS, and Marvel's Spiderman is a really god example on Ray Tracing my FPS slowdown from 110FPS to 65 on RTX, at the Beginning the reflections are great and something you notice, but after some hours I disable RTX to have lower consumption and is not a that huge changing, I prefer performance.
I prefer the one that doesn't need DLSS or FSR to run properly so yeah Raytracing still has a long way to go at least a long way to go before we can have Raytracing without having to sell a kidney.
I dont know what ray tracing is, and I dont care either. What I've seen so far is that glass is WAYYYYY too shiny.
Dunno if I am the crazy one but the only comparison which I got correct was Fortnite one. I assumed that RTX ON in Tomb raider and Witcher was the other way around. The interesting thing is that I also compared which image looked better to me and in all 3 of those cases I preferred the side that turned out to be RTX OFF... Wild.
Try Amid Evil you can really see the difference between ray tracing and rasterisation. Game looks fantastic with ray tracing. I know I'm the odd one out in that i really like ray tracing and having RT did affect my latest PC build certainly to my detriment. Wanted to downsize from full atx build with 3070ti and 21:9 monitor to something way smaller and decided to go full team blue with arc a750 and ray tracing was a factor in that vs a better card from AMD for the same price or cheaper. I will say RT performance is all right on the A750 on less demanding games like Amid Evil.
I've tried Amid Evil, game looks fine either way, though for me the non-RT one looks slightly better.
RT actually shines for the smaller simpler games. For AAAs it's too HW expensive yet.
I think the end point of RT is just as another notch on the quality settings. Like, as an indie dev who works in unreal, I don't see a lot of people I know using RT for their projects. Lumen can be used without RT, and still produce great results. The RT just makes things like reflections more accurate, but Lumen does plenty of other things like allowing for emissive materials to cast light, and that doesn't require hardware RT, i.e. Nvidia or AMD path tracing solvers. Godot also has a novel, similar lighting system called SDFGI, or Signed Distance Field Global Illumination, which is also not hardware dependent. Similar to a point you make, rastering is already so good that in a lot of cases RT is just serving as the equivalent of a new notch on the shadow and reflection quality sliders.
Right now RT is like Hairworks was back in the day. It's a marketing gimmick, where Nvidia can pump a lot of resources into partnering with a game to show off the tech to push their cards. It's not a reasonable solution, since it's pure brute force with a technique that just isn't suited for real time on consumer hardware to begin with, and as a result AMD and Nvidia are developing stuff to side step its hit. What I see as more likely than stuff like DLSS persisting as a necessary evil of RT, is simply engines getting better at just doing that work on their end (like with Lumen), so they don't have to worry about it. Because at the end of the day, the best solution for the devs in terms of accessibility is going to be making sure people have access to the stuff they show off at keynotes and stuff like the VGAs. Making the game equally performant for everything is significantly more desirable as a dev than relying on someone else to make sure you don't exist in a weird edge case where their stuff doesn't play nice with yours - a thing we've already seen a bunch of with DLSS and FSR.
Beyond that, at some point Nvidia and AMD will wear out the top end of the market, and there will simply not be more people willing to upgrade if they have a 4090 or 5090 that's still running games well because the underlying tech has matured. Which is something I don't think they're really counting on with DLSS and FSR in the mix. If anyone cracks the code on a better method of efficiently faking GI even closer to RT, which I think is definitely within the realm of possibility, then there's a good chance that DLSS and FSR create some very nasty clashes between the companies and their consumers as support is dropped to try to maintain the high end. Either that or Nvidia and AMD eat a boatload of losses. And that's before you consider issues with how the AAA space operates as businesses in terms of their ability to sustain these massive Live services at the top of the industry and whether or not indies that already rely more on stylization or just designs that don't lean on fidelity even care at all about RT as a feature. My money is on the top falling off, the indies not caring (because I'm surrounded by them all day), and the ability of all the code wizards out there to negate their reliance on hardware.
Software lumen still is rt. Hardware lumen is just more accurate and more expensive. Your statement that rt is not meant for realtime is simply wrong. Rasterization has its limits and we are right on its limit. There already are situations were rt is performing better and it scales way better. If you want to increase graphics quality, reasterization can only go so far.
I just guessed based on which look I preferred, assuming that was ray tracing, and I was surprised to find that in my opinion, rasterized look better, especially in the witcher 3
The Witcher 3 seems wrong? The shadows are way too sharp, I don't know but other raytraced games have softer shadows because that's how it is in real life.
@@dhgmrz17 the witcher uses RTGI & RT reflections, not shadows afaik (so both use oldschool shadowmaps). The difference is very obvious in places where skylighting doesn't physically reach. In the rasterised scene skylighting is applied to places where it shouldn't (like an overhanging pass or an archway). It'll have this typical blue/gray hue to it.
@@MLWJ1993 A lot of people are also very used to the way graphics look in rasterization even though it actually looks weird when you get aware of the difference. Like I can no longer unsee that a lot of places in The Witcher 3 are lit like they are in direct sunlight even when they are not, but people I find often tend to not really compare how games look to real life but to other games. A lot of people for example complaining about surfaces being overly reflective with ray traced reflections would be surprised to see how reflective a lot of surfaces are when you actually pay attention to how light bounces in an environment. There are also some indirect benefits to Ray-Tracing that are maybe not clear just by looking at comparisons of games not made with it at there core, but a certain things like real time global illumination you get from Ray-Tracing actually does great things for allowing artists to more quickly iterate over scenes to make the lighting actually look good. I should also say just because a graphical features allows stuff to more accurately approximate real life doesn't mean it can't be used in more stylized art. Just because things get stylized doesn't mean that people don't care about things like anatomy, lighting etc. Those are all very important things even in stylization, so having access to easier to iterate upon and more realistic lighting methods does a lot for everyone not just people trying to make hyper realistic shit.
Raytracing is so useless. You get 10% better visuals but 50% your performance. In some games like Cyberpunk RT literally kills your GPU. My 3070 runs 70 fps on ULTRA 1440p Native but RT gives me 18 fps. Its ridicilous.
Holy shit.....that destroys your GPU and the 3070 is beefy..... I bought cyberpung 2 months ago but have not played it yet. Anyways, I'll just skpt the RT cuz all I got is an Intel Arc 770 16GB, lol
I don't get it, if I could choose to enable raytracing would do it. It looks so beautiful for single player games if you can have a taste of the future just do it. I mean that is one of the reasons people moved from consoles back then.
rasterized looks so much better what
Not worth it, notice how even NVIDIA realized that Ray Tracing even after 5 years of them pushing it, is still extremely demanding and pretty much useless, which in turn forced their hand into borrowing Frame Interpolation which is something that already existed FOR YEARS and call it DLSS 3 - Frame Generation to magically say they now can double your FPS, which is a lie and if i remember correctly, their Frame Generation is dependent of NVIDIA Reflex which in turn is something AMD introduced first into the market with their RDNA 1 - Anti-Lag, NVIDIA Reflex only arrived with Ampere, so yeah, thanks AMD for your Frame Generation not multiplying your latency by 4x.
So much ai tech to just give u enough frames. Yeah it’s goofy rn
Simplest way to describe RT is.
A occlusion test between ray and primitive.
Rasterization solves occlusion for primitive rotated to view frustum.
Advantage becomes from ability to easily sample world from any location to any direction. (In rasterization this is just a not good idea.)
How you try to solve rendering equation with these tools is where the differences become obvious.
I think ray tracing is the future of illumination in games. Perhaps specific exceptions for artistic reasons.
However, GPUs in their current state aren't good enough for it (real time gaming applications). Perhaps when the ##50 cards match the 3080 in fully path traced scenarios, but even then the 3080 can't lock 1080 60fps games like portal RTX without DLSS.
Yeah, ray tracing just isn't really worth it right now, since 80% of the graphic cards can't handle it
Ray Tracing is incredible. But it needs performance updates. Not sure what kind of magic they’re gonna do to make that happen, but things only ever get better. So we’ll see how this video holds up 10 years from now! Great video btw!
So in conclusion get an AMD card
Pretty much. Yeah, raytracing and Cuda acceleration are the only main downsides
I would like to point out that most RT-enabled games are built to support both rasterizaton and ray tracing. This means a lot of software compromise. Look at metro exodus enhanced edition that was built to purely support RT. You get both visuals and performance. It's a considerable software limitation.
I don't think it's a software limitation as much as it's an art direction and time limitation. The issue is that the art directors, QA, and others all have to review and design every part of the game's lighting in every scene TWICE. Once with traditional shadow maps, probe GI, reflections, etc. Then another time with each RT effect enabled.
@@bananaboy482 it is increased work, sure. I'll agree. But if you look at it most games that we take about when RT comes to the picture implement RT over rasterized techniques so it's playable on all hardware. Even if your using RT your wasting computation on preliminary calculations for rasterized lighting.
Again, I'm gonna talk about metro exodus as my example, it has some pretty heavy RT (RTGI with infinite bounces, most other games just have reflections, shadows and lighting with 3 bounces.) RT on the gold edition is about as performant as youd expect which is to say, limited light bounces and low fps. Enhanced edition has its entire raster lighting ripped out and a pure RT implementation (with means you can’t run it on non-RT capable hardware.) The thing is this RT-only optimized approach runs the aforementioned heavy duty RT effects with performance that equals (if not exceeds by 1 or 2%) the raster version of the game.
Considering game dev times, must of our so called RT-games are really raster games with RT slapped on top. That said, not everyone has the time or money to build an RT-only edition (there's a reason Meteo Exodus enhanced edition is the only game I know to have done this,) so until game engines embrace optimized RT-first lighting (or at least swapable lighting systems instead of overlapping lighting,) I'm not holding my breath for performance RT.
P.S. I just wanted to put this out because I'm tired of hearing people say there's no difference (because the RT implementation is skin deep) or asking if it's worth the performance hit (right now, it depends on the game, but that's an optimization problem, not a raytracing one.)
P.P.S. Typed this on my phone, to lazy to spellcheck
@@shadyti5 I don't think you understand what you're talking about. You're not wasting preliminary calculations on rasterized lighting as the game is still inherently rasterized. It's just using a lighting cache that's computed using triangle ray tracing. Also Metro Exodus enhanced edition runs near identical to the Metro Exodus original with RTX GI probes. The only difference is in the implementation of the lighting calculations using a surface cache instead of augmenting the original probe based lighting. In your words, Metro Exodus enhanced is exactly a traditional rasterized game with RT effects layered on top. It would be stupid to do it otherwise and have full ray traced direct lighting like something like Portal with RTX. It doesn't have to do with the developer time or optimization, building a BVH on the CPU and doing triangle ray tracing is always going to be very expensive without dedicated hardware acceleration, and even then we can see that that's very difficult. The improvements in ray tracing performance from the 20 series to the 40 series is almost none, they just added more cache to help speed it up.
4:48 - On the left screen, everthing is so bright, has less shadow has less personality, rasterized in dark scene always better, however is there any bright sunlight reflection scene RTX is better
How to tell? RT is usually brighter, glossy, and at times not very realistic looking. Is it needed? Probably not at this time since they need to improve RT first. I feel like sometimes RT is just over doing the reflection. When it was first announced, to me it was just a sales pitch to get people to buy. They needed to bring something new to the table or their gpu will be boring compared to their rival. But RT has always existed in the past, it just that no one really cared to make a big deal out of it till now.
5:45 I picked left being raytraced because of the lighting on the distal zone of the scene.
It's more rewarding to my eyes, and so it would have been more than justifiable for me to turn it on if not for how taxing raytracing is on performance.
In the tomb raider scene raytracing has more frames then rasterisation, i think you made a mistake 😅
All the people who think the RT Witcher clip doesn't look right need to go look around during a low to no cloud day outside. The RT clip is believable, the other one looks fake because light doesn't just go poof and disappear like that. It bounces around and diffuses in the atmosphere. I've never been outside during a sunny day and been suddenly in the dark because of a shadow (though a couple full eclipses came close, but that's a pretty extreme shadow there).
The bulk of gamers are mostly maximizing a frames vs monetary cost function. RT increases monetary cost and decreases frames. It should be no surprise it isn't wildly popular.
Sure there are gamers who have extra money to spend and play mostly single-player games with high-fidelity graphics. But that pool of customers isn't going to be the majority by a longshot.
As you say, RT is only ever going to hit the mainstream when it can compete on price and performance with rasterization, which is going to be a while.
I'll care about ray tracing when it becomes a mainstream feature in games, which will start to happen when we get 3080-level ray tracing in a $300 entry-level card with at least 12 GB of VRAM. Originally, I figured that would be with the 5060 in 2025, but the way prices keep getting jacked up... Maybe by 2030.
Please add more raytracing tests. I got the Witcher wrong. The takeaway seems to be that raytracing isnt bad , but rather rasterization is really advanced and come a long way.
One HUGE advantage RT has over rasterization on reflections is obviously that RT does not depend on screen space so there's no weird missing reflections when the camera is not at a straight angle from the reflective surface, another advantage should be global illumination but Unreal developers also did an incredible job at emulating that with Lumen
Note: I actually hate that nobody knows what RT is and they think is just fancy graphics with tons of filters on. Like all those content creators making "RTX ON" game remakes or literally just putting nvidia filters on top of genshin impact and calling it RTX or shit like that
Edit: I almost forgot about colored shadows and soft shadows but nobody seems to utilize those with RTX
raytracing is a great tech for companies making games to save time, though the reason its not much better then rasterization is the fact that game developers have gotten so good at using it to make realistic lighting that it doesnt make that much of a difference, but in the future it will make creating games much faster without having to worry about how the lighting is going to be.
Fortnite reflection raytracing is still very screen-spacey. You can still see the pickaxe reflecting in the water despite being closer to the camera.
It's very strange that many "ray tracing" graphics modes only effect shadows and lighting but don't effect reflections at all, whereas some games only focus on reflections.
If you only play games, ray tracing is pretty cool, but not necessary.
If you work with graphics, such as making video games, offline/real-time rendering, and movies as such, ray tracing-capable cards is a MUST.
The only real difference I really enjoy with RT is the indirect lighting. So with games that have RT, I usually either have just the lighting on or if it doesn't allow for individual parts of RT to be toggle, I just turn it off.
Showing ray tracing examples outdoors isn't a very fair example, because a blue sky is dead easy to approximate in rasterization, and the bounced light isn't going to outshine it.
Try some indoor areas lit by multiple light sources, throw in some reflective surfaces and the visual difference is massive.
Also, in the last 20 years we've been trained to ignore what's in the reflections, because there's never anything useful to see in there. In ray traced reflections, you have an opportunity to see reflected things that you can't otherwise see on your screen, which can be an advantage.
What's the background song music, it's super familiar to me
DDLC?
@@vextakes Thanks 👍🙏
For most games Rtx at this point isn’t much different from rasterized rendering. However for rasterization to look realistic they use lots of tricks like light/ reflection probes and bake almost all lights in scenes beforehand. What raytracing allows game devs to do is create dynamic scenes with dynamic objects that change all the time and still render them realistic without the need to pre bake those altered scenes after changing a thing. For most games this isn’t really necessary because most games consist of static scenes with a few dynamic objects. For visualizations or architectural applications raytracing is an amazing technology cause it allows users to interact with their “scenes” and see realistic results without having to pre bake that scene for hours or even days.
Current implementation of ray tracing does look good especially when developers add the full suit not just shadows and ambient occlusion then slap the ray tracing label on most people have no clue what that is.
Real time path tracing will be the show stopper in terms of visuals but it will take another 5-10 years before the performance impact is negligible so RTX 7090Ti and beyond is when I would expect to see path tracing be fully viable for ( reflections, refractions, shadows, ambient occlusion, Global illumination, )
That being said Probe based and cube map rasterized lighting, reflections and shadows are getting very good and in some cases look similar to todays ray traced counterparts.
Cyberpunk is running 4k60 dlss without frame gen. A 5080 will run it easily. Rt performance is doubled per generation and i think that will continue for some time.
You should have tried metro exodus enhanced edition.
Good rasterisation can visually look better than bad ray tracing and vice versa, and in the end it's a relatively small feature for the huge price point that comes with it.
And lighting just like any other part of the graphical side of the game can be heavily stylised, like I really disliked how RTX looked on the witcher because it felt like the contrast was taken away with RTX and the stronger more contrasting shadows on the raster version hid some of the imperfections in the textures and models.
I foujnd myself wrong multiple times with your tests.
The crazy part is , I also use raytracing on all my games.
I notice ray tracing more when I'm playing the game vs watching a video.
But very often i find myself wondering "what's the difference?"
It's not just about FPS in MP, it's the trade-off against other settings in SP.
For example
- If you can run RT on at 1440, but RT off at 4k; you're going to choose 4k over RT.
- If you can run most settings at "high" (eg postprocessing, texture resolution, volumetric effects, water effects, etc) with RT on, but "ultra" with RT off; you're going to chose maxing out everything else over RT.
In other words, it doesn't just need graphics cards capable of running RT in some form, it needs graphics cards capable of running RT at 4k with everything maxed out.
And at the moment that means only the 4090, and there's very few people who want to drop over £2k on a graphics card (and fit the big bastard in their case and feed its power requirements).
Ray Tracing is especially awesome in older, highly stylized games. In these situations, I think the lighting just really pops. When hardware gets gud enough to run raytracing in stylized VR environments at a high frame rate and there is haptic feedback, game over man.
Unbelievable I got every one wrong 😂
Here is my opinion, I think ray tracing just outright looks better, but at the moment it has a long way to go (mainly in performance) before it would be worth it to actually enable in most games, I have a 3060 card and I mainly only enable it when I want to take pretty screenshots of things in games like Fortnite and Minecraft.
I do play with highest settings in fortnite with Lumen etc enabled but i leave hardware raytracing disabled. The difference for me between software and hardware raytracing is almost not noticeable anymore.
@@OctoFloofy Lumen is still ray tracing. So you do agree with their point that ray tracing looks better
Can I honestly say I prefer rasterization more. Thank you for making this video now I know I'm going to jump ship and join team RED.
It's funny with the tests I chose rasterized every time. I guess I'm used to it. In some cases I even found it more natural looking. That being said when on a game it gives a little visual extra and it doesn't affect performance too much for my hardware I live it on. Like Doom Eternal for example, or Metro exodus. On Cyberpunk I never used it cause the game looks amazing already rasterized on HDR tv and the performance hit with RT is huge.
So here's the thing. In games with static environments, they are technically already using raytracing for GI. It's just that the rays are precomputed, and the baked into lightmaps. So visually it isn't going to be much different.
Almost any dynamic lighting model will generally look, at best, only as good as a static lighting model. Most look slightly worse. And all dynamic lighting models run worse than their static counterparts.....provided nothing is moving.
The difference is when things move. Static lighting severely limits how dynamic the world can be. The ground can't move. People moving through environments won't see the environment reflected back at them. Forget about a dynamic day/night cycle or destructible environments. But if the map is completely static anyway, then visually there will be virtually no difference. Pretty much just reflection accuracy.
And when rasterized games do have dynamic environments, they usually look rather bad, or they have to do some really rough approximations to get anywhere close. They can't pre-render their shadows, so the shadows are generally fixed-blur lightmaps, vertex projections (Doom3 style), or tricks to try to make variable penumbras from what are still essentially basic point lights.
But if you want lighting from an animated LED billboard? Good luck. Rasterization doesn't actually have a reasonable means of handling area lights. The closest approximations were extremely intensive.
This also matters a lot in game development. The reason many rasterized games look as natural as they do today comes down to a lot of tricks the level designers themselves do. They place fake lights freaking everywhere to tweak this place to be brighter or that place to look like it has reflected light coming in. With raytracing, that just isn't nearly as necessary. If you place a glowing sign, it'll just make light. You don't have to place baked lights with shadows disabled to hide the fact that you are using point lights to emulate an areal light. You just place a thing with a glowing texture.
This goes completely nuts if your world is precedurally generated, which is why Minecraft is such a good test bed for raytracing. Everything is procedurally generated, and everything can be broken. There is NO static lighting in minecraft. So adding raytracing, or ray marching, or voxel illumination, or any other method makes a HUGE difference in how the game looks.
But if you're playing CS:GO where the maps are basically just static stone with zero interactivity, then yeah it'll be really hard to make raytracing look any better than what the level lightmap calculator already does.
I think raytracing will start to look good in games when developers begin to make them primarily _for_ raytracing instead of rasterization. The problem as I see it is that many gamedevs still think of raytracing as an element of visual flair or eye candy, instead of as a central part of the rendered image. And this makes sense, because no matter what any marketing department tells consumers, devs are going to give the most attention to settings that a majority of consumers can run, which means: rasterization.
Unlike CG animated movies or TV shows, the technology created to control lighting, shadow, and visual contrast in a frame with raytracing are simply far more advanced than what we have for raytracing in the games industry today, and the people who touch that sort of lighting in movie and TV projects are far more experienced with it than gamedevs are. I think that once art and lighting departments can actually start designing with raytracing in mind _first,_ then we will begin to see scenes that look artistically nicer than their rasterized counterparts, and with even _less_ of a raytracing performance hit than we see now.
When ray tracing was released with the RTX 2000 series it was really bad.
I had the RTX 2080 and then the RTX 2080 Super and games was lagging so much with ray tracing enabled and the visual difference was so small.
I have had the RTX 4090 since launch and it is really worth it now. The difference is very noticeable and games finally have a high framerate even when ray tracing is at the max/ultra preset.
I have tried Cyberpunk 2077 with path tracing too and it ran fine but the difference between psycho ray tracing and path tracing is not nearly as noticeable as going from rasterized to ray traced.
I like the more contrasty shadows that are in nonrtx versions. The rtx version looks to me too washed out.
That's the reason I like the RTX version more. Looks more natural and realistic
so my question is, how well does RT work in consoles? because that was one of the main selling points of the ps5 and the new xbox, if pcs can't handle RT what can the console users expect?
It’s not that powerful on console. Although we will see more games with better rt optimization for console