Honestly, I prefer the look of non-raytraced visuals most of the time. Sometimes raytracing just kills the look of it. When raytracing is done right, I love it, but it usually feels like it's tacked on as a little extra setting for those who want it.
@@sebastienmorin9020 It is, you just need some real hardware to run it with high settings and high resolutions. Its an insanely large world were you see far and alot of high def pixles that needs to be rendered.
Cyberpunk was damn near an RTX tech demo? It was very much made with Ray tracing in mind. Heck it was one of the first games to get path tracing. The first game?? Certainly the first popular one
It was designed with regulat Hybrid RT in mind, later they added Path Tracing but Psycho RT (Lights, Global Illumination (only Psycho), Shadows, Reflections) was in the game at release. Without RT you get Screen Space Reflections which works "okay" in Cyberpunk but SSR has a lots of drawbacks, disocclusion and in Cyberpunk there are a tons of reflective surfaces and this is where SSR's biggest problem appears which means it dissapears if you move the camera because the object which has to be reflected out of "screen space" so no data, like the name says.
I bought an RTX 4080 back in March of 2023. I play in 1440p and I tried Cyberpunk 2077 with Path Tracing and I would never go back even to regular RT on that title. Path Tracing literally mimics the way light should be in real life. If I use DLSS Quality with Frame Generation and Ray Reconstruction I am always around 100+ fps and without Frame Generation I can lock in Riva Tuner Statistic Server to 60 fps and I get a locked 60 fps experience without Frame Generation. Path Tracing aside, the most transformative RT feature is Global Illumination and Ambient Occlusion IMO. Because it makes the game more grounded and more life like. Anyway at the moment I can turn on RT in every game with the help of DLSS and worth it.
AMD GPU's will age really badly now. More and more games will have at least a basic level of ray tracing that you can't turn off, and AMD is not good at RT.
I'm exceedingly doubtful of that. Ray tracing is a new technology, if it ever becomes the absolutely standard that's decades down the line. anti-aliasing has existed for longer than you can remember and is still toggleable. Most consumers do not and can not run ray tracing. 90% of gamers have never played with it at all. Current AMD cards will absolutely complete their life cycle with no issues. Ray tracing is cool, and looks great, but it's a bit early unfortunately.
@@AnonsTreasures Well, there are right now already 3 games that I can think of where you can't turn off RT completely: Star Wars Outlaws, Avatar and Black Myth Wukong. There will only be more in the future. The first Nvidia card which had specialized HW for ray tracing is already 6 years old.
@nossy232323 Wukong it can be turned off, outlaws I'm not sure, avatar is interesting. You cannot turn ray tracing off fully, but, however its implemented runs pretty well on the 7900XTX. 4k ultra native gives you 35 to 40 fps, FSR Quality 4k ultra gives you solid 60 minimum. For contrast, the 4090 gives you 65~70 fps on 4k ultra native. Regardless, the average gamer doesn't have amazing hardware. Check the steam hardware survey. RTX pushes the 4090 to its limits, even if RTX becomes mandatory the current suite of cards still wouldn't keep up in due time. Though, again, I highly doubt thats the direction things will go any time soon.
No you can not turn off ray tracing in Wukong. If you do that in the options, it falls back to UE5 Lumen which is a less correct form of ray tracing. It's true that Steam stats shows hardware that is on average pretty low end. But let's not forget a lot of those are from really casual players that play e.g. a cards game on a laptop, or little kids that got an old computer from their parents etc Most play older easy to run games. It does not show what the enthusiasts on average buy. I'm pretty surprised myself how hard some of the newer UE5 games are to run, especially since there are no new real new gen consoles. And I have asked myself: how many people CAN run these in a decent way? It's pretty strange.
Ray Tracing won't be worth it till mid tier cards can perform like a 4090 and won't be a thing till another 2 generations later but who knows. edit: honestly till a 60 / 70 card can do exactly what a 90 can for way cheaper it's not gonna be worth it....and dev's need to keep that in mind since the avg gamer is running a 60 or 70 card so they can have all the ray tracing they want but if the cards can't do it then forcing people to have it on like a avatar etc it's gonna be rough so far ray tracings for the 1%.
I may be wrong , but in some games (like The Witcher 3) the raytraced reflections seem like ordinary planar reflections(Half life 2 water), but with a reduced LOD and distance. Raytraced shadows are awesome though, the shimmering always annoyed me
I am on 4090, with bit of optimisation you can pull off better numbers than on the bench especially in a watercooled rig. Cyberpunk with reaism mods is just insane, especially in vr.
@@Curianer nah thanks im good without 4k. Got my PC connected to my 4k TV also but rarely use it. Only for some games that i play on controller and feel like chilling it out on my couch. 1080p and 1440p are more than good enough for me. Would be a waste of money to get myself a 4k monitor. Thats just me though.
I think that people buying a 7900XTX will at least check those kind of benchmarks before buying such a card. Almost only enthusiasts will buy 1000$+ GPU, don't they?
07:36 DLSS would never be better than native, it's better than base TAA which is shitty the point here is they mean TAA as native which is false and misleading you created a problem with TAA and have been trying to fix it since, and Game companies now depend on these upscalers to avoid optimizing 😓😓
In certain games, to my eyes, I perceived DLSS Quality as better image quality than 4k native. Don’t know why I was surprised too (maybe I was comparing it to TAA idk..)
Have you actually compared DLSS to an aliased, native picture? Higher settings remove aliasing and look better than native. >Game companies now depend on these upscalers to avoid optimizing Optimizing? You mean making the graphics engine produce visually comparable images with less processing? Like what DLSS does? "These upscalers" are only a "different" form of optimization because they are easy to implement. Really not the fault of tech that devs don't optimize otherwise.
@@szynkers I also think the time frame, leading up to the release, is a big factor for having to take upscaling as an alternative for "optimization." the devs are artists and want to use as much of that time to make the game that's in their head as the image of what they want in it. so, since there are so many factors to make this a stressful development, they don't really have time to think of optimization during the process and have to use DLSS as an optimizer. There are too many factors to excuse this dilemma, but I believe the developers would rather have the game they intended for release and just optimize through patch updates. I wish everyone in the gaming industry was more transparent of the development process without repercussions or risk of losing their jobs. Devs get a lot of heat because people don't really understand the whole process and why things are the way they are. Also introducing ray tracing more frequently in released titles is subjecting other companies to start utilizing these functions and technologies, which in turn starts the competition for making more affordable GPUS capable of handling ray tracing and other future technologies because of the outcome of popularity spreading. Everything just takes time, and it will all be more affordable. DLSS has also caused other indie developers to develop upscaling software that functions and performs better than the two top companies. Something positive is always in something that is negative. You just have to take the time to critically think of how this can lead to the future and what needs to be fixed. This isn't directed toward anybody and just trying to make people think more critically about the circumstances at hand and just my opinion.
16:42 - I disagree pretty heavily and I'll tell you why. The 4090, really, is only worth that 60% up charge if you are an advanced 3D artist, AI developer, or specialized in some research field. Even then, barely on a consumer level. RTX is a great technology, and I think if it really captures you and you have the money to blow I think that's fair. But does that make it a reasonable cost to value to the average consumer? No, I would not. More games will come out that support RTX for sure, but right now it's few and far between and it's a heavy hit to performance. It can make games look amazing, for a heavy trade off, in a few games. A handful really. That's really like buying those 3D TVs awhile back because more 3D movies will come when you only had 15 or so to watch. I think if you like that gimmick, that's great but I wouldn't tell consumers "Stay away from the non-3d tvs they can't even do 3D!" 4090 is good for professional applications, better than the 7900XTX absolutely. However, you really have to be in a pretty specific and narrow niche to get your money's worth out of the 4090 over the 7900xtx. In some professional applications the 7900XTX beats out the 4080 super. The VRAM is a pretty major consideration here. 17:00 - This I feel is all sorts of wrong. First of all, only esports gamer for 7900XTX? No streaming? The 7900XTX can play essentially any major release at over 100fps max settings 4k. Streaming is extremely light work for it, comparisons show even the encoding for both recording and streaming is essentially the exact same. So what does this even mean? The problem with this debate really, is the money. If the 4090 was priced more reasonably, it would be an absolutely easy sweep. 1,200 dollars and you've got me sold. 1,800? Absolutely not. So then we are left with the 4080 super vs the 7900XTX. Directly comparable performance, sometimes one beats out the other. So which do you go? If you care about RTX you go Nvidia, if you want the most bang for you buck future proofing wise you go 7900XTX. That's really it. DLSS and FSR are neck and neck and subject to change on any given update. DLSS is ahead by some margin, yet I wouldn't purchase a card based on software given how we've seen how quickly that can blow in either party's favor. For me, 24gb VRAM + FSR being good enough really explains why most people tend to recommend the 7900XTX over the 4080. If you need Nvidia, you'll know before buying you need Nvidia. You are either in a specialized field, wanting the best of the best (for 4090), work with AI, or love RTX. Outside of those reasons, most people should stick with the 7900XTX. Now in lower price brackets I could absolutely see the 4070 being a really attractive option but that's a whole other discussion really.
got an extra 4080 super founders edition sitting in my closet. only opened it to look at it lol. i got a 4090 FE also. i use that every day lol. i somehow got lucky to get both off of nvidia's storefront. tried to cancel the 4080 super but i didn't in time. started the return process and just said "screw it, i'll just keept it. i didn't open it for 2 1/2 months and caved in just to see that beautiful 4080 super in black, hahaha. i will probably never use it lol
Just look at the benchmarks. The XTX is getting destroyed, also the dual media engines on 4070ti and up make nvidia the clear choice for content creation and streaming. The 4080 is a better card than the 7900xtx despite its more modest specs. The 4080 wins more consistently in all workloads. If you want a GPU for literally just rasterized gaming the 7900xtx is a slightly better choice that consumes 30% more power to do so
Well some of your points are fair FSR vs dlss is an easy contest. DLSS basically always looks better especially as you start using performance modes. That's true in pretty much every game I can think of.
My problem with the RT naysayers has always been the fact that they act like even if the performance was perfect it still wouldn't be worth turning on. I call bullshit on 90% of people who say that.
It's literally just people that can't afford a computer to run it. In my opinion it makes a big difference to the actual contrast of the picture things look more 3D and lifelike with it turned on. The Witcher 3 is a good example. The outside scenes are much more visually striking with it on. We have hit a diminishing return when it comes to textures and polygons.
Performance dlss that outputs 1080p is rendering the game at 540p and upscaling to 1080p. 540p is 960 * 540 pixels. Its one quarter of the native 1080p that is 1920*1080. So like you are buying a $2K gpu to push 100fps in a game with PS4 tier textures, because its nvidia sponsored and they really wanted to not hit that 8GB vram mark, to render it on a 1080p monitor but at 540p upscaled to 1080p so you can use real time raytracing. Thats crazy.
@@SiliconSteak Real time raytracing is entirely a graphical thing. There is 0 games where it is used in gameplay. But to use it to improve graphics you have to degrade graphics in other ways. So maybe when 6090ti is released it will do it at 4K native and 60 fps in a game thats not a tech demo or pozzed slop...
The biggest problem is the performance. It's still really bad unless you have a 4090 and you play on a 1440p monitor. Maybe in 5 years it will be important
Well, I have no problem playing at 4K@144 Hz with my RTX 4090 PC on an LG G4 TV. DLSS and GSYNC are pretty amazing as they work together to maintain a smooth frame rate with no significant loss in image quality.
Unless Raytracing is a game changer for SPECIFIC games people like to play, its going to be a waste of money IMO. While I can afford a 4090, im a long term owner of things, usually. The price of the 4090 is so fucking dumb I just couldn't swallow it. Not even for the boost in DX12 raytracing. And for the viewers Ill re-iterate For me, Im finding im playing way more Unreal 5 games with Lumen raytracing so DX12 RT is sidelined for me VRAM. Look for VRAM for the money, because thats where age will really show.
We will see. VRAM is 100% important. My point with this video is some of the most recent blockbuster games have really good implementations of ray tracing (for visuals). So if ppl want to play those games maybe they should consider turning ray tracing on which they basically need nvidia for in these games
So 4k with raytracing helps a lot with image quality. I believe DX12 RT is very nice to have, but until a 200 card can do it at 4090 level will we see a big uprising in demand for it But its getting there Actually, one of my Xbox friends was mentioning RT on the xbox and how cool it was
@@sasquatchcrew Yea it definitely needs to be more available to the masses. Obviously a console player will think it looks great compared to what they are used to (which is typically not the best visuals)
@@SiliconSteak I feel like a few more years and people will catch on It seems like games are switching to Nanite/Lumen Right now with Unreal Engine 5 I feel better with my 7900xtx purchase, but part of me still wants nvidia. I can't say buying high end isn't worth it, for higher resolutions or FPS But I feel a 6600xt/6700xt - 3060-3070 should last most people a decent amount of time 4k is still a monster, I used the LG CX 48 as my daily for over 3 months straight and it actually fried my eyeballs even on the lowest light settings. Don't cheap out and use TV's as monitors. Yes, the image quality is decent and the colors are decent, but knowing what I know now until OLED is 300 bucks or less and 200+ fps its not worth it AFMF 2.0 is a game changer including Lossless Scaling so it really will increase GPU longevity. The more tools the better, so we don't have to spend more on hardware. The issue is, the game companies will eventually figure it out and just make their games worse performance wise and use this crap to make up for it I feel.
Yea, while I think some games are more “optimized” than ever, it seems that hardware advancements just make it more acceptable to have lower base performance in games
@@SiliconSteak its a good thing console players are starting to notice and be more interested because that means a new battle is about to begin, affordable GPUs capable of high performance with RT and upscaling. sadly, console market dictates what happens, gaming wise, on PC now days.
Ray tracing takes up a lot of memory bandwidth and vram and neither AMD nor NVIDIA has given us acceptable levels of those two things in the $300 and under segment for a while now. I am optimistic about the mid range $500-$600 gpus but not so much for the >$300 for ray tracing
A feature that was never requested by customers and mainly benefits the games industry? How are we going to sell them on that? We just make it expensive for the customer so they think it's exclusive. Pew pew pew lemons.
Ray Trace made my wife’s boyfriend’s PC blow up! I think I’m in big trouble! Thank goodness I can use your link to get a new windows key for his new PC I need to build soon!
It’s pretty amazing how every real tech guy is sponsored by some shill Chinese ads huh it’s either potential pirate windows to some extent or nutropics brain enhancers
Ray tracing is meh, its alright for some stuff, but considering that it has disagreements with lots of shader types, and is generally very expensive, You won't see ray tracing social vr games for a long time, And sense I don't give two s***ts about competitive or solo gaming... kek
In the light of latest news that AMD is likely giving up on the high end not only with RDNA4, I unfortunately will probably buy 5090 to build a rig when the time comes. I dont want to support Nvidia with my wallet, but it looks like the Ray-Path tracing is starting to look very appealing. All we need to go along with it is the devs to learn how to optimize games. I will play black myth wukong on 5090 and i believe it wont ramp up 100fps in 4k for me...
@@Обовсём-э9х if you wont use the 5090 for work but purely for gaming, you will be wasting a LOT of money. For gaming get yourself a 5080 once its out. Unless you really dont mind wasting money (its yours to spend/ waste) or use it for actual work, my advice would be, dont. And path/ray tracing will only be a thing (MAYBE if at all) in around 5 to 10 years.
wtf i dont know from wich date ur benchmarks are about avatar in 4 k on my arc 770 work in 26 fps but maybe its cause of one of the last intel arc patches that it work better now
Sure its resource expensive And definitely not always worth it But you cant say it doesn’t look good, at that point you are arguing against objective truth.
Honestly, I prefer the look of non-raytraced visuals most of the time. Sometimes raytracing just kills the look of it. When raytracing is done right, I love it, but it usually feels like it's tacked on as a little extra setting for those who want it.
Cyberpunk was not designed with ray tracing in mind.
It wasn't designed with framerate in mind either.
@@sebastienmorin9020 It is, you just need some real hardware to run it with high settings and high resolutions. Its an insanely large world were you see far and alot of high def pixles that needs to be rendered.
Cyberpunk was damn near an RTX tech demo?
It was very much made with Ray tracing in mind. Heck it was one of the first games to get path tracing. The first game??
Certainly the first popular one
It was designed with regulat Hybrid RT in mind, later they added Path Tracing but Psycho RT (Lights, Global Illumination (only Psycho), Shadows, Reflections) was in the game at release. Without RT you get Screen Space Reflections which works "okay" in Cyberpunk but SSR has a lots of drawbacks, disocclusion and in Cyberpunk there are a tons of reflective surfaces and this is where SSR's biggest problem appears which means it dissapears if you move the camera because the object which has to be reflected out of "screen space" so no data, like the name says.
2:22 He almost said Nuh-Vidia. Whew. Good thing he said it the right way. I was about to get out my pitchfork and torch.
👀
585th sub
I bought an RTX 4080 back in March of 2023. I play in 1440p and I tried Cyberpunk 2077 with Path Tracing and I would never go back even to regular RT on that title. Path Tracing literally mimics the way light should be in real life. If I use DLSS Quality with Frame Generation and Ray Reconstruction I am always around 100+ fps and without Frame Generation I can lock in Riva Tuner Statistic Server to 60 fps and I get a locked 60 fps experience without Frame Generation.
Path Tracing aside, the most transformative RT feature is Global Illumination and Ambient Occlusion IMO. Because it makes the game more grounded and more life like. Anyway at the moment I can turn on RT in every game with the help of DLSS and worth it.
AMD GPU's will age really badly now. More and more games will have at least a basic level of ray tracing that you can't turn off, and AMD is not good at RT.
Could see it… lumen works well on AMD so it could be ok in the long run
I'm exceedingly doubtful of that. Ray tracing is a new technology, if it ever becomes the absolutely standard that's decades down the line. anti-aliasing has existed for longer than you can remember and is still toggleable. Most consumers do not and can not run ray tracing. 90% of gamers have never played with it at all. Current AMD cards will absolutely complete their life cycle with no issues.
Ray tracing is cool, and looks great, but it's a bit early unfortunately.
@@AnonsTreasures Well, there are right now already 3 games that I can think of where you can't turn off RT completely: Star Wars Outlaws, Avatar and Black Myth Wukong. There will only be more in the future.
The first Nvidia card which had specialized HW for ray tracing is already 6 years old.
@nossy232323 Wukong it can be turned off, outlaws I'm not sure, avatar is interesting. You cannot turn ray tracing off fully, but, however its implemented runs pretty well on the 7900XTX. 4k ultra native gives you 35 to 40 fps, FSR Quality 4k ultra gives you solid 60 minimum. For contrast, the 4090 gives you 65~70 fps on 4k ultra native.
Regardless, the average gamer doesn't have amazing hardware. Check the steam hardware survey. RTX pushes the 4090 to its limits, even if RTX becomes mandatory the current suite of cards still wouldn't keep up in due time. Though, again, I highly doubt thats the direction things will go any time soon.
No you can not turn off ray tracing in Wukong. If you do that in the options, it falls back to UE5 Lumen which is a less correct form of ray tracing.
It's true that Steam stats shows hardware that is on average pretty low end. But let's not forget a lot of those are from really casual players that play e.g. a cards game on a laptop, or little kids that got an old computer from their parents etc
Most play older easy to run games.
It does not show what the enthusiasts on average buy. I'm pretty surprised myself how hard some of the newer UE5 games are to run, especially since there are no new real new gen consoles. And I have asked myself: how many people CAN run these in a decent way? It's pretty strange.
I think ray tracing comes down to how it’s implemented. Some games I can’t even tell the difference but a handful actually look impressive
It’s all up to the devs and their artistic vision. The art style and atmosphere in the game can really tie ray tracing in or make it a eye sore
Ray Tracing won't be worth it till mid tier cards can perform like a 4090 and won't be a thing till another 2 generations later but who knows.
edit: honestly till a 60 / 70 card can do exactly what a 90 can for way cheaper it's not gonna be worth it....and dev's need to keep that in mind since the avg gamer is running a 60 or 70 card so they can have all the ray tracing they want but if the cards can't do it then forcing people to have it on like a avatar etc it's gonna be rough so far ray tracings for the 1%.
I bet we will see 60 series get over 4090 rt on the 70 class. Not on RTX 5000 tho..
I may be wrong , but in some games (like The Witcher 3) the raytraced reflections seem like ordinary planar reflections(Half life 2 water), but with a reduced LOD and distance. Raytraced shadows are awesome though, the shimmering always annoyed me
I am on 4090, with bit of optimisation you can pull off better numbers than on the bench especially in a watercooled rig. Cyberpunk with reaism mods is just insane, especially in vr.
People don't buy AMD for ray tracing
I would go as far as to say don’t buy them if you plan to do anything other than rasterized gaming
@@SiliconSteakGreat value GPUs for rasterized gaming compared to nvidia
True, and higher vram comparatively
Who will still play Games in 1080p in 2024?
Once you've played in 4K, you'll never want to go back to 2K, right? ;)
@@Curianer nah thanks im good without 4k. Got my PC connected to my 4k TV also but rarely use it. Only for some games that i play on controller and feel like chilling it out on my couch. 1080p and 1440p are more than good enough for me. Would be a waste of money to get myself a 4k monitor. Thats just me though.
I think that people buying a 7900XTX will at least check those kind of benchmarks before buying such a card. Almost only enthusiasts will buy 1000$+ GPU, don't they?
7900xtx is $900
07:36 DLSS would never be better than native, it's better than base TAA which is shitty
the point here is they mean TAA as native which is false and misleading
you created a problem with TAA and have been trying to fix it since, and Game companies now depend on these upscalers to avoid optimizing 😓😓
In certain games, to my eyes, I perceived DLSS Quality as better image quality than 4k native. Don’t know why I was surprised too (maybe I was comparing it to TAA idk..)
Cyberpunk looks better in DLSS quality than native, u cant tell me otherwise
@@jaytay420 Cyberpunk looks better native than with DLSS. There i told you otherwise. And its the truth. hehehhe
Have you actually compared DLSS to an aliased, native picture? Higher settings remove aliasing and look better than native.
>Game companies now depend on these upscalers to avoid optimizing
Optimizing? You mean making the graphics engine produce visually comparable images with less processing? Like what DLSS does?
"These upscalers" are only a "different" form of optimization because they are easy to implement. Really not the fault of tech that devs don't optimize otherwise.
@@szynkers I also think the time frame, leading up to the release, is a big factor for having to take upscaling as an alternative for "optimization." the devs are artists and want to use as much of that time to make the game that's in their head as the image of what they want in it. so, since there are so many factors to make this a stressful development, they don't really have time to think of optimization during the process and have to use DLSS as an optimizer. There are too many factors to excuse this dilemma, but I believe the developers would rather have the game they intended for release and just optimize through patch updates. I wish everyone in the gaming industry was more transparent of the development process without repercussions or risk of losing their jobs. Devs get a lot of heat because people don't really understand the whole process and why things are the way they are.
Also introducing ray tracing more frequently in released titles is subjecting other companies to start utilizing these functions and technologies, which in turn starts the competition for making more affordable GPUS capable of handling ray tracing and other future technologies because of the outcome of popularity spreading. Everything just takes time, and it will all be more affordable. DLSS has also caused other indie developers to develop upscaling software that functions and performs better than the two top companies. Something positive is always in something that is negative. You just have to take the time to critically think of how this can lead to the future and what needs to be fixed.
This isn't directed toward anybody and just trying to make people think more critically about the circumstances at hand and just my opinion.
16:42 - I disagree pretty heavily and I'll tell you why. The 4090, really, is only worth that 60% up charge if you are an advanced 3D artist, AI developer, or specialized in some research field. Even then, barely on a consumer level. RTX is a great technology, and I think if it really captures you and you have the money to blow I think that's fair. But does that make it a reasonable cost to value to the average consumer? No, I would not. More games will come out that support RTX for sure, but right now it's few and far between and it's a heavy hit to performance. It can make games look amazing, for a heavy trade off, in a few games. A handful really. That's really like buying those 3D TVs awhile back because more 3D movies will come when you only had 15 or so to watch. I think if you like that gimmick, that's great but I wouldn't tell consumers "Stay away from the non-3d tvs they can't even do 3D!" 4090 is good for professional applications, better than the 7900XTX absolutely. However, you really have to be in a pretty specific and narrow niche to get your money's worth out of the 4090 over the 7900xtx. In some professional applications the 7900XTX beats out the 4080 super. The VRAM is a pretty major consideration here.
17:00 - This I feel is all sorts of wrong. First of all, only esports gamer for 7900XTX? No streaming? The 7900XTX can play essentially any major release at over 100fps max settings 4k. Streaming is extremely light work for it, comparisons show even the encoding for both recording and streaming is essentially the exact same. So what does this even mean?
The problem with this debate really, is the money. If the 4090 was priced more reasonably, it would be an absolutely easy sweep. 1,200 dollars and you've got me sold. 1,800? Absolutely not. So then we are left with the 4080 super vs the 7900XTX. Directly comparable performance, sometimes one beats out the other. So which do you go? If you care about RTX you go Nvidia, if you want the most bang for you buck future proofing wise you go 7900XTX. That's really it. DLSS and FSR are neck and neck and subject to change on any given update. DLSS is ahead by some margin, yet I wouldn't purchase a card based on software given how we've seen how quickly that can blow in either party's favor.
For me, 24gb VRAM + FSR being good enough really explains why most people tend to recommend the 7900XTX over the 4080. If you need Nvidia, you'll know before buying you need Nvidia. You are either in a specialized field, wanting the best of the best (for 4090), work with AI, or love RTX. Outside of those reasons, most people should stick with the 7900XTX. Now in lower price brackets I could absolutely see the 4070 being a really attractive option but that's a whole other discussion really.
got an extra 4080 super founders edition sitting in my closet. only opened it to look at it lol. i got a 4090 FE also. i use that every day lol. i somehow got lucky to get both off of nvidia's storefront. tried to cancel the 4080 super but i didn't in time. started the return process and just said "screw it, i'll just keept it. i didn't open it for 2 1/2 months and caved in just to see that beautiful 4080 super in black, hahaha. i will probably never use it lol
@Donahue250 I will give one point to Nvidia, their FE cards are real slick. Love the design.
Just look at the benchmarks. The XTX is getting destroyed, also the dual media engines on 4070ti and up make nvidia the clear choice for content creation and streaming. The 4080 is a better card than the 7900xtx despite its more modest specs.
The 4080 wins more consistently in all workloads. If you want a GPU for literally just rasterized gaming the 7900xtx is a slightly better choice that consumes 30% more power to do so
Well some of your points are fair FSR vs dlss is an easy contest.
DLSS basically always looks better especially as you start using performance modes.
That's true in pretty much every game I can think of.
My problem with the RT naysayers has always been the fact that they act like even if the performance was perfect it still wouldn't be worth turning on. I call bullshit on 90% of people who say that.
It's literally just people that can't afford a computer to run it.
In my opinion it makes a big difference to the actual contrast of the picture things look more 3D and lifelike with it turned on. The Witcher 3 is a good example. The outside scenes are much more visually striking with it on.
We have hit a diminishing return when it comes to textures and polygons.
Performance dlss that outputs 1080p is rendering the game at 540p and upscaling to 1080p. 540p is 960 * 540 pixels. Its one quarter of the native 1080p that is 1920*1080.
So like you are buying a $2K gpu to push 100fps in a game with PS4 tier textures, because its nvidia sponsored and they really wanted to not hit that 8GB vram mark, to render it on a 1080p monitor but at 540p upscaled to 1080p so you can use real time raytracing.
Thats crazy.
When I was talking my about perf, it was at 4k. So it upscales from 1080p to 4k
But yea. The reliance of upscaling is real…
@@SiliconSteak Real time raytracing is entirely a graphical thing. There is 0 games where it is used in gameplay.
But to use it to improve graphics you have to degrade graphics in other ways.
So maybe when 6090ti is released it will do it at 4K native and 60 fps in a game thats not a tech demo or pozzed slop...
Games can and some do use ray tracing as a part of gameplay
@@SiliconSteak Post 1 (one) such game. I double dare you.
3:34 Exactly, thats why you should be running linux, its completely free.
I use Arch btw.
The biggest problem is the performance. It's still really bad unless you have a 4090 and you play on a 1440p monitor. Maybe in 5 years it will be important
4080 S is also great at 1440p with upscaling.
Well, I have no problem playing at 4K@144 Hz with my RTX 4090 PC on an LG G4 TV.
DLSS and GSYNC are pretty amazing as they work together to maintain a smooth frame rate with no significant loss in image quality.
Unless Raytracing is a game changer for SPECIFIC games people like to play, its going to be a waste of money IMO.
While I can afford a 4090, im a long term owner of things, usually. The price of the 4090 is so fucking dumb I just couldn't swallow it.
Not even for the boost in DX12 raytracing.
And for the viewers Ill re-iterate
For me, Im finding im playing way more Unreal 5 games with Lumen raytracing so DX12 RT is sidelined for me
VRAM.
Look for VRAM for the money, because thats where age will really show.
We will see. VRAM is 100% important. My point with this video is some of the most recent blockbuster games have really good implementations of ray tracing (for visuals). So if ppl want to play those games maybe they should consider turning ray tracing on which they basically need nvidia for in these games
been using it on my 2080ti at 1440 fps since i got the ggpu in 2019
So 4k with raytracing helps a lot with image quality.
I believe DX12 RT is very nice to have, but until a 200 card can do it at 4090 level will we see a big uprising in demand for it
But its getting there
Actually, one of my Xbox friends was mentioning RT on the xbox and how cool it was
@@sasquatchcrew Yea it definitely needs to be more available to the masses. Obviously a console player will think it looks great compared to what they are used to (which is typically not the best visuals)
@@SiliconSteak I feel like a few more years and people will catch on
It seems like games are switching to Nanite/Lumen Right now with Unreal Engine 5
I feel better with my 7900xtx purchase, but part of me still wants nvidia.
I can't say buying high end isn't worth it, for higher resolutions or FPS
But I feel a 6600xt/6700xt - 3060-3070 should last most people a decent amount of time
4k is still a monster, I used the LG CX 48 as my daily for over 3 months straight and it actually fried my eyeballs even on the lowest light settings. Don't cheap out and use TV's as monitors.
Yes, the image quality is decent and the colors are decent, but knowing what I know now until OLED is 300 bucks or less and 200+ fps its not worth it
AFMF 2.0 is a game changer including Lossless Scaling so it really will increase GPU longevity.
The more tools the better, so we don't have to spend more on hardware.
The issue is, the game companies will eventually figure it out and just make their games worse performance wise and use this crap to make up for it I feel.
Yea, while I think some games are more “optimized” than ever, it seems that hardware advancements just make it more acceptable to have lower base performance in games
@@SiliconSteak its a good thing console players are starting to notice and be more interested because that means a new battle is about to begin, affordable GPUs capable of high performance with RT and upscaling. sadly, console market dictates what happens, gaming wise, on PC now days.
Ray tracing takes up a lot of memory bandwidth and vram and neither AMD nor NVIDIA has given us acceptable levels of those two things in the $300 and under segment for a while now.
I am optimistic about the mid range $500-$600 gpus but not so much for the >$300 for ray tracing
that first picture had me dying
I love my top tier Nvidia cards and RTX but unfortunately the cost is getting out of control :*(
guess its worth it if you dont mind playing in 1080p on 1600$ GPU or upcaling from 540p to 1080p on midrange GPU.
i do go straight to Microsoft through one of their top accountants, my family member. I get unlimited keys of anything Microsoft for free.
A feature that was never requested by customers and mainly benefits the games industry? How are we going to sell them on that?
We just make it expensive for the customer so they think it's exclusive. Pew pew pew lemons.
dayum that thumbnail fixed gpu sag they just gotta make radiators big enoug to support themselves at the bottom of the case
Ray Trace made my wife’s boyfriend’s PC blow up! I think I’m in big trouble! Thank goodness I can use your link to get a new windows key for his new PC I need to build soon!
oh your wife supports polygamy way to go man 😂😂😂😂 your can play her at nights no need pc
It’s pretty amazing how every real tech guy is sponsored by some shill Chinese ads huh it’s either potential pirate windows to some extent or nutropics brain enhancers
Ray tracing is meh, its alright for some stuff, but considering that it has disagreements with lots of shader types, and is generally very expensive, You won't see ray tracing social vr games for a long time, And sense I don't give two s***ts about competitive or solo gaming... kek
Funny video to make with next gen GPU's about to drop from both AMD and NVIDIA
Then I can make a pt.2 :)
i am happy without it
In the light of latest news that AMD is likely giving up on the high end not only with RDNA4, I unfortunately will probably buy 5090 to build a rig when the time comes. I dont want to support Nvidia with my wallet, but it looks like the Ray-Path tracing is starting to look very appealing. All we need to go along with it is the devs to learn how to optimize games. I will play black myth wukong on 5090 and i believe it wont ramp up 100fps in 4k for me...
@@Обовсём-э9х if you wont use the 5090 for work but purely for gaming, you will be wasting a LOT of money. For gaming get yourself a 5080 once its out. Unless you really dont mind wasting money (its yours to spend/ waste) or use it for actual work, my advice would be, dont. And path/ray tracing will only be a thing (MAYBE if at all) in around 5 to 10 years.
no my gtx is still the goat f greed f for rtx
wtf i dont know from wich date ur benchmarks are about avatar in 4 k on my arc 770 work in 26 fps
but maybe its cause of one of the last intel arc patches that it work better now
Maybe. That bench is when it first came out
yes, path tracing is worth it every single Ada gpu can run it easily the improvements were massive this generation
corny😭
@@Rinnetix AMD users 🤡🤷♂️
Sure its resource expensive
And definitely not always worth it
But you cant say it doesn’t look good, at that point you are arguing against objective truth.
Even back when RT was introduced on the 20 series you had to be an idiot to not understand this will be massively adopted in the future.
Some ppl still think RT will never amount to anything surprisingly