I think both are great options at this resolution for RT. FSR3.0 might change a few things in AMD's favour, will see how much NVIDIA can squeeze out of the 4070ti in future updates. It's only a matter of which one can be found cheaper
Great comparion. The 7900XT did NOT get the curb stomping in heavy RT settings I thought it would from the 4070ti. Alot closer than I expected with these things enabled. I sprinkle in as much RT as I can in my games as long as I have high framerates. The Witcher for example... I just play Ultra+ 1440P without RT. The hit is just too much for too little visual improvement. Frame Generation can suck it. I am tired of half measures (upscaling and frame generation) to try and get higher fps. Both companies need to improve rendering power. Period.
I completely agree: I love the idea of RT, and also enable it when I can, but RT in it's current implementation is very much a con, even now on the third gen NVidia cards! (copy edited from my own separate comment) I accept that in some games that actually use RT for GI, AO and shadows it can be transformative, but at that point of high RT implentation, you need an absolute powerhouse of a GFx card to run it, so that really ONLY the 4090 is getting close to achieving acceptable framerates; the premium pinnacle of third gen cards, with a price tag to match, and even then it still needs to cheat considerably with DLSS 3 to truly achieve all out, full RT. As an aside, last year or the year before, didn't NVidia buy the company that pioneered the new "light based processor" (as opposed to electrically based), that even in its first iteration is around 300x faster than electronic processors? I predict that they are developing RT and DLSS now in electrical form that enables a large degree of software iteration (for example DLSS -> DLSS2 -> 2.1-> DLSS3 -> to the new iteration of DLSS3 soon to be released that improves frame generation artefacting), but once they're happy with it, they will transpose that to a light processor on 50 series cards (potentially, but probaly more like 60 or 70 series) ... The light based processor can only be built to do one thing (it cannot be changed by software), but it is 300x faster which means that the lag currently induced by RT and DLSS will be eliminated completely. At that point it will actually be truly game-changing, where RT can be enabled as a hardware solution (as opposed to a hardware AND software solution) that costs next to nothing in frame rate.
RT is pretty much barely developed for only 3 games the rest are meh so People getting all bent *nViDiA iS mOrE bEtTeRs RaY tRaCiNg* can just fuk off lolz it’s an invalid argument.
Performance seems pretty close and while the 7900XT generally does a bit worse (like -10 fps or so) it's still seems above 60 fps most of the time (which is my arbitrary cut-off for smoothness). I'm considering both cards and have no allegiance to AMD or NVidia. What I'm more interested in is is the VRAM and bus width. The 7900 XT has the 4070 TI beat, hands down. However, does that matter now and into the future if I'm only gaming at 1440p? I don't do any productivity stuff with my GPU at all - do I need to care much about 12 vs 20 GB? Bandwidth? What about gaming in the future, especially if I'm not transitioning to 4K. If anyone has any answers that'd be great, thanks!
12GB is plenty of vram for now and I suspect in the near future (2-3 years). I think in 4 years' time it is going to be the GPU itself that is the weak link, rather than the lack of vram, though I do suspect a lot more games are going to use 12+GB in 4+ years' time. The 7900XT is actually better in rasterization performance, I think something like 8% faster, while being 12% more expensive (though realistically since NO card sells at $800, ALL current cards start at the lowest price of $850) the price difference is like 5% or less, while the 4070ti is some 12% faster in RT while being around 5% cheaper. I think if you are going to spend $900, you might as well save up $100 and get the 7900XTX, much better value and I think its 16% more performance over the 7900XT while being only 10% more expensive!
@@SlickR12345 Thanks for the reply and insights. I was considering the XTX as well except that it's much harder to find (although availability may increase due to the temperature problems) and it may use more power than my 750W PSU can handle. However, since I won't really be pushing the card to its limits, perhaps the power won't be too big an issue. Also, I worry that it's going to be overkill since I don't game in 4K. But I may consider it, especially if prices come down at all and availability improves.
@@mattwong7191 Considering its only 10% more expensive than the 7900xt and is 16% on average faster, some reviewers even have it at 20% faster in certain combination of games, I'd say its extremely worth it. The AIB partner cards are generally a bit more expensive, depending on which model you go for, but I've seen custom models selling for $1000. Again, only some MBA cards have the overheating issue. You are also going to be the 4070ti in RT with the 7900xtx, in most titles its actually some 5% faster in RT. I believe it only loses in Cyberpunk 2077 at ultra high RT against the 4070ti, though that RT implementation is ALL Nvidia, it was designed to perform the best on Nvidia's hardware.
I think in your situation is better a 7900XT. The 20GB and bus bandwidth will pay off in the future. in means that it will last longer than the nvidia card at least in terms lf memory size.
@SlickR I want to buy card for 1440p for 6-8 years and I don't buy NV because of VRAM, but I'm waiting for 7800xt, 16GB will be enough for future games and card will be cheaper than 4070 Ti probably with slighty worse raster I think.
If I played my games with RT then NVIDIA all the way. But honestly I find little use for ray tracing. Definitely depends on the games you play to determine which card to get.
for games i play, only one that i would is metro, in rest of games i dont really mind no rt, could see myself using it also in cb 2077, but im not hyped for that game
Bro witcher rt is insane. Dying light is also good. Bright memory, ghostwire tokyo etc. there are many games bro. Rt is the future. Dlss3 helps out 4070ti massively as well. Amd probably has to cancel 7900xt or make it a lot cheaper or no one wud buy that card
Thank you for making this video. I appreciate Ray tracing doesn’t appeal to everyone but I personally use it where I can and really wanted to see the difference here
Not defending AMD. Never owned an AMD GPU. I don’t think the performance loss is worth the differences it displays. Fair enough a 4090 can probably handle it but my bank account cannot handle/justify a 4090… I made a short video in cyberpunk highlighting non and Ultra Ray tracing differences on a 3080ti which cut the frame rate nearly in half….
So in 1440p with RT on more than 8gb vram is being consumed and they still wants to put 8gb in laptop's 4070... Wow, my 6 year old one with 1070 got 8gb aswell and I do understand it is slower type of memory but just the fact that it's the same digit after so many years in same class laptop doesn't really look good...
Maybe 12gb seems be be the minimum for ray tracing at 1080p / 1440p. Check out hardware unboxed video on Hogwarts legacy 50+ Gpu benchmark watch?v=qxpqJIO_9gQ
I appreciate this comparison video. Thanks for making it! The people who say, "no one uses RT" or "there are only a few RT games," aren't thinking about the future. If I purchase one of these cards, I'm expecting to use it for 5 years. Sure, there aren't many RT games _now,_ that is true, but I expect RT will become much more prevalent in coming years. Especially considering there are now RT capable consoles out there, we can expect there will be many more titles that implement RT moving forward. It's been a bit over two years since the new consoles and 30-series cards have been out, so developers who started developing games around that time should be releasing their games in the next 1-3 years. I bet many of them will have at least some RT implementation. You can be sure all of the *big* releases will have RT.
Yes and no. Future proofing is good, but with this technology it is not still enough for other cards than high end. With light implementation it is ok and with this video we saw almost no difference with both cards. But with heavy implemetations even RTX 4090 isn't powerful enough. I think next one generation or even next after that would be powerful enough to no loose too much performance with RT enabled. In my opinion, when we will lose no more than 35% of performance on high/ultra RT settings only then we could think about RT as viable. I know someone can say, that we have upscaling (DLSS, FSR, XeSS), but upscaling with settings lower than quality is stupid, cause you want to have better graphics with RT, but you introducing upscaling artifacts thus lowering quality of picture. I want to say, that I understand your point of view and I can agree, that newer titles would be released with some RT options and probably older titles could get RT update. Some of recent ones look amazing and there are visible differences, but few still are barely different from ultra no RT. To conclude I think current and previous generations are good enough for 1080p RT for most part. Newest high end cards are good for 1440p (some maybe for wide). Next generation should be able to fully play 1440p and probably high end for some 4K. Next after that should run 4K RT. Those are my predictions based on what we see with RT performance trends generation by generation by both red and green team. With new blue team, who is promising with RT performance (one of lowest losses on FPS) we could see nice results. By 2024 I think we could see something like: - if you want professional workspace go with NVidia (for video, picture editing, AI) - if you want RT gaming on midrange setup go with Intel - if you want pure or light RT gameing on low to midrange setup go with AMD I don't know, if I'm right, but this is what I concluded from watching multiple benchmark videos from multiple channels. Intel and AMD need improvements on drivers and NVidia needs to lose market shares (at least on low to midrange setups) and all of them need reality check with their pricing.
I bought a 2080 when they came out thinking it would be awesome. What a let down. Only the most popular/newer games even support RT. So when I was hoping to get some RT action in the games I like, I got a rude awakening when NONE of the games I was playing ever got RT support. RT adds nothing meaningful and I couldn't care less about it. If you like shinney things, then turn RT on otherwise don't give it another thought. It sucks.
The thing is RT is only going to get more demanding, heck the upgraded Witcher 3 with RT destroys pretty much every single new GPU, except for the 4090 at 4k with RT enabled. The 4080 and 7900xtx are both running bellow 60fps, Cyberpunk 2077 at 4k and RT enabled gets the 7900xtx to like 27fps, and the 4080 to like 45fps. At 1440p its playable with the 4080 at around 60fps, though it does drop down to 35fps. So barely the playable experience for a card that costs $1300 or more! So to me any thought of "future" RT titles doesn't matter, as you can only play current titles with RT on just barely, most future titles are going to be even more demanding, so you are not going to be getting good fps in future RT titles! You are buying current gen cards to play current gen games! I think we are ways off where RT makes sense and where gamers would want it enabled all the time, right now its just way too demanding and its going to get even more demanding. I think maybe 5 years in the future with 2 generations future GPU's we might be able to start catching up to RT demands. Right now, there is no way a mid range GPU can run any high quality RT game, its reserved pretty much for the top tier of cards!
@@SlickR12345 I do agree with you mostly. I think that RT is mostly for high end GPUs right now at 4K. As for Witcher 3, I think this is not the best implementation of RT. In addition most of reviewers test this with ultra+ settings, which are taxing even more resources. Game looks pretty with those settings, but isn't playable at 30-45 fps with a little stutters. @Merlin I don't play games with RT right now, but I do see how those titles look like. I can agree with you, that graphicly they are stunning, but reduction in performance is too high for most part and midrange cards do choke themselves in those (not every title, but in most popular). Not everyone can afford or is willing to pay premium money for GPU, which isn't guaranteed to run everything smoothly. Most people are willing to spend 1-2k for whole setup and those cards are getting to those prices. I'm one of people, who can afford them, but won't pay too much for products with those performances. Before 2019 I could buy whole station for one month worth of my salary, now GPU are getting to or even exceeds this amount. I'm earning not bad money and I would use my setup to work too, but it is getting too hard to satisfy buying. In addition I need to pay additional cost, cause of EU tax. That is why RT for me isn't a buying point at this moment and most of people, who are talking good about it sound like they want to brag about or compensate for lack of their wealth. I don't mean to diminish anyone's happiness from using RT, but they are heavilly brainwashed by NVidia to go around and praise Jensen for his greed. The more you buy, the more you save - actual quote from GTC 2018, which is funny and terrifying at the same time, cause people are still buying their products. They should stop just for his ego to shutter and go back to normality. So when 1+k GPU are having problems with RT performance right now I do tell people to snap back to reality. Did you see requirements for Portal RTX? 1080p 60fps minimum RTX 3080 and using DLSS 2. So you need upscaling from 720p or lower to play the game from 2007 with RT update at normal fps. For 1440p and 4K you need newest generation with DLSS 3 (so frame generation) for 60 fps. I did complete this game without RT and let me tell you, it wasn't any less enjoyable than new one is marketed for.
@@macoi3008 depends on which marketing we talking about. The rtx 3090 used to be a 4k card then they said. The rtx 4070ti will outperform the 3090. Anyway the price should be around £600 as 900 way too much for it
ty for testing with ray tracing at 1440P, that is how i would play with any of those card. i/e is just me or the 7900XT is SO inefficient with memory ? it use SO much more memory the 4070TI at the same settings... like its actually need those 20GB.
The 7900xt has 8gigs more and it only used 1gig on average more along with higher bandwidth on it. I think everyone will agree AMD wins there. Plus, Nvidia has always been a slight bit better at managing memory overall but all though what you're seeing really is memory allocations. Memory that's on standby. Its actual used memory you need another program to tell MSI afterburner don't show that. But here AMD driver simply thinks it needs to allocate more ram for the game then Nvidia and it has all the vram in the world to do. The less vram a card has the less it will tend to allocate
It's not inefficient. Take windows for example, the more system ram you have the more windows pre loads into ram for quicker access. Same thing here, the game is just loading more into vram because of the extra space available. It's a good thing, should increase overall performance.
Great video! 7900xt surprised me and even beat 4070 handily in metro, 1 of my favorite titles. 4070 is a great card too. Try adding a few more games like far cry 6. Pretty great game and ray tracing on that favors amd.
Fc6 rt implementation is garbage so y even use rt. Quake n minecraft rt look great. In those games 7900xt gets destoyed. Sign of bad things to come for amd. Also if u use dlss3 4070ti becomes twice as fast as 7900xt for $100 cheaper. Xt is dead on arrival
@Anuza Hyder okay. That's your opinion. Including more ray tracing titles just makes sense. There are over 30 aaa games that include this feature and about a hundred all together that i would like to see a video on as the decisive rt test video on UA-cam!
Fake result most of these channel not have gpu it just non oc vs oc or low card I got both for text Cyberpunk rx7900xt in 4k is more fps than rtx 4070ti confirmed
This isn't 4k and it isn't rasterised. You might want to check your sources, because with ray tracing on the 7900XT doesn't beat the 4070Ti and anywhere that says it does is wrong.
Majority of public is arguing regarding RT, is it future or not. Future is UE5 and it uses own lighting technologies. NV is pushing RT by adding tech to old engines. Witcher 3 RT on at 45 Fps with 950 eur card, good luck with that.
how interesting that the RT of the 7900XT is surprisingly not tthat far behind the similarly priced NVidia card; AMD generally has a stigma that it's RT is less powerful than NV, but really this shows that the difference, like for like (ish), isn't really that big any more... Also when looking at RT on F1 2022 on your other video from a few days ago looking at 7900XT with RT on/off in several games, when you compare RT to the rasterisation "faked reflections", raster actually looks sooooo much better: smoother update and better definition than either resolution with RT on, giving a greater sense of speed to the game... RT is very much a con, even now on the third gen NVidia cards! (I accept that in some games that actually use RT for GI, AO and shadows it can be transformative, but at that point of high RT implentation, you need an absolute powerhouse of a GFx card to run it, so that really ONLY the 4090 is getting close to achieving acceptable framerates; the premium pinnacle of third gen cards, with a price tag to match, and even then it still needs to cheat a bit with DLSS 3 to truly achieve all out, full RT.) ((As a total aside, last year or the year before, didn't NVidia buy the company that pioneered the new "light based processor" (as opposed to electrically based), that even in its first iteration is around 300x faster than electronic processors? I predict that they are developing RT and DLSS now in electrical form that enables a large degree of software iteration (for example DLSS -> DLSS2 -> 2.1-> DLSS3 -> to the new iteration of DLSS3 soon to be released that improves frame generation artefacting), but once they're happy with it, they will transpose that to a light processor on 50 series cards (potentially, but probaly more like 60 or 70 series) ... The light based processor can only be built to do one thing (it cannot be changed by software), but it is 300x faster which means that the lag currently induced by RT and DLSS will be eliminated completely.))
@@khaled8671 AHH, but it's not, is it! You can't actually buy the 4070ti for it's MSRP... It's more like 850 minimum, so 7900xt is only 50 $/£ more... And although the 4070ti is slightly faster in raytracing, isn't the xt significantly faster in raster?
Bro nvidia easily performs better than xt in exodus. N it’s not only the averages. In rt heavy scenes, fps drops considerably on xt. Imaginary frames lol. Stop being an amdonkey. It increases the smoothness. Dlss 3 is super good n almost every 4000 user uses it. With dlss upscaling u can spot differences but with dlss3 it’s even hard to spot difference, it is that good. In metro 4070ti wud be twice as fast
Finally AMD catching up with Nvidis with RT and FSR. With next gen GPUs Nvidia sure needs to find something special to keep justifying these insane prices
Something must be going wrong with Metro Exodus. I have with my 2080S in 1440p, RT On everything in highest Setting the same FPS. Same withn the Witcher. Even there my 2080S makes better Performance. Hmmm
The extreme preset and max RT with no DLSS in the PC enhanced edition? My 2080Ti and 3070 would also get outperformed by your 2080S. The Witcher's DX12 is broken so to a certain extent it's a broken benchmark, though performance does still more or less scale through different tiers of card.
Tbh The 4070Ti starts to suffer at 2160 relative to the 7900 XT. I would've tested that resolution but the courier messed up the delivery which put me 24 hours behind the rush.
@@androgaming4476 That was my takeaway tbh. I'm genuinely shocked that people think this benchmark is intended to put AMD down because I think it shows real progress... Lol
This is AMD's 2nd generation RT against Nvidia's 3rd, and they are already almost neck and neck here in this performance class. With my 6900XT, I wouldn't even consider putting RT on. But with this card you would do it because the performance loss isn't that bad.
Is nobody gonna comment on the witcher 3's abysmal frame pacing or what one of the worst 'updates' in a while, they didn't charge money for it for a reason
@@francomora909 4070ti is better value but not by that much... go by u'r bouget and if u can get and fit 4080 go for it. if u are limited on bouget (and want RT) go by u'r resolution 1440P > 4070TI / 4K -> 4080 if u dont care about RT go 6800XT (or wait for 7800XT) for 1440P and 7900XTX for 4K.
Well let's be honest who's is playing RT games how much percent of the gamers are doing that? This is a poor way to prove that 4070 Ti or was it 4080 :) is worth something while the truth is that it's not a good deal.
Bro he’s just benchmarking. He isnt saying one is better than the other. N how many ppl are using rt? Can u tell me what % of ppl actually own high end gpus. U shud go to steam n check. From 3080 to above, incl 7900 not even 10% so obviously these ppl are gonna come n say i dont care about rt coz they never experienced it or the fps was so low that it wasn’t sustainable.
It is shame. The money AMD and nVidia ask for their gpus are ridiculous! Average FPS is between 60-70. ANYBODY should buy them. If they were about 500€ for RTX and 400€ for AMD, thats the spot. NO gpu can handle native high refresh RT gaming. If they were capable to handle high refresh and RT, let's them have a 1000€ price tag. But this performance is loughable. My rx5700xt still rocks, and it cost 450€ back in 2020. F...cash grab. SHAME
Nvidia... what happened. 7900XT right on your heels. I know, 4070Ti just isn' that powerful. Now imagine what 4070 and below will be. They are just going to suck big time.
They are both garbage. 800$ for a 1440p card that cant even run native 1440p with rtx is NOT a value in any way shape or form, imagine spending $800 for a gpu thats worse than a ps5.
@@Dempig this graphics card performs same as 3090 which costs 1800usd. Isnt necesary to buy you can get a 2060 or rx 5700xt and play every game in the market.
Ray tracing isn't going away. The reason games look realistic is the lighting. You won't be able to stick your head in the sand like this for much longer. Some of us are enjoying amazing visuals, and have been. The people that can't run it on their pc always make these comments.
The Cyberpunk 2077 benchmark settings should be labelled as having 'AUTO DLSS/FSR', not 'QUALITY'. Apologies for the error.
I think both are great options at this resolution for RT. FSR3.0 might change a few things in AMD's favour, will see how much NVIDIA can squeeze out of the 4070ti in future updates. It's only a matter of which one can be found cheaper
Seems pretty even or close . Wish these cards were more like $500-$600
That would be... renaissance of pc gaming.
Very few videos give so detailed rt test.. Thanks very much.. Rt is game changer.. Especially in city based games
Not really , the only game it looks good in is Cyber Punk and thats because the game has lots of puddles and neon lights.
Is it just me or does FSR not look quite as good as DLSS
Calm down everyone he just showing the RT difference. I don't play RT but the comments are abit harsh.
True I wish everyone would do both
@@Jason_Bover9000 you can find non ray tracing comparisons everywhere. Why do you need another one here?
RT ON / DLSS ON = 4070ti ≥ 7900XT
RT OFF / DLSS OFF = 4070ti ≤ 7900XT
is it roughly right?
Great comparion. The 7900XT did NOT get the curb stomping in heavy RT settings I thought it would from the 4070ti. Alot closer than I expected with these things enabled. I sprinkle in as much RT as I can in my games as long as I have high framerates. The Witcher for example... I just play Ultra+ 1440P without RT. The hit is just too much for too little visual improvement. Frame Generation can suck it. I am tired of half measures (upscaling and frame generation) to try and get higher fps. Both companies need to improve rendering power. Period.
I completely agree: I love the idea of RT, and also enable it when I can, but RT in it's current implementation is very much a con, even now on the third gen NVidia cards!
(copy edited from my own separate comment)
I accept that in some games that actually use RT for GI, AO and shadows it can be transformative, but at that point of high RT implentation, you need an absolute powerhouse of a GFx card to run it, so that really ONLY the 4090 is getting close to achieving acceptable framerates; the premium pinnacle of third gen cards, with a price tag to match, and even then it still needs to cheat considerably with DLSS 3 to truly achieve all out, full RT.
As an aside, last year or the year before, didn't NVidia buy the company that pioneered the new "light based processor" (as opposed to electrically based), that even in its first iteration is around 300x faster than electronic processors? I predict that they are developing RT and DLSS now in electrical form that enables a large degree of software iteration (for example DLSS -> DLSS2 -> 2.1-> DLSS3 -> to the new iteration of DLSS3 soon to be released that improves frame generation artefacting), but once they're happy with it, they will transpose that to a light processor on 50 series cards (potentially, but probaly more like 60 or 70 series) ... The light based processor can only be built to do one thing (it cannot be changed by software), but it is 300x faster which means that the lag currently induced by RT and DLSS will be eliminated completely.
At that point it will actually be truly game-changing, where RT can be enabled as a hardware solution (as opposed to a hardware AND software solution) that costs next to nothing in frame rate.
RT is pretty much barely developed for only 3 games the rest are meh so People getting all bent *nViDiA iS mOrE bEtTeRs RaY tRaCiNg* can just fuk off lolz it’s an invalid argument.
That's fake rt
@@lexavlogs7149 what is lol
Performance seems pretty close and while the 7900XT generally does a bit worse (like -10 fps or so) it's still seems above 60 fps most of the time (which is my arbitrary cut-off for smoothness). I'm considering both cards and have no allegiance to AMD or NVidia. What I'm more interested in is is the VRAM and bus width. The 7900 XT has the 4070 TI beat, hands down. However, does that matter now and into the future if I'm only gaming at 1440p? I don't do any productivity stuff with my GPU at all - do I need to care much about 12 vs 20 GB? Bandwidth? What about gaming in the future, especially if I'm not transitioning to 4K. If anyone has any answers that'd be great, thanks!
12GB is plenty of vram for now and I suspect in the near future (2-3 years). I think in 4 years' time it is going to be the GPU itself that is the weak link, rather than the lack of vram, though I do suspect a lot more games are going to use 12+GB in 4+ years' time.
The 7900XT is actually better in rasterization performance, I think something like 8% faster, while being 12% more expensive (though realistically since NO card sells at $800, ALL current cards start at the lowest price of $850) the price difference is like 5% or less, while the 4070ti is some 12% faster in RT while being around 5% cheaper.
I think if you are going to spend $900, you might as well save up $100 and get the 7900XTX, much better value and I think its 16% more performance over the 7900XT while being only 10% more expensive!
@@SlickR12345 Thanks for the reply and insights. I was considering the XTX as well except that it's much harder to find (although availability may increase due to the temperature problems) and it may use more power than my 750W PSU can handle. However, since I won't really be pushing the card to its limits, perhaps the power won't be too big an issue. Also, I worry that it's going to be overkill since I don't game in 4K. But I may consider it, especially if prices come down at all and availability improves.
@@mattwong7191 Considering its only 10% more expensive than the 7900xt and is 16% on average faster, some reviewers even have it at 20% faster in certain combination of games, I'd say its extremely worth it.
The AIB partner cards are generally a bit more expensive, depending on which model you go for, but I've seen custom models selling for $1000. Again, only some MBA cards have the overheating issue.
You are also going to be the 4070ti in RT with the 7900xtx, in most titles its actually some 5% faster in RT. I believe it only loses in Cyberpunk 2077 at ultra high RT against the 4070ti, though that RT implementation is ALL Nvidia, it was designed to perform the best on Nvidia's hardware.
I think in your situation is better a 7900XT. The 20GB and bus bandwidth will pay off in the future. in means that it will last longer than the nvidia card at least in terms lf memory size.
@SlickR I want to buy card for 1440p for 6-8 years and I don't buy NV because of VRAM, but I'm waiting for 7800xt, 16GB will be enough for future games and card will be cheaper than 4070 Ti probably with slighty worse raster I think.
If I played my games with RT then NVIDIA all the way. But honestly I find little use for ray tracing. Definitely depends on the games you play to determine which card to get.
Personally agree that it usually isn't the big deal some make it out to be. Still, the 7900 XT is pretty capable of it in most of the games shown.
for games i play, only one that i would is metro, in rest of games i dont really mind no rt, could see myself using it also in cb 2077, but im not hyped for that game
Bro witcher rt is insane. Dying light is also good. Bright memory, ghostwire tokyo etc. there are many games bro. Rt is the future. Dlss3 helps out 4070ti massively as well. Amd probably has to cancel 7900xt or make it a lot cheaper or no one wud buy that card
The nvidea 4070ti has a lower power draw as well.
And much, much less VRAM and memory bandwidth
Thank you for making this video. I appreciate Ray tracing doesn’t appeal to everyone but I personally use it where I can and really wanted to see the difference here
How will it appeal to everyone? How many ppl even own high end gpus. Ray tracing is the futurr
@@anuzahyder7185 exactly, rt is gonna replace rasterization soon. People should just stop defending amd everywhere
Not defending AMD. Never owned an AMD GPU. I don’t think the performance loss is worth the differences it displays. Fair enough a 4090 can probably handle it but my bank account cannot handle/justify a 4090…
I made a short video in cyberpunk highlighting non and Ultra Ray tracing differences on a 3080ti which cut the frame rate nearly in half….
RT is going to replace rasterization?? Do you even know what you are talking about?
Great video! Could you do another one just like this at 1440p, but without ray tracing?
Yeah I'm working on one that's mostly rasterised @ 1440p and 4k. It'll probably be a couple of days
4070ti is a clear choice if you plan on playing 1440p. But the 7900xt is objectively better for 4k.
How does that make sense? Whichever card performs better in 4K is clearly going to perform better at 1440P. 4K is entirely GPU bound.
So in 1440p with RT on more than 8gb vram is being consumed and they still wants to put 8gb in laptop's 4070... Wow, my 6 year old one with 1070 got 8gb aswell and I do understand it is slower type of memory but just the fact that it's the same digit after so many years in same class laptop doesn't really look good...
I love to see red fanboys bitching this and that everywhere while red being roasted to noob level by der8auer
will the 12GB of the TI be a problem in the near future? (2-3 years)
not at 1440
@@ntme9 what about UWQHD its 1440p but 21:9 so bit lower than 4k
@@blinzi69 should be fine
Maybe 12gb seems be be the minimum for ray tracing at 1080p / 1440p. Check out hardware unboxed video on Hogwarts legacy 50+ Gpu benchmark watch?v=qxpqJIO_9gQ
Great comparison. Just what I was looking for.
the 7900XT suddenly went from a card no one wanted to hey this card is pretty good thanks to the suckiness that is the 4070ti.
U wish. 4070ti is miles ahead if xt
I appreciate this comparison video. Thanks for making it!
The people who say, "no one uses RT" or "there are only a few RT games," aren't thinking about the future. If I purchase one of these cards, I'm expecting to use it for 5 years. Sure, there aren't many RT games _now,_ that is true, but I expect RT will become much more prevalent in coming years. Especially considering there are now RT capable consoles out there, we can expect there will be many more titles that implement RT moving forward. It's been a bit over two years since the new consoles and 30-series cards have been out, so developers who started developing games around that time should be releasing their games in the next 1-3 years. I bet many of them will have at least some RT implementation. You can be sure all of the *big* releases will have RT.
Yes and no. Future proofing is good, but with this technology it is not still enough for other cards than high end. With light implementation it is ok and with this video we saw almost no difference with both cards. But with heavy implemetations even RTX 4090 isn't powerful enough. I think next one generation or even next after that would be powerful enough to no loose too much performance with RT enabled. In my opinion, when we will lose no more than 35% of performance on high/ultra RT settings only then we could think about RT as viable. I know someone can say, that we have upscaling (DLSS, FSR, XeSS), but upscaling with settings lower than quality is stupid, cause you want to have better graphics with RT, but you introducing upscaling artifacts thus lowering quality of picture.
I want to say, that I understand your point of view and I can agree, that newer titles would be released with some RT options and probably older titles could get RT update. Some of recent ones look amazing and there are visible differences, but few still are barely different from ultra no RT.
To conclude I think current and previous generations are good enough for 1080p RT for most part. Newest high end cards are good for 1440p (some maybe for wide). Next generation should be able to fully play 1440p and probably high end for some 4K. Next after that should run 4K RT. Those are my predictions based on what we see with RT performance trends generation by generation by both red and green team. With new blue team, who is promising with RT performance (one of lowest losses on FPS) we could see nice results. By 2024 I think we could see something like:
- if you want professional workspace go with NVidia (for video, picture editing, AI)
- if you want RT gaming on midrange setup go with Intel
- if you want pure or light RT gameing on low to midrange setup go with AMD
I don't know, if I'm right, but this is what I concluded from watching multiple benchmark videos from multiple channels. Intel and AMD need improvements on drivers and NVidia needs to lose market shares (at least on low to midrange setups) and all of them need reality check with their pricing.
I bought a 2080 when they came out thinking it would be awesome. What a let down. Only the most popular/newer games even support RT. So when I was hoping to get some RT action in the games I like, I got a rude awakening when NONE of the games I was playing ever got RT support. RT adds nothing meaningful and I couldn't care less about it. If you like shinney things, then turn RT on otherwise don't give it another thought. It sucks.
Every new game I buy supports RTX and it looks incredible. Those people here probaly only play CSGO or ARK or something.
The thing is RT is only going to get more demanding, heck the upgraded Witcher 3 with RT destroys pretty much every single new GPU, except for the 4090 at 4k with RT enabled. The 4080 and 7900xtx are both running bellow 60fps, Cyberpunk 2077 at 4k and RT enabled gets the 7900xtx to like 27fps, and the 4080 to like 45fps. At 1440p its playable with the 4080 at around 60fps, though it does drop down to 35fps. So barely the playable experience for a card that costs $1300 or more!
So to me any thought of "future" RT titles doesn't matter, as you can only play current titles with RT on just barely, most future titles are going to be even more demanding, so you are not going to be getting good fps in future RT titles! You are buying current gen cards to play current gen games!
I think we are ways off where RT makes sense and where gamers would want it enabled all the time, right now its just way too demanding and its going to get even more demanding. I think maybe 5 years in the future with 2 generations future GPU's we might be able to start catching up to RT demands.
Right now, there is no way a mid range GPU can run any high quality RT game, its reserved pretty much for the top tier of cards!
@@SlickR12345 I do agree with you mostly. I think that RT is mostly for high end GPUs right now at 4K. As for Witcher 3, I think this is not the best implementation of RT. In addition most of reviewers test this with ultra+ settings, which are taxing even more resources. Game looks pretty with those settings, but isn't playable at 30-45 fps with a little stutters.
@Merlin I don't play games with RT right now, but I do see how those titles look like. I can agree with you, that graphicly they are stunning, but reduction in performance is too high for most part and midrange cards do choke themselves in those (not every title, but in most popular). Not everyone can afford or is willing to pay premium money for GPU, which isn't guaranteed to run everything smoothly. Most people are willing to spend 1-2k for whole setup and those cards are getting to those prices. I'm one of people, who can afford them, but won't pay too much for products with those performances. Before 2019 I could buy whole station for one month worth of my salary, now GPU are getting to or even exceeds this amount. I'm earning not bad money and I would use my setup to work too, but it is getting too hard to satisfy buying. In addition I need to pay additional cost, cause of EU tax.
That is why RT for me isn't a buying point at this moment and most of people, who are talking good about it sound like they want to brag about or compensate for lack of their wealth. I don't mean to diminish anyone's happiness from using RT, but they are heavilly brainwashed by NVidia to go around and praise Jensen for his greed. The more you buy, the more you save - actual quote from GTC 2018, which is funny and terrifying at the same time, cause people are still buying their products. They should stop just for his ego to shutter and go back to normality.
So when 1+k GPU are having problems with RT performance right now I do tell people to snap back to reality. Did you see requirements for Portal RTX? 1080p 60fps minimum RTX 3080 and using DLSS 2. So you need upscaling from 720p or lower to play the game from 2007 with RT update at normal fps. For 1440p and 4K you need newest generation with DLSS 3 (so frame generation) for 60 fps. I did complete this game without RT and let me tell you, it wasn't any less enjoyable than new one is marketed for.
Great 1440p performance on both cards. But a 1440p card should not cost $800.
Yep. Reduce the price of these cards by $300 or more and they're great. Otherwise they kinda suck. It's a shame really.
They not a 1440p cards lol
@@gergelykiss1142 4070 ti is marketed as 1440p card lol.
@@macoi3008 depends on which marketing we talking about. The rtx 3090 used to be a 4k card then they said. The rtx 4070ti will outperform the 3090. Anyway the price should be around £600 as 900 way too much for it
@@fpscomputers976 and wait for the next year recession.
Ty for the video.
Most vids on UA-cam are on 4k.
But both the 4070ti and 7900xt are best played on 1440p in my opinion
Thank you. Yeah I agree, particularly because of the 4070Ti's memory constraints and the 7900XT's weaker RT and upscaling.
You nailed it. I'm just here trying to convince myself xtx is going to be a killer 1440p card for the next few years.
Why does Amd Gpu use more vram on average?
It has more available. Games these days can use what you have not just the minimum
This is not true. I cross referenced this with Digital Foundry. *How is the XT beating the Ti in Cyberpunk?* it doesn't make sense.
It is not ? Watch it again
9:36 the floor looks blurry for amd.
It's a great video. but I like to see without RT comparison
can you show u raytracing vs no raytracing>?
the new 4070 must not be used for resolution bigger than 1080p at all when the RT is enabled. (I'm talking only about the nvidia)
What is your problem?
Seems like the 7900 XT is better value currently.
This video just saved me a tone of money. I will be playing Cyberpunk 2077 at 1440p. Did I just witness equivalent graphical image from the 4070ti?
Where is the GPU power usage?
Not available with MSI afterburner 4.6.5 beta 2. The 7900XT easily pulls 300w, the 4070Ti more like 250w
@@fpscomputers976 Ok, there is already beta 4 :-)
ty for testing with ray tracing at 1440P, that is how i would play with any of those card.
i/e is just me or the 7900XT is SO inefficient with memory ? it use SO much more memory the 4070TI at the same settings... like its actually need those 20GB.
The 7900xt has 8gigs more and it only used 1gig on average more along with higher bandwidth on it. I think everyone will agree AMD wins there. Plus, Nvidia has always been a slight bit better at managing memory overall but all though what you're seeing really is memory allocations. Memory that's on standby. Its actual used memory you need another program to tell MSI afterburner don't show that. But here AMD driver simply thinks it needs to allocate more ram for the game then Nvidia and it has all the vram in the world to do. The less vram a card has the less it will tend to allocate
It's not inefficient. Take windows for example, the more system ram you have the more windows pre loads into ram for quicker access. Same thing here, the game is just loading more into vram because of the extra space available. It's a good thing, should increase overall performance.
Great video! 7900xt surprised me and even beat 4070 handily in metro, 1 of my favorite titles.
4070 is a great card too. Try adding a few more games like far cry 6. Pretty great game and ray tracing on that favors amd.
Fc6 rt implementation is garbage so y even use rt.
Quake n minecraft rt look great. In those games 7900xt gets destoyed. Sign of bad things to come for amd. Also if u use dlss3 4070ti becomes twice as fast as 7900xt for $100 cheaper. Xt is dead on arrival
@Anuza Hyder okay. That's your opinion.
Including more ray tracing titles just makes sense. There are over 30 aaa games that include this feature and about a hundred all together that i would like to see a video on as the decisive rt test video on UA-cam!
what ? the 7900xt beat the 4070ti ? in metro exodus ?? we are not seeing the same thing... in some place the 7900xt can hardly get to 60 fps
Fake result most of these channel not have gpu it just non oc vs oc or low card
I got both for text
Cyberpunk rx7900xt in 4k is more fps than rtx 4070ti confirmed
This isn't 4k and it isn't rasterised. You might want to check your sources, because with ray tracing on the 7900XT doesn't beat the 4070Ti and anywhere that says it does is wrong.
your 7900xt is with low clock. Below normal boost
According to what? AMD's product page says 2400 and this card is often hitting 2600
What i think people should focus on is Fortnite since it's what most new games' engine would be.
Majority of public is arguing regarding RT, is it future or not. Future is UE5 and it uses own lighting technologies. NV is pushing RT by adding tech to old engines. Witcher 3 RT on at 45 Fps with 950 eur card, good luck with that.
how interesting that the RT of the 7900XT is surprisingly not tthat far behind the similarly priced NVidia card; AMD generally has a stigma that it's RT is less powerful than NV, but really this shows that the difference, like for like (ish), isn't really that big any more...
Also when looking at RT on F1 2022 on your other video from a few days ago looking at 7900XT with RT on/off in several games, when you compare RT to the rasterisation "faked reflections", raster actually looks sooooo much better: smoother update and better definition than either resolution with RT on, giving a greater sense of speed to the game... RT is very much a con, even now on the third gen NVidia cards!
(I accept that in some games that actually use RT for GI, AO and shadows it can be transformative, but at that point of high RT implentation, you need an absolute powerhouse of a GFx card to run it, so that really ONLY the 4090 is getting close to achieving acceptable framerates; the premium pinnacle of third gen cards, with a price tag to match, and even then it still needs to cheat a bit with DLSS 3 to truly achieve all out, full RT.)
((As a total aside, last year or the year before, didn't NVidia buy the company that pioneered the new "light based processor" (as opposed to electrically based), that even in its first iteration is around 300x faster than electronic processors? I predict that they are developing RT and DLSS now in electrical form that enables a large degree of software iteration (for example DLSS -> DLSS2 -> 2.1-> DLSS3 -> to the new iteration of DLSS3 soon to be released that improves frame generation artefacting), but once they're happy with it, they will transpose that to a light processor on 50 series cards (potentially, but probaly more like 60 or 70 series) ... The light based processor can only be built to do one thing (it cannot be changed by software), but it is 300x faster which means that the lag currently induced by RT and DLSS will be eliminated completely.))
but it’s 100$ more so it should perform better or else it’s a bad deal!!
@@khaled8671 AHH, but it's not, is it! You can't actually buy the 4070ti for it's MSRP... It's more like 850 minimum, so 7900xt is only 50 $/£ more... And although the 4070ti is slightly faster in raytracing, isn't the xt significantly faster in raster?
@@aaronjones4529 same with rx 7900xt you can’t find it with msrp prices
@@khaled8671 AHH, really? Here in UK you can, so I assumed it was similar globally
Wow metro exodus runs well on 7900xt on raytracing compared to 4070ti
Until you flip on DLSS 3.0 and your frames double.
@@JBlNN we're measuring performance here, not imaginary frames.
4070ti - avg 81fps , 7900xt - avg 68fps .. so what are u talking about ?
Bro nvidia easily performs better than xt in exodus. N it’s not only the averages. In rt heavy scenes, fps drops considerably on xt.
Imaginary frames lol. Stop being an amdonkey. It increases the smoothness. Dlss 3 is super good n almost every 4000 user uses it. With dlss upscaling u can spot differences but with dlss3 it’s even hard to spot difference, it is that good. In metro 4070ti wud be twice as fast
@@ChuckTheChosen Still double the fps no matter what you say lol
Finally AMD catching up with Nvidis with RT and FSR.
With next gen GPUs Nvidia sure needs to find something special to keep justifying these insane prices
Rtx 4070 ti 990 USD and RX 7900 xt is 1000 USD
@@kionmahuermicio9860 7900XT Reference models and XFX models are at $870-$899 on newegg. The only ones that are $1000 are the sapphire cards.
@@kionmahuermicio9860 Yeah idk where you're buying your parts but that's a straight up lie
The 12GB of VRAM kills the RTX 4070TI for future proofing.
i just wished for above 60fps 1440p not costing 1200$😭
rt off ?
Its on
AMD is still way behind nvidia in RT... damn...
Something must be going wrong with Metro Exodus. I have with my 2080S in 1440p, RT On everything in highest Setting the same FPS. Same withn the Witcher. Even there my 2080S makes better Performance. Hmmm
The extreme preset and max RT with no DLSS in the PC enhanced edition? My 2080Ti and 3070 would also get outperformed by your 2080S. The Witcher's DX12 is broken so to a certain extent it's a broken benchmark, though performance does still more or less scale through different tiers of card.
Good video. Poor AMD. I love my 6900XT but they just can't compete.
That 4070ti has a 192 bit bus for crying out loud.
Tbh The 4070Ti starts to suffer at 2160 relative to the 7900 XT. I would've tested that resolution but the courier messed up the delivery which put me 24 hours behind the rush.
But the. Performance difference is very low compared to 6900xt with 3080 , there is so much less difference in rt
@@fpscomputers976 I have subscribed so I don't miss out on your videos. Can't wait for the 4K comparison.
@@androgaming4476 That was my takeaway tbh. I'm genuinely shocked that people think this benchmark is intended to put AMD down because I think it shows real progress... Lol
This is AMD's 2nd generation RT against Nvidia's 3rd, and they are already almost neck and neck here in this performance class. With my 6900XT, I wouldn't even consider putting RT on. But with this card you would do it because the performance loss isn't that bad.
So rt enabled cards are about even but with it off the AMD card wins hands down...
New games nowadays look good enough even with RT off anyways.
The only game i'd ever turn it on in would be Minecraft.
Is nobody gonna comment on the witcher 3's abysmal frame pacing or what
one of the worst 'updates' in a while, they didn't charge money for it for a reason
Not even a single stutter on my 4090
No difference for me with my 3080 12 gb oced,670 eyro new.4070ti s price it s not good
Man 4070 ti is a much better value than 7900xt
What do you think is better value between 4070ti and 4080
@@francomora909 4070ti is better value but not by that much... go by u'r bouget and if u can get and fit 4080 go for it.
if u are limited on bouget (and want RT) go by u'r resolution 1440P > 4070TI / 4K -> 4080
if u dont care about RT go 6800XT (or wait for 7800XT) for 1440P and 7900XTX for 4K.
Bro play in 4k 4070 ti struggles😂 i would rather buy 3090 than this card it's meant for 1440p with low vram and more costly
@@androgaming4476 correct
@@gametime4316 im playing in 2k currently
Well let's be honest who's is playing RT games how much percent of the gamers are doing that? This is a poor way to prove that 4070 Ti or was it 4080 :) is worth something while the truth is that it's not a good deal.
Bro he’s just benchmarking. He isnt saying one is better than the other. N how many ppl are using rt? Can u tell me what % of ppl actually own high end gpus. U shud go to steam n check. From 3080 to above, incl 7900 not even 10% so obviously these ppl are gonna come n say i dont care about rt coz they never experienced it or the fps was so low that it wasn’t sustainable.
It is shame. The money AMD and nVidia ask for their gpus are ridiculous! Average FPS is between 60-70. ANYBODY should buy them. If they were about 500€ for RTX and 400€ for AMD, thats the spot. NO gpu can handle native high refresh RT gaming. If they were capable to handle high refresh and RT, let's them have a 1000€ price tag. But this performance is loughable. My rx5700xt still rocks, and it cost 450€ back in 2020. F...cash grab. SHAME
now non RT please
Yes I want to see AMD more dead😂😂
There's a mostly rasterised set of benchmarks coming. It'll be 1440 and 4k.
@@ogxracer2008 no no rtx amd wins
Nvidia... what happened. 7900XT right on your heels. I know, 4070Ti just isn' that powerful. Now imagine what 4070 and below will be. They are just going to suck big time.
disable that rayscam shit
...but my water reflections and 30% performance loss...
4070ti is better value. Nvidia wins in high tier on this generation
They are both garbage. 800$ for a 1440p card that cant even run native 1440p with rtx is NOT a value in any way shape or form, imagine spending $800 for a gpu thats worse than a ps5.
@@Dempig i see every game over 60fps what are you talking bout
@@Dempig this graphics card performs same as 3090 which costs 1800usd. Isnt necesary to buy you can get a 2060 or rx 5700xt and play every game in the market.
@@marcomarroquin8403 It dosnt perform the same as a 3090 at 4k , its worse
@@Dempig worse than ps5 😂. U high?
This dude is an NV fangay.
Pure amdonkey.
In Norway the 7900 xt costs the same as the 4080. The xtx is almost the same cost as a 4090
1% Lows are a lot better on the 7900 xt
nobody uses Ray tracing
Yes. Most gamers aren’t need RT.
They need more price down.
Yeah, rt is useless, just because amd is bad in it, isn't it?
DOUBLE STANDARDS
If u don't have a 4000s card, true. Definitely don't use RT.
But if u do have a 4000s card, there's no reason not to use RT
Ray tracing isn't going away. The reason games look realistic is the lighting. You won't be able to stick your head in the sand like this for much longer. Some of us are enjoying amazing visuals, and have been. The people that can't run it on their pc always make these comments.
@@MultiNastyNate I can run RT on my pc and he is right.... Ray Tracing is useless!!!
Ray tracks🤡
Can't even spell "ray tracing" 💀
No RT pls or is this channel pro nvidia? Almost nobody cares about RT.
The title of the video literally says Ray tracing game tested. What did you expected lmfao
Simply an amdonkey
I have 75Hz monitor so RT is so important to me, because i can't see more frames...
@@anuzahyder7185 I'm not buying either lol.
Rtx Gpu is best .