It is funny how you people are comparing old and undesirable products in society with something newly researched and made which is also high in demand. You are the problem here. If you would look at anything cars related in aftermarket, they go in thousands of euros. A simple maintenance bill for a car is half a grand. So what you people are about?
They were testing the new cooler that can handle upto 600w. 4090 FE cooler already rated upto 600w if i remember correctly. During 40 series development nvidia actually testing cooler that can handle upto 900w.
I think focussing on their X600, X700 and X800 lines instead of trying to fight Nvidia with the X900 sounds like a winning strat. If your budget is the infinity sign you'll usually buy Nvidia anyway, but if they can beat every single card of Team Green in the 200-600 dollar range that would be neat.
@@detecta I fear even AMD has low key abandoned that class of cards, it usually makes more sense to wait for the x600 or even x650 XT to drop in price.
Most basic things are like 20% more expensive than 5 years ago. Even the things that weren't unobtainable for like 2 years. Expecting things to be the same price as they were a decade ago is delusional.
@@kaminekoch.7465Yeah, well salaries didn't go up at all or just a bit so companies can't expect people to be prepared to pay more. Its one of the reasons why AMDs sales have dropped badly.
@@reahs4815 4080S is about 300$ expensive(where I live) for the cheapest model compared to 7900XTX while the 4090 1000$ more. Imo, Either go 7900XTX or jump straight to 4090. 4080 Super does not make much sense even if you're doing productivity, you may as well push to a 4090.
I dont think fsr and other technologies are getting more focus than they already have. Once they killed highend RDNA4 they put people on RDNA5 we will be lucky if they announce it in 2025 but probadly in 2026
Honestly if they can keep upping the rasterizing performance and catch up in RT I'd much rather see that. DLSS is very impressive but it still has ghosting and blur issues, which is a pet-peeve of mine (and a lot of other gamers). I use my 7900XTX to run stuff natively, if that means no RT I'm more than fine with it because I really do not like upscaling artifacts. I tried DLSS3 on someone else's setup and still was annoyed by it.
Kinda smart ngl. Their highest end card is like $800 less and only 21% slower at 4k but can't reach the 4090, like ever. So if they make all their lower end cards just like that, same same performance as nvidia but like 20 - 50% cheaper? Could be better for us all.
They don't have the same margins on the low end cards, so they're going to need to do some innovation on that front first. Maybe if they can pull off an actual chiplet design that splits up the graphics die.
@@AAjax Pretty sure they're already on a design that is cheaper to manufacture than Nvidia's design, which means that if they keep the current offset to Nvidia they'll have better returns. Nvidia already had to backpedal 4070 and 4080 prices hard due to AMD. I think AMD abandoned their flagship because it would be very expensive to make but wouldn't sell well (like the 7900XTX). They'd rather take as much marketshare of the low/mid range, press their efficiency advantage and catch up in raytracing. If they succeed they can potentially match or even overshoot Nvidia's flagship in a generation or two and then slash Nvidia's prices. At which point Nvidia might choose to go all-in on AI instead, if that keeps being where the money is.
They did the same with RDNA1 (5700) and Polaris/GCN5 (480/580). It makes business sense when they can't come within shooting distance of the priciest NVIDIA chips.
@@Ranguvar13 The biggest problem with that is that even though there are a lot of tech UA-camrs and you can find statistics and information everywhere, people still base their mid-range purchase off of how well the enthousiast side of a company does. The amount of times I've seen people discussing AMD vs Nvidia using the flagship models when they all had midrange cards themselves... To then defend their purchase based on that discussion...
AMD ceased to "fight" Nvidia on the price at the low and mid range since the RTX 30X0 era, they increased their price to be lower than Nvidia but not completly undercuting them so they get a better margin. I wouldn't hope they will try to fight the price again they just seem focused on keeping their margin high and sell at the range they want.
I wouldn’t be surprised if AMD GPU users are less likely to report their hardware. I feel like AMD users skew more tech savvy and skeptical of tech companies using their data
Those same steam records also show a clear trend of AMD cards including the 7000 series increasing percentage wise month over month while there are many decreases in Nvidia's higher end, except for the top end cards. It shows a clear trend that people are not only buying AMD cards but also changing from Nvidia to AMD in some cases. There are many negative percentage valuses for the 2000 and 3000 series for Nvidia with the only clear win being the 4000 series, with AMD having increases across the board for 5000, 6000 and 7000 series.
But Nvidia is up 56% gaming revenue. Are you saying that that is only because of the 4000 series? If that's the case then the 5000 series will burry team red
Idk why Vex liked it, knowing the copium it is. The earnings calls don't lie, the Steam samples are random, and cards that are over 5 years old are still shown more. It's not "changing from Nvidia", it's "The 4060 is dogshit, so I'll take what I can get.", or "I want one of the best GPUs, but I don't have over $1k."
Thats what they want you to think. GPUs in general are horrible value today. Only true value cards are like $120 or less. You can find 1080s for 100 thats best value out there
@@knusperkeks2748 my 6700xt does just fine. So youre ok with the prices today then? Sounds like youre trying to cope with the fact youre getting fleeced by billion and trillion dollar companies.
@@knusperkeks2748 There was a time when high end never used to cost anything close to what it does today. Hope you enjoy paying 10k in the relatively near future. Also thanks for outing yourself as a boomer.
They are. That's why they abandoned their flagship card. Too expensive to make and if it doesn't beat or at least matches Nvidia's flagship it's just not worth it for them. They tried with the 7900XTX but it was screamingly expensive to make and didn't sell that well. Efficiency and decent performance at very good prices should be AMDs game, I agree.
I guess they are planning to go back to RDNA 1 era. The issue is that AMD still struggle with ray tracing RDNA4 is probably a test to see if their new ray tracing engine can rival NVIDIA. Plus, there's a strong bias against AMD regarding their driver. Those biases are very, very hard to get rid of.
That not the only thing not do they only suck at ray tracing but at everything else excluding raster performance that's the only thing their cards are good at anything else from TAA implementation to Features even intel is better than AMD not even better Intel smokes them also their cards have way lower resale value so anybody who build a new pc like every year or so stays away from them and there's nothing AMD can do about it they'd actually be better of if they drop manufacturing GPU at all and ust focus at CPU afterall that's where they get most of their revenue from
Tbh, GPUs are very hard to get right compared to CPUs which doesn't have as much exclusive features. In fact, AMD cpus have avx-512 support and intel doesn't anymore. Quicksync is almost negligible with how good gpus are nowadays. AMD needs a better driver team. Their publicly maintained linux drivers are always outperforming windows and it shows AMD GPUs have the hardware performance to match nvidia..but their software engineers are lagging behind nvidia.
6800XT here, first AMD card since...the HD 5850..no wait I had a 470 for a couple months in the middle somewhere that I was faffin with repairs on(used to repair GPU's for people) (I have some older AMD cards now, but I got them cheap for old PC's after I got the 6800XT for my main machine) after having Nvidia cards for ages(GTX 770, 1070, . when I was looking for a card it was during the late end of covid lockdowns etc, my choices were the 3000 series cards(with 4k on the horizon) or the 6000 series cards second hand. the 3000 series cards were going for the same price second hand as they were new, and the 3060 Ti was $1000NZD at the time, 3070 was 1400, 3080 was 2100 and the 3090 was 2400, and god knows what a 3090 TI was at the time, I saw one for sale for over 3k nzd. I ended up buying a 3060 TI and was pretty unhappy with it for the price, and that raytracing was annoying(it darkened corners and made stuff I want to see not visible, made reflections I don't look at prettier, but otherwise didn't do all that much to make it look better to me but cost a huge performance hit. a second hand red dragon 6800XT was going for $850 came up the only cards in my price range were the 3060 TI and the 6800XT, after looking at the stock speeds, and working out if I would use raytracing ever with a 3060 TI really, I checked benchmarks and framerates and realized the 6800XT smashes a 3060ti. I then checked overclocking related stuff and realized the 6800XT had massive potential to overclock like a beast, then be modded and overclock some more and the Red Dragon had so much high quality powerful fully populated VRM goodness. so I sold the 3060 TI for the same price I got it for and got the 6800XT. and a water block for it was $100, a second hand D5 pump and rad went for $40. so I had a water cooled 6800XT. then liquid metaled it. Then MPT to mod the power table to let it pull then used an EVC2SE voltage controller on it. now it gets 22k in time spy and found if I don't give a shit about raytracing, it floats between just above a 3090 to just above a 3090 Ti in performance depending on the game. I have no regrets getting the 6800XT. would I get another AMD card? yea, yea I would. still wouldn't fanboy for em tho, if Nvidia comes out with something that competes for the price I'd go them instead. but 8gb vram shitbuckets for stupid money that don't even perform that well? yea no thanks, then right after that cards that cost even more, still have only 8gb vram, and BARELY perform better than the last gen? yea no thanks.
One of the best GPUs ever made for low budget I'm still trying to tell Nvidia fan boys about it, but they get BUTT HURT when I show them it mops up a 3060ti for less money lol
Last time they refused to compete in high-end segment was actually a good move for them. RX400 series were just a great value mid-range cards. Because in mid-range greater amount of VRAM and highter raseter performance is still a great selling point. In high-end you would have more VRAM and raster performancethan you need anyways so only features like RT and upscaling become a decisive factor - and Nvidia wins effortlessly. So I would ultimately see it as a positive change. It's better to just refuse to compete when you can't win rather than loose money on a product that cannot compete and thus will fail to bring profit.
I absolutely love my 7900XT. Overclocks like a beast and can handle cyberpunk at 3840x1620 with 60+ graphical mods and still get 90+fps average. Best purchase ever
@@arcrides6841 cyberpunk is very well made in rasterization if you really want to see a difference in ray tracing then you will have to use path tracing and no card other than 4090 can do that at the highest settings above 1440p at any reasonable frame rate
@@arcrides6841 U love RayTacing?...Good!!...I don't....Love DLSS??..Yeah it is Fake frames..& not Fan of this 16pin fascination...7900XT gives me enough Real frames!! No CPU Overhead problem! Believe me I still have my GTX_1060 6GB & I Love that If your GPU can Run your game pretty good.. U happy at the end of the day...That's enough. Red, Green or Blue, doesn't matter.... though RGB matters.
The people who bought 1200w power supplies are no longer looking silly. 600w GPU is crazy, throw in a 300w CPU and that's 900w just from 2 components lol. With overclocking you'd probably crash the PC from running out of wattage, not to mention the components would be on fire regardless of what cooling you use.
What players want: low power consumption native 60fps@1080p $200 What brands and UA-camrs push: high power consumption dlss fake dps@1440p/4k $2000 This is why.
I have a 7900XTX for 1080p (because I could get it for really cheap). I can run everything natively without any upscaling, and even though DLSS runs circles around FSR for quality right now, nothing beats native. I hate ghosting and the blurry mess you often get with high speed movement when upscaling. Not to mention the card isn't even trying so it's actually running games at ultra settings using less power than my 1070 did (unfair comparison, I know) at medium settings.
Nvidia pushing RT is what's ridiculous. Even today, it drops frame rate by half to get only slightly better lighting than pure rasterization. I picked the 7900 GRE over the 4070 Super just for this reason. Better rasterization performance and more VRAM for less money. Screw RT.
nah, at this point it's time to let 1080p@60fps go (not for 200$ tho), I would rather have 1440p@144fps or higher any day of the week. (i am an outlier though, as higher fps is unnecessary unless you got a 360hz screen like my stupid ass but I stand by that 1440p@60-144fps is what you would want in the current day)
Videogame crash is happening, because of absolute idiocy of "Game Passes," and Monopolization; due to this, the quality of games is decreasing, so there's literally no need to keep making ever more powerful graphics cards, when you can start focusing on Efficiency, Memory, Power Consumption, etc.
Well, we're done this new Gen if it's true that the 5080 will be "just" at the same level of the 4090, but costing $1200-1300, that's just insane. Second best GPU from a new gen has always been much better than previous gen flagship. If AMD can hopefully make a mid/high end card for $649 that matches the 7900XTX raw performance with better ray tracing, that would be the way to go
Built my niece a PC for Christmas, Built 2 more, upgraded my desktop, about to do so again, upgraded my gaming rig. So far 3 Intel Arc GPU's, A380, A580, A750. 2 AMD, RX7600, RX7600XT. The XT is now in my Gaming Rig does RT just fine over the RX6600 then the RX7600, the 7600 is in my desktop. So just in the last year I have grabbed 2 RX580's, 1 RX5500XT, the 7600 and 7600XT and 3 Intel GPU's so a little over 1300 bucks total 8 GPU's. Still less than 1 4090. All my PC builds plus my old Board in a test rig frame are AMD CPU's.
If AMD wants to steal the market, they gotta start innovating and make something new that's headline and hypeworthy. Instead of following Nvidia they gotta make their own "Rtx" or "Dlss" moment
Or they could ditch the "High end" segment entirely and follow one plus/Chinese phones policy, that is to bring incredible value for money at low-mid level segments. I am talking 4060-4070ti level of performance at half or at least 2/3rd of the price. That would also make headlines.
@@naqibfarhan4356 They sont have to be at the high end and beat Nvidia's top cards, they just gotta do something to make better hype. Basically they gotta advertise better and use stuff like rtx for that advert hype like Nvidia did
Prices are ridiculously high, most people wont buy Graphics Cards at the current price. Which is a bit of a issue because they increase the prices because they don't get enough sales, but the higher the price increases the less people buy them. I for one only buy AMD, Nvidia in my country is about 30-40% more expensive.
Nvidia cards are only around 5-9 FPS faster in their top cards and does not justify a 2000-dollar price point when AMD is only asking 900 dollars for their top card which can play pretty much all games in 4k@60 fps. Nvidia uses major over hyped advertising because they target kids that have no experience to know any better that there paying an extreme premium for 8fps and they also use their latest Gimmick called Ray tracing like they did in the past with Hairworks so to be honest there Gimmicks work because Nvidia targets kids and persons that are not informed so therefore they get tricked into buying an overpriced GPU and with Nvidia paying youtubers to hype up there GPUs were likely to see Nvidia stocks continue to rise until people start to see through there Gimmicks
AMD cant compete and people act surprised? Took them until RDNA4 to implement ray tracing hardware that nvidia had back on the 20 series. Then intel came in and beat them in the low end on their first try... with much better upscaling.
@@baronvonslambert Its more about hype, having stuff like "giga omega better graphics" just makes headlines and gives fomo to the users. Atleast on the lower end cards
@@baronvonslambert it's marketing trick cause people are fukin' stupid. people thought that ps5 will do 8k gaming cause Sony said "8K" when releasing ps5 and 8K is on ps5 box. this is how stupid people are.
@@vigilantbruiser1119 I'm sure nobody is forcing you to use a better monitor. The vast majority of people will have a better experience and most likely no issues with eyesight when using a higher resolution and/or higher refresh rate monitor, so I don't really see why we should collectively stick to 1080p@60Hz. But we're probably gonna be stuck with that for quite some time anyway, since monitors with a high resolution and high refresh rate combined are unfortunately still incredibly expensive.
I don't know why so many people still run after Nvidia. The GPUs are far too expensive for what they do. I currently have an RX 7700XT and my god am I happy with it. I have no problems with either the drivers or the GPU The price of the GPU was bad before but now it's a blast. I don't think AMD will stop, they're also getting better and have better prices
brand recognition. like iphone. amd don't have such high brand recognition. even when an amd gpu is better than nvidia at cheaper price, most people will buy the nvidia cause it's nvidia. amd has to find something to innovate like nvidia did with RT and dlss to increase their brand recognition. even though RT is dogshit cause it eats even 50% of performance, people still fall for this shit. people are stupid, so if amd finds something than nvidia doesn't have, people will fall for it.
i have a 6900xt, paid about 680 new, with shipping. also bought the 12 core amd cpu,non 3d, and 64gb of ram. i wont need another computer for 8-10 years. companies expecting people to buy new computers every year or so is dumb. dedicated rt hardware is dumb, every gimmick nvidia does is dumb. the software based rt is already almost as good. per usual nvidia bribed game studios to include gimmicky crap that only they support because they came up with it for the sole reason of saying "look,they dont have it." amd has done this a few times in the past,last one i can think of is the hair graphics thing. i dont think amd is truly trying to compete at the highest end. the board partners have said their cards could handle more watts but amd locks them out of juicing them. yeah the 4090 is about 20% better, but it uses 150+ watts more to do that. that's a hell of a lot, really speaks to how efficient amd is. also amds focus has been on the cpus, mainly server for a while. not a great reason for either company to devote their best people to gpus when they just dont bring in the money like their other stuff. also, keep in mind most prebuilt computers dont even have amd gpus as options, only a couple have offered them the last couple years.
Lol.... Funny that the intel iris xe integrated GPU is one of the popular gpu lmao! Dunno but makes me feel quite proud cuz I own one haha! great video btw 👍👍!
Did you go to Business School or something? Your Titles and Thumbnails are on another level man..... I dont think I have ever seen you make a title or thumbnail that is nearly impossible to not click for someone interested in the PC space. I am actually being serious because you are very talented at what you do & would like to get better at it myself.
Got a great idea! AMD should just focus on cpus so Nvidia can concentrate on pricing GPU's however they like because fanbois will pay $3500 for the shiny new AITX5070 with 12gb vram and they will love it and rejoice that no inferior products are clutrering up their hallowed shelves. Good job Vex ypu won
AMD being AMD... You can't charge 90% of what Nvidia charges when your product is clearly inferior... Nvidia has better upscaling, better Ray Tracing, better frame generation and they have convinced a lot of people that that stuff matters... AMD could fight back with more V-Ram and better prices, but they refuse to... Their low-end cards have the same pathetic 8GB V-Ram that Nvidia's cards have, and they cost about the same... but Nvidia has a better up-scaler and better ray tracing, does it matter on this class of card, well, no not really, but if the cost is the same why wouldn't I. This has always been AMD's problem. They have never been the first to implement a new feature, they are always copying Nvidia... and the copy is never as good as Nvidia, and they don't offer a big enough discount for customers to accept an inferior copy of whatever Nvidia's doing. AMD isn't as good as Nvidia If they want to compete they need to: 1. Offer more V-Ram than Nvidia AT EVERY PRICE TIER. 2. Stop with the fucking clamshell design and actually offer the wider memory bus that Nvidia refuses to offer to up memory bandwidth. 3. TAKE THE FUCKING RAYTRACING SHIT OFF THE LOW-END CARDS THAT ARE NOT POWERFUL ENOUGH TO EVEN TURN IT ON. 4. Accept a smaller margin on their products so they can LOWER THE FUCKING PRICES, cuz no one is going to buy AMD for 90% of Nvidia's price, they have to make it more like 75%... See, the 4060 is a clamshell 8 Mb V-Ram card on a 128-bit bus and it costs about $300 USD. The 7600 is a clamshell 8 Mb V-Ram card on a 128-bit bus and it costs about $270 USD, that's a 10% discount. NO ONE IS GOING TO BUY AMD FOR A 10% DISCOUNT. Now imagine that the 7600 has 12GB of V-ram NOT ON A FUCKING CLAMSHELL DESIGN, with a real 196-bit memory bus, and the price is $225 now would you be interested. OK, so the up-scaler still isn't as good, and the Ray Tracing still sux, but the memory is much better and so is the memory bandwidth and the price... but AMD couldn't do that and make a profit on the card. YES THEY COULD YOU GODDAMN MORONS. They wouldn't make nearly as much profit as they would like to make, the margin would be much smaller, but they would absolutely still make a profit, and they would take over the market segment that the majority of people buy in. Most people cannot afford a 4090, If you look at the steam survey the market is dominated by the more affordable cards. The card Nvidia offers in this segment is a joke, the 4060 is crap, problem is the AMD offering, the 7600 is just as crap and not any cheaper so why wouldn't I buy a 4060 and 90% of people do... AMD is stupid, Nvidia isn't even trying in the budget GPU segment, the 4060 is a joke... All AMD had to do was make the 7600 not dogshit... and they couldn't do it... so Nvidia wins again without even trying, cuz AMD is Stupid.
AMD is always the worst option. Has been since they started as ATI and will always be. AMD has invented and put in features at times (Tensilica DSP, TrueAudio Next) but those are always useless and nobody wants it. AMD does that always same like with their Ryzen CPU. Absolute junk and Intel always has the far superior options.
Their problem is marketing and not keeping everything closed source and being a unethical company like Nvidia. Most of AMDs techniques are open source. Vulkan was started using Mantle which cost AMD millions in R&D yet they gave it to Khronos for free, who turned it into Vulkan which we now all enjoy. Also, when you turn off RT the value for money on AMDs side is insanely better than Nvidia right now. Which means you can buy and AMD card that, without RT, can actually run games at native resolutions without having to upscale for the same price.
AMD cant catch up to Nvidia. Its like stoping a big wave with only your hands. They need to focus on the things they have and Nvidia dont, Vulkan or Fluid motion Frames. They will never catch up to Nvidia otherwise.
@@stysner4580 The unethical Company lets me play Graphics far faaaaar superior to AMD. AMD does not the Open Source thing because its lead by Jesus, AMD needs to because otherwise none buys one. Their GPUs are nearly the same Price as Nvidia Cards and AMD is also selling them 1000€+ Btw, i got more Features or improvments on Nvidia side as on AMD side and thats why i pay more for Nvidia.
AMD isn't going anywhere. They just won't build stupidly big chips with 25 billion more transistors than anything else like nvidia did with the 4090. My 7900xt isn't on the steam hardware survey, nobody I know let's steam interrogate their system either.
Pretty soon at Nvida’s power consumption you’ll need a dedicated circuit just for your PC, most homes only have 15 to 20 amp circuits with multiple outlets on one circuit. The average person doesn’t understand wattage to amperage, 120 watts = 1 amp this is a subject you should make a video on. So a 4090 is pulling about 5 amps, a 4K monitor pulling up to 2 amps (some less) cpu maybe 1 amp so on a 15 amp breaker your already at 50% of what it can handle before tripping. Also sustained heat on a breaker weakens it, they are at the point say your microwave and pc are on the same circuit if your pc is on and someone uses the microwave the breaker will trip.
14:43 Nvidia has traditionally not been first with new technology. It has always been a back and forth between the two. ATI created Tessellation in 2001, Nvidia was behind for a good while before they marinated an even greater solution. In 2002 ATI created the first DX3D 9 compatible GPU. In 2015 AMD gave us Async compute, paving the way for DX12 and Vulkan. Then that same year gave us the Fury X, the first GPU featuring HBM memory. Most recently in 2021 AMD gave us the worlds first chiplet based GPU, the MI200. Nvidia gave us stream processing units and CUDA in 2007, which were both exceptionally important developments. Most recently Nvidia is pushing Ray accelerators and AI which are both hugely successful.
All of what AMD has pioneering wasn't unachievable by Nvidia, AMD never succeed to replicate the success of CUDA. Wish is why even if somehow Nvidia started to lose the gaming market; steam hardware survey still shows Nvidia with 76% of the market while AMD is not even at 20% but at 16% and intel at a bit less than 8%, it’s hold on the professional and now the new AI market will enable them to overcome AMD when needed. I don’t think AMD could even compete in the AI market with Nvidia given the cost in R&D and the massive gap in size between them. I also forget that DLSS and RT where and still are Nvidia domain where AMD still struggle to replicate what Nvidia achieved first.
@@itachiaurion3198 Something to think about here is Nvidia's current success is largely attributed to the current interest in AI. the 40 series have been selling terribly compared to previous generations and that has everything to do with the fact that Nvidia cards are absurdly overpriced even compared to the 30 series which were notorious for being a terrible value. The 10 series GPUs were a serious high mark for them, they were fast and reasonably priced while offering a substantial increase in performance over the previous generation. The current generation of cards just don't make sense for gamers, I would argue their pricing is more in line with enterprise applications which is coincidentally why Nvidia's stocks have been skyrocketing. This is a short term trend though, Nvidia has the AI market cornered right now, but it's only a matter of time before AMD catches up or, most likely, purpose built cards made for AI become the industry standard. If and when that happens Nvidia will be in serious trouble, what the 40 series has done more than anything is sully their name in the eyes of the public and make them seem downright greedy.
@@Todd_Coward If AMD can't even play catchup with the best Nvidia as to offer for gaming, how can they catch up on a more difficult market? Maybe Intel will try something for the AI but I don't see AMD finally succeeding at overcoming Nvidia. Unlike Intel, Nvidia will not wait 5 to 10 years that AMD will catchup before waking up. They price may be too high but they are still in the lead in regard of the technology and don't seem to let AMD catchup anytime soon. Their success come from the fact that they are the sole leader in the 3D rendering space since at least 2015 and all of their gpu sell like hotcakes, even in the gaming market with the 40X0 as the exception and even that I have doubt. The most recent hardware survey from steam show that the top 15 spots of GPU are all Nvidia with the RTX 3060 in the lead and 10X0, 20X0 and 40X0 and other 30X0 in this top. Then we have AMD and Intel integrated processor in 16th and 17th place. Heck even the 4090 is nearly at 1% of the survey and it's not the better placed 40X0. The first named AMD GPU is the AMD Radeon RX 580 with 0.83%; Nvidia is still at a very comfortable 76% of the market from those steam hardware review. Even if steam is somehow the HQ of all the Nvidia chill AMD can't have more than 30 or 40% of the PC gaming market from those result. Even if the price doesn’t make sense, it seems that players bite the bullet and buy the gpu’s anyway. I really don't see AMD pulling a hail mairy and finally overcoming AMD in the 3D market while they are still behing Intel in the CPU market and far behing for the gpus.
They could certainly get Ray Tracing improvements to the moon in one go if they wanted to, same as Intel did. They have the ability to kind reverse engineer what's already been done too, they probably know to a very high degree what they need to do to get up to par.
We had rumors of the 4090 going to 600w as well. Lots of them. But that's the power limit with an OC on some expensive models like the Strix. I'm sure they'll stick to under 500w.
I found a youtube channel of my country that brainwashes people saying amd cards have problems and if you don't want problems buy nvidia and all of people trust him and go yeah bro thanks for telling us and informing us i feel so bad for those people who trust him (Most people in my country are poor and want to buy a cheap card just to play old games like takken 6-7 and gta iv-v)
I love my 6800XT but the first one was a dud and had to be RMA'd. The newer one is doing great but a little louder than it should be. This is my first ALL AMD system I have build in over 13 years. Usually I build with Intel / Nvidia and in all honesty, I will be going back to that combination on my next build in a few years.
Yeah I just took my 3070 out and replaced it with 7800 XT. While the green machine has some good features, I am putting my money on AMD. NVIDIA needs a wake up call.
DLSS changed the game. I am playing AAA games on my 2022 midrange laptop with a 3050, on high graphics, at constant solid 60fps, only drawing 45watts of power. You people don't understand how game changing Nvidia's hardware accelerated ai super solution is. It basically extended the life of entry level Nvidia GPUs.
it feels like this happenned wayyyy wayyy back before on the vega vs 1000 series era when vega 64 and radeon 7 trying to close the gap to 1080 non ti and gave up since it can't close the gap to its superior 1080 ti and a few years later dated now 2024, they're like ... its time.
Then your world is stuck at AMD only. Go out from that AMD cacoon and see the reality of what GPU really is. Then you will understand why nvidia dominate GPU world
@@arenzricodexd4409 you are talking sh1t. I've owned AMD and Nvidia gpu's , for someone who don't wan't to spend a lot of money for midrange gpu AMD is the best in price/performance. I owned a 2070 S ( is in my daughter pc) and now in my pc i have a MSI 6800 non xt paired with and AMD 7700 non x cpu and they work beutiful togheter.
7900 GRE is worth the slightly higher price tag over the 7800 XT, especially after overclocking the gimped memory, although you can't go wrong with either choice.
Marketshare doesn't lie. Nvidia is vastly superior in every way. People pay premium for feature set. This is why Nvidia hold 80% market share. It's not always about how cheap a card is if feature set and software is terrible. FSR is a joke. Also the reality is AMD will never catch Nvidia in software or hardware. They are so far behind it's an impossibility. Even INTEL who is new at this has a superior upscaler than FSR. It's hilarious. AMD doesn't push any new tech ever. When is the last time AMD introduced something that changes the market? Set's a new standard? Never. AMD just reacts to every new tech Nvidia brings forward in the most gimped way possible. It's a joke. In b4 AMD fanboys not willing to accept reality. Cost doesn't mean anything. Market shows this.
They would,lose money by levelopingnhigh end gpus… that don`t sell enough! The point is… development eats money. The money has to get back by selling product… and AMD gpus don`t sell! NVIDIA market share is 78% AMD 21% so NVIDIA gets 4 times more money if they would use same money to development!
I'm still holding on to my RX 5700 XT and I'm not in a hurry to upgrade. I left nVIDIA behind 20 years ago when ATI released their Radeaon X800 card and I have not regretted that choice at all. I got a allround more stable card and I have never had any blue screens of death since with any Radeon card or any driver since. I don't use raytracing and so it's not a problem for me. I rather have a bit slower card and stable than an insane over-priced card and unstable. Not alot have changed in 20 years sure but apparently alot of the issues I had back then are still pressent in the current drivers.
@@cloud380 I’m not thinking what you’re thinking I’m thinking. I watch people skip it constantly by dismissing its little pop up it does when the survey rolls around. Ofc I always participate in it.
The 6900 XT was my first AMD GPU, I’d always gone team green before. I upgraded to the 7900 XTX, and had intended to upgrade to whatever the top of the line 8000 series card ended up being. I suppose now that isn’t going to happen… guess the 7900 will have to carry me through an extra generation or two 🤷🏻♂️
A friend of mine got a 7900xtx just to find out it was slower than his old rtx3070. DDU was used in safemode to remove drivers and everything, 7900xtx was still slower. Maybe it was a defective unit, but after all of the throubleshooting he just returned it and got a 4090 instead
This rumor happens every new generation. The only time it held true is when they stopped making cards like the Radeon 7. I think they will keep competing with the 80 cards, which is good
I paid $750 in December 2021 for my RTX 3060. I can't justify changing for anything else right now. I have to keep this card at least until the end of 2025. Only then will I consider AMD... If they are still in the GPU segment. On second thought, I still have haunting memories of the RX 570 I bought several years ago: onscreen stutters, continuous coil whine, intense heat and fans blowing like a Jumbo jet is about to take off. So maybe not AMD.
The 3090 was a more perfect RTX card than my 2080Ti. The 4090 is RTX perfected. The problem however is that horrible power connection. 5090 will maximize FPS when Ray Tracing and DLSS are enabled in 4K. It’s gonna be a monster. I’m sure the power connection will be better this time around.
With the power estimates, i'll probably only buy from an AIB that does triple or quad 8 pin honestly. I don't want 12HPWR on the 5090, shit is a time bomb.
Amd wants the same amount of Money for around the same performance with significant lesser features. You see what AMD cards sell well: 7800 XT, that's it.
If you want something for vr you have basically no choice. And my previous 1080 gtx made a lasting impression on me, my amd 4890 did not when basically it broke. That is why u bought a 1k bucks 4080
We will never get back to the days where x60 was ~ $200, x70 $350, and x80 $500. Prices decreased from 2007 to 2012 but then rose again dramatically around 2018. Even accounting for inflation, a x80 should be ~$800-$900.
Sorry to hear AMD is slumping with sales, I am happy with my 7800xt. I think it is good AMD is there, cause it will help make graphic card prices lower. If Nvidia is the only company, then they will raise prices for us gamers. There is also Intel, who knows maybe they will pickup in 2 years or so with their new silicon foundry. I like what that guy making his own gpu "Fury". But the bottom line FPGAs have to be more affordable and powerful so you could have decent OpenSource gpus that would drive prices further for gamers. Thanks for reviewing about AMD, Vex.
I like how we are returning to our roots of gaming with Upscaling implementations. Nothing beats a game at a low resolution with sharp User Interface for me.
Just goes to show how effective bs marketing is on the average person. DLSS 3.0 is a horrible technology that increases framerate while INCREASING latency. The whole point of high framerates (above something like 90fps) is to get lower latency. Meanwhiley raytracing looks worse than rasterisation in all except for maybe 3 Nvidia-sponsored titles, because devs just use shitty plugins they dont actually understand just like with TAA. But your average dude-bro with no understanding of tech sees the shiny new tech and wants it regardless. A friend of mine bought a 4070 a few weeks ago, and tried rtx once, then immediately turned it off because it halved his fps for a visual downgrade over raster. He at least had the excuse of owning a shield that he uses actively. Nvidia has perfected the Apple strategy of making things look new without doing any actual improvements (rounded vs sharp edges).
I never noticed the danger of AMD was INTEL's GPU starting to grow up quite fast on the market, more than going to join the NVIDIA's "high end" GPU's route. Somehow, it made me remember the old times AMD was fighting INTEL again from their processors unit progression in the market. Thanks for the video, you made a new subscriber. Have a nice day☺
To be fair to AMD, Intel and Nvidia have engaged in extremely anti-competitive business practices in the past to kick AMD out of the market. AdoredTV has some videos on their past. Nvidia for instance has been caught modifying drivers just to go faster on press benchmarks and have included technologies in their games like Hairworks that cripple performance on AMD cards with no path for optimization for them.
Naaaa, if you search a minimum you can found that amd call it for a 9900xt(x) / 9950xt(x) like as twice the power of actual top-end GPU. The thing is: they aren’t ready yet, so (in my opinion / i think) they are going to replicate what happened with Vega to current XT generation. « How ? »: by sending Nvidia in the corner by releasing it something around 6 months later after the 8th gen / during spring 2025 (maybe summer). I call it again : they gonna sacrifice the next gen who is gonna be really short in time, or an intermediate one, and release something that gonna put nvidia in a hell of a day.
A lot of us may be still rocking on older AMD cards. I have an RX 590 that plays Starfield and DD2. While I don't really need a new card, I've been considering either a 7000, or an 8000 when they come out.
Ey bro, I can't wait to be proven wrong
I hope so
I hope so too I have a 6950 and I love it
RDNA 4 will hopefully be the tipping point for at least 20% more of the buyers
what we can say is Nvidia win market
Really wanted you to be wrong but... All the statistics and info you have provide seems to be correct.
don't like fake tweet thumbnails at all tbh..
Tbh thumbnails were just a straight negative for UA-cam
Yeah he meeds to stop that its kinda gay
@@lemonke5341not kinda but super gay
How else are you going to click on the video? It's blatant clickbait.
You can tell they are fake though. Clickbait aside, it makes it really easy to tell what the topic of the video will be at a glance.
I can rebuild a used motorcycle for cheaper than I can buy a video card. That's a problem!
My brand new 150cc motorcycle is cheaper than my GPU 🤣🤣
Man I hope to one day do that too lol. Maybe a zx7r, it's one of those old school bikes I love the look of, especially the purple and green livery.
My 4080 cost more than my car. But my car is 23 years old....
My perfectly running '99 Honda Accord was 1400 bucks
It is funny how you people are comparing old and undesirable products in society with something newly researched and made which is also high in demand. You are the problem here.
If you would look at anything cars related in aftermarket, they go in thousands of euros. A simple maintenance bill for a car is half a grand. So what you people are about?
“It’s gonna go to 600w” LOL without connectors melting?
They gonna put 2 300w connectoes on it
@@Shahzad12357 good, twice the 12VHPWR, double the house fires
Right?
Right?
They were testing the new cooler that can handle upto 600w. 4090 FE cooler already rated upto 600w if i remember correctly. During 40 series development nvidia actually testing cooler that can handle upto 900w.
Hopefully they learn and use better connectors this time.
I think focussing on their X600, X700 and X800 lines instead of trying to fight Nvidia with the X900 sounds like a winning strat. If your budget is the infinity sign you'll usually buy Nvidia anyway, but if they can beat every single card of Team Green in the 200-600 dollar range that would be neat.
Bold of you to assume that Team Green has cards for 200$
@@khursaadtRTX 3050, but why would anybody buy that?
im praying to god they push out an actually decent x500 too
@@detecta
I fear even AMD has low key abandoned that class of cards, it usually makes more sense to wait for the x600 or even x650 XT to drop in price.
@@Alias_Anybodyyeah 6500xt was generally shit and 7500xt doesnt even exist
I just picked up a 7900XTX and I couldn’t be happier honestly. Insane improvement over my 1080Ti.
Same, got mine after my 3080 died and even that was a noticeable performance gap. Going from a 1080 Ti is probably nuts.
I got myself a 4080 a little while ago, going from a 1070 since 2018 now that's a crazy gap
why would you buy amd when nvidia is better value at the top end
Going from a GTX970 to an RTX3060 was wild for me lol
Such a shame more people don't buy Radeon imagine the prices NVIDIA can change now.😢
The 1060 costed less than 300. As long midrange cards are above 400 and low-end above 200. most people ain't buying
Most basic things are like 20% more expensive than 5 years ago. Even the things that weren't unobtainable for like 2 years. Expecting things to be the same price as they were a decade ago is delusional.
@@kaminekoch.7465Yeah, well salaries didn't go up at all or just a bit so companies can't expect people to be prepared to pay more. Its one of the reasons why AMDs sales have dropped badly.
@@kaminekoch.7465 Delusion is expecting people to pay more and get less.
My 2060 was $250. Still very happy.
@@LtCommanderTato Which GPU costs more than 300 and performs worse than GTX1060? Thanks in advance
With 80% of the 4090's speed at 4K for a grand less, I'm very happy with my 7900XTX.
So a 4080S?
4090 is just a dumb card for rich people with terrible price/performace
@@reahs4815 Ah yes i cant afford it thus it must be dumb
@@reahs4815 4080S is about 300$ expensive(where I live) for the cheapest model compared to 7900XTX while the 4090 1000$ more. Imo, Either go 7900XTX or jump straight to 4090. 4080 Super does not make much sense even if you're doing productivity, you may as well push to a 4090.
@@elderman64 what? I called the card dumb
7900xtx isn't even close to 80% the performance when you enable RT
I'm fine with AMD not releasing new GPU's this year if it means they focus on improving FSR and the existing feature set.
Facts
I dont think fsr and other technologies are getting more focus than they already have. Once they killed highend RDNA4 they put people on RDNA5 we will be lucky if they announce it in 2025 but probadly in 2026
Honestly if they can keep upping the rasterizing performance and catch up in RT I'd much rather see that. DLSS is very impressive but it still has ghosting and blur issues, which is a pet-peeve of mine (and a lot of other gamers). I use my 7900XTX to run stuff natively, if that means no RT I'm more than fine with it because I really do not like upscaling artifacts. I tried DLSS3 on someone else's setup and still was annoyed by it.
It'd also probably mean that Nvidia GPUs are going to go higher in prices
@@stysner4580what xtx do you have?
Kinda smart ngl. Their highest end card is like $800 less and only 21% slower at 4k but can't reach the 4090, like ever. So if they make all their lower end cards just like that, same same performance as nvidia but like 20 - 50% cheaper? Could be better for us all.
They don't have the same margins on the low end cards, so they're going to need to do some innovation on that front first. Maybe if they can pull off an actual chiplet design that splits up the graphics die.
@@AAjax Pretty sure they're already on a design that is cheaper to manufacture than Nvidia's design, which means that if they keep the current offset to Nvidia they'll have better returns. Nvidia already had to backpedal 4070 and 4080 prices hard due to AMD. I think AMD abandoned their flagship because it would be very expensive to make but wouldn't sell well (like the 7900XTX). They'd rather take as much marketshare of the low/mid range, press their efficiency advantage and catch up in raytracing. If they succeed they can potentially match or even overshoot Nvidia's flagship in a generation or two and then slash Nvidia's prices. At which point Nvidia might choose to go all-in on AI instead, if that keeps being where the money is.
They did the same with RDNA1 (5700) and Polaris/GCN5 (480/580). It makes business sense when they can't come within shooting distance of the priciest NVIDIA chips.
@@Ranguvar13 The biggest problem with that is that even though there are a lot of tech UA-camrs and you can find statistics and information everywhere, people still base their mid-range purchase off of how well the enthousiast side of a company does. The amount of times I've seen people discussing AMD vs Nvidia using the flagship models when they all had midrange cards themselves... To then defend their purchase based on that discussion...
AMD ceased to "fight" Nvidia on the price at the low and mid range since the RTX 30X0 era, they increased their price to be lower than Nvidia but not completly undercuting them so they get a better margin. I wouldn't hope they will try to fight the price again they just seem focused on keeping their margin high and sell at the range they want.
Stop looking at Steam surveys. I have SIX AMD PC's all with Radeons, and not ONCE the past few years have I been invited to submit any survey...
That's weird. I get asked about once a year
I'm surprised steam doesn't just allow the platform to be able to check the current running gpu and pc in general and use that data instead
@@KnightmareUSAsteam specifically asks your consent to poll your hardware data, and honestly I wouldn't have it any other way
Well obv. surveys don't include everyone. Surveys are done by sampling a reasonable ammount for people and avoiding sampling bias
I wouldn’t be surprised if AMD GPU users are less likely to report their hardware. I feel like AMD users skew more tech savvy and skeptical of tech companies using their data
Happy to be one of the 0.36%. Haven't regretted the purchase at all.
well as somebody who've been in the scene for over a decade, you will not now but in future and I can give you like a 1000 reasons why
@@smokychristian Why
@@smokychristian give me 3
@@smokychristian give us few
@@smokychristian gimme 2
Those same steam records also show a clear trend of AMD cards including the 7000 series increasing percentage wise month over month while there are many decreases in Nvidia's higher end, except for the top end cards. It shows a clear trend that people are not only buying AMD cards but also changing from Nvidia to AMD in some cases. There are many negative percentage valuses for the 2000 and 3000 series for Nvidia with the only clear win being the 4000 series, with AMD having increases across the board for 5000, 6000 and 7000 series.
Switched from the 1080 to the 7900xt.
But Nvidia is up 56% gaming revenue. Are you saying that that is only because of the 4000 series? If that's the case then the 5000 series will burry team red
@@mikelay5360it could be GeForce Now too
Idk why Vex liked it, knowing the copium it is. The earnings calls don't lie, the Steam samples are random, and cards that are over 5 years old are still shown more. It's not "changing from Nvidia", it's "The 4060 is dogshit, so I'll take what I can get.", or "I want one of the best GPUs, but I don't have over $1k."
@Dr_b_ Well, except for sponsored titles, making the rest perform like shit.
Happy with 7900 GRE. It's a heck of a value.
Thats what they want you to think. GPUs in general are horrible value today. Only true value cards are like $120 or less. You can find 1080s for 100 thats best value out there
@@craciunator99 bruh😑
@@craciunator99 Delusional zoomer on cope trying to make himself feel better for owning a trash PC.
@@knusperkeks2748 my 6700xt does just fine. So youre ok with the prices today then? Sounds like youre trying to cope with the fact youre getting fleeced by billion and trillion dollar companies.
@@knusperkeks2748 There was a time when high end never used to cost anything close to what it does today. Hope you enjoy paying 10k in the relatively near future. Also thanks for outing yourself as a boomer.
Misleading title. Implying that AMD is giving up because they are focusing on the most profitable segment is a bad joke.
That title is a little misleading.
@@slc9800gtx A little? very misleading. I reported it for clickbait
welcome to vex
they might as well, even the new drivers for GoT are trash.
Nvidia also outsells AMD in that bracket
I returned my 4070 to get a 7800 xt and have not regretted my decision. 5800x3d with a 7800 xt, I´m very happy with my setup.
That is a good combo for gaming X3D + 7800 XT!
AMD should start to focus on mid end GPUs again, like it was on RX 400 and RX 500 generations, for me these were the best GPUs from AMD.
Yes but these gpus now are almost low end not even mid end
580 is not even in low end category, it's definitely Potato tier.
I see my old 3070 as low end and 6950xt as mid range GPU.
They are. That's why they abandoned their flagship card. Too expensive to make and if it doesn't beat or at least matches Nvidia's flagship it's just not worth it for them. They tried with the 7900XTX but it was screamingly expensive to make and didn't sell that well. Efficiency and decent performance at very good prices should be AMDs game, I agree.
It's NOT potato. It is low end. It can play EVERY single game at 1080p.
@@Kage0No0Tenshi
@@stysner4580 And still Ada managed to be more effiicient across the board, RDNA needs a complete do-over.
I guess they are planning to go back to RDNA 1 era.
The issue is that AMD still struggle with ray tracing
RDNA4 is probably a test to see if their new ray tracing engine can rival NVIDIA.
Plus, there's a strong bias against AMD regarding their driver.
Those biases are very, very hard to get rid of.
That not the only thing not do they only suck at ray tracing but at everything else excluding raster performance that's the only thing their cards are good at anything else from TAA implementation to Features even intel is better than AMD not even better Intel smokes them also their cards have way lower resale value so anybody who build a new pc like every year or so stays away from them and there's nothing AMD can do about it they'd actually be better of if they drop manufacturing GPU at all and ust focus at CPU afterall that's where they get most of their revenue from
Tbh, GPUs are very hard to get right compared to CPUs which doesn't have as much exclusive features. In fact, AMD cpus have avx-512 support and intel doesn't anymore. Quicksync is almost negligible with how good gpus are nowadays. AMD needs a better driver team. Their publicly maintained linux drivers are always outperforming windows and it shows AMD GPUs have the hardware performance to match nvidia..but their software engineers are lagging behind nvidia.
@@smokychristian fanboy
@@smokychristian And let NVIDIA have the monopoly? Are u insane?
@@smokychristian Nvidia fanboy are ridiculous (so as AMD fanboy)
6800XT here, first AMD card since...the HD 5850..no wait I had a 470 for a couple months in the middle somewhere that I was faffin with repairs on(used to repair GPU's for people) (I have some older AMD cards now, but I got them cheap for old PC's after I got the 6800XT for my main machine) after having Nvidia cards for ages(GTX 770, 1070, . when I was looking for a card it was during the late end of covid lockdowns etc, my choices were the 3000 series cards(with 4k on the horizon) or the 6000 series cards second hand.
the 3000 series cards were going for the same price second hand as they were new, and the 3060 Ti was $1000NZD at the time, 3070 was 1400, 3080 was 2100 and the 3090 was 2400, and god knows what a 3090 TI was at the time, I saw one for sale for over 3k nzd.
I ended up buying a 3060 TI and was pretty unhappy with it for the price, and that raytracing was annoying(it darkened corners and made stuff I want to see not visible, made reflections I don't look at prettier, but otherwise didn't do all that much to make it look better to me but cost a huge performance hit.
a second hand red dragon 6800XT was going for $850 came up
the only cards in my price range were the 3060 TI and the 6800XT, after looking at the stock speeds, and working out if I would use raytracing ever with a 3060 TI really, I checked benchmarks and framerates and realized the 6800XT smashes a 3060ti. I then checked overclocking related stuff and realized the 6800XT had massive potential to overclock like a beast, then be modded and overclock some more and the Red Dragon had so much high quality powerful fully populated VRM goodness. so I sold the 3060 TI for the same price I got it for and got the 6800XT.
and a water block for it was $100, a second hand D5 pump and rad went for $40. so I had a water cooled 6800XT.
then liquid metaled it.
Then MPT to mod the power table to let it pull
then used an EVC2SE voltage controller on it.
now it gets 22k in time spy
and found if I don't give a shit about raytracing, it floats between just above a 3090 to just above a 3090 Ti in performance depending on the game.
I have no regrets getting the 6800XT.
would I get another AMD card? yea, yea I would.
still wouldn't fanboy for em tho, if Nvidia comes out with something that competes for the price I'd go them instead.
but 8gb vram shitbuckets for stupid money that don't even perform that well? yea no thanks, then right after that cards that cost even more, still have only 8gb vram, and BARELY perform better than the last gen? yea no thanks.
My 6850m XT = 6700XT is so amazing 🥺 Thank you AMD 😢
Yooooo mobile gang!
@@kickskii 😎
I have a Corsair laptop with the same chip paired with a Ryzen 7 6800HS
One of the best GPUs ever made for low budget I'm still trying to tell Nvidia fan boys about it, but they get BUTT HURT when I show them it mops up a 3060ti for less money lol
Last time they refused to compete in high-end segment was actually a good move for them.
RX400 series were just a great value mid-range cards.
Because in mid-range greater amount of VRAM and highter raseter performance is still a great selling point. In high-end you would have more VRAM and raster performancethan you need anyways so only features like RT and upscaling become a decisive factor - and Nvidia wins effortlessly.
So I would ultimately see it as a positive change. It's better to just refuse to compete when you can't win rather than loose money on a product that cannot compete and thus will fail to bring profit.
I absolutely love my 7900XT. Overclocks like a beast and can handle cyberpunk at 3840x1620 with 60+ graphical mods and still get 90+fps average. Best purchase ever
with ray tracing?
@@arcrides6841 no. But without upscaling. And arguably way better looking.
@@arcrides6841 cyberpunk is very well made in rasterization if you really want to see a difference in ray tracing then you will have to use path tracing and no card other than 4090 can do that at the highest settings above 1440p at any reasonable frame rate
@@НААТwhere do you find these mods. I want to try them out
@@arcrides6841 U love RayTacing?...Good!!...I don't....Love DLSS??..Yeah it is Fake frames..& not Fan of this 16pin fascination...7900XT gives me enough Real frames!! No CPU Overhead problem! Believe me I still have my GTX_1060 6GB & I Love that
If your GPU can Run your game pretty good.. U happy at the end of the day...That's enough. Red, Green or Blue, doesn't matter.... though RGB matters.
The people who bought 1200w power supplies are no longer looking silly. 600w GPU is crazy, throw in a 300w CPU and that's 900w just from 2 components lol. With overclocking you'd probably crash the PC from running out of wattage, not to mention the components would be on fire regardless of what cooling you use.
My 860W PSU (2011) destined for 3x sli and now run 6950xt/5800X3D below 550W is really good but anything over 850W pointless
@@Kage0No0TenshiI got 7700x and 6950xt with 600w kekw
@@TheZerosd well I overclock and undervolt them thats why I am on low Wattage
Limiting your fps might actually be needed. xD
@@Kage0No0Tenshi 600w power supply I mean mine is running around 350w it doesn't get higher than around 500 never saw it
Love seeing ur channel growing mate, closing in on 100k
I have an AMD GPU and I'm proud of it, the value for money is miles better than what I would get if I went to the green team.
What you got
"proud of it" Imagine being proud of buying something over the other probably better thing.
@@diamonshade7484 rx 6700
Bought a 7900xt for $80 less than the cheapest 4070ti at the time. Rt wasn’t the big seller for me. Love the card.
@@DragonOfTheMortalKombat you can be proud of making good decisions and not wasting money
What players want: low power consumption native 60fps@1080p $200
What brands and UA-camrs push: high power consumption dlss fake dps@1440p/4k $2000
This is why.
I have a 7900XTX for 1080p (because I could get it for really cheap). I can run everything natively without any upscaling, and even though DLSS runs circles around FSR for quality right now, nothing beats native. I hate ghosting and the blurry mess you often get with high speed movement when upscaling. Not to mention the card isn't even trying so it's actually running games at ultra settings using less power than my 1070 did (unfair comparison, I know) at medium settings.
Nvidia pushing RT is what's ridiculous. Even today, it drops frame rate by half to get only slightly better lighting than pure rasterization.
I picked the 7900 GRE over the 4070 Super just for this reason. Better rasterization performance and more VRAM for less money. Screw RT.
RX 6600 XT for 200 usd
nah, at this point it's time to let 1080p@60fps go (not for 200$ tho), I would rather have 1440p@144fps or higher any day of the week. (i am an outlier though, as higher fps is unnecessary unless you got a 360hz screen like my stupid ass but I stand by that 1440p@60-144fps is what you would want in the current day)
That's completely true.
Videogame crash is happening, because of absolute idiocy of "Game Passes," and Monopolization; due to this, the quality of games is decreasing, so there's literally no need to keep making ever more powerful graphics cards, when you can start focusing on Efficiency, Memory, Power Consumption, etc.
Gta 6
@@diamonshade7484 I hope you're right.
For me higher consume is not progress, it's just an old thing being forced to it's limit
Pretty much.
Well, we're done this new Gen if it's true that the 5080 will be "just" at the same level of the 4090, but costing $1200-1300, that's just insane. Second best GPU from a new gen has always been much better than previous gen flagship. If AMD can hopefully make a mid/high end card for $649 that matches the 7900XTX raw performance with better ray tracing, that would be the way to go
I was really shocked to see AMD's own GPU sales figures, down 48% on this time last year.
Built my niece a PC for Christmas, Built 2 more, upgraded my desktop, about to do so again, upgraded my gaming rig. So far 3 Intel Arc GPU's, A380, A580, A750. 2 AMD, RX7600, RX7600XT. The XT is now in my Gaming Rig does RT just fine over the RX6600 then the RX7600, the 7600 is in my desktop. So just in the last year I have grabbed 2 RX580's, 1 RX5500XT, the 7600 and 7600XT and 3 Intel GPU's so a little over 1300 bucks total 8 GPU's. Still less than 1 4090. All my PC builds plus my old Board in a test rig frame are AMD CPU's.
I will never pay over 500 bucks for any card.
If AMD wants to steal the market, they gotta start innovating and make something new that's headline and hypeworthy. Instead of following Nvidia they gotta make their own "Rtx" or "Dlss" moment
They can't they don't have enough money to
@@smokychristian Yeah probably, unlike Nvidia they dont have enough sourcing to spend on R&D and experiment
Nvidia will go full ai then and will just have Intel to compete with. Scary thought.
Or they could ditch the "High end" segment entirely and follow one plus/Chinese phones policy, that is to bring incredible value for money at low-mid level segments. I am talking 4060-4070ti level of performance at half or at least 2/3rd of the price. That would also make headlines.
@@naqibfarhan4356 They sont have to be at the high end and beat Nvidia's top cards, they just gotta do something to make better hype. Basically they gotta advertise better and use stuff like rtx for that advert hype like Nvidia did
THEY'RE TOO EXPENSIVE!!! Cried the populace. AMD turned and put their hands over their ears.
Prices are ridiculously high, most people wont buy Graphics Cards at the current price.
Which is a bit of a issue because they increase the prices because they don't get enough sales, but the higher the price increases the less people buy them.
I for one only buy AMD, Nvidia in my country is about 30-40% more expensive.
Increasing prices in response to lower sales is exactly the wrong thing to do. Ask any economist. Big companies understand economics.
@@joesterling4299 and yet they still do it.
and yet nvidia gpu's sell and nvidia made $2.9 billion profit in fourth quarter 2023.
They will always make money, but less people are buying gpus regardless of how much money they make.
Nvidia cards are only around 5-9 FPS faster in their top cards and does not justify a 2000-dollar price point when AMD is only asking 900 dollars for their top card which can play pretty much all games in 4k@60 fps. Nvidia uses major over hyped advertising because they target kids that have no experience to know any better that there paying an extreme premium for 8fps and they also use their latest Gimmick called Ray tracing like they did in the past with Hairworks so to be honest there Gimmicks work because Nvidia targets kids and persons that are not informed so therefore they get tricked into buying an overpriced GPU and with Nvidia paying youtubers to hype up there GPUs were likely to see Nvidia stocks continue to rise until people start to see through there Gimmicks
Sounds like a good time to raise the prices. Because in the end, YOU BUY IT ANYWAY! LOLO!LL!L!L!LL!
Thanks for the leather!
🤣🤣🤣🤣🤣🤣🤣🤣
The more you buy the more leather.
AMD cant compete and people act surprised? Took them until RDNA4 to implement ray tracing hardware that nvidia had back on the 20 series. Then intel came in and beat them in the low end on their first try... with much better upscaling.
@@baronvonslambert Its more about hype, having stuff like "giga omega better graphics" just makes headlines and gives fomo to the users. Atleast on the lower end cards
@@baronvonslambert it's marketing trick cause people are fukin' stupid. people thought that ps5 will do 8k gaming cause Sony said "8K" when releasing ps5 and 8K is on ps5 box. this is how stupid people are.
Don't forget that most PC gamers still play on 1080p.
It should probably stay that way tbh, for me and others either too much hz or resolution makes our eyes hurt.
Oh he left, but yeah I didn't get the best eyesight genetics myself I'm not even old. 🥲
@@vigilantbruiser1119i play on 1080p but what you are saying is just retarded nonsense
@@vigilantbruiser1119 I'm sure nobody is forcing you to use a better monitor. The vast majority of people will have a better experience and most likely no issues with eyesight when using a higher resolution and/or higher refresh rate monitor, so I don't really see why we should collectively stick to 1080p@60Hz. But we're probably gonna be stuck with that for quite some time anyway, since monitors with a high resolution and high refresh rate combined are unfortunately still incredibly expensive.
@@ytrism Whatever you say, eyecandy fanatic. lol
I don't know why so many people still run after Nvidia. The GPUs are far too expensive for what they do. I currently have an RX 7700XT and my god am I happy with it. I have no problems with either the drivers or the GPU The price of the GPU was bad before but now it's a blast. I don't think AMD will stop, they're also getting better and have better prices
Cuda accelerator and machine learning purposes I guess
brand recognition. like iphone. amd don't have such high brand recognition. even when an amd gpu is better than nvidia at cheaper price, most people will buy the nvidia cause it's nvidia. amd has to find something to innovate like nvidia did with RT and dlss to increase their brand recognition. even though RT is dogshit cause it eats even 50% of performance, people still fall for this shit. people are stupid, so if amd finds something than nvidia doesn't have, people will fall for it.
Control panel settings actually work outside of DX9. Adrenaline doesn't.
i have a 6900xt, paid about 680 new, with shipping. also bought the 12 core amd cpu,non 3d, and 64gb of ram. i wont need another computer for 8-10 years.
companies expecting people to buy new computers every year or so is dumb.
dedicated rt hardware is dumb, every gimmick nvidia does is dumb. the software based rt is already almost as good. per usual nvidia bribed game studios to include gimmicky crap that only they support because they came up with it for the sole reason of saying "look,they dont have it." amd has done this a few times in the past,last one i can think of is the hair graphics thing.
i dont think amd is truly trying to compete at the highest end. the board partners have said their cards could handle more watts but amd locks them out of juicing them. yeah the 4090 is about 20% better, but it uses 150+ watts more to do that. that's a hell of a lot, really speaks to how efficient amd is. also amds focus has been on the cpus, mainly server for a while. not a great reason for either company to devote their best people to gpus when they just dont bring in the money like their other stuff.
also, keep in mind most prebuilt computers dont even have amd gpus as options, only a couple have offered them the last couple years.
Lol.... Funny that the intel iris xe integrated GPU is one of the popular gpu lmao! Dunno but makes me feel quite proud cuz I own one haha! great video btw 👍👍!
AMD driver issues remain a problem, as always they can't get their %$%$ together. They seriously need to up their software/driver quality stability.
Did you go to Business School or something? Your Titles and Thumbnails are on another level man..... I dont think I have ever seen you make a title or thumbnail that is nearly impossible to not click for someone interested in the PC space.
I am actually being serious because you are very talented at what you do & would like to get better at it myself.
I have studied business and can tell you as a fact they don't teach you about content creation at all
Got a great idea! AMD should just focus on cpus so Nvidia can concentrate on pricing GPU's however they like because fanbois will pay $3500 for the shiny new AITX5070 with 12gb vram and they will love it and rejoice that no inferior products are clutrering up their hallowed shelves. Good job Vex ypu won
AMD being AMD... You can't charge 90% of what Nvidia charges when your product is clearly inferior... Nvidia has better upscaling, better Ray Tracing, better frame generation and they have convinced a lot of people that that stuff matters... AMD could fight back with more V-Ram and better prices, but they refuse to... Their low-end cards have the same pathetic 8GB V-Ram that Nvidia's cards have, and they cost about the same... but Nvidia has a better up-scaler and better ray tracing, does it matter on this class of card, well, no not really, but if the cost is the same why wouldn't I. This has always been AMD's problem. They have never been the first to implement a new feature, they are always copying Nvidia... and the copy is never as good as Nvidia, and they don't offer a big enough discount for customers to accept an inferior copy of whatever Nvidia's doing.
AMD isn't as good as Nvidia If they want to compete they need to:
1. Offer more V-Ram than Nvidia AT EVERY PRICE TIER.
2. Stop with the fucking clamshell design and actually offer the wider memory bus that Nvidia refuses to offer to up memory bandwidth.
3. TAKE THE FUCKING RAYTRACING SHIT OFF THE LOW-END CARDS THAT ARE NOT POWERFUL ENOUGH TO EVEN TURN IT ON.
4. Accept a smaller margin on their products so they can LOWER THE FUCKING PRICES, cuz no one is going to buy AMD for 90% of Nvidia's price, they have to make it more like 75%...
See, the 4060 is a clamshell 8 Mb V-Ram card on a 128-bit bus and it costs about $300 USD. The 7600 is a clamshell 8 Mb V-Ram card on a 128-bit bus and it costs about $270 USD, that's a 10% discount. NO ONE IS GOING TO BUY AMD FOR A 10% DISCOUNT. Now imagine that the 7600 has 12GB of V-ram NOT ON A FUCKING CLAMSHELL DESIGN, with a real 196-bit memory bus, and the price is $225 now would you be interested. OK, so the up-scaler still isn't as good, and the Ray Tracing still sux, but the memory is much better and so is the memory bandwidth and the price... but AMD couldn't do that and make a profit on the card. YES THEY COULD YOU GODDAMN MORONS. They wouldn't make nearly as much profit as they would like to make, the margin would be much smaller, but they would absolutely still make a profit, and they would take over the market segment that the majority of people buy in. Most people cannot afford a 4090, If you look at the steam survey the market is dominated by the more affordable cards. The card Nvidia offers in this segment is a joke, the 4060 is crap, problem is the AMD offering, the 7600 is just as crap and not any cheaper so why wouldn't I buy a 4060 and 90% of people do... AMD is stupid, Nvidia isn't even trying in the budget GPU segment, the 4060 is a joke... All AMD had to do was make the 7600 not dogshit... and they couldn't do it... so Nvidia wins again without even trying, cuz AMD is Stupid.
amd needs to hear this
AMD is always the worst option. Has been since they started as ATI and will always be.
AMD has invented and put in features at times (Tensilica DSP, TrueAudio Next) but those are always useless and nobody wants it. AMD does that always same like with their Ryzen CPU. Absolute junk and Intel always has the far superior options.
Their problem is marketing and not keeping everything closed source and being a unethical company like Nvidia. Most of AMDs techniques are open source. Vulkan was started using Mantle which cost AMD millions in R&D yet they gave it to Khronos for free, who turned it into Vulkan which we now all enjoy. Also, when you turn off RT the value for money on AMDs side is insanely better than Nvidia right now. Which means you can buy and AMD card that, without RT, can actually run games at native resolutions without having to upscale for the same price.
AMD cant catch up to Nvidia. Its like stoping a big wave with only your hands.
They need to focus on the things they have and Nvidia dont, Vulkan or Fluid motion Frames.
They will never catch up to Nvidia otherwise.
@@stysner4580 The unethical Company lets me play Graphics far faaaaar superior to AMD.
AMD does not the Open Source thing because its lead by Jesus, AMD needs to because otherwise none buys one.
Their GPUs are nearly the same Price as Nvidia Cards and AMD is also selling them 1000€+
Btw, i got more Features or improvments on Nvidia side as on AMD side and thats why i pay more for Nvidia.
AMD isn't going anywhere. They just won't build stupidly big chips with 25 billion more transistors than anything else like nvidia did with the 4090.
My 7900xt isn't on the steam hardware survey, nobody I know let's steam interrogate their system either.
7:05 there's also 7900 GRE now, it has similar performance/price ratio to 7800 XT, but a bit higher in hierarchy
The consumerists. XD
Pretty soon at Nvida’s power consumption you’ll need a dedicated circuit just for your PC, most homes only have 15 to 20 amp circuits with multiple outlets on one circuit. The average person doesn’t understand wattage to amperage, 120 watts = 1 amp this is a subject you should make a video on. So a 4090 is pulling about 5 amps, a 4K monitor pulling up to 2 amps (some less) cpu maybe 1 amp so on a 15 amp breaker your already at 50% of what it can handle before tripping. Also sustained heat on a breaker weakens it, they are at the point say your microwave and pc are on the same circuit if your pc is on and someone uses the microwave the breaker will trip.
Why would you spend hundreds of dollars more so you can see a nicer reflection in a puddle?
Only idiots pay more than 400 for a GPU, 500 absolute MAX.
@@Wobble2007 Yeah that is my limit also.
seeing at the steam reviews looks like people want nicer reflection.
DLSS and wayyyyyyyyyyyy better drivers
Raytracing is way more than nicer puddles, though. It means finally less bugged out shadows, which are a huge part of the image.
14:43 Nvidia has traditionally not been first with new technology. It has always been a back and forth between the two. ATI created Tessellation in 2001, Nvidia was behind for a good while before they marinated an even greater solution. In 2002 ATI created the first DX3D 9 compatible GPU. In 2015 AMD gave us Async compute, paving the way for DX12 and Vulkan. Then that same year gave us the Fury X, the first GPU featuring HBM memory. Most recently in 2021 AMD gave us the worlds first chiplet based GPU, the MI200. Nvidia gave us stream processing units and CUDA in 2007, which were both exceptionally important developments. Most recently Nvidia is pushing Ray accelerators and AI which are both hugely successful.
All of what AMD has pioneering wasn't unachievable by Nvidia, AMD never succeed to replicate the success of CUDA. Wish is why even if somehow Nvidia started to lose the gaming market; steam hardware survey still shows Nvidia with 76% of the market while AMD is not even at 20% but at 16% and intel at a bit less than 8%, it’s hold on the professional and now the new AI market will enable them to overcome AMD when needed. I don’t think AMD could even compete in the AI market with Nvidia given the cost in R&D and the massive gap in size between them.
I also forget that DLSS and RT where and still are Nvidia domain where AMD still struggle to replicate what Nvidia achieved first.
@@itachiaurion3198 Something to think about here is Nvidia's current success is largely attributed to the current interest in AI. the 40 series have been selling terribly compared to previous generations and that has everything to do with the fact that Nvidia cards are absurdly overpriced even compared to the 30 series which were notorious for being a terrible value. The 10 series GPUs were a serious high mark for them, they were fast and reasonably priced while offering a substantial increase in performance over the previous generation. The current generation of cards just don't make sense for gamers, I would argue their pricing is more in line with enterprise applications which is coincidentally why Nvidia's stocks have been skyrocketing. This is a short term trend though, Nvidia has the AI market cornered right now, but it's only a matter of time before AMD catches up or, most likely, purpose built cards made for AI become the industry standard.
If and when that happens Nvidia will be in serious trouble, what the 40 series has done more than anything is sully their name in the eyes of the public and make them seem downright greedy.
@@Todd_Coward If AMD can't even play catchup with the best Nvidia as to offer for gaming, how can they catch up on a more difficult market? Maybe Intel will try something for the AI but I don't see AMD finally succeeding at overcoming Nvidia.
Unlike Intel, Nvidia will not wait 5 to 10 years that AMD will catchup before waking up. They price may be too high but they are still in the lead in regard of the technology and don't seem to let AMD catchup anytime soon.
Their success come from the fact that they are the sole leader in the 3D rendering space since at least 2015 and all of their gpu sell like hotcakes, even in the gaming market with the 40X0 as the exception and even that I have doubt. The most recent hardware survey from steam show that the top 15 spots of GPU are all Nvidia with the RTX 3060 in the lead and 10X0, 20X0 and 40X0 and other 30X0 in this top. Then we have AMD and Intel integrated processor in 16th and 17th place. Heck even the 4090 is nearly at 1% of the survey and it's not the better placed 40X0.
The first named AMD GPU is the AMD Radeon RX 580 with 0.83%; Nvidia is still at a very comfortable 76% of the market from those steam hardware review.
Even if steam is somehow the HQ of all the Nvidia chill AMD can't have more than 30 or 40% of the PC gaming market from those result. Even if the price doesn’t make sense, it seems that players bite the bullet and buy the gpu’s anyway.
I really don't see AMD pulling a hail mairy and finally overcoming AMD in the 3D market while they are still behing Intel in the CPU market and far behing for the gpus.
They could certainly get Ray Tracing improvements to the moon in one go if they wanted to, same as Intel did. They have the ability to kind reverse engineer what's already been done too, they probably know to a very high degree what they need to do to get up to par.
Better upscaling maybe. Ray tracing is just a silly sales pitch.
We had rumors of the 4090 going to 600w as well. Lots of them. But that's the power limit with an OC on some expensive models like the Strix.
I'm sure they'll stick to under 500w.
I found a youtube channel of my country that brainwashes people saying amd cards have problems and if you don't want problems buy nvidia and all of people trust him and go yeah bro thanks for telling us and informing us i feel so bad for those people who trust him
(Most people in my country are poor and want to buy a cheap card just to play old games like takken 6-7 and gta iv-v)
He's right. Google AMD card issues, drivers have been a plague for decades.
he spittin factos tho 🗣️
If you've seen framechasers it's not just your country.
@@nontoxic9960 bruh lol
Nice of that guy to warn people.
I love my 6800XT but the first one was a dud and had to be RMA'd. The newer one is doing great but a little louder than it should be. This is my first ALL AMD system I have build in over 13 years. Usually I build with Intel / Nvidia and in all honesty, I will be going back to that combination on my next build in a few years.
NVIDIA smartness is like Aizen planned level 🤣🤣
Nvidia is madarame baku
so nvidia is the bad guy?
gj on your content bro, you just got another subscriber.
your doing great, all the best.
Why people didn't buy AMD GPU?
1. Drivers issue
2. Lag behind rattracing
3. Lag behind in encoder
4. They are barely used by media. Meaning that people only think of NVIDIA
braindead nvidia simp spotted
0:34 in other words they didn't improve the architecture to get more fps/watt just made it bigger ...
I'd only upgrade my 7900XTX if there's a GPU equal or better in performance with way less power draw at 4k.
Yeah I just took my 3070 out and replaced it with 7800 XT. While the green machine has some good features, I am putting my money on AMD. NVIDIA needs a wake up call.
DLSS changed the game. I am playing AAA games on my 2022 midrange laptop with a 3050, on high graphics, at constant solid 60fps, only drawing 45watts of power.
You people don't understand how game changing Nvidia's hardware accelerated ai super solution is. It basically extended the life of entry level Nvidia GPUs.
I jump from rtx 2060 to a rx 7700xt and can't be happier. Really nice price i got here in Portugal.
Does the Steam Hardware survey work?. I wonder
It works, it's probably the biggest sample size for such data you will ever get.
Your content is generally good and well researched but the level of clickbait in your titles and thumbnails is getting a little out of hand bro
it feels like this happenned wayyyy wayyy back before on the vega vs 1000 series era when vega 64 and radeon 7 trying to close the gap to 1080 non ti and gave up since it can't close the gap to its superior 1080 ti and a few years later dated now 2024, they're like ... its time.
As a 7900 xtx owner, I really don't get why everyone is buying Nvidia
Then your world is stuck at AMD only. Go out from that AMD cacoon and see the reality of what GPU really is. Then you will understand why nvidia dominate GPU world
4080 is faster than the 7900xtx at literally almost every single workload including 3d rendering apps used by professionals
Because ray tracing
@@arenzricodexd4409 you are talking sh1t. I've owned AMD and Nvidia gpu's , for someone who don't wan't to spend a lot of money for midrange gpu AMD is the best in price/performance. I owned a 2070 S ( is in my daughter pc) and now in my pc i have a MSI 6800 non xt paired with and AMD 7700 non x cpu and they work beutiful togheter.
@@mitsuhh 4080 was on pair with 7900xtx and the 4080s closed that small difference. Nvidia is still the best Value
I just can't with the prices for either of the brands. At the moment I have to buy second hand 2 generations back basically. It's just gone ridiculous
Happy with my 6700xt. Getting ready to install 7800xt. Coming from the 1070
I'd rather go for RX 7900gre...
at least upgrade to 7900xt not worth to upgrade to rx 7800xt from rx 6700xt
Just get the RX 6800 non XT. Go peep the specs, it's literally the same (annoyingly) as the 7800 XT except 100-150 cheaper.
7900 GRE is worth the slightly higher price tag over the 7800 XT, especially after overclocking the gimped memory, although you can't go wrong with either choice.
@@JonnyFlash80 400$ for a gigabyte couldn't beat it. I payed 600$ for the 6700xt red devil
10:00 I think ray tracing is gonna be big in 2025/2026.
Waiting for you to talk about ASUS's Scam
Marketshare doesn't lie. Nvidia is vastly superior in every way. People pay premium for feature set. This is why Nvidia hold 80% market share. It's not always about how cheap a card is if feature set and software is terrible. FSR is a joke.
Also the reality is AMD will never catch Nvidia in software or hardware. They are so far behind it's an impossibility. Even INTEL who is new at this has a superior upscaler than FSR. It's hilarious.
AMD doesn't push any new tech ever. When is the last time AMD introduced something that changes the market? Set's a new standard? Never. AMD just reacts to every new tech Nvidia brings forward in the most gimped way possible. It's a joke. In b4 AMD fanboys not willing to accept reality. Cost doesn't mean anything. Market shows this.
I hope they are trolling and release a beastly gpu 😭
In the past this was a thing, today its doubtful because even if they give us a beastly gpu nobody will buy it.
Hopefully
Get help
@@temperedglass1130 u need to be put in a mental asylum (you need to be helped yourself)
They would,lose money by levelopingnhigh end gpus… that don`t sell enough!
The point is… development eats money. The money has to get back by selling product… and AMD gpus don`t sell!
NVIDIA market share is 78% AMD 21% so NVIDIA gets 4 times more money if they would use same money to development!
I'm still holding on to my RX 5700 XT and I'm not in a hurry to upgrade. I left nVIDIA behind 20 years ago when ATI released their Radeaon X800 card and I have not regretted that choice at all. I got a allround more stable card and I have never had any blue screens of death since with any Radeon card or any driver since. I don't use raytracing and so it's not a problem for me. I rather have a bit slower card and stable than an insane over-priced card and unstable. Not alot have changed in 20 years sure but apparently alot of the issues I had back then are still pressent in the current drivers.
stupid ads, lemme watch Vex’s new vid
Get a VPN dude.
@@thepatriot6966Or something much better and for free uBlock origin.
@pffboahkeineahnung Surprised youtube didn’t delete your comment, I named one and the comment got instantly deleted.
I'm on a 1080.
My next upgrade will be a Radeon, but waiting for next series and price drops.
Bang for buck is most important.
The problem with the steam hardware survey is that *nobody does it*
I do it
its automatic lol with 1 button
bro probably thinking that you need to answer some questions etc instead of just pressing single button 🤣 so how nobody does it? show some proof
@@cloud380 I’m not thinking what you’re thinking I’m thinking. I watch people skip it constantly by dismissing its little pop up it does when the survey rolls around. Ofc I always participate in it.
@@cloud380 you can dismiss it.
The 6900 XT was my first AMD GPU, I’d always gone team green before. I upgraded to the 7900 XTX, and had intended to upgrade to whatever the top of the line 8000 series card ended up being. I suppose now that isn’t going to happen… guess the 7900 will have to carry me through an extra generation or two 🤷🏻♂️
A friend of mine got a 7900xtx just to find out it was slower than his old rtx3070. DDU was used in safemode to remove drivers and everything, 7900xtx was still slower. Maybe it was a defective unit, but after all of the throubleshooting he just returned it and got a 4090 instead
A clean install is the best practice.
7900xtx performs better then rtx 4090 on some games, definitely defective.
genuinely got a faulty card. like for real real. super unfortunate
This rumor happens every new generation. The only time it held true is when they stopped making cards like the Radeon 7. I think they will keep competing with the 80 cards, which is good
Because most people already have good hardwares from the previous drop, and being good with it
They haven't given up, they simply refuse to return to the successful business model from before the crypto boom.
I paid $750 in December 2021 for my RTX 3060. I can't justify changing for anything else right now. I have to keep this card at least until the end of 2025. Only then will I consider AMD... If they are still in the GPU segment.
On second thought, I still have haunting memories of the RX 570 I bought several years ago: onscreen stutters, continuous coil whine, intense heat and fans blowing like a Jumbo jet is about to take off. So maybe not AMD.
The 3090 was a more perfect RTX card than my 2080Ti.
The 4090 is RTX perfected. The problem however is that horrible power connection.
5090 will maximize FPS when Ray Tracing and DLSS are enabled in 4K. It’s gonna be a monster. I’m sure the power connection will be better this time around.
With the power estimates, i'll probably only buy from an AIB that does triple or quad 8 pin honestly. I don't want 12HPWR on the 5090, shit is a time bomb.
So far I've had a 5700xt, 6600xt for my son's build and upgraded to a 7800xt. I'll probably upgrade for the 9000 series.
Amd wants the same amount of Money for around the same performance with significant lesser features. You see what AMD cards sell well: 7800 XT, that's it.
If you want something for vr you have basically no choice.
And my previous 1080 gtx made a lasting impression on me, my amd 4890 did not when basically it broke.
That is why u bought a 1k bucks 4080
The risk of rain music for the outro was a pleasant surprise.
There's gotta be like a ceiling here....in gpus and game graphics
We will never get back to the days where x60 was ~ $200, x70 $350, and x80 $500. Prices decreased from 2007 to 2012 but then rose again dramatically around 2018. Even accounting for inflation, a x80 should be ~$800-$900.
Sorry to hear AMD is slumping with sales, I am happy with my 7800xt. I think it is good AMD is there, cause it will help make graphic card prices lower. If Nvidia is the only company, then they will raise prices for us gamers. There is also Intel, who knows maybe they will pickup in 2 years or so with their new silicon foundry. I like what that guy making his own gpu "Fury". But the bottom line FPGAs have to be more affordable and powerful so you could have decent OpenSource gpus that would drive prices further for gamers. Thanks for reviewing about AMD, Vex.
I like how we are returning to our roots of gaming with Upscaling implementations. Nothing beats a game at a low resolution with sharp User Interface for me.
Amd's mid range cards are too expensive that I'd rather spend a little bit more money and buy nvidia and get more features which I did.
Just goes to show how effective bs marketing is on the average person. DLSS 3.0 is a horrible technology that increases framerate while INCREASING latency. The whole point of high framerates (above something like 90fps) is to get lower latency. Meanwhiley raytracing looks worse than rasterisation in all except for maybe 3 Nvidia-sponsored titles, because devs just use shitty plugins they dont actually understand just like with TAA.
But your average dude-bro with no understanding of tech sees the shiny new tech and wants it regardless.
A friend of mine bought a 4070 a few weeks ago, and tried rtx once, then immediately turned it off because it halved his fps for a visual downgrade over raster. He at least had the excuse of owning a shield that he uses actively. Nvidia has perfected the Apple strategy of making things look new without doing any actual improvements (rounded vs sharp edges).
Dlss and frame gen were reasons I was leaning more towards nvidia for newer graphics cards.
I never noticed the danger of AMD was INTEL's GPU starting to grow up quite fast on the market, more than going to join the NVIDIA's "high end" GPU's route.
Somehow, it made me remember the old times AMD was fighting INTEL again from their processors unit progression in the market. Thanks for the video, you made a new subscriber.
Have a nice day☺
It makes alot of sense to focus where the big market share is, low-mid range graphics cards, where most of the people is.
To be fair to AMD, Intel and Nvidia have engaged in extremely anti-competitive business practices in the past to kick AMD out of the market. AdoredTV has some videos on their past. Nvidia for instance has been caught modifying drivers just to go faster on press benchmarks and have included technologies in their games like Hairworks that cripple performance on AMD cards with no path for optimization for them.
Naaaa, if you search a minimum you can found that amd call it for a 9900xt(x) / 9950xt(x) like as twice the power of actual top-end GPU. The thing is: they aren’t ready yet, so (in my opinion / i think) they are going to replicate what happened with Vega to current XT generation. « How ? »: by sending Nvidia in the corner by releasing it something around 6 months later after the 8th gen / during spring 2025 (maybe summer). I call it again : they gonna sacrifice the next gen who is gonna be really short in time, or an intermediate one, and release something that gonna put nvidia in a hell of a day.
I just built my first PC from the ground up and I went with a Gigabyte OC 7900 GRE and I'm happy with the card so far.
A lot of us may be still rocking on older AMD cards. I have an RX 590 that plays Starfield and DD2. While I don't really need a new card, I've been considering either a 7000, or an 8000 when they come out.