Can AMD Beat Nvidia?

Поділитися
Вставка
  • Опубліковано 14 жов 2024

КОМЕНТАРІ • 123

  • @harveybirdmannequin
    @harveybirdmannequin 5 місяців тому +30

    No matter what happens, nvidia will still be overpriced, amd will still be polishing chiplets, and intel will still be catching up. Unless prices come back to sanity and reasonable levels it's an automatic skip generation.

    • @NatrajChaturvedi
      @NatrajChaturvedi 5 місяців тому

      Yea if the top end is gonna get more expensive then the mass market skus which is 70ti and below will be even worse this generation. 5070 probably gonna be as good as 3080ti at best. 5060 be like 4060 + 10% at best. Gonna suck massively.

    • @kodacko
      @kodacko 5 місяців тому +1

      I knew when I picked up my 6700XT that I'd have this thing for MANY years. But I didn't expect that I'd be waiting till about RTX 7xxx or RX9xxx. Intel seems to be an, interesting?, competitor, but I'm not holding my breath. What with the massive downsizing they've done to their gpu branch. I want to see something that makes me go "OMG, I NEED THIS." Not "Ehh, that's only X% faster than my tweaked and driver modded 6700XT...
      Been very disappointed with all the companies not giving a damn about the enthusiasts.

    • @qrogueuk
      @qrogueuk 5 місяців тому

      ​@@kodacko I'm still running Jan 2020 5700XT and Dec 2019 RX 580 in my house hold

    • @GamingPenguinEnthusiast
      @GamingPenguinEnthusiast 4 місяці тому

      Nope, Nvidia is a premium product worth every penny. AMD can't compete with 4090 through 2 generations of their own products. You can't just stand there for 3 years and pretend 4090 does not exist 😂 And 5090 launches this year and its x2 faster than 4090 and brings new memory standard. Intel polished their drivers and will launch their cards soon. AMD is going to become 3rd choice product.

    • @yacquubfarah6038
      @yacquubfarah6038 2 місяці тому

      @@GamingPenguinEnthusiast lmfao your comment aged like milk

  • @laserak9887
    @laserak9887 5 місяців тому +5

    Much prefer the calmer, direct thoughts. Thanks, and I agree.

  • @lemlibloodaxe313
    @lemlibloodaxe313 5 місяців тому +3

    Another problem with RDNA 3 was people were expecting it to beat the 4090 at half the price which is just silly. I mean they could have priced their cards somewhat lower and probably would have gotten a lot more sales. When you have a similar price and performance to Nvidia's offering most people will choose Nvidia anyways. Next gen AMD could see them back to the 5700xt era where that was their top card. Having a 8800xt at $499 using less power, higher clocks and similar performance to a 7900xtx or a 4080 super could make Nvidia sweat since they probably want to push prices even higher for their low to mid level cards than the current gen ones they have now.

  • @KillerKollar
    @KillerKollar 5 місяців тому +10

    RDNA 4 is rumored to top out at the midrange and that's fine. The best generation of graphics cards AMD had that I can remember was the HD4000 series, and remember they didn't have the performance crown back then either. They just absolutely crushed Nvidia when it came to value and made a huge comeback in terms of sales, they were a serious competitor for a short while. That's what they need, the only thing that's different now is that it isn't just about raster perf anymore, RT perf also needs to improve and FSR and AMDs version of FG has to be on par with Nvidias solutions and it has to be in as many games as possible.

    • @BlackJesus8463
      @BlackJesus8463 5 місяців тому +1

      You really only need upscaling for RT though and it's not like AMD is bad just because Nvidia is better.

    • @arenzricodexd4409
      @arenzricodexd4409 5 місяців тому

      but those HD4870/4890 still perform side by side with nvidia GTX280/285. meaning they still compete with nvidia even on the high end.

    • @adamtajhassam9188
      @adamtajhassam9188 5 місяців тому +1

      i wouldnt mind seeing AMD taking top spot in the GPU market 4 a change. But the already said they wont.

  • @Dimythios
    @Dimythios 5 місяців тому +13

    The price of the video card are just too high.

  • @Dwayneff
    @Dwayneff 5 місяців тому +7

    holy shit! I thought you left UA-cam. Good to see you again Apple Fan

  • @ivana6141
    @ivana6141 5 місяців тому +10

    Tort tie tark = thought I'd talk

  • @roythunderplump
    @roythunderplump 5 місяців тому +3

    AMD Radeon £350 card needs to match or beat Nvidia's £500 one. To get any sort of win. Majority of work professionals or gamers outside of the UA-cam bubble only know Nvidia for the go to for their Photo, Video, 3D modelling, CAD Software, and so on.

  • @callofdutyfreak10123
    @callofdutyfreak10123 5 місяців тому +5

    In the reasonable price range ($250-500) they can, but beyond that prolly not :(

  • @bobdylan2843
    @bobdylan2843 5 місяців тому +4

    One word. Duopoly

  • @26Guenter
    @26Guenter 5 місяців тому +3

    There is around 15% performance left. Almost all of RDNA 3 except for N33 can gain about 15% more performance from overclocking.

  • @brants7131
    @brants7131 5 місяців тому +4

    AMD crushes nVidia in having more letter M's in the company name. Every year, I hear and hope that AMD is going to beat nVidia only to be let down time and time again, back to 2000's. Yet I still use AMD's (CPU/GPU) and game just fine.

    • @yuan.pingchen3056
      @yuan.pingchen3056 5 місяців тому +1

      Over competition hurts the profit of NV and AMD, fake competition causes win win.

    • @davehenderson6896
      @davehenderson6896 5 місяців тому +1

      I loved AMD's first Radeon card, then it went down hill from there.

  • @theordinarydude2104
    @theordinarydude2104 5 місяців тому +11

    AMD doesn't require a crappy power connector, and then melt it into goo...just sayin'

    • @adamtajhassam9188
      @adamtajhassam9188 5 місяців тому +1

      AMD is chicken ass shit to bring out a card that is topped nvidia. Prove me wrong and close DOES not cut it either.

    • @mitsuhh
      @mitsuhh 5 місяців тому +1

      There's nothing wrong with 12vhpwr if you plug it in correctly

    • @davehenderson6896
      @davehenderson6896 5 місяців тому

      I have a 4090 for 7 months now, with 0 problems, and never passes 40c while playing games.

    • @jjlw2378
      @jjlw2378 5 місяців тому

      Idk man I always hear about this issue, but there aren't any actual facts on what's happening. There is no list of the number of connectors are melting when using a 3rd party adapter, which brand 4090 are melting, the voltages of the 16 pin HVPWR in HWINFO64, or even a list of how many have melted in total. Tons of standard 8pin cables melted during the mining boom thanks to adapters, I think that's most likely happening here too.

    • @m.l.9385
      @m.l.9385 5 місяців тому +2

      @@mitsuhh I beg to differ - the 12vhpwr is fundamentally flawed design. I can recommend the videos by der8auer on this topic.

  • @gregwakolbinger
    @gregwakolbinger 5 місяців тому +12

    If you look at amd’s miserable sales numbers for RDNA 3, no way will they put enough money into RDNA 4 to be competitive. And they will over price it as they do with everything, just like everyone else

    • @ItsJustVV
      @ItsJustVV 5 місяців тому +1

      But didn't MLiD kept lying to us... I mean telling us that failRDNA3 was selling so well all the time? And didn't AMD fanbois believe him and still do about any BS he says... 😑

    • @AVerySillySausage
      @AVerySillySausage 5 місяців тому +2

      Problem is they are so close yet so far, the only way for them to "compete" is just significantly undercut and the chips are too expensive to that, nothing changes.

    • @onomatopoeia162003
      @onomatopoeia162003 5 місяців тому +1

      I think tech in general is down a little.

    • @gregwakolbinger
      @gregwakolbinger 5 місяців тому +2

      @@AVerySillySausage I think there is room to price to sell. Just completely unwilling to accept lower margins. They will die on that hill

  • @christopherramsay5517
    @christopherramsay5517 5 місяців тому +8

    With RTX 5090 coming AMD needs to step up especially Raytracing. We will see once next gens come out.

    • @BlackJesus8463
      @BlackJesus8463 5 місяців тому

      If it's the same or slightly less performance is it really next-gen?

    • @christophorus9235
      @christophorus9235 5 місяців тому

      Lol AMD does not need to give a shit about the 5090. Nvidia is shitting their pants due to the new instinct cards.

    • @mitsuhh
      @mitsuhh 5 місяців тому

      @@christophorus9235 Good joke

    • @christophorus9235
      @christophorus9235 5 місяців тому

      @@mitsuhh Lol, just facts snowflake.

    • @davehenderson6896
      @davehenderson6896 5 місяців тому +1

      I will NEVER use RT but will use DLSS3.1 so if AMD focuses on that they can get market share.

  • @Kapono5150
    @Kapono5150 5 місяців тому +2

    Hope AMD doesn’t let Paul down

  • @crzyces1693
    @crzyces1693 5 місяців тому

    I have a 3080FE and use RT when I can get a locked 45+/High settings at 1440P w/DLSS 2.x. The problem is only a handful of games can do that with an RT impact that is noticeably better than regular old shadow mapping and the double flip mirror _"trick."_ When I say locked 45 that means more like a 55 avg with lows around 41 btw, so not really locked but with a VRR display it's pretty unnoticeable while playing. If I had a 4080 or better I'm sure I would use it far more since I don't play competitive shooters, and with over 50% of actual new game sales coming from single player titles, I'm not in the minority. And yes, I would be seriously considering an AMD card but they aren't going to have one in the price range I'll be shopping in when the new cards release, and the 10GB of VRAM is essentially forcing an upgrade if I want to continue playing the newest AAA C/ARPG's at 1440P/High regardless.
    That said, the only way I'd consider it would be the 5090 coming out at over $2K *AND* the 5080 performing worse than a used 4090 (which is possible). It seems ridiculous to type that as the 70 class card had *ALWAYS* matched, at worst, the previous gens high-end offerings in everything but VRAM capacity, but with everything slid down to the next lowest (and sometimes to the smaller size on the smaller die to boot) I wouldn't be surprised to see the 5080 come out with less, well everything than the 4090 while only slightly boosting clocks. For anyone that thinks a $1200 5080 that just loses to a 4090 is a good thing, they are morons as that is a *HUGE* gen on gen on gen on gen on gen regression. Not a retardation, an actual regression. 970 basically matched a 780Ti for $300 bucks. 1070 matched the 980Ti, 2070 was just about on par with a 1080Ti. The 3070 was right on the heals of a 2080Ti. Then the 4070 came out and was all of a sudden performing around 3080 levels...for $100 less? Well that sucked, and the card sold like ass for a long time. Now when I see the 5080 specs all I can do is hang my head a little bit. Like I said, the games I play and the upcoming titles I am looking forward to like Exodus, Judas, DA: Dreadwolf, Avowed etc. etc. simply aren't going to run all that great when limited by VRAM, and wow can RT make a huge difference in the atmosphere in-game. Even in something like The Witcher 3...if you can use Global RT the game is all of a sudden the best looking medieval style RPG ever made...and it's 9 years old! No, the character models themselves aren't quite as good as BG3 for instance, but the monsters, buildings, trees...oh my gosh is it ridiculous with the sun casting shadows or the flicker of a fireplace bouncing off desks while the candlelight casts it's shadows on character model faces. Of course you can't use it with less than a 3090 at 1080P upscaled to 1440P because it just sucks down so much fricken video memory.
    Sure, I could just get a 7800XT for $500 bucks and deal with games using more and more rt features as more and more games are made using new engine add-ons, but if I wanted a console experience, I'd just buy a console since frames aren't my biggest concern. Not hiccuping with graphics cranked is though, and of course, this gen AMD looks like it is throwing a giant FU to the high end bc people didn't buy enough of their underperforming cards, and yes, the 6K and 7K series *ARE* under-performing cards in the high-end for single player gamers. The best selling game last year was single player. 2022? Single Player. 2021? Single player. There may be more people playing CoD or GTAV a few months after a huge single player game releases, but if you are one of the 25 million people who bought Cyberpunk, or BG3, or Hogwart's Legacy, or Elden Ring, you're probably glad you got an Nvidia card as even if you have to inject RT in via a mod, it still looks pretty fricken fantastic, adding even more depth to the fantastic stories and various gameplay types.
    To everyone playing the latest CoD, Battlefield, Overwatch 2 or CS:GO, I'm glad it looks like there will be even better midrange cards for a little less money, but will the next gen from either company really be much of an upgrade over what we already have for $300-$500 bucks? Do any of those games really need more than a 7700XT ($329.99 for most of today on NewEgg and Amazon) or a 6800XT ($389.99 earlier today on Amazon) to get decent frame rates? If you are trying to max out an ultra high 1440P refresh rate display without block settings you will still need a 7900XT or 4070Ti+ anyway. I doubt AMD is going to drop a 4080 level card at $399.99, so I just don't see the needle being moved all that much. Will it be _good_ for people who have held onto 2060's or 5700's? Well yeah, it should be awesome, but for a huge portion of the new GPU *discreet* GPU market it just leaves the high-end in poop land again. Nvidia being allowed to block companies from using AMD cards with programs programmed specifically for CUDA acceleration certainly doesn't help either, as that is literally billions more going to Nvidia which could be going to AMD for research which would inevitably trickle into GPU R&D. Maybe only a billion or two, but Nvidia would still be up huge year on year losing a billion in gross sales per quarter while it would help AMD immensely, in turn helping us _a little,_ if for nothing else than to give us alternatives. And yes, I would absolutely consider AMD. I bought a 5700 over a 2070. I bought a 580 over 1060. Heck, I even had an X700 126MB laptop and x700 Pro 256MB way back in...2003 I believe, so it's certainly not like I am adverse to buying the best card available for the money I am willing to spend at the time, but AMD has to be there competing, and they need to have the features I am looking for. I don't care about DLSS 3.whatever while barely being able to discern between FSR 2.x and DLSS 2.x while moving around, so it literally boils down to lighting effects/shadows (and reflections to a lesser extent). Until AMD improves *DRAMATICALLY* there, they just aren't in the picture for me.
    Even with the 4080 Super I feel like Nvidia is too far behind themselves. I expect more now. If the 4080 released for $1K I would have bought it. If the 4090 was $1400 I would have bought it. They weren't so I _toughed_ it out. $1000 2 years into a generation when i have already played all of the games I want which can take advantage of the 4080 is a silly investment, just like $1200 would be a stupid price for 4090 levels of performance with 16GB of VRAM in my opinion. That's $200 less than I felt like the 4090 was worth *for gaming,* which is why AMD not releasing a high end card is so infuriating. Even if the 5090 was $2200 with a 5080 dropping for $1200 again, if AMD had a 7900XTX level raster card with 4090 RT performance for $900 bucks, well now things look very different and I probably end up with an AMD card in my PC.
    Could AMD do it? Probably, but I just don't see it happening. Not with the focus in R&D being shoved squarely at AI along with all the silicon that will entail, Epyc CPU's and new consoles, leading me to believe that the rumors are pretty accurate...except that $500 4080. I just can't see them releasing a card that matches the 4080 in RT hitting the market, even if raster is somewhere between a GRE and XT.

  • @Acconda
    @Acconda 5 місяців тому

    Nice video and very much unbiased thoughts. I went AMD this gen as fed up of Nvidia trying to push the tiers/costs and the deceit of selling cards with same names but are cut down. Even though I can easily afford the 4090 tier cards and just can’t justify that bracket for games. The £1k mark is my max but the £750 - £800 mark should be the sweet spot like the 1080TI back in the day. Hope AMD keep pushing through as we need at least some competition and it’s also a good thing intel are joining the mix.

  • @michahojwa8132
    @michahojwa8132 5 місяців тому

    no Paul, MCD is to blame. You lower ram clock and can OC core higher, you need to raise voltage for higher OC (!) instead of lowering. It tells me GCD to MCD connections eat power and overheat the gpu. 7600 has different problem, oreo packaging - miniature chip, less cache - those were designed to be cheap to produce but turned out slower. Adverised +80% rt perf didn't come either.

  • @protato911
    @protato911 2 місяці тому

    This might sound a bit weird, but honestly if AMD manage to compete with Nvidia next gen, even if its 10 to 15% slower in the top end while manage to stay within 1k, I will give it a shot simply because I want to get a Sapphire card. Nvidia this days with how they treat their board partners, EVGA is gone and honestly brand like Asus, Gigabyte, MSI and so on don't seem to be as reliable but demanding absurd price increase. I heard tons of good thing about Sapphire and their top end card isn't that much of a price hike so I would love to own one, but ofcourse the base chips need to be good too.

  • @jimdob6528
    @jimdob6528 4 місяці тому

    In terms of sheer raw performance? Probably not. But if amd would actually price themselves to be competitive instead of complacent then we might actually see a change. Amd should not be trying to turn a profit for 1-2 generations and instead try to build themselves up and undercut nvidia. So that they can actually be put into a positive light for most gamers.

  • @kellymaxwell8468
    @kellymaxwell8468 2 місяці тому

    I'm not sure if someone can help me. I'm still new. I have a quick question. I'd the final resolution end goal 8k or 16k, thanks ?

  • @jorgepesquero9967
    @jorgepesquero9967 5 місяців тому +2

    Great thought process, I enjoyed the video, Thanks Paul!
    I believe this time around, RDNA 3, like you said, the halo card that AMD would have to create, just wouldn't be good enough all around. Would most likely be very hot and require watercooling, kind like the fury x. Would be expensive to make, big and would "devour" more precious wafers and AMD just wouldn't be able to price it to their liking, after all, people buying at this level, usually with more money than sense hehehe, most just go for Nvidia, in order to compete, AMD would have to go significantly cheaper than Nvidia, even if they outperformed Nvidia in raster. All that while spending on wafers, expensive cooler solutions, marketing etc, just low ROI really.
    TL DR: "7950XTX" just wouldn't be worth it to AMD.

    • @26Guenter
      @26Guenter 5 місяців тому

      They wouldn't have to br significantly cheaper. Say the 7950 XTX is 10% faster in raster than the 4090, but it still loses in RT. It'll probably have 32GB of VRAM. So with all of that AMD could probably get away with $1500 being that it having more VRAM would make it more attractive for AI which would drive sales.

    • @jorgepesquero9967
      @jorgepesquero9967 5 місяців тому

      @@26Guenter A U$ 1500 "7950 xtx" would still lose badly to a U$ 2000ish RTX 4090 in sales proportionally, even being faster. All while being way too costly to make. Specially at this level, a very small portion of the market. AMD would not get the crown properly, in rt would still lose and the margins on a very costly "7950 xtx" just wouldn't make up for it. That's precisely why they didn't make it imho. They are just way better off focusing on more relatively lucrative segments, kind like what Paul said, a killer U$ 500 card its where it's at, that and cheaper segments.
      Not to mention Nvidia could just counter with a 4090ti and bang, more lost sales to AMD.

    • @26Guenter
      @26Guenter 5 місяців тому

      @@jorgepesquero9967 MSRP of the 4090 is $1600. So that would push prices down and cause Nvidia to move better 202 dies from data center to the 4090 Ti. Also the 4090 Ti wouldn't be more than 20% faster than the 4090. I'm not saying AMD would win but just like with RDNA 2 make Nvidia look stupid. If AIBs can get away with charging $2000 for 4090s what do you think a 4090 Ti would be? AMD would be in a prime position to scoop up sales.

    • @jorgepesquero9967
      @jorgepesquero9967 5 місяців тому +1

      @@26Guenter I believe if that was the case AMD would have made the 7950 xtx. They have way more data than us and decided not to. Fighting against Nvidia for the halo product, within the current context, arch just proved not to be worth it. And the 4090 U$ 1600 MSRP means nothing, since you can't buy one for such price most of the time, even years after launch. Any sales AMD were to get from said halo product just wouldn't be worth the investment. It would just be an expensive product to make that the ROI just wouldn't be there, most likely. They themselves likely realized that, hence it doesn't exist.
      As for AI and Nvidia looking stupid, hehe, Nvidia couldn't care less, right now they're just too far ahead, they don't care much about "looking stupid" in the gaming market, they've been doing that for years, heck, they likely didn't even need a 4090ti vs 7950 xtx, they could just ignore it and still win or launch a 4090ti for 2.5k and still outsell AMD largely.
      Fighting for the crown requires muscle that AMD right now realized/decided would be better put to use elsewhere and I can't say I disagree.

  • @GreenAppelPie
    @GreenAppelPie 5 місяців тому +1

    Ray tracing is alright, but I’m all about the CUDA core count

  • @7rich79
    @7rich79 5 місяців тому

    Could it be that AMDs architecture at some fundamental level really depends on a memory technology like HBM?
    I'm just thinking that it may actually not be possible for AMD to use the best technical solution because of how the market operates.

  • @mraltoid19
    @mraltoid19 5 місяців тому

    What would be nice:
    RX-9700-Pro - 7900XTX Performance - $500
    RX-9600-Pro - 7700XT Performance - $250

  • @dirkjewitt5037
    @dirkjewitt5037 5 місяців тому

    No. But they can compete against up to Nvidia's 80 series. maybe they should concentrate there.

  • @onomatopoeia162003
    @onomatopoeia162003 5 місяців тому +1

    It's going to take mastering the chiplet design. For them to beat them

    • @mitsuhh
      @mitsuhh 5 місяців тому

      Nvidia are already using chiplets

  • @johnathaan1
    @johnathaan1 5 місяців тому

    The issue amd has run into this gen is the like 5 different clock rates for every single part of the gpu if you look at the 6900xt vs 7900xtx vs 4090 clock speeds the only one that has more then 3 listed is the 7900xtx if they just didnt care about the shader clock and the game clock and all that crap amd's 7900xtx would be a monster of a card and this is coming from a 4090 owner I personally dont buy amd and prefer nvidia cards due to the more mature tech like dlss and frame gen which unfortunately are locked behind nvidia cards and for FG specifically the 40 series, but seeing how amd cant compete in the enthusiast level cards really nvidia can charge what ever they want if amd released a 7900xtx with 3 clock speeds ie base boost and memory and they just let it draw around the same power as nvidia did they'd have a killer product a 1000usd card that can trade blows with nvidias top dog in raster would be a amazing W for amd and would draw more of the masses over to team red which in turn would cause nvidia to lower prices ever wonder y 4090's still a 2000usd card?? Outside of nvidias AI crap is because they are still the king of gaming gpu's as of current which honestly even as a 4090 owner suuuucks so bad nvidia has 0 incentive to innovate so they'll keep shoveling crap out to the mass gaming market and no one can do anything about it because they dont have anything to compete with it, the 6900xt was an amazing card for that gen and I'd be willing to guess that card and how well it was received is why the 4090 was such a huge leap over the 3090 is raster and RT because amd was trading blows with nvidia having competition drives innovation and if you cant innovate fast or well enough your product fails and eventually your company starts to die out, look at intel amd has spanked them so bad for the past like 2ish gens and 1 refresh that they went into the GPU market which is insane intel cards where the first to have AV1 encoding and decoding and then rnda 3 and 40 series did months later because it was so well received in the market honestly I am very excited to see what intel has instore for their battle mage gpu line up on top of amd's strix point apu's which benefits the handheld and mobile market greatly

  • @zodwraith5745
    @zodwraith5745 5 місяців тому +1

    It's not just that prices have gone up that pisses you off, it's that the _performance_ just isn't there even at the ridiculous prices. I'm stuck where I want more performance but if I want anything _notably_ faster I'm going to have to spend a stupid amount of money for a 4090. I'm currently using a 3080ti which is not slouch mind you, and I obviously don't _need_ a new GPU right now but it's obnoxious to have to tune settings and use upscaling in every new game I pick up just to get 90fps. Games just keep getting more demanding faster than GPUs unless you want to spend a house payment on one. It used to be when you spent $700 on a GPU you didn't even have to think twice about settings and just max them out. Now even the 4 figure GPUs still require you to tune your settings if you want more than 60FPS, and even worse if you want to use RT.
    Even on the CPU side I've been needing more multicore performance and was lucky enough to find a buyer for my 12700k so I can upgrade to a 14700k tomorrow while only putting up $100 for the upgrade from 12 cores to 20. That should give me a small gaming boost but should be drastically better for compiles. While I wanted to wait for Zen5 or Arrow Lake those might not be priced decent until well into 2025. At least with that 14700k maybe I'll be fine to focus on GPU after 50 series launches.

  • @anthonyc2159
    @anthonyc2159 5 місяців тому +2

    Raytracing should not be a benchmark for graphics card performance. Have an RTX 3080Ti and I saw RT as a gimmick that Nvidia was able to get away with as soon as I tried it. It takes a massive FPS hit on most games with little visual difference unless you stop and stare. I'd rather push everything to Ultra and textures to the max and enjoy a more fluid gaming experience.

  • @InternetSupervillain
    @InternetSupervillain 5 місяців тому +1

    Nope, they can't.

  • @sulphurous2656
    @sulphurous2656 5 місяців тому

    Hell no, Jim was right in his assessment of Radeon a year ago. AMD aren't playing ball with their graphics division and haven't in ages. They could have beaten NVIDIA this generation but chose not to, again. And even then they'd be outsold 1:10 as always, because their problems go back to the days of the 8800 GT.

  • @billlam7756
    @billlam7756 5 місяців тому +1

    Hopium 😂

  • @daviddesrosiers1946
    @daviddesrosiers1946 5 місяців тому

    People care a lot more about frame gen and upscalers than RT.

  • @NatrajChaturvedi
    @NatrajChaturvedi 5 місяців тому

    You don't actually play games much especially modern games so you wouldn't know. But ray tracing doesn't only work at 4k. It can also be used at 1080p and 1440p with reasonable cards like a 3070 and beyond.

  • @goblinphreak2132
    @goblinphreak2132 5 місяців тому

    In my mind, the early leaks for AMD for the 7000 series stated a single chiplet with cores on it followed by EIGHT (8) cache chiplets surrounding it. What did AMD actually give us? a 999.99 card with a single core chiplet and six (6) chiplets. everyone was like "where the faster card?" and the one dude that used to work for alienware who is now amd's guy, said "we targeted the 999.99 price point, we could have gone more expensive, but felt this was better" which to me is just "um what?".... okay, so release a card that fits the 999.99 price point while offering a faster card that's more expensive. like come on man use some common sense as a business. im tired of it whoever is running AMD's marketing and business choices. i mean i get lisa su runs things, but im sure she has people in her ear with what they think is best, and their best isn't good enough.... if I were AMD, there would have been the 7900xtx "flagship" that was targeting the 999.99 price (their model not AIB obviously, because that's what they were saying) and then above that they have the "extreme enthusiast" card that's however more expensive and just the most insane edition. CORRELATING THAT TO NVIDIA, technically the 4080 is their flagship gaming card.... the 4090 is technically the titan class card, which is "extreme enthusiast" card above the mainstream card.... so they would in essence being doing exactly what nvidia already does. offering the flagship card that MOST gamers buy due to pricing, and then that one "fuck you" "balls to the wall" "insane" "extreme enthusiast" card. which is what we should be getting. i would buy their "extreme enthusiast" card.... i bought the 6900xt because it was the highest. i bought the 7900xtx because it was the highest.... they need to stop with
    the bullshit and give us higher performing cards. this "but the budget" is just asinine and im tired of it. we deserve more.
    the biggest thing about graphics card, is scaling with cores. they say "more cores wont net more performance" bullshit. every generation nvidia adds more cores. the 3090 only had 82 SM's (cores) and the 4090 has 128 SM's.... they literally increased cores. they just choose not to. Grahpics workloads are not like CPU workloads where its hard to run multi-core. look at games, and old single core game can't magically use more cores. you have to specifically code the game to use those cores. and even then, there are tweaks code wise in order to get it to work. its a huge "pain in the ass" if you dont know how.... but with graphics cards, more cores is more performance. the 5700xt only had 40 compute units (cores). EVERYONE SAID "you can't do more because you don't gain performance" and then AMD drops the 6900xt which had 80 compute units. from that 80 compute units, you gained 50% more performance. 100% more cores for 50% more performance. and then due to architecture gains, we had another 50%. so the overall quality was literally 100% faster than previous gen (6900xt vs 5700xt) and every reviewer proved it to be true. it blew people away. with that in mind, if the basic meme is double cores means 50% more performance, then going from 80 to 160 would give us another 50% gain. But I think it would actually be more than that. maybe 60-80% more performance. because the gap between 40-80. 40 to 80 is only 40 more. 80 to 160 is 80 more.... 40 is double 80.... so the gap is larger even though its 100% more.... so the gains should be more. is it hard to make a large chip? absolutely. not my problem. people like me would pay the money for a faster, more expensive card. period.

  • @Lightsaglowllc
    @Lightsaglowllc 5 місяців тому

    No.

  • @ooringe378
    @ooringe378 5 місяців тому

    Let alone Nvidia, will AMD even beat Intel Celestial?

  • @shayeladshayelad2416
    @shayeladshayelad2416 5 місяців тому

    Tell u the truth i dont care any more or even in the past i always wanted a 4k gpu that does not cost like full pc rig 60fps native around 500$ that is what i want the rest is bonus

  • @C0bblers
    @C0bblers 5 місяців тому +3

    Nvidia have Arc de Triomphe architecture.
    AMD have 2 chairs back to back in their mums kitchen with a sheet thrown over the top to make a tent.

  • @LambdaShitpostHeaven
    @LambdaShitpostHeaven 2 місяці тому

    The answer is: No, they can not. It will never happen again.

  • @rvs55
    @rvs55 3 місяці тому

    Back after months. So what happened to the hype about that vaporware 7950XTX? LOL

  • @BlackJesus8463
    @BlackJesus8463 5 місяців тому +5

    AMD is about to put out the third generation in a row of the same performance all the while raising prices whenever they can! I might just buy Nvidia out of spite! loljk 🤣🤣

  • @HastyRhombus760
    @HastyRhombus760 5 місяців тому

    Reportedly AMD is not putting out a top end card for RDNA4 but will with RDNA5 so they will not close the gap at all. This is a total skip gen.

  • @davehenderson6896
    @davehenderson6896 5 місяців тому

    They can but they are a year behind Nvidia all the time.

  • @อีลอนคาร์ก
    @อีลอนคาร์ก 5 місяців тому

    Thanks for funny video

  • @Zorro33313
    @Zorro33313 5 місяців тому

    Any revolutionary shift comes with an actual penalty, but opens up plenty of possibilities to improve. First AMD revolutionized the arch itself - in was 5000 and it was kind of distorted cuz it was the first step after the revolutionary move. Then they got it right improved massively af and made another revolutionary shitf - MCM. It was 7000 and it was distorted again for the same reason. 8000 to 7000 gotta be like 6000 was to 5000 - huge ass improvements, very competitive. what they're planning on next - i dunno. maybe 9000 will be another revolutionary move, if AMD has smth up their sleeve like an actual 3d stacking or maybe it's gonna be evolutionary move. cant predict - i'm pretty much out of the industry now, but i hope the general concept is clear now. it applies to anything basically, not only GPUs or semiconductors or even tech in general.
    amd's strategy is obviously to cut evolutionary improvements and transition to next revolutionary stage asap. it's a clever long term strategy. They will be first to MCM by far so when others GPU-manufacturers will move to that revolutionary stage and suffer from transition penalty AMD will be enjoying making evolytionary improvements.
    Lisa is setting the company to be far ahead of nvidia. it's a done deal already. just sit and watch. amd will surpass nvidia massively in 2-3 gens. leather jacket maniac is betting all his shit on software for a reason - he sees the writing on the wall. it's just how it goes. you can't break this fundamental development principles. he already lost the hardware battle. will he be able to win via software? he surely hopes so. i hope he'll get wrecked. we'll see.

    • @BlackJesus8463
      @BlackJesus8463 5 місяців тому

      IDK they don't seem to get the performance worked out before they guess at how much die to cut for the tier, why we have three 90s and a 50% drop in gaming.

    • @arenzricodexd4409
      @arenzricodexd4409 5 місяців тому +1

      "They will be first to MCM by far so when others GPU-manufacturers will move to that revolutionary stage and suffer from transition penalty AMD will be enjoying making evolytionary improvements."
      not always the case. looking the thing that AMD has done for the past 15 years sometimes they push certain tech in to their GPU just for the sake of having it and can say they have bleeding edge stuff inside their GPU. and just because AMD was ahead of other GPU maker on certain things that they are ahead. tessellation nvidia beat AMD at their first try. RT. intel end up having much better hardware implementation than AMD despite Arc being their first serious attempt at dGPU.
      "Lisa is setting the company to be far ahead of nvidia. it's a done deal already. just sit and watch. amd will surpass nvidia massively in 2-3 gens."
      been hearing this stuff from the likes of Charlie D years ago really.

    • @Zorro33313
      @Zorro33313 5 місяців тому

      @@arenzricodexd4409 you're good at predicting past that's already happened. just sit and watch.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk 5 місяців тому +1

    4080 Super > 7900 XTX.
    Fact.
    "All of Nvidia's dials go to 11."

  • @Nianfur
    @Nianfur 5 місяців тому

    Jim. You're forgetting that AMD chose not to give the RX 7900 XTX the power it can max draw. If it was allowed to gobble the watts like a RTX 4090, it would be very fast.

    • @certain_media_outlets
      @certain_media_outlets 5 місяців тому

      How can we make it faster?

    • @Nianfur
      @Nianfur 5 місяців тому

      @@certain_media_outlets AMD decided against it, so not sure you can without a full WC loop and lots of power.

    • @arenzricodexd4409
      @arenzricodexd4409 5 місяців тому +1

      People oc those 7900XTX. End up consuming almost 600w while still being slower than 4090.

    • @Nianfur
      @Nianfur 5 місяців тому

      @@arenzricodexd4409 It would have required AMD to do it. Which they declined to do.

    • @davehenderson6896
      @davehenderson6896 5 місяців тому

      It's not so much the power as what you do with the power, my 4090 is a cool 40c while gaming, now you can't tell me it's consuming 600watts at that temp?

  • @kirkblack2310
    @kirkblack2310 5 місяців тому

    They could have done a lot better if they weren't charging Nvidia prices for the 7900xtx. No one uses ray tracing except for like 10-15 mins to see what it looks like.

  • @tofu_golem
    @tofu_golem 5 місяців тому +2

    Who cares? The next generation of cards will be just as overpriced.

    • @davehenderson6896
      @davehenderson6896 5 місяців тому

      Yes if you think the 4090 is expensive, try a 5090.

  • @simplysimon966
    @simplysimon966 3 місяці тому

    no videos of late...channels dyeing?

  • @Zfast4y0u
    @Zfast4y0u 2 місяці тому

    whats up with this guy, he had a stroke?

  • @magnusnilsson9792
    @magnusnilsson9792 5 місяців тому +2

    $499 7900XTX and
    $349 7800XT performance
    would be good.

    • @johncrosby2785
      @johncrosby2785 5 місяців тому

      7900xtx will likely drop to $600-700. Rtx 5070 will be similar performance for $700-800 is my guess

    • @blabla-fw9ix
      @blabla-fw9ix 5 місяців тому +3

      @@johncrosby2785 Maybe but the 7900xtx will have 24 gigs of VRAM while the 5070 will most likely have 12 , the 5070 because of the VRAM is a bad product at that price range regardless of performance . Nvidia purposely putting low VRAM so people will buy a card every generation , heck even the 5080 will likely have 16 which for a 1k+ card is laughable but it is what it is , it will have 4090 like performance but 1/3 less the VRAM .
      This guy seems like an Nvidia fanboy , the dude never mentioned VRAM which is important to AMD owners as much as the raw raster is , you can't ignore VRAM , anybody who thinks RT is more important than VRAM is a moron .

    • @davehenderson6896
      @davehenderson6896 5 місяців тому

      @@johncrosby2785 Yes the 7900xtx will be quickly replaced by a 5070.

    • @jjlw2378
      @jjlw2378 5 місяців тому

      I would not hold my breath. I expect very marginal gains with a focus on efficiency and slightly better RT. I expect RDNA4 top die to be a slightly faster 7900GRE with slightly better RT. It will be $600+.

  • @LaneBatman-c2v
    @LaneBatman-c2v 5 місяців тому

    It’s hard to desire AMD at the high end.. 4k etc… and RT is a top tier feature too. And to not be able to use it (playable) on AMD is silly…. Nvidia has the top on lockdown, of course they still have to consider price to per, but they will just increase performance a little more to always beat AMD. They have the r and d and exp to do this like clock work. Both parties know this..
    Also to see nvidia so a nvlink between two dies and esincially make it function as one will open the floodgates for them design wise… as they can now manufacture 2 cheaper smaller better yield dies and link them, this is insane good. Surely better than chiplets for simplicity and functionality. I don’t know details on this… but now they can essentially do a mega card for marginally more money. Like a 70-800mm sq die from fusing 2 smaller dies… I bet the margins only widen on the next gen card…
    And cranking power is a last ditch effort to compete. Both cos did this. 4090 should have been a 350-400w tdp tops. Card is so close to peak performance at 300w… anyway
    Good thoughts on the mid and lower tier.. with RT and 4k out of the equation for the most part, AMD could really dominate the raster/price/per competition… but Nvidia dominates top end foreverrrrr… and they could take notes on mid and low end manu and do last gen arch to keep costs way down

  • @Thor_Asgard_
    @Thor_Asgard_ 5 місяців тому

    AMD wont compete.

  • @dante19890
    @dante19890 5 місяців тому

    Amd will never catch up. But they can still win over the lower end market If they price is right

    • @davehenderson6896
      @davehenderson6896 5 місяців тому

      Yes they can NEVER beat Nvidia's flagship card BUT on the low end they CAN beat Nvidia.

  • @oneanother1
    @oneanother1 5 місяців тому

    Amd needs faster ai image generation. Their worse power efficiency isn't helping them either. Amd needs better content creation as well. I would not bother considering Amd even if they were priced better.
    What crazy, I think we are reaching the limit of performance to power consumption. So that's it. 5090 is it. I don't know how they are going to get more performance without more fake frames. Pretty soon dual GPU dies will be it. How much performance does a game need to run smoothly? What a time to be in the GPU space when all the new games suck. And games aren't made for high end GPUs.

  • @mperuski100
    @mperuski100 3 місяці тому

    I gave up gaming. I just watch videos now. The GPU game is Crazy town with prices. Game developers are lazy and the games are more of the same.

  • @GodKitty677
    @GodKitty677 5 місяців тому

    The dGPU market is just nVidia now. There is no AMD vs nVidia. DXR (ray tracing) was the killer feature and so was AI upscaling. Steam hardware survey shows AMD have been wiped out. This has been the tread from the nVidia 20 series released. Telling people crap about AMD cards wont get them to buy them.

  • @gertjanvandermeij4265
    @gertjanvandermeij4265 5 місяців тому

    *Ray Tracing is just GAY Tracing ! I DON'T want, or need it !*
    I choose RASTER over RT, any day !

    • @davehenderson6896
      @davehenderson6896 5 місяців тому

      I don't use either, I'm more into DLSS, that's more important than how pretty the graphics are.

  • @mikebruzzone9570
    @mikebruzzone9570 5 місяців тому

    mb

  • @tigerbalm666
    @tigerbalm666 5 місяців тому +1

    AMD gpus are soooo gawd awful ugly lately...WTF????

  • @Boorock70
    @Boorock70 5 місяців тому

    - Horrible "video playback" 40+W power consumption (UA-cam/Netflix/VLC etc.)
    - Useless "chiplet" architecture (unstable UV/OC)
    - OverPriced (need real discounts, URGENT!)
    So, that's why No One is buying the AMD RX 7000 series ! 🤔

  • @obeliskt1024
    @obeliskt1024 5 місяців тому

    I'm 69th like