Це відео не доступне.
Перепрошуємо.

AMD's REVOLUTIONARY Gaming GPU Chiplet Will Change EVERYTHING

Поділитися
Вставка
  • Опубліковано 17 сер 2024
  • AMD's REVOLUTIONARY Gaming GPU Chiplet Will Change EVERYTHING
    A very interesting patent has surfaced, giving us some insight into AMD's plans for a REVOLUTIONARY gaming GPU chiplet that could change everything. We also touch on a very interesting claim from a startup that claims it can boost CPU performance by 100X! Join us as we examine and deep dive into the AMD patent, and what the gaming graphics card chiplet design could mean not only for the future of Radeon GPUs but GPUs themselves. Of course, we know that RDNA 4 will NOT have a high end variant, and AMD are instead putting their resources into the future - could this chiplet design lead to some huge leaps in gaming and ray tracing performance? A shake up of the Radeon architecture?
    Plus, we also have some very interesting claims from Flow Computing, who are claiming that it's Parallel Processing Unit could boost processor performance by an impressive 100X. But just how realistic are these claims? Flow Computing are claiming that their PPU is compatible with ANY current architecture CPU. But will ever see this interesting sounding redesign in real world processors with real world performance benchmarks?
    Subscribe - www.youtube.co...
    Join our Discord - / discord
    LINKS
    www.tomshardwa...
    x.com/Olrak29_...
    www.freepatent...
    Tip on PayPal - paypal.me/RedGamingTech
    Whokeys 25% off code:RGT
    Windows 11 Pro $23,2:biitt.ly/581eD
    Windows 10 Pro $17,6: biitt.ly/9f0ie
    Windows 10 Home $15:biitt.ly/XYm9o
    Office 2016 $28,8:biitt.ly/cy7Vb
    Office 2019 $48.6:biitt.ly/YInvw
    www.whokeys.com/
    AMAZON AFFILIATE LINKS
    US - amzn.to/2CMrqG6
    Canada - amzn.to/2roF4XO
    UK - amzn.to/2Qi6LMS
    GreenManGaming Affiliate link -
    www.greenmangam...
    SOCIAL MEDIA
    www.redgamingte... for more gaming news, reviews & tech
    / redgamingtech - if you want to help us out!
    / redgamingtech - Follow us on Facebook!
    / rgtcrimsonrayne - Paul's Twitter
    / redgamingtech - Official Twitter
    0:00 Start
    0:02 Today's topic
    1:10 RDNA 4 patent
    8:00 Parallel Processing Unit = 100X CPU performance?
    Royalty-Free Music - www.audiomicro.com/royalty-free-music

КОМЕНТАРІ • 197

  • @ProtectusCZ
    @ProtectusCZ 2 місяці тому +154

    We hear this with every new generation that'll completely change the game and nothing significant happens in the end.

    • @ianwalton4408
      @ianwalton4408 2 місяці тому +16

      clickbait fella, you wont bother if they say, this chip's slow and crap :(

    • @mrm7058
      @mrm7058 2 місяці тому +10

      @@ianwalton4408 Probably not slow and crap, but more like an incremental update.

    • @theanglerfish
      @theanglerfish 2 місяці тому

      jako obvykle...(as usual)

    • @BridgeTROLL777
      @BridgeTROLL777 2 місяці тому +8

      we have quite significant jumps every generation.

    • @ianwalton4408
      @ianwalton4408 2 місяці тому

      @@mrm7058 quite possible :) i was simply posing an example, if title said it was slow etc, you wouldnt bother, so everything's gamechanging wooohhoo

  • @riven4121
    @riven4121 2 місяці тому +67

    At this point I'll believe it when I see it. It's the same way the 7900XTX was supposed to "change everything"

    • @beachboy_boobybuilder
      @beachboy_boobybuilder 2 місяці тому

      RDNA3 failed to meet expectations apparently due to issues with silicon.

    • @FMBriggs
      @FMBriggs 2 місяці тому +6

      it definitely made people change their PSU. That card pulled 450 watts out of the box. glad I sold mine, was getting heat stroke while gaming.

    • @ZackSNetwork
      @ZackSNetwork 2 місяці тому +4

      @@FMBriggsThe 7900xtx over clocked pulls more power than a damn 4090.

    • @unavailable291
      @unavailable291 2 місяці тому +8

      @@ZackSNetwork 4090 spikes to 600watts, thats why they still burning

    • @darkfire3691
      @darkfire3691 2 місяці тому +6

      @@unavailable291 such a lie it's cringe, 4090 has lower power spikes than 3090... Check Gamers nexus video. Probably an AMD fanboy no wonder.

  • @haroldadams3951
    @haroldadams3951 2 місяці тому +57

    Swore I heard this about RDNA 3… 😂

    • @NetworkNinja84
      @NetworkNinja84 2 місяці тому +2

      Idk. I think he got something from his sources but seems like they are all gone. Now it’s just talking about stuff on the internet Channel. 😂. Still love em but he getting me too many times with these titles.

    • @aberkae
      @aberkae 2 місяці тому

      Yeah but this is different said the target demographic audience 😂.

    • @HM-rz8nv
      @HM-rz8nv 2 місяці тому +2

      @@aberkae Don't be so quick to generalize. Not even the "target demographic" you're thinking of believes these rumors. I use AMD GPU's, i have a 6800 and 6900XT in two separate desktop compters, my only Nvidia GPU is a 3060 laptop. My preference for AMD GPU's comes down to getting really great performance for the price, i couldn't care less who is "winning" at the cutting edge of performance. I don't believe for one moment RDNA 4 is going to beat the future 5090. If it was a similar leap from the 6900XT to the 7900XTX for RDNA 4 (which is 36% going by techpowerup), that would be an acceptable boost in performance. that would easily be enough to edge out the 4090 by a decent margin. The point it, I, as someone who favors AMD GPU's don't give a crap if it beats Nvidia or not, i just expect to see reasonable gains in performance to keep it reasonably competitive. they just need to be priced fairly.

    • @aberkae
      @aberkae 2 місяці тому

      @HM-rz8nv speak for yourself I on the other hand, would pay more for an AMD gpu if it performed better than the 5090, ( efficency, rt, feature set, etc)

    • @tkermi
      @tkermi 2 місяці тому

      lol, yeah. Nothing against hearing these "news". It's interesting stuff. But definitely nothing I give any trust.

  • @Etheoma
    @Etheoma 2 місяці тому +21

    So RDNA 5 then?

    • @benjaminlynch9958
      @benjaminlynch9958 2 місяці тому +7

      Yeah, they couldn’t get it working for RDNA4.

    • @50H3i1
      @50H3i1 2 місяці тому

      It benefits more than just more time if it's a bit delayed . They also can use a better node and use the architecture for next console generation

    • @tomtomkowski7653
      @tomtomkowski7653 2 місяці тому

      "Just wait" xD

  • @drsm7947
    @drsm7947 2 місяці тому +6

    Maybe the next rtx 5070 series will have the same performance as the rtx 4090 its price will be $600-$800.
    AMD need to get competitive and make a mid GPU that's better performance than rtx 4090 for just $500.

    • @skywalker1991
      @skywalker1991 2 місяці тому

      70 series always been faster than previous 9 series, rtx3070 was faster than rtx2080 ti , rtx4070 was same as rtx3090 , rtx5070 better be same as rtx4090 .

    • @drsm7947
      @drsm7947 2 місяці тому

      @@skywalker1991 true 4070 12gb more or less better vs 3090 24gb
      So AMD needs to make the GPU better than the 4090 for just $500

  • @kozad86
    @kozad86 2 місяці тому +7

    Riding the hype train will only take you to disappointment while you cling to your outdated parts.

  • @SmashTimeStudio
    @SmashTimeStudio 2 місяці тому +1

    Who used an RX 480/580 and saw the huge improvements with Vega 64 +38%, or +24% with the previous gens high end R9 Fury X - I had the smaller R9 Fury and R9 Nano and the jump from GCN to Vega was significant. During that time the real problem wasn't an insignificant jump in AMD's performance it's the fact that the GTX 1080 Ti was released blowing Vega out of the water and remained relevant all the way through RDNA 1.0 with the RX 5700 XT being 12% slower + AMD not competing in the high end that gen - Radeon VII was Vega 20 and was more of a prosumer part. So finally we get to RDNA 2 and the 6700 XT was 35% faster than the 5700 XT, and the 6800/6900 XT had around double the performance - also significant but didn't really have anything that competed with DLSS at first, which is important for RT. RTX + DLSS + RT vs RX + RT wasn't really a fair comparison: RDNA 2 had significant improvements that made them competitive with everything but the massively overpriced 3090, all in one generation.
    RDNA 3 may not of been as significant vs RDNA 2 but everything from GCN to Vega to RDNA to RDNA 2 definitely was and RDNA 3 was the step into chiplets and moving away from monolithic dies - if/when AMD gets that to work with RDNA it'll be VERY significant as AMD will finally be in a position to make competitive desktop APUs & the ability to out maneuver Nvidia's monolithic GPUs. I can see a jump comparable to RDNA & RDNA 2 for RDNA 3 & RDNA 4: an extra generation of RT & FSR built into the architecture with the scaling of chiplets - we've already seen it typically taking 2 generations to get this new kind of tech working.
    Just look at Zen 2 to Zen 3 & RTX 2000 to RTX 3000 as examples...

  • @maotyrias
    @maotyrias 2 місяці тому +2

    I think you already said that for RDNA3

  • @DantesDentist
    @DantesDentist 2 місяці тому +7

    Heard this all before. I'll believe it even I see it

  • @LastRightsTV
    @LastRightsTV 2 місяці тому +5

    MCM will never happen. Latency is why, stop please for the love of god stop it with this bs mcm gaming gpu hype.

    • @thickdaddymukbang
      @thickdaddymukbang Місяць тому

      I wouldn't say never, probably not very soon, but "never" is a bit of a stretch...

  • @johndoh5182
    @johndoh5182 2 місяці тому +3

    So your opening statement, that WE thought that AMD would sport an MCM design on RDNA 4?
    Yeah that's not true because I've never felt you could get past the issue of latency created by having different chiplets solve a graphics problem.
    I mean, graphics cores are INCREDIBLY fast for a reason, to draw as many frames as possible. When you add latency into a system there is a HARD CAP that adding more cores to can't fix. Simple way to view this. Add a 100ms latency into the GPU. You will find that the best you can do is 10 fps unless you're using frame gen. ANY latency in a GPU is BAD and it's why Nvidia over their 900, 1000 and 2000 series GPU spent a lot of effort getting rid of latency issues to the point of them giving long talks about latency and how it affects gaming, and they're correct.
    So, have you seen Nvidia move to MCM for gaming GPUs? Nope. Who's the world's elite GPU maker? Nvidia.
    I'll let YOU know when AMD is going to solve that issue that latency creates when trying to use MCM designs, because Nvidia will already be doing it. And it's not like Nvidia hasn't been looking at it. If you try to use chiplets to solve quadrants of the image you have the boundary problem, and it's not a trivial problem to solve. It's a HELL of a lot of data that would have to pass back and forth between two chiplets each trying to solve the graphics problem in their quadrant, but data from each quadrant is needed for the other quadrant. And, if you try to do what AMD did with cache on chiplets it's TOO SLOW. You're going to hit a cap once again because of added latency.

    • @maxbirdsey7808
      @maxbirdsey7808 2 місяці тому +1

      Certain implementations of MCM incur significant latency penalties, sure, but not all.
      Obviously if AMD (or NVIDIA) were to do a MCM gaming GPU, it would use an MCM interconnect technology which would be much closer to monolithic than say is used in the Zen 2 and later CPUs.
      The other thing to bear in mind is that future nodes see a massive drop in the reticule limit (from ~850mm^2 to about 450mm^2) so to maintain performance increases MCM is an inevitability.
      Ultimately the only reason why we haven't seen it thus far, is those low latency interconnects are much more expensive, most likely prohibitively so currently for the cost of a gaming GPU - especially when you consider the die used in the 4090 is 609mm^2 so still comfortably below the reticule limits on current nodes limiting the justification to split into multiple dies.
      In summary, latency is a mostly solved issue with the right interconnect and packaging. Now we just need to see the interconnect/packaging costs come down because eventually GPUs will go the same way as CPUs did with Zen 2. I only really see it working (financially) when a GPU maker (AMD or NVIDIA) can produce the bulk of the stack (xx50/xx60 tier upwards) with the chiplets similar to how AMDs CCDs are used in Ryzen, Threadripper and all the way up to the EPYC - I think it will still be too cost prohibitive to just have 1 card (or a few SKUs) with an MCM design.

  • @brucethen
    @brucethen 2 місяці тому

    The PPU advertising reminds me of Google Stadia, "It will have negative latency," claim

  • @Ninjutsu2K
    @Ninjutsu2K 2 місяці тому +1

    I already set in stone; my next full system will be only an APU with RDNA 4, if possible, with a lot more features and tech for motherboards too, and this time I will change for a fully futured high-tech machine. I think RDNA 5 is too far away.

  • @FMBriggs
    @FMBriggs 2 місяці тому +4

    sounds great for power saving, i.e. handheld gaming.

    • @FeyFong
      @FeyFong 2 місяці тому +1

      Sound great for ressource optimisation too, will depend on drivers implementation.

  • @sangrevili3328
    @sangrevili3328 2 місяці тому +1

    Anyone remember AMD’s Super-SIMD patent? I wonder if this will be another situation like that (aka never gonna be seen).
    Also weren’t PPU’s already used in PS3’s Cell Processor? So is this just a smarter/newer Cell architecture?

  • @PaulGrayUK
    @PaulGrayUK 2 місяці тому +1

    Flow computing - a parallel scheduler that the Itanium was designed for. 😃

  • @NINJA_INVESTORS
    @NINJA_INVESTORS 2 місяці тому +3

    You do this every cycle bro

  • @andye7715
    @andye7715 2 місяці тому

    Why use PPUs if you already have parallel execution in GPUs and NPU available? Did not fly before with Cell, Larabee and this one Intel server architecture nobody wanted to code for.

  • @AngelOfDeath420
    @AngelOfDeath420 2 місяці тому +3

    Not on the high end. They won't even try!

    • @wilhelm391
      @wilhelm391 2 місяці тому

      rdna 5 bud.......

    • @joxplay2441
      @joxplay2441 2 місяці тому

      @@wilhelm391 Wait For VEGA 2.0...

    • @wilhelm391
      @wilhelm391 2 місяці тому

      @@joxplay2441 why?

  • @ziggybombers1563
    @ziggybombers1563 2 місяці тому

    We had PPU chips years ago. Back then it was physics processing unit and then Nvidia bought them.

  • @blueeyednick
    @blueeyednick 2 місяці тому

    the PPU part had me laughing. 100x is wild.

  • @shermanfreeman1
    @shermanfreeman1 Місяць тому

    When will PCIe 5 graphics card? Is happening on rnda 4 or 5

  • @oktc68
    @oktc68 Місяць тому

    I think at this point AMD need to give up on GPU's and concentrate on APU/CPU's. If they could work on reliability and getting all of their products to run at sensible temperatures I think they might see an increse in sales.

  • @ygny1116
    @ygny1116 2 місяці тому

    There is no need for chiplet GPU if you design your architecture so you can salvage almost every GPU die by cutting off defects. Consumer GPUs have ~400w power limit where monolithic will always be more efficient therefore higher performance.

  • @Violet-ui
    @Violet-ui 2 місяці тому +3

    Let me guess, 40% IPC?

  • @andye7715
    @andye7715 2 місяці тому

    I don't get the causality why a patent of AMD should be part of the RDNA4 architecture. These are independent events.
    Were there not the rumours that RDNA4 is basically going back a bit to more monolithic design while RDNA5 is the next attempt at chiplet design? Therefore RDNA4 also not such a big die and therefore no 5080/90 competition

  • @xlr555usa
    @xlr555usa 2 місяці тому +1

    Float Computing has parallelism on one cpu amongst the threads? I need to take a look at it. I do know that Intel has Deep Link tech but it is difficult to find information on it. They are implementing parallelism across the iGPU and Arc and it is likely going to expand. Is there a snoozetube out there that can cover Intel's Deep Link? I may be the only one that is qualified to do it as most of you are asleep at the wheel. Where you at MLID? There is lots of great tech out there that doesn't get marketed properly.

    • @minus3dbintheteens60
      @minus3dbintheteens60 2 місяці тому

      MLID couldn't use a DMM, what are you calling him out for?

  • @NetworkNinja84
    @NetworkNinja84 2 місяці тому +1

    Bro. Much love. But you been getting me with these titles. Where have the sauces gone? Now it’s all just internet and Twitter stuff.

    • @DeathByHentai
      @DeathByHentai 2 місяці тому

      This guy doesn't even read his comments let alone reply to them, what makes you think he'll actually answer you?

  • @mraltoid19
    @mraltoid19 2 місяці тому +1

    I was hoping AMD would try to bring back "Crossfire". I remember when the RX-480 came out they said:
    "With 2x RX-480, you can get better than GTX1080 performance for less money"
    Imagine if the Radeon RX-9700XT ($650) only performs as well as an RX-7900XT, BUT can be Crossfired, and theoretically offer RTX-5090 performance.

    • @kozad86
      @kozad86 2 місяці тому +2

      Games never properly supported multi-GPU, and motherboards usually didn't feed 16 PCIe lanes to both card slots because it was a niche feature used to upsell folks to Xeon and Opteron, both of which were terrible for gaming. Looping back to game support - when a game could only use about 45% of each GPU with latency and pacing issues, you were better off just buying 1 top end GPU.

    • @dex6316
      @dex6316 2 місяці тому

      Multiple GPUs was only really viable when we didn’t measure frame pacing and lows. If it worked it boosted averages, but this came out the cost of stuttering. Synchronizing multiple GPUs is hard and doing so adds latency. There’s a reason why AMD and Nvidia killed crossfire and SLI/NVLink on consumer GPUs.

  • @Neequse
    @Neequse 2 місяці тому

    isnt it the thing they did with zen architecture? that actually could be a game changer, cant it?

  • @supremeboy
    @supremeboy 2 місяці тому +2

    Nice clickbait title. There is absolutely nothing leaked that AMD can compete in GPU segment

  • @randolphblack2554
    @randolphblack2554 2 місяці тому

    I am ready to move on from my RX5700XT because I need something more capable on the encoding front. This could be the way to go if it proves to be more capable than Battlemage on the encoding front. Nvidia is out of consideration due to pricing and my own previous experience

  • @jensonee
    @jensonee 2 місяці тому +1

    There's no current release date or window for when RDNA 5 will reach PC gamers, but based on the current timeline for RDNA 3 and the upcoming RDNA 4 mid-range refresh, late 2025 would probably be the soonest we can expect to see a new AMD flagship.May 13, 2024

    • @HM-rz8nv
      @HM-rz8nv 2 місяці тому

      your date is wrong, it's june, not may

  • @rvs55
    @rvs55 2 місяці тому

    As long as the sum of the whole can't beat nVidia's monolithic in performance and features, it won't matter a whit except to the guys who just want cheaper stuff

  • @dalenihiser7766
    @dalenihiser7766 2 місяці тому

    Tag should have read, "AMD's REVOLUTIONARY Gaming GPU Chiplet Will Change EVERYTHING (or not)." That would have at least been somewhat more accurate.

  • @patrikmiskovic791
    @patrikmiskovic791 2 місяці тому

    Hope this will be new Ryzen x3d GPU

  • @MrPtheMan
    @MrPtheMan 2 місяці тому

    I think it will be a software mess. What are they going to use to assign the GPU, lol game bar haha, no thanks. What will it do with the used locked resources when you want to multitask and reassign? Will Windows not freak out with kernal panics.
    Fact and no one can't deny this, it's been years you cannot properly eject a USB or a Hard drive from windows because its always in use even if not in use.
    Software side it's hard to implement with stability, maybe with a reboot you could reassign the GPU. But live environment I don't think so.

  • @tangok5677
    @tangok5677 2 місяці тому

    i bought radeon hd 4870 x2 and they never enabled, i dont think it will work

  • @thecatdaddy1981
    @thecatdaddy1981 2 місяці тому +1

    Funny how these clickbait rumor-mill-leak-tubers get recommended to me even AFTER I click "do not recommend channel".
    The algorithm is is a complete meme at this point.

    • @joeallan3706
      @joeallan3706 2 місяці тому

      well you click on it. it will show it to you

  • @Integroabysal
    @Integroabysal 2 місяці тому

    i mean , it seems like AMD wants to cut costs with chiplet design , i don't see where this design is going to improve the performance since it will rely on same infinity fabric like ryzen r9 does, in CPU its not that prominent the bottleneck that this tech is offering , but in GPU it will be massive , since GPU is doing a hell lot of more operations per second than CPU , so unless they slap a gigaton of infinity cache on it i don't see it working in the favor of AMD any time soon.

    • @horuslupercal3872
      @horuslupercal3872 2 місяці тому

      That’s exactly what they will probably do, basically X3d a gpu

  • @alistermunro7090
    @alistermunro7090 2 місяці тому

    Isn't this the step that Nvidia skipped as Nvidia physically glued two GPU chips together? What we see is AMD trailing again.

  • @nossy232323
    @nossy232323 2 місяці тому +1

    Will he be as wrong as he was on Zen 5?

  • @benjaminlynch9958
    @benjaminlynch9958 2 місяці тому +1

    To me this almost seems like a patent that would boost performance of datacenter GPU’s where the physical GPU is carved up into different virtual GPU’s. Looks to me like the patent looks to solve some data privacy issues in virtual environments, and you can better segment the physical hardware so that data leaks are much more difficult to achieve by a malicious actor.

    • @xlr555usa
      @xlr555usa 2 місяці тому

      Patent? Patents destroy innovation and it takes a long time to get one. What do you end up with? Corporations, lawyers and judges make money, that's all it achieves.
      Patent > /dev/null

  • @venturefanatic9262
    @venturefanatic9262 2 місяці тому

    Doesn't telling Windows the Power Setup for each Program in the OS Settings funnel GPU and CPU resources to them? That's what I've been doing each time I install a new Game /Program. Can't the Motherboard have a mini GPU just for the OS and Programs UI? Then the GPU can be leveraged to the fun Stuff.

    • @MoonSon298
      @MoonSon298 2 місяці тому +2

      With how powerful GPUs & CPUs are today the impact on performance is zero for gaming, this will only marginally affect massively multi threaded tasks that use 100% of ALL CPU and GPU cores and even then the impact would be less than 1% for the more powerful systems

  • @kristopherleslie8343
    @kristopherleslie8343 2 місяці тому +1

    Until the gpu come with 128 Gb onboard memory I’m not impressed

    • @HM-rz8nv
      @HM-rz8nv 2 місяці тому

      20 years back would you have said you wouldn't be impressed with GPU memory until it had 16GB?
      I know you're just messing around.

    • @kristopherleslie8343
      @kristopherleslie8343 2 місяці тому

      @@HM-rz8nv no because 20 years ago I was already looking 20-100 years ahead of what we used. I’m a futurist.

    • @cajampa
      @cajampa 2 місяці тому +1

      @@HM-rz8nv You know we can easy have uses for even way more than 128GB, right?
      The best LLM models are massive now and get bigger all the time.
      128GB is nothing.

    • @onomatopoeia162003
      @onomatopoeia162003 2 місяці тому

      The workstation PRO W7900 has 48gb

    • @zg8999
      @zg8999 2 місяці тому

      Gb or GB?

  • @pf100andahalf
    @pf100andahalf 2 місяці тому

    Yes, 100x cpu speed next gen would be great, thanks

  • @righteousone8454
    @righteousone8454 2 місяці тому

    I want AMD and Intel to succeed, and in the end we will get the best parts, I am not fanboying over one brand
    I am on Intel now, but I had AMD FX-8320 that did way better in multi-processing, regardless of of thunderous hate from all Intel fanboys
    Then Ryzen came out, and people were on the fence, then 5000 series AMD Ryzen came out, and people began bashing Intel
    It's comical how disloyal "fans" are
    I wouldn't want AMD to fail.
    I wouldn't want Intel to fail.

  • @adamadamx5464
    @adamadamx5464 2 місяці тому

    This patent is 100% for RDNA5, not RDNA4

  • @BastyTHz
    @BastyTHz 2 місяці тому

    we still wating for superSIMD that already patents years ago
    this still cant be consider enough to be revolutionary. if they use photonic then that it is real revolutionary

  • @lebu6513
    @lebu6513 2 місяці тому

    10:08 You mean ScholgU lol

  • @shephusted2714
    @shephusted2714 2 місяці тому

    we need ppu desperately - wait and see and as for amd they need to push the boundaries and grow chiplets on gpu like they have with cores on cpu plus much much more memory - it should happen eventually - they need a moonshot now to change the balance and competition equation

  • @VatiWah
    @VatiWah 2 місяці тому

    will they use chiplets to pass on the price savings to the customers? or will they use it for profit? :P or will they use the profit for mostly R&D to keep the momentum?

  • @ferdgerbeler8494
    @ferdgerbeler8494 2 місяці тому

    No, it won't... because they will nerf it into the ground on all but their most obscenely priced products just like they did with the 680m and 780m

  • @korinogaro
    @korinogaro 2 місяці тому +3

    Lets not kid ourselves. It is AMD, even if they have really incredible architecture they will do whatever it is in their power to look stupid as f*ck. Good architecture from them would mean NVidia pricing. Or maybe even worse if it could beat NVidia.

  • @benjaminlynch9958
    @benjaminlynch9958 2 місяці тому +1

    That PPU segment sounds like a bunch of magic beans, and I’m not buying what they’re selling.
    A lot of software is already single thread compute bound - creating more threads does nothing to speed up performance. Think gaming here, or any task that is sequential in nature. The other thing is that for tasks that scale well with multithreading, programmers already have the tools to get those speed ups. Writing the code isn’t always easy, but it’s not impossible if you know what you’re doing.
    The only way I can see this being of any benefit is when there is poorly written or poorly designed legacy code that could be parallelized but isn’t because either the original software developer was lazy or didn’t know what they were doing. But these days almost all software that is running at scale in the datacenter has been manually refactored to take advantage of multithreading. And that’s to say nothing of efforts to shift massively parallel tasks to specialized hardware like GPU’s.
    And last point: unless they’ve gotten a license from Intel and AMD, their magic beans won’t run x86 code. AMD and Intel jointly own the x86 patents, and they’re the only ones who can legally make processors that can execute x86 code. There used to be a 3rd company with an x86 license, but they were acquired by Intel a few years ago, and none of their products had been competitive in decades.

  • @axl1002
    @axl1002 2 місяці тому

    So basically this PPU turns multithread performance into singlethread. right?

    • @akosv96
      @akosv96 2 місяці тому

      More like going as high as it can. Single thread means completely serialized computation: one goes in one goes out.
      I'm super hyped for this. If this can 10x current computers that would mean handheld computers with days of battery power.

    • @axl1002
      @axl1002 2 місяці тому

      @@akosv96 The single thread is the bottleneck here, otherwise we could use GPUs for general processing.

  • @vensroofcat6415
    @vensroofcat6415 2 місяці тому

    This is a good one 🤣 You made my day!
    Like realistically, what are the chances of AMD spending more money on GPU development or achieving wonder right now? You can't spend what you don't make.

    • @joeallan3706
      @joeallan3706 2 місяці тому

      AMD doesnt make money... they are in the best shape they ever have been. shill

  • @meekmeads
    @meekmeads 2 місяці тому

    Ah shit, here we go again!!!

  • @jasontodd5953
    @jasontodd5953 2 місяці тому +1

    Same old song and dance year after year. Next!

  • @Deeps__
    @Deeps__ 2 місяці тому

    Guarantee it doesn’t.

  • @TheCgOrion
    @TheCgOrion 2 місяці тому +1

    If this comes to fruition, it could be another Zen moment, in terms of scaling.

  • @gn2727
    @gn2727 Місяць тому

    Why do we even need this information ?

  • @newyorktechworld6492
    @newyorktechworld6492 2 місяці тому +1

    Coming from a 6800 xt, should i wait for rdna 5 or rdna 6?

    • @gametime4316
      @gametime4316 2 місяці тому

      5 if u want to buy high end
      6 if u want to buy good value mid-high end
      *i take into consideration that they will do good generation not a flop RDNA 3 style

    • @DeathByHentai
      @DeathByHentai 2 місяці тому

      You should wait until there's a game that comes out that you want to play that you can't reach settings you're comfortable with

    • @kozad86
      @kozad86 2 місяці тому

      Wait for a new card until you actually need it. Upgrading just for the hell of it is a huge waste of money.

    • @jawohlbxb3534
      @jawohlbxb3534 2 місяці тому

      6800XT is still good for a couple years RDNA5 should release 1H 2026 is my guess based on leaks

  • @richardsalazar4817
    @richardsalazar4817 2 місяці тому

    Doesn't nvidia already have an mcm gpu? If so they bet amd to the punch, which is/has been kinda rare in recent years.

    • @MattJDylan
      @MattJDylan 2 місяці тому

      That's a datacenter card (and one I don't think we've even seen in use as of yet.)
      Which means:
      1-it doesn't have the usual issues a card would have in a gaming scenario, since they do different things.
      2-those things cost like 50/100k $, while the multi-die solutions like ryzen's one aim to be competitively priced. It's easy to find solutions for things that make you a lot of money, while it's hard to find a reliable way to do it for pennies (which is the bulk of the problem in that sense: they can't research a mcm design that makes a 8900xtx cost 15k)

    • @dex6316
      @dex6316 2 місяці тому

      @@MattJDylanNvidia at least is better positioned to make a mcm gaming gpu than AMD though because of that. The biggest issue with MCM designs is having to synchronize multiple GPUs. Nvidia says their MCM design doesn’t even know which GPU it is on and just treats the chiplets as one chip.
      The biggest issues that a MCM Nvidia gaming GPU would face as a result is cost from lots of silicon and expensive packaging and scaling issues. We have seen that both GA102 and AD102 scale poorly with all the additional SMs they have. CoWoS from TSMC isn’t just expensive, but limited in capacity, so Nvidia would probably have to go to Intel’s Foveros for packaging.
      A MCM focused on performance would have limited gains due to poor scaling. A MCM design focused on small chiplets for cost and yields isn’t particularly viable right now because of limitations of production capacity and packaging being expensive.

    • @MattJDylan
      @MattJDylan 2 місяці тому

      @@dex6316 "nvidia says" yeah no, if I'm watching a clear blue sky and jensen tells me that the sky is blue, I'd still triple check for caveats 😂
      Sure they have some advantages as far as data centers go, but to make an mcm gpu for gaming is totally another world as of yet.
      If their design would work in such scenario, you'd bet the 5090 would at least separate the tensor cores from the normal cores, but that's not the case as far as we know.

    • @richardsalazar4817
      @richardsalazar4817 2 місяці тому

      My only question was "does nvidia already have an mcm gpu?" If so they beat amd to that inovated punch. Wasnt asking about gaming, cause I know theres not one yet. Wasnt asking about price either. Just pretty much, who developed one and put one on any market first? It's ok to say nvidia, if it was/is. Coming from an ATI fan, and we now who owns ati.

    • @MattJDylan
      @MattJDylan 2 місяці тому

      @@richardsalazar4817 isn't rx7000's mcm by definition though?

  • @2284090
    @2284090 2 місяці тому

    Nvidia destroyer RDNA5 lets gather the Nvidia flys over here

  • @tomtomkowski7653
    @tomtomkowski7653 2 місяці тому

    Yeah yeah... just wait. I have heard the same story for the last 18 years from 2006 when AMD bought ATI.
    Nvidia is also making Chiplets with their interconnectors.
    But I don't care any longer as Nvidia is setting more and more crazy prices and only what AMD does is blindly follow them but just 10% cheaper.
    xx80 (4080 S) for $1k and its AMD counterpart for $900 - LOL. Give me a break as this market is as broken as possible because of this duopoly.

  • @gametime4316
    @gametime4316 2 місяці тому

    wasn't it the same with RDNA3 and 4 ?
    we ended up with RDNA3 being beat by NV 2ed core in rasterization and by the 3ed core in RT.
    and with RDNA 4 AMD just gave up on the high end.

  • @Code-n-Flame
    @Code-n-Flame 2 місяці тому +1

    Suuuure it will 🥱

  • @tubalcain7
    @tubalcain7 2 місяці тому

    Until they have or improve hardware accelerated raytracing with AI, no one cares.

  • @samal90
    @samal90 2 місяці тому +1

    Navi 44 better have 16GB Vram or its bust.

  • @braindead2813
    @braindead2813 2 місяці тому

    Awesome! So we can expect the 8800 XT to be 3% faster and use 1.4 times the power! 👍

  • @TechieViewVideo
    @TechieViewVideo 2 місяці тому

    Snake oil... 😂

  • @soplam9555
    @soplam9555 2 місяці тому

    no gddr7 no buy

  • @ZoeyR86
    @ZoeyR86 2 місяці тому

    Lol crossfire chiplets

  • @daveuerk4030
    @daveuerk4030 2 місяці тому

    AMD changed die +tech, + cooler, + something else I think redesigned vram. So they did massive changes. Everything was new all at once. So now they have had 1 gen to work out the kinks. In theory they should be able to make a jump as they adopted new tech before Nvidia.

  • @FunBunChuck
    @FunBunChuck 2 місяці тому

    It will not be released. There will be no such thing as "multi-core" GPUs from AMD. Apple? Yes. Nvidia? Yes. Intel? Yes. AMD? No.
    Don't like that fact?
    Tough. Facts don't care about feelings.
    Nvidia wins again!
    Nvidia always wins!

  • @skywalker1991
    @skywalker1991 2 місяці тому +1

    hahahahahahahahhahahaha hahahahahahahaa , AMD what ??????? hahahaha hahahahaha hahahaha ahhhhhhhh,
    AMD might change their mind , thats all gona happen .

  • @Gyld3N
    @Gyld3N 2 місяці тому

    Here we go again 🤣

  • @Rathika5
    @Rathika5 2 місяці тому

    Same thing different year. I will take the normal wait and see how much of a letdown it is.

  • @Etheoma
    @Etheoma 2 місяці тому +1

    The reason why AMD is so keen on a multi chiplet GPU is that they can create a Übermensch GPU with dies that the individual dies can go in mid range parts, so they can have a high-end part that almost nobody buys and it's fine so that they can build up mindshare without having to lose money doing it. So yes you will see AMD go multi GCD first, it's basically the same idea as the OG Zen we can show people we can build high performance data centre class parts even though almost nobody will buy them so that we can get business down the line.

  • @EmperorOfMan
    @EmperorOfMan 2 місяці тому

    I have watched this video every year for 10 years

  • @anttikangasvieri1361
    @anttikangasvieri1361 2 місяці тому

    This feels as likely as nvidia 300€ value card coming out.

  • @johnaitchie3803
    @johnaitchie3803 2 місяці тому

    Amd junk lol

  • @Dimmu668
    @Dimmu668 2 місяці тому

    Suomi mainittu torille.

  • @infinity2z3r07
    @infinity2z3r07 2 місяці тому

    the last pc revolution i can think of is solid state drives becoming affordable. that was 10+ years ago.
    amd forcing core counts higher was also very good.
    abolishing hyper-expensive gpus should be next but we all see where this is going...

  • @David_Banner
    @David_Banner 2 місяці тому

    What again? Really? Didn't even watch the video what's the point? More BS!

  • @mikebruzzone9570
    @mikebruzzone9570 2 місяці тому

    mb

  • @peshmerge44
    @peshmerge44 2 місяці тому

    just clicked to downvote. you are welcome

  • @Bthaman69
    @Bthaman69 2 місяці тому +1

    Sweet

  • @AiVirtualBot
    @AiVirtualBot 2 місяці тому

    😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂 cap

  • @bestofmusicchannel
    @bestofmusicchannel 2 місяці тому

    The only revolutionary thing about AMD is how they end up chasing NVIDIA every gen !

  • @remsterx
    @remsterx Місяць тому

    Whomp whomp

  • @odaharry
    @odaharry 2 місяці тому +1

    Ya I doubt it Lol. Could you be more exaggerated and dramatic in your title.?🤪

  • @xlr555usa
    @xlr555usa 2 місяці тому

    Change Everything? I'm not gonna have enough change in my pocket to afford the ridiculous prices that AMD charges for their garbage silicon. Lisa ping me please, so beautiful.

  • @UpgradingAsUsual
    @UpgradingAsUsual 2 місяці тому

    TLDR;
    It won't change anything. In fact, Nvidia wins again by a significant margin.

  • @chrisgrimes325
    @chrisgrimes325 2 місяці тому

    Free banwiff, somefing like that, basically speaking, and yeah you do talk a load of crap