The FUTURE of GPUs: PCM

Поділитися
Вставка
  • Опубліковано 4 січ 2025
  • Urcdkeys.Com 25% code: C25 【Black Friday Super Sale】
    Win11 pro key($21.8):biitt.ly/f3ojw
    Win10 pro key($15.4):biitt.ly/pP7RN
    Win10 home key($14.4):biitt.ly/nOmyP
    office2016 pro key($28):biitt.ly/aV1op
    office2019 pro key($60):biitt.ly/7lzGn
    office2021 pro key($136):biitt.ly/DToFr
    Support me on Patreon: / coreteks
    Buy a mug: teespring.com/...
    My channel on Odysee: odysee.com/@co...
    I now stream at:​​
    / coreteks_youtube
    Follow me on Twitter: / coreteks
    And Instagram: / hellocoreteks
    Relevant links:
    www.nature.com...
    www.deepdyve.c...
    www.freepatent...
    www.freepatent...
    Footage from various sources including official youtube channels from AMD, Intel, NVidia, Samsung, etc, as well as other creators are used for educational purposes, in a transformative manner. If you'd like to be credited please contact me
    #ai #nvidia #amd

КОМЕНТАРІ •

  • @6Ligma
    @6Ligma 9 місяців тому +179

    Crank up the subwoofers bois, Corteks published a new video

    • @JoshuaFlower-bl3ey
      @JoshuaFlower-bl3ey 9 місяців тому +1

      I knowwwwww

    • @InquilineKea
      @InquilineKea 9 місяців тому +4

      IT'S BEEN QUITE A WHILE

    • @cedricdellafaille1361
      @cedricdellafaille1361 9 місяців тому

      Owww yeahhhhhhhhh

    • @hiXhaX-YT
      @hiXhaX-YT 9 місяців тому +6

      His voice is pretty off this time

    • @ADB-zf5zr
      @ADB-zf5zr 9 місяців тому +2

      I stopped watching them because I can't stand the Bot voiceover, but this video caught my attention, shame really that I have downvoted it, if they used a Human for the voiceover it would get an upvote and I would still be a subscriber, their choice, their reward.

  • @Kratoseum
    @Kratoseum 9 місяців тому +80

    These dives into current limitations, ongoing research and future potential is when I think this channel is at it's best! Thanks for the hard work.

    • @timginter146
      @timginter146 9 місяців тому +4

      I thought the same - I missed these ~20 minute Coreteks videos - focus on technology, history, data and facts. Great video, great to see those back!

  • @Sythemn
    @Sythemn 9 місяців тому +21

    I've been following the alternative memory types since 2005 and continue to be frustrated that the closest any of them came to useful commercialization was Intel's proprietary half a**'d Optane (PCM) experiment. MRAM is finding industrial embedded uses at stupid inflated prices but doesn't really seem to be in any hurry. Hadn't seen FeRAM mentioned in years.
    Glad to see the big guys are finally revisiting these.

    • @seylaw
      @seylaw 9 місяців тому

      I am also following the news of new memory types, Nantero also had a cool hot chips presentation some years ago. It was never heared of again though.

    • @thecat6159
      @thecat6159 8 місяців тому

      That's cause most of claims are based on overexaggerated hype, and highly selective research that are not represented of real-world situations.
      Nothing exemplifies this more, than the huge number of startups have attempted to commercialise non-volatile memory over the previous two decades with none of them ever producing a viable product that can compete with currency legacy memory technologies.

  • @Vatharian
    @Vatharian 9 місяців тому +15

    1. Jim Keller knows what he is doing.
    2. Cerebras is 3rd way - it has so wide fabric, that it can offload memory.
    3. PCM requires huge temperature swings. Optane had to heat up single bit cells to 400 deg. Celsius. it's not-unlimited lifespan came specifically from cracking issues.
    4. Stacking requires ability to move all of compute layer's heat load through all of the memory layers. 3D-VCache shown this precisely. Every layer literally robs underlying die of TDP. :C

  • @4.0.4
    @4.0.4 9 місяців тому +4

    There is another reason for CUDA's dominance: AMD's incompetence. Consider the case of the genius hacker George Hotz (iOS jailbreaks, reverse engineering the PlayStation 3) developing an AI box for the home (TinyBox) and giving up on AMD after offering the olive branch of fixing their bad drivers for them (if AMD open sourced them).

  • @reinerfranke5436
    @reinerfranke5436 9 місяців тому +8

    Stacking scale down power density for compute. If the sweet point is less compute efficient fetch data then using memory with local compute is the right way.

  • @abcqer555
    @abcqer555 9 місяців тому +14

    Awesome video!

  • @keyboard_toucher
    @keyboard_toucher 9 місяців тому +3

    12:01 latency increased--not decreased--by 3% on average.

  • @Buddy308
    @Buddy308 9 місяців тому +9

    At the very end of this video, Coreteks states that there is no other channel doing this level of research. That's exactly what I was thinking for the last ten minutes before that statement. He and his videos are unique in this area of technology reporting. Even though much of his presentations are over my head, I'm compelled to donate in order to keep the channel viable. I hope others feel the same.

    • @kelvinnkat
      @kelvinnkat 9 місяців тому +3

      Asianometry is comparable. Not quite as in-depth, but close enough to be comparable at least.

    • @Buddy308
      @Buddy308 9 місяців тому +1

      @@kelvinnkat
      I'm on it. Thanks

  • @covert0overt_810
    @covert0overt_810 9 місяців тому +6

    yes… but can it run crysis?

  • @mikebruzzone9570
    @mikebruzzone9570 9 місяців тому +7

    Fero magnetic and other forms of resistive ram (spin torque) is more likely than PCM that has its own system heat issues that can flip crystalline bits. Intel has shown they can stack mylar cells essentially with Optane but not profitably. Ultimately it all gets down to cell size and SRAM on 'stacking' is very cost effective albeit as noted here power consuming. The stability and endurance of solid state magnetics, spin and tunneling will win and are winning now in industrial embedded and aeronautics they are much more stable. The Intermediate step is to place FPGAs that are SRAM laden with various processing elements onto a GPU but ultimately the trend is toward solid state memories, a return to solid state memories actually do your research. mb

  • @AnkushNarula
    @AnkushNarula 9 місяців тому +3

    Great coverage - thanks! PCM reminds me of HP's "memrister" announcement from 2008. Do you have knowledge of it or maybe an inside scoop on the progress? Would make for an interesting technical video!

  • @Zujanbre
    @Zujanbre 9 місяців тому +4

    Germanium Antimony and Telirium. Was the AI that made the discovery called Shodan?

  • @--waffle-
    @--waffle- 9 місяців тому +2

    where are the sick forest graphics at the start from?

  • @WXSTANG
    @WXSTANG Місяць тому

    Material change states are too slow for storage. Stacking memory on top of the CPU is an AMD thing with the X3D... BUT... you can't add large amounts of memory without adding undue, added heat stress on the CPU cores themselves.

  • @_DarkEmperor
    @_DarkEmperor 9 місяців тому +4

    Stacking memory on top of a chip performing calculation is not so great idea, unless it is low voltage low power chip.
    Proper way of doing things is to stack a compute chip on top of memory, this way you can put cooling directly on top of a compute chip.

    • @roqeyt3566
      @roqeyt3566 9 місяців тому

      That's how adamantine does it right?

  • @Austin1990
    @Austin1990 9 місяців тому +2

    Chiplet GPUs could only go so far without significant on-chiplet memory. So, we will see if they pull it off.

  • @seylaw
    @seylaw 9 місяців тому +2

    @coreteks What about Samsung and SKHynix' concpets of in-memory computing? Samsung already had a prototype with an AMD GPU and had a presentation at HotChips 2023 about it.

  • @Phil-D83
    @Phil-D83 9 місяців тому +4

    Intel optane was basically phase change memory

  • @CreepToeJoe
    @CreepToeJoe 9 місяців тому +4

    It's always exciting when humanity pushes the limits even further! Thank you for making this video and keeping us up to date! 🙂

  • @Ivan-pr7ku
    @Ivan-pr7ku 9 місяців тому +3

    Nvidia switched from graphics-first designs way back in 2006 when they released their first unified shader GPU with CUDA support. Since then, GPU architectures have evolved into parallel compute machines with some attached graphics functionality. The same process for AMD began with GCN and today the company have two distinct architectures -- CDNA for pure (enterprise) compute and RDNA with balanced graphics and compute features. Anyway, in the near future even graphics rendering will be mostly driven through compute (incl. RT) and AI inference, and less of classic raster shading... if we would believe Nvidia at least.

  • @florin604
    @florin604 9 місяців тому +1

    This guy describes intel old faithful optane as a revolution.... amazing

  • @IraQNid
    @IraQNid 9 місяців тому +1

    SSDs will lose data if they are left without power for too long.

  • @aalhard
    @aalhard 9 місяців тому

    The return of the Transputer😊😊🎉. Makes you wonder what might have been...

  • @platin2148
    @platin2148 9 місяців тому

    Doesn’t that mean you have to clean that memory also? At least to a part.
    So when do we see intel adding it stacked on top?

  • @gethinfiltrator6700
    @gethinfiltrator6700 9 місяців тому +4

    time: 18:46
    The word "AI": 25 times

  • @profounddamas
    @profounddamas 9 місяців тому

    Wasn't that a fail with those small companies Intel and Micron using 3D XPoint?

  • @pf100andahalf
    @pf100andahalf 9 місяців тому +4

    Excellent video.

  • @IraQNid
    @IraQNid 9 місяців тому +2

    Graphics processing Units weren't meant for gaming. They were meant to provide better looking visuals for whatever you were doing with your computer. To take over the workload of the general purpose CPU / FPU combo chips so that they could operate more effectively.

  • @alpha007org
    @alpha007org 9 місяців тому +1

    Isn't packaging the bottleneck currently at TMSC?

  • @ninthburn1199
    @ninthburn1199 9 місяців тому

    Interesting deep dive! Thanks for sharing it with us

  • @vitormoreno1244
    @vitormoreno1244 9 місяців тому +13

    I always guessed why FeRAM didn't catch up, I guess is a Cypress scale problem, but the tech is awesome, it is very low power and have no write delay, it writes at bus speed. I use on my projects since 2020

  • @noanyobiseniss7462
    @noanyobiseniss7462 9 місяців тому +1

    That's a layout, not a block diagram.

  • @neilb1540
    @neilb1540 Місяць тому

    Great video, this is interesting.

  • @soothsayer5743
    @soothsayer5743 7 місяців тому

    U have a certain workload to get through as cheap as possible, could u sell more ai/gpu units because its cheaper or because the units are so fast that people want more?Think better graphics?(im still using 1080p screens, its good enough: cheap) Idealy i want quiet, fast enough for my game/web browsing, cheap, compact(netbook size) and not power hungry(long lasting small battery). Imagine if i could have a xbox/ps5 in my small netbook(sub 13 inch and less than a kg) and its silent! No fan!!!! GPU's are fast enough and good enough but noisy! I would def go for a silent, very low power option but cheap....this is my world that im looking from, my question to the brains out there, what points is this PCM targeting?

  • @RubiconV
    @RubiconV 7 місяців тому +1

    How do you never take a breath and keep talking so fast for 15 minutes? Amazing.

    • @KanedaSyndrome
      @KanedaSyndrome 2 місяці тому

      Breathe in with the nose while speaking

  • @noobgamer4709
    @noobgamer4709 9 місяців тому

    Coreteks can you put out video on GTC NVIDIA super gpu MCM? want to know your thought on nvidia mcm

  • @ntal5859
    @ntal5859 9 місяців тому

    Legend has it the Leather Jacket was stolen of the Arnnie , and he(Jensen) really is a T800 sent to progress Skynet and prepare us meatbags for our AI overload... All hail Skynet.

  • @mmmuck
    @mmmuck 9 місяців тому

    curious if it's still worth buying Nvidia stock to hold for a decade or more?

  •  9 місяців тому

    what about Groq's LPUs?

  • @ATestamentToTech
    @ATestamentToTech 9 місяців тому +2

    I really hope AMD win this race. With the patents filed they abviously have a road map in place.. The next decade is going to change the world as we know it. Great video

  • @korinogaro
    @korinogaro 9 місяців тому +25

    I don't think this dude has any valid "future of" video. Like where is our RISCV revolution?

    • @pf100andahalf
      @pf100andahalf 9 місяців тому +1

      "Better late than never" seems to be an appropriate thing for me to say.

    • @korinogaro
      @korinogaro 9 місяців тому +1

      @@pf100andahalf true but by the same virtue I can predict whatever to revoloutionize whatever and as long as it doesn't vanish it would be in a state of: "but it's getting there". Maybe I should make a video about glass substrate for CPUs that Intel is working on, make hype around it and then forget to say that Intel predicts they need at least 10 more years.

    • @korinogaro
      @korinogaro 9 місяців тому +1

      BTW. take his "2024 Inte is Intel's ALL-IN year" with AMD should be WORRIED. Dude makes short-term predictions about CURRENT year a MONTH AGO and so far is completely in the wrong. So far Intel tries to sell 14900KS for sick money and all leaks show that their CPUs this year will be kinda shit. The most revolutionary thing they did so far is anoucement of change in naming scheme.

    • @nossy232323
      @nossy232323 9 місяців тому

      @@BlackLixt That's what she said!

    • @mattBLACKpunk
      @mattBLACKpunk 9 місяців тому +2

      His arm video, arguably

  • @axl1002
    @axl1002 9 місяців тому +1

    I was going to say "...but Jim Keller said"🤣🤣🤣

  • @Lex90909
    @Lex90909 9 місяців тому

    awesome video! thanks

  • @selohcin
    @selohcin 9 місяців тому

    Those Iranian researchers are incredible. I really hope Nvidia (or AMD?) pays them a lot of money to join their staff and integrate this technology into their products.

  • @davidlazarus67
    @davidlazarus67 9 місяців тому +2

    China has a huge lead in Phase Change Memory. It won’t be available to western countries for some time. Nvidia’s valuation is based on AI which it lags behind China. That’s a big bubble just waiting to burst.

    • @jackinthebox301
      @jackinthebox301 9 місяців тому +1

      Go away Chinese bot.

    • @ofon2000
      @ofon2000 9 місяців тому +1

      I was wondering the same thing...2 word name with random 2 numbers and pro China comment with awkward grammar@@jackinthebox301

    • @jackinthebox301
      @jackinthebox301 9 місяців тому +3

      @@ofon2000 The only phase change that China has better than the US is their concrete's natural ability to phase change to rubble.

    • @BoraHorzaGobuchul
      @BoraHorzaGobuchul 9 місяців тому

      Please tell us more entertaining stuff, I laughed so hard...

    • @ofon2000
      @ofon2000 9 місяців тому

      @@jackinthebox301 yeah tons of Chinese stuff that is good value, but a lot more that seems like a good deal only to realize it breaks so fast that it's garbage value.

  • @mcmalden
    @mcmalden 9 місяців тому +2

    What a bunch of incoherent rambling about compute architectures and then somehow mixed in 3D integration and different memory technologies.
    The majority of power is dissipated on the GPU die, therefore you cannot just stack insulating components on top of it. The refresh power for DRAM has little to do with this and PCM wont fix it. The Iranian paper somehow appears to envision sandwiched cooling, which is a completely different point altogether.

  • @Integr8d
    @Integr8d 9 місяців тому +1

    Bottom heat sink is the key

  • @D.u.d.e.r
    @D.u.d.e.r 7 місяців тому

    Without any doubt memory needs to catch up with the chip logic as it was left far far behind for decades and cache like SRAM memory is not sufficient enough nor economically viable to be that next step even it will most likely rely on 3D stacking similar to AMD's V-Cache.
    One moment memristor looked like solution and holy grail of the compute memory but it might be actually PCM. Let's see, thx for the vid!

  • @cracklingice
    @cracklingice 9 місяців тому +1

    Intel no longer had a fab to build Optane because they gave up their half and the other company decided to sell it.

  • @davidswanson9269
    @davidswanson9269 3 місяці тому

    PCM has write endurance issues as well and will eventually fail burning out.

  • @mikemj8204
    @mikemj8204 9 місяців тому

    Great job thank you.

  • @XxXnonameAsDXxX
    @XxXnonameAsDXxX 9 місяців тому +1

    The only future I see is a 8000 usd projector which is mandatory for gaming.

  • @kaisersolo76
    @kaisersolo76 9 місяців тому

    great stuff.

  • @Boorock70
    @Boorock70 9 місяців тому +2

    So, will AMD be able to reduce the insane "power consumption" of it's GPUs @ "video playback ?"
    Last Gen's 6800 (and up) RDNA 2 GPUs & the new 7000 (RDNA 3) series have serious power issues @ video playback (UA-cam, Netflix, VLC etc.)
    Check out the "doubled" video playback consumption of 6700 XT (20W) vs 7700 XT (42W) at TechPowerUp & ComputerBase
    40+ W is stupidly HiGH & meaningless...
    AMD needs to solve that "power issue @ video playback" at least in the next 8000 series. (or do they even care?)
    * Emphasize on "video playback" as most people confuse it with idle, web or gaming consumption. They are very different things.
    AMD, still didn't solve the "video playback consumption" the previous solution was for "idle consumption" only.
    PS: RX 7900 XTX is still the record holder with 67W but 7900 GRE is getting closer with 62W on watching UA-cam !

  • @dosmastrify
    @dosmastrify 9 місяців тому

    4:25 This guy definitely just said ass ram

  • @cdurkinz
    @cdurkinz 9 місяців тому

    15:53 wait is this the first iteration of AI improving itself? 😂

  • @GeorgeD1
    @GeorgeD1 9 місяців тому +3

    Celso's voice is extra husky today. :D

    • @ADB-zf5zr
      @ADB-zf5zr 9 місяців тому +1

      Husky Bot.!

  • @scottpar28
    @scottpar28 9 місяців тому

    How about micron? They did this

  • @nintendobrad3946
    @nintendobrad3946 9 місяців тому +1

    I'd like to see this make it into the PS6.

    • @HoneyTwee
      @HoneyTwee 9 місяців тому

      Would be far too expensive and far too soon.
      PS6 could be as close as 2 years away.
      If we're unlikely to see this in Nvidia enterprise cards by then, we're not going to see it in a $500 console by then.
      PS6 pro or PS7 sure. If this tech scales well and isn't a dud.

  • @newerstillimproved
    @newerstillimproved 9 місяців тому +5

    so optane will come back

  • @hambotech9954
    @hambotech9954 9 місяців тому

    Bro's voice almost blew my speakers 💀

  • @phaedrussocrates7636
    @phaedrussocrates7636 9 місяців тому

    Thank you

  • @lasagnadipalude8939
    @lasagnadipalude8939 9 місяців тому +1

    Underappreciated the fact that the material was discovered by ai. We are really at the start of a science fiction novel

    • @christophermullins7163
      @christophermullins7163 9 місяців тому +1

      Yes we are. Anything you want.. any medication.. technology.. etc etc. The AI will learn all of humanitys capabilities and production technologies and recommend the best way forward to continue to advance everything in a more efficient and economical way that we would be able to do without ai. I feel that being born into the beginning of this information revolution is the strongest thing about our existence. I am so fascinated by the future of technology and AI it seems like we are living through a scifi novel for sure. Breakthroughs in all fields will flood in in the coming decades and we get to watch it unfold like.. a scifi novel for lack of better words. What a strange and fascinating life humans are living through. Sad thing is most people are oblivious to it and are not interested in the technology that will change humanity forever. I am obsessed with it.

    • @lasagnadipalude8939
      @lasagnadipalude8939 9 місяців тому

      @@christophermullins7163 That's totally how I feel

  • @--waffle-
    @--waffle- 9 місяців тому

    When is Nvidia going to launch their CPU to general public? Will it EVER come? All i want is a Grace-Hopper like CPU&GPU in one small(ish) convenient box, a 'console' like PC. A 5090 that i just plug in to my monitor. No wires everywhere, doesnt take up half my room. Or even AMD, they already make APUs.

  • @Dmwntkp99
    @Dmwntkp99 9 місяців тому

    Hopefully Jenson won't buy them out if they become a threat.

  • @mr.electronx9036
    @mr.electronx9036 9 місяців тому

    Chips on glas will change everything

  • @igormarcos687
    @igormarcos687 9 місяців тому +4

    The description filled with affiliate links shows the decadence of the channel

    • @XxXnonameAsDXxX
      @XxXnonameAsDXxX 9 місяців тому

      Projector gate never forget

    • @TheChkgrniv
      @TheChkgrniv 9 місяців тому

      I would gladly watch your Channel. I am sure that you can take your time and effort and totally give it away for free.

    • @HoneyTwee
      @HoneyTwee 9 місяців тому

      ​@@XxXnonameAsDXxXwhat exactly is projector gate.
      He reviewed a $8000 projector to look at what the future of display technology could look like.
      Just because you can't afford it now and he is looking at it not from a value perspective, doesn't mean there isn't value in analysis what $8000 tech looks like today, because in 5 years that could be $2000 tech. Then another 5 that's $700 tech.
      What's the problem?
      I genuinely could be missing something scummy he did, but being sponsored to talk about an extremely expensive product that almost nobody can afford anyway isn't really scummy on its own?

  • @kozmizm
    @kozmizm 9 місяців тому

    We're not a hardware company, we're a software company. BS! You are both!

  • @effexon
    @effexon 9 місяців тому

    TLDW after 3.5 minutes: nvidia gonna put HBM and other highend very fast memory to professional gpgpus. gaming gpus have GDDR7 so not to give too cheap mining gpus for regular people.

  • @johnbeer4963
    @johnbeer4963 9 місяців тому +1

    Phase change memory.... So Optane.

  • @PointingLasersAtAircraft
    @PointingLasersAtAircraft 9 місяців тому

    We need through die heat pipes.

  • @esra_erimez
    @esra_erimez 9 місяців тому +4

    Jim Keller is a living legend

  • @l-cornelius-dol
    @l-cornelius-dol 9 місяців тому

    Just can't wait until I need a $2000 GPU _and_ a $2000 AIU to play the latest titles. 😑

  • @VincentPandian-z9b
    @VincentPandian-z9b 9 місяців тому

    Interesting stuff. Well researched and informative.

  • @shk0014
    @shk0014 9 місяців тому +3

    Hello reader, your mom.

  • @Panacea_archive
    @Panacea_archive 9 місяців тому +3

    Never underestimate the ability of AMD to disappoint

  • @jflgaray
    @jflgaray 9 місяців тому

    This PCM hype again???!!! This tech has a good track record of NOT successfully winning the market.

  • @ahmedp8009
    @ahmedp8009 9 місяців тому

    Well Done Iran!

  • @Peter-uf4yn
    @Peter-uf4yn 9 місяців тому

    haha.. he said "ass ram"

  • @SP95
    @SP95 9 місяців тому

    Good news

  • @ThaboW11
    @ThaboW11 9 місяців тому

    10:21 Iran... the western media is indeed biased as far as reporting the 'positive' aspects of Iranian tech while the eastern is also complicit, for there was no elaborate mention of what i'm about to say. The 1023 recent alleged attack on an American base in Jordan by a Iranian military drone masquerading as an American military one is case in point, apparently one of their key scientists went as far as to claim back in 2011 that their nation had 'been offered' technologies (he claimed it was by IDs) that enabled them create a tractor beam, much like in the star trek/ wars films, with which they captured an American drone. There's a video done on this by the you tube channel end time productions,,
    There's more to this place than what the news implies.

  • @esra_erimez
    @esra_erimez 9 місяців тому +6

    I like nachos

  •  9 місяців тому

    lol boom:) Just staggering!!!

  • @mattbegley1345
    @mattbegley1345 9 місяців тому

    The future of GPUs is SPAM... There are so many new videos everyday about nvidia that the HYPE has turned into SPAM.
    Stop spamming the board!

    • @Coreteks
      @Coreteks  9 місяців тому +1

      @mattbegley1345 My apologies, I'm deleting my channel right now!

  • @Zorro33313
    @Zorro33313 9 місяців тому

    ib4 gpus gettin l1, l2, l3... oh wait, so it's a CPU with ho SMT now basically. kind of big little or whatever. intel is on it's way to inventing GPU. or APU. same shit. CPU are becoming more GPU-like, GPUS are becoming more CPU-like. LMAO.

  • @gsestream
    @gsestream 9 місяців тому

    focus on fixing current stuff, dont even go to new stuff, even the current, hardware and software, is fixed. fully. well if you are in hurry, you produce nothing. literally. fix it.

  • @nsf001-3
    @nsf001-3 9 місяців тому

    No, PCM means Pulse Code Modulation. Quit mucking up my search engine results with BS tactics like this

  • @tristankordek
    @tristankordek 9 місяців тому

    👍

  • @nossy232323
    @nossy232323 9 місяців тому

    I wonder if Coreteks was drunk when he made this video?

    • @ofon2000
      @ofon2000 9 місяців тому

      why are you saying that?

    • @nossy232323
      @nossy232323 9 місяців тому

      @@ofon2000 His voice sounds strange in this video.

  • @gummywurms226
    @gummywurms226 9 місяців тому

    During AMD's AI presentation last year one of AMD's partners said that they have exceeded the capability of CUDA in regards to AI. I'm surprised that nobody picked up on that little tidbit. Without CUDA Invida is nothing.

  • @nsf001-3
    @nsf001-3 9 місяців тому

    Can't wait for the "AI" fad to be over with. Really makes you beg for the tessellation meme again at this point

  • @DudeGamer
    @DudeGamer 9 місяців тому +1

    First

  • @gregandark8571
    @gregandark8571 9 місяців тому +2

    Unsubscribed, a lot of no sense and wrong information's.

  • @DoesNotInhale
    @DoesNotInhale 9 місяців тому +2

    Coreteks is such a joke on youtube whatever topic he discusses I literally go to Moore's Law is Dead or any other channel and can be guaranteed I will not only be given actually accurate information, I won't have to listen to a limey smug mouth breather make horrendously bad hot takes and predictions that never come true while he fantasizes about AMD becoming "competitive" in our lifetime. Thanks Coreteks for keeping me up to date with topics I know you cant handle with your Celeron tier dysgenic grey matter.

    • @BIG_HAMZ
      @BIG_HAMZ 9 місяців тому +5

      Coreteks isn’t a leaker, he provides a unique, speculative look on what he thinks the future of technology might look like. I enjoy his videos and don’t expect them to be like MLD. I understand that he has some hot takes sometimes but that’s the point, it’s about starting a discussion for us enthusiasts

    • @adiffkindofswag1148
      @adiffkindofswag1148 9 місяців тому +3

      You lost all credibility the moment you mentioned Moore's Law is Dumb. 🤡

    • @waynnewilliams5588
      @waynnewilliams5588 9 місяців тому +1

      AMD becoming "competitive" in our lifetime. lol no chance

  • @_vofy
    @_vofy 9 місяців тому

    Awesome video!