NVIDIA Made a CPU.. I’m Holding It. - Grace CPU/Hopper SuperChip @ Computex 2023

Поділитися
Вставка
  • Опубліковано 25 тра 2024
  • Try Pulseway FREE today, and make IT monitoring simple at: lmg.gg/LTT23
    I'm at the Gigabyte booth at Computex 2023 where they're showing off bonkers new hardware from Nvidia!
    Discuss on the forum: linustechtips.com/topic/15099...
    Immersion tank 1 A1P0-EB0 (rev. 100) :www.gigabyte.com/Enterprise/A...
    Immersion tank 2 A1O3-CC0 (rev. 100): www.gigabyte.com/Enterprise/A...
    Big AI server (h100) - G593-SD0 (rev. AAX1): www.gigabyte.com/Enterprise/G...
    ► GET MERCH: lttstore.com
    ► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
    ► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloatplane
    ► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
    ► EQUIPMENT WE USE TO FILM LTT: lmg.gg/LTTEquipment
    ► OUR WAN PODCAST GEAR: lmg.gg/wanset
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    0:22 Meet the Grace Super Chip!
    1:22 We got permission for this...
    3:13 ..but not for this.
    4:40 Now for the GPU!
    6:13 That's where the Interconnect comes in
    7:32 There's "old-fashioned GPUs" too
    8:35 Crazy network card
    11:00 outro
  • Наука та технологія

КОМЕНТАРІ • 5 тис.

  • @RevJR
    @RevJR Рік тому +4876

    They don't want you to know this, but the processors at the convention are free. You can just walk up and take one.

    • @maxmustermann2370
      @maxmustermann2370 Рік тому +102

      Just the reason that nvidia squez the moneyz from gamer. They now can gift thier new CPUs, the gamer allrdy payd for the development and research.
      Btw. HP Moonshot was a fail. I see here no diff, just a bunch of Desktop GPUs crunchd in a 14" Laptop Blade. But it is AI man!!!11!1!!, the next hype hardware for failing startups.

    • @russellzauner
      @russellzauner Рік тому +41

      *doorbell rings*
      Linus 2 weeks from publishing this clip: Why did [ship company] just pull up in a semi?

    • @HanSolo__
      @HanSolo__ Рік тому +20

      Also, these are our available GPUs.

    • @user-go7mc4ez1d
      @user-go7mc4ez1d Рік тому +94

      Can confirm, I went to one of these conventions and offered $1000 for one of their processors.
      Their answer? "It's not for sale".
      Snooze you loose Nvidia, thanks for the freebie

    • @oscarsh2909
      @oscarsh2909 Рік тому +7

      I think we all know that you made a joke like this because you thought about stealing that poor CPU.

  • @Krolitian1
    @Krolitian1 Рік тому +20353

    I love the idea of Linus just going into conventions and just unscrewing random tech he finds all over the walls without permission.

    • @faceboy1392
      @faceboy1392 Рік тому +1116

      sinus lebastion is just too dangerous at conventions

    • @MrJosephrandall
      @MrJosephrandall Рік тому +376

      I was about to comment this myself, goes to show how much the companies trust him now

    • @CrustySofa
      @CrustySofa Рік тому +110

      seems like something he does everywhere he goes

    • @Vysair
      @Vysair Рік тому +79

      That's probably what he do in the old day. On-the-spot permission with no prior planning

    • @jaxtinpetersen62
      @jaxtinpetersen62 Рік тому +69

      the way he chuckles as well, when he actually get's permission. lol

  • @arshkhanlm3086
    @arshkhanlm3086 11 місяців тому +550

    Intel made a GPU, now NVIDIA made a CPU, what a time we live in

    • @i_Kruti
      @i_Kruti 11 місяців тому +2

      😂🤣😂🤣😂🤣😂🤣😂🤣

    • @crashnreset6987
      @crashnreset6987 11 місяців тому

      Yea,, what's next?..... Men making babies and women getting drunk and having tattoos ? ;p

    • @KanadeYagami
      @KanadeYagami 11 місяців тому +35

      NVIDIA should start making motherboards again to go with that new CPU. That would be a real trip. 😆

    • @i_Kruti
      @i_Kruti 11 місяців тому +10

      @@KanadeYagami and we willn't be able to buy that motherboard due to its price.....😂🤣😂🤣😂🤣😂🤣😂🤣

    • @adamhassam
      @adamhassam 11 місяців тому

      meanwhile we "gamers" are still fine..

  • @billygilbert7911
    @billygilbert7911 Рік тому +432

    I'm not surprised they let you take it apart. 1.5 million views in less than 24 hours is more coverage then this would get anywhere. I love these types of videos.

    • @pr0f3ta_yt
      @pr0f3ta_yt Рік тому +6

      Yeah duh. This whole video is full of shill. Do people actually think this isn’t paid for by Gigabyte or Nvidia. It might aswell be a marketing video for them

    • @billygilbert7911
      @billygilbert7911 Рік тому +36

      @@pr0f3ta_yt Who cares if it's marketing. Its still cool.

    • @draketurtle4169
      @draketurtle4169 Рік тому +2

      @@pr0f3ta_yt at least through Linus we can’t done transparency from these big tech companies.
      We actually get to see up and coming tech and Linus explains it’s use cases etc to us normies.

    • @nguyenhanh9479
      @nguyenhanh9479 11 місяців тому

      @@pr0f3ta_yt how do you expect them making money ? youtube pay is sh*t, everyone know that.

    • @hjf3022
      @hjf3022 11 місяців тому

      @@pr0f3ta_yt they would have to disclose that fact if it was.

  • @HStorm26
    @HStorm26 Рік тому +6232

    A green cpu with a blue gpu may soon be possible.
    Scary times.

    • @justinbiggs1005
      @justinbiggs1005 Рік тому +597

      Scary times with pricing and greed. But interesting times hardware/software technology wise

    • @stephenkennedy266
      @stephenkennedy266 Рік тому +117

      What the hell kind of bizzaro world are we in?

    • @DuckAutomata
      @DuckAutomata Рік тому +492

      Highly doubt the green goblin is interested in making a cpu for peasants like us.

    • @BudgetGamingEnthusiast
      @BudgetGamingEnthusiast Рік тому +16

      It already is
      The world is ending

    • @SWOTHDRA
      @SWOTHDRA Рік тому

      ​​@@justinbiggs1005 scary times indeed for the PC cucked race, hold up ya'll are now gonna get cucked the other way around Nvidia proc and intel gpu??? Damn the PC peasant race keep taking L's.

  • @macias22
    @macias22 Рік тому +2481

    I love how Linus just HAS to disassemble everything he gets his hands on

    • @superintendent1152
      @superintendent1152 Рік тому +16

      thats how he rolls

    • @grumpyratt2163
      @grumpyratt2163 Рік тому +48

      Someone somewhere was holding their breath saying don't f'ing drop it Linus don't you dare drop it 😂

    • @dilbertron2
      @dilbertron2 Рік тому +4

      thats how he rick rolls

    • @jordi95
      @jordi95 Рік тому +17

      He could not use the LTT screwdriver though! What a missed oportunity!

    • @Ander01SE
      @Ander01SE Рік тому

      Imagine if it was GN Steve...

  • @landonvincent9586
    @landonvincent9586 Рік тому +71

    Linus is literally the legend of the tech industry. imagine not only being invited to a pre-show, but also being allowed to play with the displays.

  • @thefreebooter8816
    @thefreebooter8816 Рік тому +63

    Linus holding a ~$150,000 compute module like it's a boombox will never get old

  • @Filoz
    @Filoz Рік тому +2675

    I've never been so nervous watching Linus holding new tech.

    • @mike-tq5es
      @mike-tq5es Рік тому +63

      the intel fab tour was more nerve wrecking lol~ even though he wasn't holding anything like here, his hand gestures and body movement so near all those precision machines after saying we shouldn't touch anything was true anxiety. (oh yea, and he actually did pat machines anyway) XD

    • @phoenux3986
      @phoenux3986 Рік тому +43

      Something tells me the display units are probably nonfunctional if they're willing to let Linus take one off the wall and open it up with little to no supervision.

    • @Xorthis
      @Xorthis Рік тому +32

      @@phoenux3986 Nope. I'm sure they are fully functional hardware items. I'm kinda sad he didn't drop one!
      Next week: Repairing the $150,000 server we had to buy after breaking it!

    • @gloamglozer
      @gloamglozer Рік тому +1

      @@Xorthis haha :D I need to see it! But I gues it costs much more.

    • @IngwiePhoenix
      @IngwiePhoenix Рік тому +5

      If you've watched him for years, you get used to it.
      Gold controller, 10k Intel CPU (which he dropped) are just among the first things that come to my mind. xD

  • @shorty685
    @shorty685 Рік тому +1987

    The confidence that some manufacturers have in Linus despite his track record is impressive.

    • @Fishmanistan
      @Fishmanistan Рік тому +242

      That's because if Linus drop's their product it's free advertising though clips for years to come lol

    • @TAMAMO-VIRUS
      @TAMAMO-VIRUS Рік тому +141

      Also, these are display units. Meaning they either don't work at full capacity, or might not even work at all.

    • @SonicBoone56
      @SonicBoone56 Рік тому +84

      @@TAMAMO-VIRUS big companies rarely put something valuable out there in public view, sometimes it's just a dummy unit.

    • @huskycruxes7232
      @huskycruxes7232 Рік тому +15

      ​@@FishmanistanI did not think of that. I have found myself watching Linus drop compilations

    • @michaelshafer2996
      @michaelshafer2996 Рік тому +6

      Walk in there with a fat wallet and or million dollar business insurance policy theyd let you do it to 🤷🏻

  • @williambrunelle9050
    @williambrunelle9050 Рік тому +33

    The little giggle of holding a server... a very expensive server and not dropping it made everyone's day! Like a kid in the toy store... Would love to see how hard it was for Jake to pull him away kicking and screaming.

  • @Flots1111
    @Flots1111 11 місяців тому +8

    Fantastic overview of the new NVIDIA products and a stellar breakdown on ARM procs and where they work best. I'm working through some NVIDIA certification courses and the info is all there but they provide no context other than a dizzying array of multiplier comparisons against previous gen hardware and this video brought it all into focus. Thanks so much, really helpful!

  • @ANeMzero
    @ANeMzero Рік тому +1269

    For reference on the name: Grace Hopper was the US Navy computer scientist who wrote some of the earliest theory on machine-independent programming languages and is credited for writing the first compiler, two incredibly important steps towards modern computing.

    • @idova
      @idova Рік тому +63

      yes, hearing 'grandma COBOL' mentioned did bring a smile to my face

    • @crimson-foxtwitch2581
      @crimson-foxtwitch2581 Рік тому +48

      yeah, NVIDIA names a lot of their architectures after important people in science history.

    • @markus1351
      @markus1351 Рік тому +30

      also ranked Rear Admiral on top of that

    • @legerdemain
      @legerdemain Рік тому +7

      Grace Hopper has a Posse.

    • @cruzer789
      @cruzer789 Рік тому +43

      She was also the first person to coin the term 'bug' in computer sciences because she found an actual bug in one of their systems and then taped it into the maintenance log book.

  • @jamr3y
    @jamr3y Рік тому +925

    Linus: we haven’t been on good terms with nvidia for a long time
    Also Linus: proceeds to dismantle latest nvidia tech

    • @damptoget9000
      @damptoget9000 Рік тому +70

      It's gigabyte's booth

    • @brandonmoss7976
      @brandonmoss7976 Рік тому +7

      @@damptoget9000 they are third-party seller this is NVIDIA tech tho

    • @Gabu_
      @Gabu_ Рік тому +34

      @@brandonmoss7976 Realistically, Nvidia can't do shit if Gigabyte wants to show off their new stuff that's already available.

    • @prestonvarner611
      @prestonvarner611 Рік тому +10

      @@brandonmoss7976 Gigabyte can let linus do what he wants... Nvidia would not stop him lets be real here.

    • @Petrixxxxxxx
      @Petrixxxxxxx Рік тому +7

      @@brandonmoss7976 Which does not matter?
      If you bought a car from Toyota and started dismantling it, do you think Toyota could tell you to stop?

  • @gaborkeresztes1739
    @gaborkeresztes1739 11 місяців тому +17

    Mad respect to Gigabyte for letting this chip get into Linus' hands.

  • @topsofwow
    @topsofwow Рік тому +1

    Those network cards are also hypervisors allowing you to divide one system up and scale the compute needed per customer.

  • @mylestheman
    @mylestheman Рік тому +1761

    I can’t believe they trusted Linus not to drop one of these 😂

    • @pitchradio9707
      @pitchradio9707 Рік тому +38

      I think they more trust he can compensate fairly when he does, plus it would be good advertising.

    • @TravisFabel
      @TravisFabel Рік тому +153

      I think these are non-operational demo examples. That's why they don't care.
      You don't hang $100,000 machine on the wall of a convention. You put up the dead CPUs and mockup PSUs that are basically worthless.

    • @mattsnyder4754
      @mattsnyder4754 Рік тому +20

      I can’t believe you think that they hung functional hardware on the wall of a convention center 😂

    • @oddball_the_blue
      @oddball_the_blue Рік тому +3

      I came here just to say the same...

    • @hw2508
      @hw2508 Рік тому +3

      Mounted to the wall? Probably just models with damaged CPUs anyways.

  • @xtr0city
    @xtr0city Рік тому +923

    Gigabyte allowing Linus to disassemble a product mounted vertically is a level of trust I didn't know was possible, glad it worked out for them cause Jensen made it very clear how much it costs lol.

    • @hariranormal5584
      @hariranormal5584 Рік тому +21

      They got a visit to ASML... that cuts everything. Visit to one of the arguably most complicated machines on earth is not a easy task.

    • @karmatraining
      @karmatraining Рік тому +65

      That module was probably a dud or scrap part that they just used to show how it looks. Ain't nobody leaving a $100K chip hanging on a wall

    • @miroslavmilan
      @miroslavmilan Рік тому +4

      At first I thought he was going to DELID it.

    • @yourboi1842
      @yourboi1842 Рік тому +2

      I’d imagine a sponsor on a Linus tech tips video is a few grand. But Linus making a an entire video directly on your product is somehow not worth him dropping it once a blue moon?

    • @miroslavmilan
      @miroslavmilan Рік тому +1

      The thing is, that probability is a lot higher than once in a blue moon 😄
      Anyhow, it’s mostly just a banter from his loyal fans.

  • @MrLickwidfoxxx
    @MrLickwidfoxxx Рік тому +5

    I love how Linus held the hooper gpu on his shoulder like Clouds Buster sword 😂

  • @Lampe2020
    @Lampe2020 Рік тому +2

    5:33 WHAT?!? 4TB/s?!? That's all computer data I ever have produced - in a single second?!?

  • @fatrobin72
    @fatrobin72 Рік тому +632

    Fun fact... one of the first boards Acorn (the company who created ARM) made had a broken power connection to the CPU... but as ARM chips were so low powered, it was still fine

    • @someoneelse5005
      @someoneelse5005 Рік тому +96

      I watched that... the insanity was that residual power from capacitance all around the chassis managed to power the circuits!

    • @Dragoon710
      @Dragoon710 Рік тому +7

      @@someoneelse5005 that seems very interesting how can I find this video?

    • @brandonw1604
      @brandonw1604 Рік тому +11

      Then RISC, what ARM is built on, the CPU was running after power was disconnected.

    • @createusername6421
      @createusername6421 Рік тому

      😮

    • @tanmaypanadi1414
      @tanmaypanadi1414 Рік тому

      ​@@Dragoon710
      You are in for a treat
      Lowspec gaming YT channel has a couple of videos covering ARM .
      ua-cam.com/video/gKYOjDz_RT8/v-deo.html
      ua-cam.com/video/nIwdhPOVOUk/v-deo.html

  • @theindependentradio
    @theindependentradio Рік тому +15

    4:14 who else was waiting for him to drop it

  • @AlwaresHUN
    @AlwaresHUN 11 місяців тому

    In work we already on the ARM architecture. Switching to it was change the amd64 to arm values in our infrastructure config. Its like half an hour with hundreds of microservices (+ testing, validating).

  • @ianmchale770
    @ianmchale770 11 місяців тому +12

    Hearing Linus saying “I can’t believe they let me take this off the wall” and proceeding to laugh like a small child made my day. Linus is the geeky adult version of a kid in a candy store 😅

  • @idonotcomplyrevolution
    @idonotcomplyrevolution Рік тому +493

    What I'm beginning to notice is, "compute modules" are essentially the PC and the motherboard isn't really a motherboard anymore, its just an I/O interface for the compute module. Which if you remember is how we used to make computers 40 years ago, just with wildly more advanced tech.

    • @krozareq
      @krozareq Рік тому +37

      Yep. Everything got shoved into an ISA slot. Keyboard controller, mouse controller, VGA card, memory, etc.

    • @bassplayer3974
      @bassplayer3974 Рік тому +7

      Reaching limit shrinking and motherboard a bottle neck. Buy in fab package.

    • @brodriguez11000
      @brodriguez11000 Рік тому +7

      Still do with industrial computers. e.g. VME, etc.

    • @threalismaradona9899
      @threalismaradona9899 Рік тому +13

      Cloud is just the mainframe and time sharing albeit as you said with very much more advanced tech

    • @SonicBoone56
      @SonicBoone56 Рік тому +10

      Yep. It's going back to everything being on a card and having it all connect to a backplane.

  • @dariusz.9119
    @dariusz.9119 Рік тому +623

    Imagine if Nvidia's reps didn't know Linus has a screwdriver and he just looked around, saw the reps moved away and started dismantling the showcase board before anyone could take a notice 😅😅

    • @davidhanousek9874
      @davidhanousek9874 Рік тому

      he has mounted ltt screwdriver in his butt... like always 😁

    • @thewiirocks
      @thewiirocks Рік тому +21

      If they're not used to Linus by now, that's on them!

    • @U1TR4F0RCE
      @U1TR4F0RCE Рік тому +9

      To be fair NVIDIA didn't let him do that, it's Gigabyte who does work with NVIDIA who did the presentations.

    • @Landen79Foff-wc5ej
      @Landen79Foff-wc5ej Рік тому +3

      and then when they noticed, they'd be like "HEY!" and then Linus drops it. 😏🤣

    • @8088I
      @8088I Рік тому +1

      Get ready for Not Super Ai🤖,
      Bot for Super Duper AI👾👧!
      Westworld is only a generation
      away🤠👧(👾). ... :-))

  • @v3nomdahunta
    @v3nomdahunta Рік тому

    I love that you don't appear to double check that it's off or unplugged.
    I worry when seeing you with any screwdriver to hardware.

  • @ahmadpochinki1244
    @ahmadpochinki1244 Рік тому +2

    1:05 i love linus just unscrew everything and employees just ignore him like "oh that's guy"

  • @DarkSwordsman
    @DarkSwordsman Рік тому +506

    It's ironic that in an era where we went from needing dozens of dedicated cards to having most things handled in software, we are now going in reverse: Hardware processing things with dedicated chips or cards.

    • @Ferretsnarf
      @Ferretsnarf Рік тому +106

      About 10 years ago when I was in college for Electrical and Computer Engineering this is actually one of the things we were talking about. We're more or less hitting a brick wall in miniaturization and increasing the raw speed of individual components. How do we improve performance when we can't miniaturize our chips any more than we already have (At this point we're talking about transistors that are so small that you can count their width in atoms)? Well you offload tasks into different chips (TCP/IP on the network adapter and like Linus showed putting the encryption workload on the adapter). If you find there's a specific workload that you're constantly asking your general-purpose CPU to do, it might start to make sense to put that task on a specialist chip rather than putting it on your CPU.
      ASICs are on the rise and expansion cards are coming back.

    • @autohmae
      @autohmae Рік тому +17

      Do you remember some people were saying: the end of Moore's law ? That's what is going on here...

    • @Ferretsnarf
      @Ferretsnarf Рік тому +28

      @@autohmae Yeah, we were talking about that at the time as well. I avoided saying it because I kind of hate talking about Moore's law online - you almost always get some kind of blowback when you talk about moore's law being dead. On the consumer side of things I could almost see why you might think moore's law isn't dead. We're not really seeing smaller+faster all that much anymore. We occasionally barely scrape by into a smaller node, but you're not really getting faster and more efficient transistors out of it anymore, instead you're mostly cramming more stuff onto the die and subsequently aiming a firehose at it to hope you cool it enough to not explode.

    • @autohmae
      @autohmae Рік тому +5

      @@Ferretsnarf do you know why consumers with some technical knowledge don't know it's dead ? Because of the marketing with CPU node process size.

    • @JustSomeDinosaurPerson
      @JustSomeDinosaurPerson Рік тому +8

      @@Ferretsnarf This has happened before, and ASICs have always had a need over general purpose processors. Our reasons for stagnation in tech is more of a complex problem as opposed to exclusively being down to physics. As it is, quite a few clever people in fields of research have proposed numerous workarounds that are plausible in theory, but simply not testable at the moment and not feasible on a wide scale, especially without aggressive grant funding like in the past.
      If anything, I would say that we're actually quite lucky that AI has brought about a bit of a resurgence in potential general optimization and advancement.
      Finally, Moore's law was always more of a "loose observation" and never intended to be indefinite, with Moore himself saying that he was certain the trend would not hold for long and become irrelevant to the next abstract steps in advanced design.

  • @FH1X_PROJECT
    @FH1X_PROJECT Рік тому +157

    That totally natural scan around the room before he takes the thing apart is just brilliant.

  • @1over137
    @1over137 11 місяців тому +3

    I have used a system with 1440 cores and 64Tb RAM, but it was a few hundred physical commodity boxes. The latest compute space stuff that is replacing the likes that I used, is insane.

    • @1oglop1
      @1oglop1 11 місяців тому +1

      And can it run Crysis?

  • @Hisham_HMA
    @Hisham_HMA Рік тому +8

    whats more amazing is them letting Linus handle those parts with his steady hands

  • @billyeveryteen7328
    @billyeveryteen7328 Рік тому +615

    Hopefully Linus is still making content fifteen or twenty years later, when you can pick these up for relatively cheap to see how they perform in games.

    • @Jaroartx
      @Jaroartx Рік тому +22

      i could imagine lets install batorcera for running ps6 games 😂😂😂 at ease, and for you dirty otakus create your own living A.I waifu cat girl

    • @JavierMora1112
      @JavierMora1112 Рік тому +15

      On Ali express 10 years from now

    • @mika2666
      @mika2666 Рік тому +16

      Sadly the "100" series, previously known as Tesla, does not support any video outputs and does not support any graphics APIs, it's only for compute

    • @Xorthis
      @Xorthis Рік тому +6

      @@mika2666 That hasn't stopped people running games on them. There's a few benchmarks out there.

    • @honam1021
      @honam1021 Рік тому +6

      @@Xorthis A100 and newer would perform very poorly in games as only a small subset of the chip supports graphics workload.
      From the H100 architecture whitesheet: "Only two TPCs in both the SXM5 and PCIe H100 GPUs are graphics-capable (that is, they can run vertex, geometry, and pixel shaders)." (a full H100 has 72 TPCs)

  • @stratonarrow
    @stratonarrow Рік тому +325

    For Linus to not drop whatever he’s holding immediately after saying “I don’t even wanna know what this thing costs” is pretty astounding to me.

    • @tombrauey
      @tombrauey Рік тому +13

      I personally doubt that those are working chips. It‘s more likely that they are defective and are used for exhibition purposes.

    • @jackturner269
      @jackturner269 11 місяців тому

      Linus has enough money to replace what ever gets broken guaranteed

    • @Demidar
      @Demidar 11 місяців тому

      you really think Nvidia is going to let him dissasemble working systems ? one of those racks is probarbly 500k

    • @stratonarrow
      @stratonarrow 11 місяців тому +1

      @@tombrauey oh it is known. Still funny though.

    • @stratonarrow
      @stratonarrow 11 місяців тому

      @@Demidar I didn't say that

  • @EstebanAbarca94
    @EstebanAbarca94 Рік тому +1

    This is your thing man, glad to see you hyped up in your videos again!!! welcome back to the old Linus :)

  • @TiagoRamosVideos
    @TiagoRamosVideos 11 місяців тому +4

    Incredible product! 👏 And it was great to see the confidence brands put on you Linus 👌

  • @jamesswaby5676
    @jamesswaby5676 Рік тому +402

    Linus: 'I didn't ask permission for this part but nobody seems to be stopping me.'
    Security: 'That's Linus... just let him do his thing. He'll put it back together... probably.' 😂

    • @RK-252
      @RK-252 Рік тому +23

      "trust me bro". 😉

    • @kameljoe21
      @kameljoe21 Рік тому +12

      he might even drop it!

    • @whitekong316
      @whitekong316 Рік тому +5

      Put it back with half the screws…

    • @KalebG
      @KalebG Рік тому +1

      let him cook

    • @TheZanzou
      @TheZanzou Рік тому +8

      I mean knowing how security goes for various events they probably weren't fully informed of what he could and couldn't do, just that he was allowed to mess with the display over there.

  • @Yoshi-ux9ch
    @Yoshi-ux9ch Рік тому +174

    It's a very brave move to allow Linus to hold anything important

    • @trapical
      @trapical Рік тому +7

      I mean let's be honest. This video is going to get more views than anything in the entire rest of the weekend of this convention... It's worth the risk of having to drop something when he is the headliner.

    • @Gabu_
      @Gabu_ Рік тому +11

      They're almost certainly dummy chips that already don't work.

    • @Bimboms
      @Bimboms Рік тому

      I kept waiting for him to drop it.

    • @aleks138
      @aleks138 Рік тому

      considering how much there is as stake if someone steals one, they're not real chips

    • @brodriguez11000
      @brodriguez11000 Рік тому

      @@trapical The people who can afford this most likely aren't in Linuses demographic.

  • @WaffleStaffel
    @WaffleStaffel 11 місяців тому

    Thanks for the infomercial, with advertising!

  • @viktoraggerholm5102
    @viktoraggerholm5102 Рік тому

    00:15 shoutout to that employee who saw you were filming and didn't want to be in the way

  • @ZyzzEnjoyer
    @ZyzzEnjoyer Рік тому +340

    I love seeing Linus having fun while disassembling all those things

    • @Fogolol
      @Fogolol Рік тому +3

      yeah and he looks like a kid in a candy shop lmao

    • @tyson31415
      @tyson31415 Рік тому +3

      Me too, but it also makes me nervous. He and gravity don't always get along so well.

    • @ddbrosnahan
      @ddbrosnahan Рік тому

      anything on Cerebras Andromeda AI wafer-exa-scale technology?

  • @rightwingsafetysquad9872
    @rightwingsafetysquad9872 Рік тому +398

    Nvidia has been making CPUs for over a decade now. Tegra initially for high end tablets and now for high end (~$700-$2,500) embedded systems. And they've been making Grace for AI prototyping workstations for about 5 years (if you have a spare $25,000).
    If you only have $5,000, there are a few options with the Ampere Altra if you really must have ARM.
    The power savings are very suspect, Jeff Geerling tested and found it to not be much different than Threadripper.

    • @GauravSharma-dy8xv
      @GauravSharma-dy8xv Рік тому +27

      I seriously forgot about tegra, it was sooo long ago

    • @FilippRoos
      @FilippRoos Рік тому +26

      and the Switch

    • @rafradeki
      @rafradeki Рік тому +50

      Arm is no magic bullet to energy efficiency, if using arm alone would be enough to make cpus more efficient even at high power, we would only have arm cpus

    • @andywolan
      @andywolan Рік тому +10

      Is it a true that ARM instructions are more energy efficient, but require more instructions to get the same task done than x86 instructions?

    • @genderender
      @genderender Рік тому +16

      Nvidia is betting that people will use Hopper and most definitely betting that people will buy their expensive ass interconnect modules. The actual performance of these chips is probably meaningless outside of the context of "shove a shit ton of DDR5 at it", much like Apple Silicon. And plus, AMD already beat Nvidia to the punch here. MI300 is CDNA3 + Zen 4 on a single package, using their Infinity Fabric (which is literally the same technology but packaged differently)
      Epyc still exists, and is impossible to actually beat because its much more versatile than these bespoke solutions. Until Arm can complete outside of the niche, we will keep hearing these arguments for years to come. Zen 4 is extremely efficient, as good as many Arm chips so x86 isn't out of the game yet

  • @carrino15
    @carrino15 Рік тому +1

    When he threw the little processor net work card my heart stopped a second.

  • @Zensiji
    @Zensiji Рік тому +8

    @Linus, I'm so glad you decided to step down as CEO so you could focus on the magic! Every day I tune into this channel to learn something new and you guys always manage to keep it fresh and engaging! Long Live LTT!

  • @EposVox
    @EposVox Рік тому +497

    I feel like we're looking at the future of consumer platforms in 5-10 years, just in BIG form

    • @z0phi3l
      @z0phi3l Рік тому +38

      Like mentioned Apple is there, Microsoft is close, question is, who will do mass ARM based consumer chips first, Intel or AMD?

    • @NostraDavid2
      @NostraDavid2 Рік тому +7

      Some powerusers, maybe. I don't see windows hardcore switching to ARM. Who knows... Maybe we'll be surprised.

    • @jonforhan9196
      @jonforhan9196 Рік тому +6

      @@z0phi3lMicrosoft is far from there with their Qualcomm chip surface laptops, maybe for a student taking notes and using a web browser but it’s basically the compute power of a phone lol

    • @wright96d
      @wright96d Рік тому +14

      @@NostraDavid2 I think you have that switched. I’ll be surprised if 90% of consumer PCs aren’t running ARM SoCs in 10 years. And I’m talking mostly pre-builts here.

    • @z0phi3l
      @z0phi3l Рік тому +2

      @@NostraDavid2 if this goes like I think it will, we won't have a choice. Wild guess is Intel x86 will make it to 16th gen before they kill it, same with AMD, 2-3 more x86 before they also switch to all ARM

  • @Morecow
    @Morecow Рік тому +501

    No no no this can’t be right

  • @BrunoTorrente
    @BrunoTorrente Рік тому +1

    Grace Brewster Hopper (née Murray; December 9, 1906 - January 1, 1992) was an American computer scientist, mathematician, and United States Navy rear admiral. One of the first programmers of the Harvard Mark I computer, she was a pioneer of computer programming who invented one of the first linkers. Hopper was the first to devise the theory of machine-independent programming languages, and the FLOW-MATIC programming language she created using this theory was later extended to create COBOL, an early high-level programming language still in use today.

  • @greggv8
    @greggv8 11 місяців тому

    Apple did an early version of offloading network processing to the network card. They made one model of NuBus Ethernet card which had a Motorola 68000 CPU on it and it used the A/ROSE system extension. Apple Realtime Operating System Extension. To find out what performance difference it made you'd have to dig through old issues of Macintosh magazines.

  • @artamereenshort6610
    @artamereenshort6610 Рік тому +277

    It's a real moment of pure pleasure, to see Linus with eyes that shine, like a kid in a toy store

  • @TheSickness
    @TheSickness Рік тому +1131

    Nvidia: we can connect multiple GPUs in multiple racks into one room filling huge Gpu
    Also Nvidia: SLI...yeah that don't work

    • @nhiko999
      @nhiko999 Рік тому +108

      Just in case: SLI works but it's mainly dépendant on the type of work asked to the GPU, and games are not benefiting much of the multiple nodes. For scientific computation however...

    • @EmilKlingberg
      @EmilKlingberg Рік тому +102

      Well, SLI is actually a great technology, but its requires high competency from game developers, and lets just say that's not too common. Look at simulation programs or modeling and raytracing software and you realize how awesome sli setups are when running proper software.

    • @coccoborg
      @coccoborg Рік тому +24

      ​@@EmilKlingberg on point! If you want to see a well optimized game for sli/cf, have a look back at crysis 2! May not have been the best in the series, but multi-GPU support in that title was wildly effective!

    • @TheSickness
      @TheSickness Рік тому +12

      @@EmilKlingberg yeah, feels like game devs these days need 'guardrails' enforced by Sony and a 'one click - enable' to implement feature button.
      (Thinking about the interviews on Morres law is dead channel)
      For the mentioned use cases you can forget running that on consumer cards as none have the connectors anymore

    • @krozareq
      @krozareq Рік тому +18

      The difficulty with SLI is that is has to raster frames real time for 144+hz display on a screen. GPU offloaded work, such as NN machine learning, is a much easier task to parallelize.

  • @alek2341
    @alek2341 11 місяців тому

    I've just been crushed by the CC wall of text at the start

  • @Indrid__Cold
    @Indrid__Cold 10 місяців тому +2

    I love your contagious passion and enthusiasm for technology. I joined the PC industry as a hardware trainer/presenter in 1991. It took me months to accept the fact that i was actually getting paid to do something was so passionate about. Best wotking years of my life!

  • @VertexTuner
    @VertexTuner Рік тому +99

    What I find amazing about ARM architecture CPU's is that the very first one was was simulated and developed by Acorn Computers on an Acorn BBC Microcomputer (which used a MOS 6502 CPU) . The original name for Advanced RISC Machine was Acorn RISC Machine. I'm happy to say as a Brit I saw the beginning of this CPU legacy and still own both my BBC Microcomputer Model B and my Acorn Archimedes A3010 (which featured an ARM250, the 2nd generation ARM CPU). There was an actual ARM upgrade system for the BBC Micro, but it was far out of anyone's league/access and was mainly used by Acorn to develop the Archimedes.

    • @robinpage2730
      @robinpage2730 Рік тому +10

      Fun fact: when the fist ARM CPU passed it's bench test, the testers went to unplug it, and realized it was already unplugged. It had passed it's bench test entirely on residual stored energy. It was THAT power efficient.

    • @100daysofmeh
      @100daysofmeh Рік тому

      The first computer i ever used was a bbc micro. When i got to infant school and even highschool we had Acorns

    • @autohmae
      @autohmae Рік тому

      This is why the Raspberry Pi exists, to re-ignite the BBC Micro experience, as a teaching tool. And I can report: the Raspberry Pi is the most sold computer from the UK ever.

    • @Xorthis
      @Xorthis Рік тому +1

      @@autohmae I was just reading about the shortages! Insane that it's so popular. I also just bought an RP2040 to mess about with. Incredible little devices!

    • @Xorthis
      @Xorthis Рік тому +1

      Granny's Garden and Suburban Fox were two of the biggest games in my primary school :D Damn I miss the old BBC machines. Just before my computer lab went to x86 to keep up with the newest trends, we had one RISC machine with module cards. It could run as an Acorn, as a BBC, or even as a 486 (Each module card had the CPU to run that standard). I have no idea why this kind of system never made it.

  • @CMoney1401
    @CMoney1401 Рік тому +179

    I don’t know what’s more impressive, the technology or Linus not dropping it!

    • @bricefleckenstein9666
      @bricefleckenstein9666 Рік тому +6

      I vote for Linus not dropping it.
      9-)

    • @davidgoodnow269
      @davidgoodnow269 11 місяців тому +3

      "You drop it, you pay for it.
      Don't worry, we have payment plan options available."

    • @emerje0
      @emerje0 11 місяців тому +1

      Linus goes into these tech booths with all the abandon of a kid with boundary issues walking into a toy store unattended.

  • @wanderingbufoon
    @wanderingbufoon 11 місяців тому +1

    4:31 why do I suddenly feel like I was watching a car breakdown?

  • @ApfelJohannisbeere
    @ApfelJohannisbeere Рік тому

    That was an awesome info release!

  • @jackoboy1754
    @jackoboy1754 Рік тому +70

    linus: *walks in*
    also linus: *randomly starts unscrewing things from the wall*

  • @Glitch5970
    @Glitch5970 Рік тому +8

    4:22 dude you sounded like Ramon Salazar from og RE4 laughing like that LMAOOO

  • @therucha
    @therucha Рік тому

    You holding the chip like that gave me flashback to when you accedently dropped a card a few years ago x'D

  • @jpdj2715
    @jpdj2715 11 місяців тому +1

    ARM - "Acorn RISC Machine - first used in 1983. The ARM company never made processors/chips themselves, but designed them in specialised CAD systems. The CAD logical design file then was converted into a physical design that could be "printed" (my term) by the "foundry" (industry jargon). Such a logical design actually facilitates simulation in software of how the processor will work. The first physical batch of ARM came back to the ARM company and they had their physical test motherboards. Set the mobo up, plug the CPU in, run tests. Overnight, one of the engineers wakes up and becomes aware there was a connection or configuration issue in the power-lines and the test should have failed. Turned out the processor needs so little power that it had run off the power leaked into the processor from I/O presented to the processor. That's why almost all CPUs in smartphones are derived from that first ARM and why Apple derived their current generation of "proprietary" Apple chips from ARM too.

  • @raymondm.3748
    @raymondm.3748 Рік тому +297

    This is nothing short of insane, the fact that there is so much processing power with less power means that we will have much higher speeds throughout our internet!

    • @OmniKoneko
      @OmniKoneko Рік тому +19

      Not only that but it will decrease the heat generated from it so it provides more cushions for the coolers

    • @KalebG
      @KalebG Рік тому +6

      also cheaper hosting!

    • @JelvinCS
      @JelvinCS Рік тому +36

      This can't happen. What will I blame my awful Counter Strike performance on?

    • @SlyNine
      @SlyNine Рік тому +9

      Just lower the clock speed on your cpu and you'll have much better performance per watt.
      Our home chips run way beyond the efficiency curve

    • @phantomedits3716
      @phantomedits3716 Рік тому

      This won't really make your internet faster. But, there's a case to be made that it might, in a roundabout way, make your websites load faster because the website is running on this hardware.

  • @chriskaprys
    @chriskaprys Рік тому +75

    Wild to see just how far-wide-deep the subscription model has reached. If the contemporary fiscal landscape were a chess board, the pawn could only move to a square that it's rented from the opposing king.

    • @autohmae
      @autohmae Рік тому +2

      IBM has been doing this for decades, so no surprise

  • @Tyuwhebsg
    @Tyuwhebsg Рік тому

    Great video, thanks for sharing

  • @FunkyPants3D
    @FunkyPants3D 11 місяців тому

    Just waiting to see one of the clips of the next installment of Linus Drop Tipps in this video

  • @miyaayazaki4273
    @miyaayazaki4273 Рік тому +44

    Can we all take a second and appreciate how casually Linus holds that GPU on his shoulder? ( 5:12 )

  • @MaxXxoUh
    @MaxXxoUh Рік тому +155

    That Grace Hopper is a freaking piece of art. It makes you want to code an entire OS and Game just to test it. Just imagine what a crazy project that would be.

    • @TheCHEATER900
      @TheCHEATER900 Рік тому +10

      I love the homage to grace Hopper btw. Excellent naming

    • @Allan_Stone
      @Allan_Stone Рік тому +4

      How well can Rollercoaster Tycoon 2 run on this thing if natively translated, that's what I'm wondering

    • @chrishousby2685
      @chrishousby2685 Рік тому +4

      ​@@TheCHEATER900 imagine going back and telling her how big a transistor will become. The ones she developed software languages with were vacuum tubes several inches long.

    • @brodriguez11000
      @brodriguez11000 Рік тому

      We know what future Crysis developers will be using.

    • @puppergump4117
      @puppergump4117 Рік тому +2

      @@chrishousby2685 I imagine that anyone working with computers understands how quickly they will improve. The stuff shown in this video may be made for personal computers in 20 years. Just as 20 years ago, personal computers had millions of times less space and compute power.

  • @UrMomExpressed
    @UrMomExpressed 10 місяців тому

    "aww shit, here comes linus" -7:14 the arm following Linus around outside the camera cleaning up after him

  • @KingGJT
    @KingGJT Рік тому +1

    It is great to see you so happy Linus!

  • @trapical
    @trapical Рік тому +242

    This hardware is utterly and completely insane. If you can comprehend the slightest bit of the numbers behind this, it's just madness.

    • @MarkOakleyComics
      @MarkOakleyComics Рік тому +37

      Those AI art programs which can produce 30 photo-real variations of, "A mountain of cookies" in under a second strongly suggests that we're living the last generation before everybody is born in pods and never uses their eyes. I'm legit alarmed by these compulsive engineers who know deep down that they should put the brakes on, but just can't stop themselves.

    • @rudisimo
      @rudisimo Рік тому +39

      ​@@MarkOakleyComics​you should make your tinfoil hat tighter, perhaps that will help.

    • @Leanzazzy
      @Leanzazzy Рік тому

      ​@@MarkOakleyComicsOnly idiots think AI will overthrow the world.
      If you actually understood what AI is and how it works you wouldn't think that. It's not some magical sentient being. It's literally just mathematical models and equations used to predict future outcomes based on inputs datasets.
      Datasets, which need I remind you, need to come from living, active, intelligent humans. If there aren't humans producing new, creative, informative data, AI would be useless.
      AI is a good thing. It is simply a tool to help us simplify our work and reach our goals. It can, and hopefully will, be used to ease and remove the burden of existence from mankind, so we can truly be free to do what we want and not struggle just to survive.

    • @Leanzazzy
      @Leanzazzy Рік тому

      It's insane only if you compare it to consumer-level hardware and software.
      Remember, governments all over the world have and maintain far higher tech than the public can even dream of. They secretly use this tech for military, scientific, and usually espionage purposes.
      We get only the bottom of the barrel. Most of the tech we use today were once government secrets.
      The Internet itself started as a US military defence and research project.

    • @MarkOakleyComics
      @MarkOakleyComics Рік тому +15

      @@rudisimo Right. Because there aren't any examples of technology getting ahead of our ability to adapt without catastrophic results. I can think of a couple items of note just from the last few years.
      Meanwhile.., Neuralink is entering human trials.

  • @Thohean
    @Thohean Рік тому +309

    I think the most surprising thing to me is that Gigabyte has enterprise class hardware.

    • @konnorj6442
      @konnorj6442 Рік тому +29

      Ah but notice they did NOT show you the GB power supply?
      Lmfao

    • @ThyXsphyre
      @ThyXsphyre Рік тому +1

      Pleaase dont say that about Gigabyte

    • @georgevel
      @georgevel Рік тому +1

      My whole system is aorus bro

    • @Tehkezah
      @Tehkezah Рік тому +14

      @@georgevel and nothing of it is enterprise class hardware

    • @georgevel
      @georgevel Рік тому +1

      @@Tehkezah I said that bc ppl are saying about their psus and I wanna note that they got no problem

  • @bobd7384
    @bobd7384 11 місяців тому

    I worked with that equipment 10 years ago. Those are huge bricks.

  • @donner7708
    @donner7708 11 місяців тому

    I love the guy who just jumped out of the shot near the start of the video

  • @ardentdfender4116
    @ardentdfender4116 Рік тому +19

    My heart definitely skipped a beat at 8:34 when someone threw the Network Card. That would have been a hell of a drop.

    • @tormodhag6824
      @tormodhag6824 Рік тому +3

      Like that thing can cost as much as a car

    • @Vortex001_MLG
      @Vortex001_MLG 5 місяців тому

      @@tormodhag6824it probably does 😮

  • @viken3368
    @viken3368 Рік тому +5

    8:33 throwing prototype/showcase tech to Linus 'Droptips' Sebastian is a very bold move

  • @nwk2VGtxbs26_eiXlo2wnQ
    @nwk2VGtxbs26_eiXlo2wnQ Рік тому

    The quality of these videos keeps getting better, its unreal.

  • @elamonty
    @elamonty 11 місяців тому

    The guy in the hard hat jumping out of shot real quick cracked me up. Lol

  • @bricoschmoo1897
    @bricoschmoo1897 Рік тому +15

    0:16 I love how the worker jumps away as soon as he realizes he's going to inadvertedly go between Linus and the camera !

    • @notjux
      @notjux Рік тому +1

      He tried so hard to bail but only drew more attention to himself. May he rest in peace. o7

  • @dannymitchell6131
    @dannymitchell6131 Рік тому +54

    I just love how excited Linus always is for new tech. Never change bro.

  • @takomayowasabi6491
    @takomayowasabi6491 2 місяці тому

    I didnt not know how they actually look like but looking at the size, now I kind of understand the price.

  • @KX36
    @KX36 11 місяців тому +1

    stock trading software runs on mellanox network cards as the latency from cpu to network card could cost a huge amount in missed trades

  • @Akizurius
    @Akizurius Рік тому +38

    I can imagine Computex chief security officer watching this and thinking to himself "Why didn't anyone stop him? Well... At least he didn't drop anything."

    • @andreastyrberg7556
      @andreastyrberg7556 Рік тому +2

      Security officer maybe. PR leader says "lovely". and maybe "sad he didn´t drop it for the meme-videos". It is worth a lot in advertising

  • @dragon2knight
    @dragon2knight Рік тому +378

    This is NVidia's future, and they know it. Good, now we can let folks like Intel and AMD shine a bit more, especially when they get their drivers ironed out.

    • @maxjames00077
      @maxjames00077 Рік тому +24

      You think Intel and AMD can get some GPU market share in HPC?

    • @FlaMan407
      @FlaMan407 Рік тому +43

      AMD drivers are really nice. Meanwhile, Intel drivers struggle to play old games.

    • @Mr.Morden
      @Mr.Morden Рік тому +8

      As long as Nvidia continues to fail at purchasing a CPU vendor.

    • @mcslender2965
      @mcslender2965 Рік тому +11

      Im sure NVidia will be sad about that as they control whatever runs the AI customer support you work with, the AI that power your online services, the servers that renders your movie...

    • @maxjames00077
      @maxjames00077 Рік тому +13

      @@FlaMan407 My Arc A770's drivers are amazing after the updates.. AMD has gone nowhere in the last 10 years 😂

  • @davidkelly132
    @davidkelly132 Рік тому +3

    When I saw Linus start to unscrew the cpu I though "wait..what are you doing" then I saw him hold it and I was full of fear

  • @Dick_Valparaiso
    @Dick_Valparaiso Рік тому +4

    Nice to see Nvidia making such cpu strides. Hopefully, these improvements bode well for the next Nintendo Switch (should Nintendo actually opt for a semi-modern chipset).
    Btw, I know *this* cpu won't be in the next Switch. I'm talking about general improvements that could trickle down to gaming.

  • @Posh_Quack
    @Posh_Quack Рік тому +9

    1:27 DON'T DROP IT

  • @BenjaminWagener
    @BenjaminWagener Рік тому +35

    The way NVIDIA focuses on cloud and AI and so on and makes local gaming more and more expensive, I fear local gaming will get rarer and rarer. They want us to nudge to use GeForce Now instead, because it's more efficient to them to share the performance from its servers than to sell us individually a GPU.

    • @SWOTHDRA
      @SWOTHDRA Рік тому +9

      True, thats the future. Thats also thw reason gaming companies want to go always online , games as a service. Cause those games you can easily do the transition from local to cloud without the consumer knowing and once ur locked in you gonna pay for renting the software and hardware

    • @brandonmoss7976
      @brandonmoss7976 Рік тому

      Pretty soon you're gpus are going to come with their own custom CPUs 😄 along with a few pelethites of storage for the AI data, so every time you play a game, the EI will be smarter every single time it customized for every single game for every single place style for every single player 😳

    • @ryanw7196
      @ryanw7196 Рік тому

      Honestly with the percentage of their income that is now coming from AI I dont really see them giving a shit about GeForce anything, they could be completely out of the commercial hardware space in 10 years. I mean I imagine they may already have rtx 5000 series in the wings and possibly even the 6000 series... But after that? If everything else goes according to plan then Nvidia wont care much about consumer cash anymore.

    • @Wobble2007
      @Wobble2007 Рік тому +1

      It will never be viable until the majority of the internet infra structure is pure fibre, copper is just way too high-latency (laggy) for gaming remotely, even 1GB fibre is borderline, in reality, 10gb full fat fibre is the minimum for a good gaming experience over remote connections, even current HDMI standards struggles to carry enough bandwidth to keep up with modern video games, so even with 10GB fibre a heavy compression technique will need to be employed, I wouldn't ever want to use it for gaming personally.

    • @Gabu_
      @Gabu_ Рік тому

      @@Wobble2007 What are you smoking? You can ALREADY play remotely at fairly decent latency with a regular 100 MB bandwidth. Barely anybody except competitive e-sports professionals care whether you have a latency of 50ms or 10ms

  • @dannydonohue2577
    @dannydonohue2577 11 місяців тому

    Very Cool!! Went there and saw it too... -Does it come with the Gigabyte special backdoor chip to acess all your data?

  • @AlexDesplanque
    @AlexDesplanque Місяць тому

    Playing with my first gh200 now :D

  • @sjgonline
    @sjgonline Рік тому +117

    The smile on Linus’ face is like a 80’s kid going to a toy store… you know you won’t leave the place with anything, but just being surrounded with the toys is a joy

  • @LPcrazy_88
    @LPcrazy_88 Рік тому +106

    This might be the first video where Linus hasn't damaged or at least recklessly handled expensive electronics. So it IS possible for him to not break stuff!

    • @autumntechdruid
      @autumntechdruid Рік тому +6

      Must be a robot Linus.....

    • @lonelyPorterCH
      @lonelyPorterCH Рік тому +5

      What about the network card jake threw?^^
      I guess that was not linus

    • @LPcrazy_88
      @LPcrazy_88 Рік тому +3

      @@lonelyPorterCH The more surprising part about that was Linus actually yelling NO! I would have expected him to just carry on like it's normal to throw around electronics like that.

    • @DaftFader
      @DaftFader Рік тому +2

      TBF, we haven't seen it powered on since he touched it ... ! xD

    • @TheOnlyAndreySotnikov
      @TheOnlyAndreySotnikov Рік тому

      He probably did damage something, it was just edited out to avoid liability.

  • @diskgrind3410
    @diskgrind3410 11 місяців тому

    Wow, things are getting faster and cooler!

  • @rayyu4511
    @rayyu4511 Рік тому +1

    Linus is the only guest I saw at Computex who was allowed to mess with that amazing server like this.

  • @yoloswaggins2161
    @yoloswaggins2161 Рік тому +3

    0:16 That poor guy in orange jumping out of the shot as fast as he could

  • @J0URDAIN
    @J0URDAIN Рік тому +22

    Crazy to think that at one point our future generations will see this as ancient technology just how we see OUR ancestor’s tech (tools).

  • @MorrisonManor
    @MorrisonManor Рік тому

    Grace Hopper? Awesome nod!

  • @Randomlumberjack
    @Randomlumberjack Рік тому

    4:40 "clear backblast!"

  • @Woodie-xq1ew
    @Woodie-xq1ew Рік тому +90

    What you don’t see is just below the camera shot is a nvidia employee with their hands out ready to catch that module if Linus drops it 😂

  • @egillis214
    @egillis214 Рік тому +63

    ARM is a RISC instruction set. The Hewlett-Packard Packard PA-RISC was way ahead of its time. I worked on the first HP 3000 on MPE & HP 9000 HP-UX systems. Some of the desktop workstations like the tiny 715 systems were incredible in 1980’s.

    • @konnorj6442
      @konnorj6442 Рік тому +1

      Ah the toys of my youth! I worked with some of that wayyyy back when along with many other goodies that all ofnthe winbloze babies wouldnt have any clue what it is now nevermind how to use it and due to the millenials andnbeyond idiotic overly entitled arrogant bs they dont even appreciate that which was gained to make the current toys evennpossuble via our hard work long before they were a set spot on cheap hotels sheets

    • @danielwoods7325
      @danielwoods7325 Рік тому +1

      Beat me to this lol good comment

    • @kellecetraro4807
      @kellecetraro4807 Рік тому +2

      You beat me to this comment, but I made it anyway 😂
      I'm sort of scratching my head as to why he's (Linus??) acting like it's a new thing... Just for novelty I still have a SUN E450 still running and productive 😂

    • @archgirl
      @archgirl Рік тому +1

      @@konnorj6442Pardon?

    • @1Sonicsky1
      @1Sonicsky1 Рік тому +6

      ​​@@konnorj6442 First of all, it is this exact toxicity that completely stagnates any real intelligence... I would rather be stuck fixing Windows 3.1 and Vista installations for the rest of eternity than ever hold a mindset akin to yours. Every architecture, operating system, and programming language has its strengths and weaknesses, and it is our responsibility as technicians to learn and understand each one so that we can always provide the best for whatever our client is trying to achieve. I have met both old and young people who are kinder, more intelligent, and exhibit far more competence than you have shown here.

  • @pinchettaloon609
    @pinchettaloon609 Рік тому

    4:20 - the camera man at that moment "am I a criminal now?"

  • @archgaden
    @archgaden 11 місяців тому

    Seeing such insane hardware posturing to take on AI really makes me appreciate the wetware in we all own. ChatGPT's core is a roughly 800gb blob of static digital neurons, that can't change or reorganize after the training. That's just for an LLM that is still missing a lot of functionality we have when it comes to language processing... and on top of that, we're doing so many other things, all while consuming a small fraction of the power, re-trainable in real time, and being entirely portable. To fully map the input/output map of a human neuron takes either very expensive calculus or a blob of transformers 5 to 7 layers deep. We've still got a couple of orders of magnitude to go on hardware to match what a brain can do, it's not such an impossible goal now.
    As an aside, I find it funny that LLMs tend to be so much bigger than visual AIs like StableDiffusion. That old 'a picture is worth a thousand words' seems to be panning out more like 'a word is worth a thousand pictures'. Of course, StableDiffusion still have a lot of obvious gaps in training, while ChatGPT's flaws are more subtle, so it's not really a good comparison yet.