How Nvidia Won AI

Поділитися
Вставка
  • Опубліковано 6 чер 2024
  • When we last left Nvidia, the company had emerged victorious in the brutal graphics card Battle Royale throughout the 1990s.
    Very impressive. But as the company entered the 2000s, they embarked on a journey to do more. Moving towards an entirely new kind of microprocessor - and the multi-billion dollar market it would unlock.
    In this video, we are going to look at how Nvidia turned the humble graphics card into a platform that dominates one of tech’s most important fields: Artificial Intelligence.
    Links:
    - The Asianometry Newsletter: asianometry.com
    - Patreon: / asianometry
    - The Podcast: anchor.fm/asianometry
    - Twitter: / asianometry

КОМЕНТАРІ • 600

  • @Asianometry
    @Asianometry  2 роки тому +110

    What would you like to see on the channel?

    • @bjliuyunli
      @bjliuyunli 2 роки тому +12

      Thanks a lot for the video! Would be great to see a video about power semis like IGBTs and Silicon carbide.

    • @2drealms196
      @2drealms196 2 роки тому +8

      You've covered Nvidia, could you cover Nuvia and Nivea?

    • @masternobody1896
      @masternobody1896 2 роки тому +1

      yes more gaming video is what I like

    • @fulcrumR6
      @fulcrumR6 2 роки тому +9

      Can't wait to see. You should do a video on the companies behind modern day tanks, no matter the country. The history on many companies (General Motors, Etc) and them taking the time to design the tanks and make them functional is very interesting. I'd love to see a video on that.

    • @screwcollege8474
      @screwcollege8474 2 роки тому +6

      Marvell technology pls

  • @okemeko
    @okemeko 2 роки тому +514

    From what a professor in my university told me, they didn't only "work closely" with researchers. They straight up gifted some cards so research centers in some cases. This way, not only did nvidia provide a good platform, but all the software made was naturally made for CUDA

    • @eumim8020
      @eumim8020 2 роки тому +43

      My master's thesis supervisor has 5 professors submitting a request for a GPU each, NVIDIA covers all their monopolistic anticompetitive core with a whole system for helping public university systems, if i'm lucky my final DL model will be trained in his little office server with those GPUs

    • @slopedarmor
      @slopedarmor 2 роки тому +16

      i think i member that nvidia gifted a gtx980ti to the developers of kingdom come deliverance (a kickstarter computer game), to supposedly help them with development? haha

    • @monad_tcp
      @monad_tcp 2 роки тому +24

      Ah that old trick from Microsoft of gifting goodies. Like giving away office licenses or the entire Internet Explorer for free if you bought Windows.

    • @steveunderwood3683
      @steveunderwood3683 2 роки тому +26

      If you don't provide some help to early adopters, how are you ever going to build a thriving environment? Providing cards, software, training and support to academics was a good thing. The sleazy stuff they did was to cook studies to make the benefits of a GPU look much greater than it really was, in applications where the benefits of GPU were marginal at best. GPGPU is great for some things, and weak for others. The early nVidia sponsored papers were so heavily rigged, it took some serious analysis to figure out where GPGPU was a real boon, and how big that boon might be.

    • @monad_tcp
      @monad_tcp 2 роки тому +11

      @@steveunderwood3683 Yeah, its the environment, the benefits stated on those papers were rigged to nVidia's side, but they would be feasible in a computational level with a open environment for study.
      But the industry is too locked on Cuda/Intel x86 .
      At least now, things are going to change a bit, as if we could say ARM is different...

  • @e2rqey
    @e2rqey 2 роки тому +181

    Nvidia does a really good job of identifying new burgeoning industries where their products could be leveraged, then integrating themselves into the industry from so early on that as the industry matures Nvida's product become essential to the functioning of that industry. I remember visiting a certain self driving car company in California about 4 years ago and seeing a literal wall of Nvidia 1080Ti GPUs. They had at least a couple hundred of them. Apparently they had all been gifted to them by Nvidia.
    I've heard Nvidia will also send their engineers out to work with companies and help them optimize their software or whatever they are doing, to get the maximum performance out of the GPU for whatever purpose they are using them for.

    • @zerbah
      @zerbah 2 роки тому +21

      Nvidia has great support for AI and game development. When I was talking with a small indie game studio about their game, they confirmed that Nvidia sent them two top of the line founder's cards for development free of charge and offered to optimize drivers for their game when the final build is ready. Meanwhile, the AMD cards were crashing and having black screen monitors because of buggy drivers making it complete pain to test the development version of the game on them...

    • @aamirsiddiqui9957
      @aamirsiddiqui9957 2 роки тому

      @@zerbah How long will AMD take to be as good as Nvidia

    • @cyranova9627
      @cyranova9627 2 роки тому +5

      I remember one. that some game developer actually get invited to dining with Nvidia person to talk about their game development with nvidia GPU. not AMD one.
      all they just do sweet talk to game developer

    • @tweedy4sg
      @tweedy4sg 2 роки тому +3

      True they do... but is not exactly successful everytime. Remember how they joined the mobile AP (application processor) market with the Tegra series, which now seem to have fizzled out into oblivion.

    • @graphicsRat
      @graphicsRat Рік тому +4

      @@tweedy4sg Yes not every bet will win. In fact most bets will fail. But the 1 out of the 5 that succeed will more than pay for the failures and much more. That's how investments work. Venture capitalists for example know this too well. Not all their investments will pay off. But every now and then they invest in tomorrow's Google scale company and that's where they make their money.

  • @0MoTheG
    @0MoTheG 2 роки тому +30

    CUDA was originally not targeted at machine learning or deep neural networks, but
    molecular dynamics, fluid dynamics,
    financial monte carlo, financial pattern search, MRI reconstruction, deconvolution
    and very large systems of linear equations in general.
    A.I. is a recent addition.

    • @TheDarkToes
      @TheDarkToes Рік тому +3

      Back in the day, we would have 64 cuda cores and we thought we were hot shit hitting 800mhz. Look how far it's come.

    • @christopherpearson8637
      @christopherpearson8637 Рік тому +3

      You stumble into the right choices sometimes.

  • @scottfranco1962
    @scottfranco1962 2 роки тому +484

    Nvidia is a real success story. The only blemish is (as illustrated by Linus Torvald's famous giving the middle finger to them) is their completely proprietary stance on development. Imagine if Microsoft had arranged so that only their C/C# compilers could be used to develop programs for Windows. CUDA is a closed shop, as are the graphics drivers for Nvidia's cards.

    • @janlanik2660
      @janlanik2660 2 роки тому +6

      But msvc can be used only on Windows.

    • @theairaccumulator7144
      @theairaccumulator7144 2 роки тому +23

      @@janlanik2660 imagine using windows, much less mvsc

    • @scottfranco1962
      @scottfranco1962 2 роки тому +34

      @@janlanik2660 I think you misread what I said. Microsoft (or any OS maker, Apple included) could have easily made it so that only their compilers could be used on their systems, no GCC, no independent developers. That is what Nvidia has done.

    • @janlanik2660
      @janlanik2660 2 роки тому +6

      @@scottfranco1962 ok sorry for the misinterpretation. But even so, I have only used CUDA, which is indeed Nvidia only, but I believe that there are some cross platform solutions, e.g. OpenCL, so you don’t have to use proprietary tools to run something on Nvidia, or am I wrong?

    • @Ethan_Simon
      @Ethan_Simon 2 роки тому +81

      @New Moon You don't need something to be proprietary to pay your engineers to work on it.

  • @mimimimeow
    @mimimimeow 2 роки тому +69

    I think it's worth mentioning that the a lot of recent advances in GPU computing (Turing, Ampere, RDNA, mesh shaders, DX12U) can be traced to the PlayStation 2's programmable VU0+VU1 architecture and PlayStation 3's Cell SPUs. Researchers did crazy stuff with these, like real time ray tracing, distributed supercomputing for disease mechanism research and USAF's space monitoring. PS3 F@H program reached 8 Petaflops at one point!
    Sony and Toshiba would've been like Nvidia today if they provided proper dev support to make use of these chips' capability and continued developing, than just throwing the chip to game devs and said "deal with it". I feel like Sony concentrated too much on selling gaming systems and didn't realize what monsters they actually created. Nvidia won by actually providing a good dev ecosystem with CUDA.

    • @dhargarten
      @dhargarten Рік тому +2

      Didn't Sony at one point encourage and support using PlayStations for science computing, only to later block it completely? With the PS4 if I recall correctly?

    • @FloStyle_
      @FloStyle_ Рік тому +15

      @@dhargarten It was the PS3 and running linux native on the console. Later that caused exploits and hacks of the hardware and sony closed the ecosystem really fast. That caused lawsuit that took years into the PS4 lifespan to conclude.

    • @Special1122
      @Special1122 11 місяців тому

      ​@@FloStyle_geohot?

  • @zombielinkinpark
    @zombielinkinpark 2 роки тому +53

    Despite both Google and Ali cloud developed their own NPU for AI acceration. They are still buying large quantities of Nvidia Delta HGX GPUs as their own AI development platform. Programming for CUDA are far easier then their own proprietary hardware and SDK. Nvidia really put a lot of effort in the CUDA sdk and make it to be industry's standard.

  • @yadavdhakal2044
    @yadavdhakal2044 Рік тому +30

    Nvidia didn't invent the graphics pipeline. It was invented by Sillicon Graphics or SGI. SGI developed the language OpenGL as far as 1992. They mainly used to target cinema and scientific visualizations market. They used to manufacture entire work station with their own OS (IRIX) and other specialized servers.
    What Nvidia did was to target the personal entertainment market. This made Nvidia competent because of decreased overall unit cost. Later OS such as linux were able to run these GPUs in cluster and thus here too SGI loosed. SGI could easily be like Nvidia if they were on right track.
    SGI is now reduced to a conference known as SigGraph. And mainly is research based peer program. And still contributes to computer graphics especially through OpenGL and Vulkan API specification!

    • @lookoutforchris
      @lookoutforchris 10 місяців тому +2

      The original GeForce card was so groundbreaking they were sued by Silicon Graphics for copying their technology. SGI won and nVidia paid royalties to them. Everything nVidia had came from SGI 😂

  • @deusexaethera
    @deusexaethera 2 роки тому +295

    Ahh, the time-honored winning formula:
    1) Make a good product.
    2) Get it to market quickly.
    3) Don't crush people who tinker with it and find new uses for it.

    • @heyhoe168
      @heyhoe168 2 роки тому +44

      Nvidia dont really follows (3), but it have a very strong (2).

    • @cubertmiso
      @cubertmiso Рік тому

      @@heyhoe168 agree on that comment. 3) corner the market 4) raise prices

    • @peterweller8583
      @peterweller8583 Рік тому

      @@heyhoe168 3 Thay's too bad because that is where the most honey comes from.

    • @shmehfleh3115
      @shmehfleh3115 Рік тому +4

      @@heyhoe168 Neither does Apple, unfortunately.

    • @locinolacolino1302
      @locinolacolino1302 Рік тому +12

      3* Create an accessable proprietary toolkit (CUDA) that's become mainstream in legacy content, and crush anyone who tries to leave the Nvidia ecosystem.

  • @PhilJohn1980
    @PhilJohn1980 Рік тому +20

    Ah, geometry stages with matrices - I remember my Comp Sci computer graphics class in the 90's where our final assignment was to, by hand, do all the maths and plot out a simple 3D model on paper. Each student had the same 3D model defined, but different viewport definitions. Fun times.

  • @BaldyMacbeard
    @BaldyMacbeard Рік тому +18

    The secret of their success for many years was: working closely with developers/customers to gain advantage over their competitors. For instance, Nvidia would give free cards to game developers and send out evangelists to help optimize the game engines. Obviouly resulting in a strong developer bias towards Nvidia cards. Which is how and why they were outperforming AMD for many years. In the machine learning space, they are being extremely generous in their public relations to academia, once again giving away tons of free GPUs and helping developers out. It's a fairly common tactic to try and bring students on board so once they graduate and go on to work in tech companies, they bring a strong bias towards software & hardware they're familiar with. In the server market, Nvidia has been collaborating closely with most manufacturers while offering their DGX systems in parallel. They also have a collaboration with IBM that solders Nvidia GPUs onto their Power8 machines, giving a ginormous boost to bandwidth between GPU and CPU and also PS5-like storage access. And don't forget about the Jetson boards. Those things are pretty amazing for edge computing use cases like object recognition in video and such. They dominate like they do by not trying to sell a single product, but offering tons of solutions for every single market out there.

    • @409raul
      @409raul Рік тому +3

      Genius move by Nvidia. Jensen Huang is the reason why Nvidia is where they are today. One of the best CEOs in the world (despite the greed LOL).

    • @TherconJair
      @TherconJair Рік тому +3

      It's quite easy when your main competitor was nearly extinguished by anti-competitive measures of their much larger rival, Intel, and has to stay afloat somehow while bleeding money. Nvidia made so much money with gaming cards when AMD couldn't compete due to lack of funds for RnD with, that they had an extremely calm "blue ocean" to work with and could comparatively cheapely build up their de-facto monopoly in the space. AMD will need to invest a lot of money to somehow break into the now very "red ocean" of the Nvidia monopoly of CUDA.
      I don't see them able to survive in the long term against two much larger rivals, and we'll be all losers for it.

    • @Magnulus76
      @Magnulus76 Рік тому

      Yeah, Nvidia offered alot of support.
      I know there's alot of fanboys that think NVidia must have some kind of secret sauce, but the truth is that CUDA's performance isn't necessarily any better than OpenCL. And I say that as somebody that owns an NVidia card. NVidia just spent alot on support and generated alot of influence/hype.

  • @Quxxy
    @Quxxy 2 роки тому +61

    I don't think you're right about what "clipping" means at 2:56. Occlusion (hiding things behind other things) is done with a Z-buffer*. As far as I recall, clipping refers to clipping triangles to the edge of the screen to avoid rasterising triangles that fall outside of the visible area, either partially or fully. As far as I'm aware, no one ever did occlusion geometrically on a per-triangle basis. The closest would be in some engines that will rasterise a simplified version of a scene to generate an occlusion buffer**, but that's not handled by the geometry engine, it's just regular rasterisation.
    *Except on tile-based rasterisers like the PowerVR lineage used in the Dreamcast and some smartphones, notably the iPhone.
    (Not a graphics programmer or expert, just an interested gamer.)
    *Edit*: Also, for 7:46 about the fixed function pipeline being totally gone: from what I remember this is not entirely true. GPUs still contain dedicated units for some of the fixed functionality; from memory, that includes texture lookups and blending. Reminds me of an old story from someone who worked on the Larrabee project who mentioned that one of the reasons it failed to produce a usable GPU was that they tried to do all the texturing work in software, and it just couldn't compete with dedicated hardware.

    • @Asianometry
      @Asianometry  2 роки тому +16

      Thx. I'll look into this and see if a clarification is needed

    • @Quxxy
      @Quxxy 2 роки тому +26

      @@Asianometry I doubt it. It's an inconsequential detail that doesn't change anything about the substance of the video. I mean, I doubt anyone is watching a video about nVidia's AI dominance looking for an in-depth technical description of the now long-obsolete fixed function pipeline. :)

    • @musaran2
      @musaran2 2 роки тому +4

      Clipping is the general removal of what does not need rendering: view volume, backface, occlusion…

    • @tma2001
      @tma2001 2 роки тому +3

      yeah I was about to post the same nitpick - also the setup and window clipping part of the fixed function pipeline is still there in hardware its just not programmable (nor should it be). The raster ops backend is not programmable either - just configurable.
      The Painters algorithm is an object based visibilty test that clips overlapping triangles against each other whereas the z-buffer is an image based per pixel visibilty test.

    • @vintyprod
      @vintyprod Рік тому

      @@Quxxy I am

  • @CarthagoMike
    @CarthagoMike 2 роки тому +10

    Oh nice, a new Asianometry video!
    Time to get a cup of tea, sit back, and watch.

  • @TickerSymbolYOU
    @TickerSymbolYOU Рік тому +15

    This is literally the best breakdown on UA-cam when it comes to Nvidia's dominance of the AI space. Love your work!

    • @409raul
      @409raul Рік тому

      Nice to see you here Alex! Nvidia for the win!

    • @prashantmishra9985
      @prashantmishra9985 Рік тому +4

      ​@@409raul Being a fanboy of a corporate won't benefit us.

    • @havkacik
      @havkacik Рік тому

      Totally agree 👍 :)

  • @ted_1638
    @ted_1638 2 роки тому +1

    fantastic video! thank you for the hard work.

  • @Doomlaser
    @Doomlaser 2 роки тому +8

    As a game developer, I've been waiting for a video like this. Good work

  • @DavidSoto90
    @DavidSoto90 2 роки тому +1

    such a valuable video, great work as usual!

  • @Sagittarius-A-Star
    @Sagittarius-A-Star 2 роки тому +94

    I don't want to know how much effort it was to put all this information together.
    Thanks and thumbs up.
    P.S.: At Nvidia they are insane. Just try to find out which GPU you have and how it compares to others or if they are CUDA capable .....
    You will end up digging through lists of hundreds or thousands of cards.

    • @killerhurtalot
      @killerhurtalot 2 роки тому +13

      That's the thing though.
      Nvidia usually actually has 6-7 actual chips that they manufacturer. They don't manufacturer tens or hundreds of GPUs each generation...
      The main difference is that due to manufacturing defects, the GPUs are just binned and has different sectors enabled.
      The 3090 and 3080 are actually the same chip. The 3080 just has around 15% less pipelines/CUs and less tensor cores enabled...

    • @Baulder13
      @Baulder13 2 роки тому +10

      This man has no quit! The amount of research he puts in and how much content that has been coming out is ridiculous.

    • @Hobbes4ever
      @Hobbes4ever 2 роки тому +2

      @@killerhurtalot kind of like what Intel does with their Celeron

    • @marksminis
      @marksminis 2 роки тому +6

      @@killerhurtalot yes that is correct. A large silicon wafer is a huge investment. By testing each core, defective cores can be coded out, so you still have a working chip to sell. Throwing out a large expensive chip just for having a few bad cores would be insane. Only a small percentage of chips coming off the huge wafer are totally perfect, and those are mostly near the center of the wafer.

  • @ministryofyahushua3065
    @ministryofyahushua3065 2 роки тому +1

    Love your channel, very well presented.

  • @BenLJackson
    @BenLJackson 2 роки тому

    I felt some nestalgia, good vid 👍 deciphering all this back in the day was so much fun. Also I love your explanation of AI and what it really is.

  • @hgbugalou
    @hgbugalou 2 роки тому +8

    I would buy a shirt that says "but first, let me talk about the Asianometry newsletter".

  • @NeilStainton
    @NeilStainton 2 роки тому +1

    Thank you for your excellent work in condensing and analysing NVIDIA’s progress.

  • @Matlockization
    @Matlockization Рік тому

    Thank you for explaining some of the details in the beginning.

  • @RandomlyDrumming
    @RandomlyDrumming 2 роки тому +20

    A small mistake, right at the beginning - Geforce 256 had hit the market in 1999, not 1996. In the mid-90's, Nvidia was, more or less, just another contender, chipping away at the market dominance of the legendary 3dfx. :)

    • @shoam2103
      @shoam2103 2 роки тому

      So theirs wasn't the first GPU? I think the PlayStation had an albeit very basic one..

    • @shoam2103
      @shoam2103 2 роки тому

      Okay 5:55 clears it up a bit..

    • @RandomlyDrumming
      @RandomlyDrumming 2 роки тому

      @@shoam2103 Well, technically, it was, as it handled the entire pipeline. Interestingly, the first *programmable* graphics chip for PC was Rendition Verite v1000 (RISC-based), released back in 1995, if I'm not mistaken. :)

    • @DM0407
      @DM0407 Рік тому +1

      Yep, I had bought an RIVA TNT2 to play Asheron's Call in 1999. I guess the 256 was out at this time but I couldn't afford it bad the TNT2 was still a massive jump in performance. Going from a choppy software renderer to "hardware accelerated" graphics was amazing at the time.. The paths had textures! Who knew?
      I don't remember the original Geforce being that big of a deal, but I remember lusting after the Geforce 2 AGP.

  • @dipankarchatterjee8809
    @dipankarchatterjee8809 2 роки тому

    A very well researched presentation. Thank you Bro.

  • @Meta5917
    @Meta5917 2 роки тому

    Great video. Keep it up, proud of you

  • @drewwollin3462
    @drewwollin3462 2 роки тому +12

    Very good as always. A good explanation of how graphics cards work and how they have evolved.

  • @rzmonk76
    @rzmonk76 2 роки тому

    Subscribed, really nice presentation!

  • @jmk1727
    @jmk1727 2 роки тому

    man your videos are all always amazing......PERIOD.

  • @conradwiebe7919
    @conradwiebe7919 2 роки тому +10

    Long time viewer and newsletter reader, love your videos. I just wanted to mention that the truncated graph @ 15:28 is a mistake, especially when you then put it next to a non-truncated graph a little later. The difference between datacenter and gaming revenue is greatly exaggerated due to this choice of graph. I feel it actually diminished your point that datacenter is rapidly catching up to gaming.

  • @punditgi
    @punditgi 2 роки тому +3

    First rate information well presented! 👍

  • @helmutzollner5496
    @helmutzollner5496 Рік тому

    Excellent overview. Thank you.

  • @ttcc5273
    @ttcc5273 Рік тому

    Thank you for this video, it was informative, digestible, and I learned more than I expected to. 👍

  • @Socrates21stCentury
    @Socrates21stCentury Рік тому

    Nice job, very informative !!!

  • @Zloi_oi
    @Zloi_oi Рік тому

    This is really interesting!Thanks for your work, sir!

  • @skipsteel
    @skipsteel 2 роки тому

    Thanks really well done, you made the complex simple thanks.

  • @supabass4003
    @supabass4003 2 роки тому +43

    I have spent more money on nvidia GPUs in the last 20 years than I have on cars lol.

    • @mspy2989
      @mspy2989 2 роки тому +3

      Goals

    • @heyhoe168
      @heyhoe168 2 роки тому +2

      Same. Btw, I dont have a car.

    • @wazaagbreak-head6039
      @wazaagbreak-head6039 Рік тому

      I have no reason to update my ancient corolla it's a piece of crap but it gets me to work each day

    • @LimabeanStudios
      @LimabeanStudios Рік тому

      I have only purchased one of each and same lmao

    • @prateekpanwar646
      @prateekpanwar646 Рік тому

      @@wazaagbreak-head6039Is it 750 TI / 760?

  • @christakimoto8425
    @christakimoto8425 6 місяців тому

    This is an outstanding and informative video. Thank you so much!

  • @richardm9934
    @richardm9934 5 місяців тому

    Fantastic video!

  • @jdevoz
    @jdevoz 3 місяці тому

    Amazing video!

  • @MohammadSadiqurRahman
    @MohammadSadiqurRahman 2 роки тому

    insightful. loved the content

  • @harrykekgmail
    @harrykekgmail 2 роки тому +6

    a classic in your stream of videos!

    • @screwcollege8474
      @screwcollege8474 2 роки тому

      how you posted 2 months ago?

    • @2drealms196
      @2drealms196 2 роки тому +3

      @@screwcollege8474 patreon members get access to his vidoes first. Later on he makes the makes the videos public. Another way is through his college partnership program.

  • @ChristianKurzke
    @ChristianKurzke Рік тому

    I love this, very well researched, and the correct level of technology for the average executive,.. who isn't a math genius. ;)

  • @Magnulus76
    @Magnulus76 Рік тому +1

    They had neural networks being used in computer games even back in the early 90's, to a limited extent (mostly a few strategy games). The reason there's hype about neural nets now, is that the raw computing power of a GPU allows companies to develop neural networks that can mimic human visual perception and pattern recognition.

  • @GBlunted
    @GBlunted 2 роки тому +5

    This is cool content, I liked this video! I like the explanation of low-level processes as well as the history lesson of how it all evolved to where it is today...

  • @nailsonlandim
    @nailsonlandim 2 роки тому +2

    Excellent video. funny fact is I passed the day dealing with CUDA and a CV application I'm working on

  • @LimabeanStudios
    @LimabeanStudios Рік тому +1

    Just found this channel the other day and it's amazing.
    One thing I don't see mentioned in the comments is that Nvidia is often rated one of the best companies in the world to work at. It's a lot easier to do big things with happy employees lol

  • @Bianchi77
    @Bianchi77 2 роки тому

    Nice video,thank you for sharrng it :)

  • @FrancisdeBriey
    @FrancisdeBriey 2 роки тому

    Subscribed !

  • @AaronSchwarz42
    @AaronSchwarz42 2 роки тому

    Excellent analytics on market diffusion of COTS //

  • @hc3d
    @hc3d 2 роки тому

    wow, amazing analysis.

  • @BartKus
    @BartKus 2 роки тому +22

    You do really good work, sir. Much appreciate.

  • @jem4444
    @jem4444 Рік тому

    Extremely well done!

  • @valenganev5774
    @valenganev5774 2 роки тому

    what do think about the fujitsu Celcius PC? where do you place this among other PC's? What is the future of fujitsu?

  • @PlanetFrosty
    @PlanetFrosty 2 роки тому

    Good presentation

  • @green5270
    @green5270 2 роки тому

    excellent video

  • @kokop1107
    @kokop1107 9 місяців тому

    This is a very good and acurate explaination

  • @BillHawkins0318
    @BillHawkins0318 2 роки тому +31

    What I never understood is why NVIDIA Attemped to cool the HEAT SINK with a 3 cent fan. Especially from 2001 to 2010. When said 3 cent fan goes out the GPU can burn up. 🔥. As passive cooling Has never been enough. We don't have to worry about AI. 3 cent fans will make short work of that.

    • @Tartar
      @Tartar 2 роки тому +3

      GPU fans are still very cheap these days.
      Hopefully we the Asus Noctua collaboration is a sign of things to come with GPU's with premium and quiet cooling solutions.

    • @rayoflight62
      @rayoflight62 2 роки тому +1

      A good percentage of computers from that period were because of the failed fan. It was that transparent plastic melted all over the GPU heatsink. The GPU subsequently failed, usually shorting the +5 V bus...

    • @musaran2
      @musaran2 2 роки тому +2

      By the time it happens, they consider it is obsolete and you are supposed to upgrade.
      I hate it.

    • @Bialy_1
      @Bialy_1 2 роки тому

      "As passive cooling Has never been enough." Nope some cards got good passive cooling, and you always can replace the fan on your own when its starting to make noise...

  • @Campaigner82
    @Campaigner82 2 роки тому

    You make so good videos! The pictures I’m intrigued by. You’re doing a good job!

  • @19smkl91
    @19smkl91 Рік тому

    6:24 I've seen people rubber banding when stepping on and even bug off at half way up, usually receiving hurtings.

  • @emulegs5
    @emulegs5 2 роки тому

    Please remember to leave a video link to the last video and the first in a series and the first aswell, I would have clicked links to them based off your intro alone

  • @ADHD55
    @ADHD55 2 роки тому +15

    Nvidia is what happens when the CEO is a engineer not a short term thinking mba

    • @user-lx7kx1dd3q
      @user-lx7kx1dd3q 2 роки тому

      It's in Chinese blood. An engineer

    • @xraymind
      @xraymind 2 роки тому +5

      @@user-lx7kx1dd3q Correction, Taiwanese blood.

    • @user-lx7kx1dd3q
      @user-lx7kx1dd3q 2 роки тому +2

      @@xraymind there's no such thing as Taiwanese blood. Taiwan is a land not a race.

    • @ADHD55
      @ADHD55 2 роки тому +1

      @@user-lx7kx1dd3q huh? NVIDIA is a American company not Chinese

    • @user-lx7kx1dd3q
      @user-lx7kx1dd3q 2 роки тому +1

      @@ADHD55 since when I said Nvidia isn't American company. You talked about it CEO. And who do you think it's CEO? It's Jason Huang. A Chinese Taiwan that has became US citizen.
      Are you still a kid that I need to spell everything for you???????

  • @gregsutton2400
    @gregsutton2400 Рік тому

    great info

  • @igorwilliams7469
    @igorwilliams7469 2 роки тому

    Thinking about that elevator analogy a bit too much... Are there ANY elevators with level tiers midway (like bottom and top) for riders to decamp? While obviously adding complexity to a system that it almost plug and play, it could certainly be interesting!

  • @Palmit_
    @Palmit_ 2 роки тому +1

    thank you John. :)

  • @GhostZodick
    @GhostZodick 2 роки тому

    Your video always have a low frequency pounding sound in the background. Would you mind look into that and try to fix it in the future videos? At first I thought it was something pounding in my house, but later realized it was in your video because I only hear it at certain parts of your video.

  • @Jensth
    @Jensth 11 місяців тому

    You were spot on with this one. After this came out everyone bought NVIDIA stock up like crazy.

  • @shmehfleh3115
    @shmehfleh3115 Рік тому

    This video filled in a lot of gaps for me. I work with the things and I wasn't sure how GPUs evolved into general computing devices.

  • @zodiacfml
    @zodiacfml 2 роки тому +5

    beaten me to this critique which is the most important part of Nvidia's luck/success. I recall it took years before Nvidia finally got to CUDA support/programming. Researchers using GPUs is also the reason why AMD bought Ati. There was a whitepaper from AMD that computing will move/focus to graphics from then on, they were just more than a decade way too early with that prediction.
    Another thing to note, it is the gamers/consumers that made all this possible paying for the R&D of graphics cards that will be used to sell products for the datacenter. Ray tracing hardware for example is a poor feature for use in gaming currently but it is excellent for industrial use.

    • @markhahn0
      @markhahn0 2 роки тому +1

      in some ways, it's remarkable how poorly AMD has done. they've never delivered on anything like a sleek cpu-gpu-unified infrastructure, even though they have all the pieces in hand (and talked about things like HSA). it'll be ironic if Intel manages with oneAPI, since for so long, they were defending the CPU like a castle...

    • @zodiacfml
      @zodiacfml 2 роки тому

      agreed. though the hardware on the latest gaming consoles were impressive when they were announced, just ok when the consoles became available.
      AMD also doesn't have a foot in Arm where Nvidia has on the Nintendo Switch and Apple on M1.
      My last two PCs are Intel i3-8100 and recently a i3-12100 since I have some use of the iGPUs.

  • @swlak516
    @swlak516 2 роки тому +1

    These videos make me feel smarter than I really am. And I feel like you’re one of the few UA-cam contact creators in the space who can do that. Thank you.

    • @Speed001
      @Speed001 2 роки тому

      This is definitely a bit above me with tech terms I don't care to learn.

  • @cfehunter
    @cfehunter Рік тому +1

    "Early graphics processing broke scenes up into triangles".... they still do.

  • @etherjoe505
    @etherjoe505 2 роки тому +5

    Single Instruction Multiple Data 👍👍👍

    • @doug184
      @doug184 9 місяців тому

      AMD?

  • @markhahn0
    @markhahn0 2 роки тому +4

    important to point out that no one really uses Cuda for AI - they use pytorch or tensorflow. that means that Nv doesn't have any real lock on the market - alternatives are highly competitive.

    • @Stef3m
      @Stef3m Рік тому

      That is an important point that is too rarely bring out

    • @kotokotfgcscrub
      @kotokotfgcscrub Рік тому

      ML frameworks came to existing later and were built upon cuda and cudnn, and are way more optimized for nvidia even after starting to support other hardware.

  • @IntangirVoluntaryist
    @IntangirVoluntaryist 2 роки тому

    I still have several old gen cards
    TNT cards, banshee, voodoo, first gen geforce, some early gen ati cards too
    i also have some old soundblaster cards :)

  • @nabeelhasan6593
    @nabeelhasan6593 2 роки тому +4

    I always wish there was a unified framework like Cuda for all platforms , NVIDIA absolute monopoly on Deep learning reallly makes things hard

    • @Corei14
      @Corei14 2 роки тому +4

      Open cl. Now making it work as well as others is a different question

    • @joshuagoldshteyn8651
      @joshuagoldshteyn8651 Рік тому

      How does it make things really hard? Simply use an Nvidia GPU with high batch sizes or any CPU with low batches sizes?

  • @estebancastellino3284
    @estebancastellino3284 Рік тому

    I remember when NVidia software graphic accelerator card was the cheap option for those of us who couldn't afford a Vodoo card, the one that did came with hardware accelerator. Vodoo was about it's fith version by the time NVidia put the GForce chip on.

  • @johnaugsburger6192
    @johnaugsburger6192 2 роки тому

    Thanks so much

  • @bhavintoliabg4946
    @bhavintoliabg4946 2 роки тому

    This one video made me respect NVIDIAs work more than any advt ever would.

  • @davidbooth8422
    @davidbooth8422 2 роки тому

    Hi John. I love your videos! Do you have any possible connections that might want to manufacture a much better and cheaper smoke detector than is available today? I would love to explain how easy that would be to any technical person who would listen. I am not trying to make money either, just save lives.

  • @michaelhulcy6680
    @michaelhulcy6680 2 роки тому

    "Triangles. Triangles all the way down baby." Dat was a good one.
    Making Duke Nukem in 96 jealous man.

  • @hunter8980
    @hunter8980 2 роки тому

    How many polygons RTX 3080 process per sec?

  • @screwcollege8474
    @screwcollege8474 2 роки тому +10

    Great video, now I understand why Nvidia is worth 600 billion in market value

    • @johnl.7754
      @johnl.7754 2 роки тому +2

      Would have never thought that it would be worth over 3x Intel or any other cpu manufacturers.

  • @MikkoRantalainen
    @MikkoRantalainen Рік тому

    Great document as usual but the bar graphs not starting from zero around 15:30 wasn't very cool. The illustration made it appear like Gaming is more than double the Data Canter revenue but when you compare the actual numbers 3.22 vs 2.94, you'll quick see that the difference is actually about 9%!

  • @ravindertalwar553
    @ravindertalwar553 2 роки тому

    Congratulations 👏 and lots of love and blessings ❤️

  • @Aermydach
    @Aermydach 2 роки тому +13

    Another great presentation.
    Watching this got me thinking that I wasted my time studying agricultural and wine science at Uni. Instead, I should've studied computer science/engineering. . .

    • @dankuruseo9611
      @dankuruseo9611 2 роки тому +7

      Someone has to feed us 👍

    • @pinkipromise
      @pinkipromise 2 роки тому

      didnt know farmers have degrees

    • @Aermydach
      @Aermydach 2 роки тому +1

      @@pinkipromise They typically don't. The degrees are for researchers, technical support/agronomists (for fertilisers, pesticides, crop and livestock nutrition etc) and other specialist support roles.

  • @johndvoracek1000
    @johndvoracek1000 Рік тому

    I was wondering if you would mention Apple; then you did at the end but not in the way I anticipated. Isn't Apple's M chip a move in the same chip capabilities and architecture as Nvidia, etc.?

  • @tonyduncan9852
    @tonyduncan9852 2 роки тому

    Thanks for that. :)

  • @birseyleryap
    @birseyleryap Рік тому

    that popping sound @8:53 from the lips

  • @rjl7655
    @rjl7655 5 місяців тому

    newsletter link doesn't work...

  • @JohnKobylarz
    @JohnKobylarz 2 роки тому +6

    Excellent video. As someone who remembers when the GeForce 256 was launched, it’s amazing to reflect on how far they’ve come and how influential their tech has been on the world.
    Before GPU’s, PC gaming was a much different affair. Even looking at JPEG’s was an somewhat intense system task before GPU’s became the norm.
    I learned a lot from this video, and enjoyed it. It helps me connect the dots regarding how AI learning works.

  • @Alphahydro
    @Alphahydro 2 роки тому

    That's pretty interesting, and only the tip of what we'll be able to accomplish with GPU horsepower.

  • @timswartz4520
    @timswartz4520 Рік тому

    That G-Force 256 made me very happy for a long time.

  • @AtaGunZ
    @AtaGunZ 2 роки тому +14

    I'm saying this as an AMD fanboy: ROCm sucks.
    It's not properly supported for use with their RDNA/2 cards, I wanted to try it out since CUDA was pushed down my throat during HPC courses, wanted to see what my brand new RX 6900 XT could do with the HPC knowledge I acquired. Turns out it can't do jack because the card did not support ROCm (or the other way around) on launch, only vauge promises for future support with no announced dates (and apparently the earlier rx 5000 series cards were not supported either, even 2+ years after their launch). I tried to learn more about it but all I could get from the outdated and poorly documented github page was that ROCm was intended for CDNA so not all parts of ROCm were present for RDNA2, so converting my CUDA knowledge into HIP and running with it was out of the question.
    I looked it up again now to see if there are any improvements; the latest info I can find is on a random hackernews thread from 6 months ago, a user working for AMD on ROCm reporting that there is unofficial support for some ROCm 4.3 libraries... How are we supposed to track this info?
    I understand that there are architectural differences between RDNA and CDNA, but how does AMD expect sustained growth in this market when not one of my peers nor professors write computational software for AMD GPUs? (meanwhile we are expected to be proficient with CUDA to get a passing grade from our graduate courses). I am not even taking any ML/AI courses, I know the situation is more dire there. I'm still just a student without much industry knowledge so my perspective might not be reflecting how the inidustry really works, but I can't see a world where new graduates wouldn't stick to the platform they are familiar with when going into their career.
    That being said, I hope I can get that summer internship at AMD :P I am so hyped for them especially after the xilinx acquisition.

    • @justice929
      @justice929 2 роки тому

      Are you a Stanford or MIT student? Lisa Su a MIT grad, Nvidia CEO Stanford grad

    • @AtaGunZ
      @AtaGunZ 2 роки тому

      ​@@justice929 wish I was :P Doing my masters at Technical University of Munich (TUM) at the moment.

    • @RainKing048
      @RainKing048 2 роки тому

      Yeah, this is what I don't understand from AMD. They wouldn't be able to expand their market share if they don't even 'support' their products. Even something like the lowly GT 710 from Nvidia had day one CUDA support.
      Meanwhile AMD only has vague hints (and sometimes those have to come from the community) if you want to first find out that you could enter the AI/ML scene using their products.

  • @minecraftdonebig
    @minecraftdonebig 2 роки тому +2

    If i was in charge all chip processes engineers and associated people would be required to wear wizard hats because this shit is insane magic

  • @allcouto
    @allcouto Рік тому

    You guys completly fogot DOJO!

  • @royfpga
    @royfpga Рік тому

    Thanks!

  • @villageidiot8194
    @villageidiot8194 2 роки тому +1

    Go do an article on Innosilicon, a mainland Chinese GPU maker. How far behind are they? Is there any hope for a third or fourth GPU player in the market space. Will Intel Arc be the 3rd player?

    • @justice929
      @justice929 2 роки тому

      Lisa Su and Nvidia CEO are Taiwanese. same with TSMC

    • @villageidiot8194
      @villageidiot8194 2 роки тому

      @@justice929 Don't know how Nvidia & TSMC entered the chat. I was asking about Innosilicon, they have Fantasy One GPU graphics card. From what I can gather, their offices are in Zhuhai, Wuhan, Suzhou, Xi'an, Chengdu, Dalian, Beijing, Shanghai, Shenzhen, London, Silicon Valley, Toronto. Note that only 3 offices outside of China (UK, US, Canada) and 9 offices in China. Their headquarters are in Wuhan, Hubei province, China.

    • @perforongo9078
      @perforongo9078 Рік тому

      I think a company like Innosilicon would do well in China itself because China is so protective of homegrown companies. But if I were to bet on a third player in the GPU market I'd bet on Intel.

  • @isaacamante4633
    @isaacamante4633 2 роки тому +5

    At 15:31 the graphic on the left is not anchored at zero.

    • @Cythil
      @Cythil 2 роки тому +1

      And is not that clear it not. Generally is good form to indicate this.

  • @final0915
    @final0915 Рік тому

    12:35 haha i wonder what images they collected for non-hotdogs

  • @Tigerbalm338
    @Tigerbalm338 Рік тому

    To paraphrase a popular SNL skit:
    "Triangles baby! MORE TRIANGLES!"