How Nvidia Won Graphics Cards

Поділитися
Вставка
  • Опубліковано 18 лис 2024
  • In 1995, there were over thirty different companies competing with one another to build the best graphics chips for the personal computer.
    Six years later, there would be only three. With one clearly in the lead: Nvidia.
    As of this writing, Nvidia Corporation is the 15th biggest company in the world, worth half a trillion dollars.
    Their graphics cards sell out like gangbusters the second they come onto the market.
    And the company is seeking to buy ARM for $40 billion.
    In this video, we are going to look back into the past and see how a little startup came up from behind everyone else to dominate the graphics card industry on route to being the world-leading tech juggernaut it is today.
    Links:
    The Asianometry Newsletter: asianometry.com
    Patreon: / asianometry

КОМЕНТАРІ • 783

  • @Asianometry
    @Asianometry  3 роки тому +102

    Hope you enjoyed the video. Like and subscribe, etc. etc.
    For other company profiles, check out the playlist: ua-cam.com/play/PLKtxx9TnH76Qod2z94xcDNV95_ItzIM-S.html

    • @Conservator.
      @Conservator. 3 роки тому +2

      Excellent as always, thank you!

    • @vladdx
      @vladdx 3 роки тому +3

      Wait, why was "Going Fabless" written when switching from SGT to TSMC? The SGT deal was also fabless because Nvidia didn't own any fabs then either

    • @passiveincomestream8813
      @passiveincomestream8813 3 роки тому +2

      So good.

    • @Travlinmo
      @Travlinmo 3 роки тому +1

      Excellent review of the history. seeing the Doom and Quake reminded me of the simple days of getting doom running and just wondering what that noise was behind me. :)

    • @solarwolf678
      @solarwolf678 3 роки тому

      How did you comment before the video started

  • @excitedbox5705
    @excitedbox5705 3 роки тому +1160

    My dad worked at Zeng Labs starting in 1996 before it was bought by ATI. Stayed on until Ati was bought by AMD, and stayed on at AMD during the GPU wars. He left AMD in 2005 or 2006. His team built the first Radeon chip.

    • @SianaGearz
      @SianaGearz 3 роки тому +30

      You man Tseng Labs?

    • @VandalIO
      @VandalIO 3 роки тому +15

      Lies !

    • @truthsocialmedia
      @truthsocialmedia 3 роки тому +6

      I had a veretrie 2200 powered by tseng labs. It was a dissapointment

    • @SianaGearz
      @SianaGearz 3 роки тому +15

      @@truthsocialmedia I think you might be confusing something. V2200 is entirely Tseng-free. And Tseng never finished its 3d chip, at least not before going belly up.
      But it was indeed fairly useless.

    • @argh6666
      @argh6666 3 роки тому +4

      So?

  • @tneper
    @tneper 3 роки тому +763

    One correction:
    NVIDIA did coin the term GPU in 1999.
    GPU is short for Graphics Processing Unit (not General Processing Unit).
    GPGPU is the term for General Purpose computing on GPUs. (as far as I know it was not coined by NVIDIA though)
    General purpose computing on GPUs started to become more common after programmable shaders were introduced in 2001, with the NV20.
    Great video. Loved hearing again about the early days of NVIDIA. There's a lot more to the story for sure, but this hit all the right notes, thank you!
    I've worked for NVIDIA from 1999 through today. I lived through a good portion of this, it was (and is) exciting.

    • @mingdianli7802
      @mingdianli7802 3 роки тому +7

      What was your university degree in?

    • @tneper
      @tneper 3 роки тому +78

      @@mingdianli7802 I dropped out of high school when I was 13

    • @VandalIO
      @VandalIO 3 роки тому +3

      I thought it was 3dfx

    • @xchazz86
      @xchazz86 3 роки тому +4

      Can you get me a GPU?

    • @lembkamb
      @lembkamb 3 роки тому +3

      Maybe you could collaborate with asianometry , a new concept video, interview video

  • @joelcorley3478
    @joelcorley3478 3 роки тому +275

    You mention the IBM PGA, but your presentation seems to ignore other 2D graphics adapters (with and without acceleration) that rose to prominence with Windows before 3D graphics. Also there were actually a number of graphics chip design houses that were already on the scene creating chips with 3D acceleration (or deceleration depending on who you ask) when nVidia and 3dfx arrived on the scene. Admittedly none were particularly good by comparison.
    Also every last IHV on the market was writing device drivers for their graphics accelerator chips. The system OEMs didn't write squat. OEMs contracted out all the productization and customization to board manufacturers which hired their own device driver development teams for that purpose. At least that was true in the '90s.
    Admittedly nVidia always had a substantial device driver development team practically from the get-go. But that was actually for self-serving reasons that probably weren't apparent to the public. The early nVidia designs had something unusual - it's own integrated command execution pipeline tied to it's own DMA channels. However all of the less common commands in the pipeline were actually virtualized and farmed back out to the host CPU for simulation. To accomplish this virtualization, nVidia needed a much larger driver team and that team needed to be more involved with the silicon design team. That's actually what initially drove their vertical integration - they just couldn't rely on board manufacturers to address issues in this chip virtualization system - though at first they tried.
    Also about the demise of 3dfx: Before 3dfx approached it, STB Systems was probably the largest or second largest contract graphics board manufacturer in the world. All the major OEMs bought cards from STB Systems. But unlike companies like Diamond and Matrox, STB Systems did not sell self-branded retail products to the public pre-merger. Instead its business model was to take sample silicon from graphics chip makers, spin a board and customize the driver with it's own library of graphics optimizations. (The optimizations were the secret sauce it used to sell to OEMs because they impacted benchmark numbers.) It would then offer these sample boards up the OEMs and each OEM would order SKUs, usually from the two best performing chipsets. This model kept STB System's Mexico PCB factory line near 100% capacity for several years.
    Before 3dfx made it's offer, STB Systems had seen huge success with both the nVidia Riva 128 and the TNT. At the time of the merger announcement about 90% of STB Systems' sales were nVidia TNT-based boards and every major system OEM was buying them. Post-merger announcement nVidia obviously refused to offer any new chip designs to STB Systems. What's worse 3dfx had never been forced to meet an OEM development cycle and even if it had, their new Banshee was at best competing with the nVidia TNT (and not even beating that) and not the current generation silicon.
    When 3dfx and STB Systems merged they were flush with something like $100M in cash. However STB Systems had done a capital lease financing arrangement on it's Mexico production line and those multi-million dollar lease payments had to be made each month whether the production lines were producing products 3dfx/STB could sell or not. It didn't take very long before the fabs in Mexico were idle and the company was staring bankruptcy in the face, because the few wholly 3dfx-based boards they produced sold only a tiny fraction of what the Mexico fabs could spit out. Also STB Systems had just built a new headquarters that it had to pay for and all those engineers STB and 3dfx had on staff didn't work for free.
    So it wasn't too long after the merger that they went looking for suitors. nVidia cut them a deal to buy out their intellectual property and hire their device driver and firmware developers. The remains of 3dfx entered bankruptcy and the support staff were shown the door.
    Regards,
    Joel Corley,
    Windows Device Driver Developer,
    Recently Retired from Microsoft,
    Formerly a Developer for STB Systems and 3dfx...

    • @dbcooper.
      @dbcooper. 3 роки тому +9

      RIP who read this all along

    • @arenzricodexd4409
      @arenzricodexd4409 3 роки тому +38

      @@dbcooper. it is a nice read

    • @emmausgamer
      @emmausgamer 3 роки тому +22

      @@dbcooper. It's a very interesting read.

    • @stevencorley6691
      @stevencorley6691 3 роки тому +3

      😱 that was one hell of a long comment

    • @Kaboomnz
      @Kaboomnz 3 роки тому +17

      Thank you for your post, it's a very interesting read.

  • @curtswartz
    @curtswartz 3 роки тому +71

    I used to work in the video card industry in the valley. I helped design video cards and all chip companies like Nvidia, Tseng labs, Rendition and 3D labs would come and show us their latest wares. We actually made a 3DFX Voodoo card that sold very well. At one point I put together a video card that was both PCI and AGP on a single card. You would just flip the card over and change the bracket to use the card with the other bus. Wow such memories.

    • @alexlo7708
      @alexlo7708 Рік тому

      The 1st ides to USB-c and Apple charger.

  • @AirCrash1
    @AirCrash1 2 роки тому +19

    From my experience in the industry it was how Nvidia released driver updates for their cards. Sometimes 5 years after they stopped selling them which was practically forever compared to the other players who released graphics cards with buggy drivers and then never fixed them and stopped releasing updates after 1 year. In the early days of windows a graphics driver problem would result in a BSD and loss of work and no real clue what the actual problem was with the PC. I saved the company I worked for thousands in warranty repairs by only using reliable and supported graphics cards and ditching all the other brands. Just about all the other PC components if there was a fault would clearly show themselves. Graphics card problems cost a fortune to sort out, customers bringing back machines that worked all day in the workshop but would BSD when the client was doing some unusual operation, I would return the cards to the OEM, they would test them and send them back. A nightmare of epic proportions.

  • @RenzoTravelsTheEarth
    @RenzoTravelsTheEarth 3 роки тому +31

    My uncle used to work for SGI back in the early 90s and he said that at some stage he remembers some people pitching the idea of a consumer graphics card and they basically got laughed out of the room.
    He was also telling me they basically invented video on demand but at the time the bandwidths available were too low to be practical and the project was eventually abandoned.

    • @georgesdev4577
      @georgesdev4577 2 роки тому +7

      I can confirm the first point. It happened during the worldwide sales conference around 1992 or 1993. Audience of about 500 sales and engineers.
      Top management dismissed the idea.
      Some engineers mumbled their disagreement in the back of the room.
      The rest is history...

    • @georgesdev4577
      @georgesdev4577 2 роки тому +2

      Regarding VoD, that is right too.
      But it was not abandoned, it was spun off into Kasenna and had many commercial successes, including at Tier 1 operators.

    • @TheUtuber999
      @TheUtuber999 Рік тому +1

      Probably laughing out loud while filing for the patent.

    • @lemagreengreen
      @lemagreengreen Рік тому +1

      @@TheUtuber999 Not quite but a few SGI engineers were listening and agreed, those are the guys left shortly after and founded 3dfx, using their experience to design that first Voodoo chip.

    • @Martinit0
      @Martinit0 8 місяців тому

      B2B and B2C distribution are very different beasts. SGI may well have been capable of solving the technical challenge but then there is the go-to-market strategy that has to match the customer. Jon gave us right in this video the example of how you can fail with 3dfx trying to go from B2B (selling chips to gfx card makers) to B2C (selling finished cards to consumers). Fail you will if not prepared for differences in sales and marketing approach.

  • @TheNefastor
    @TheNefastor 3 роки тому +64

    Such a trip down memory lane ! I remember living through it all, but of course back then we didn't have all this information about how it all came to pass. Thanks, man, I enjoyed that.

    • @playnochat
      @playnochat 3 роки тому +5

      I had Nvidia's Riva 128 in my first computer, but immediately bought Voodoo 2, because Riva was such a trash. It only sell as OEM graphics card. I don't even know if any game actually worked well with it. Not that it was NVidia's fault because DirectX was such a bad joke at that time:
      "Plug and play" -> "Plug and pray"
      "3D accelerator" -> "3D decelerator"
      At least DirectX had build-in software rendering support, so you could play games with it as the last resolt.

    • @jhutfre4855
      @jhutfre4855 Рік тому

      @@playnochat indeed i also remember Riva as quite a trash in fact, I even had to play Fifa 99 on Software mode! Why did I pay for that card??

    • @JTWhite-zm4mx
      @JTWhite-zm4mx 2 місяці тому

      What a ride it was, man!

  • @noutram1000
    @noutram1000 3 роки тому +59

    There's clearly a follow up to this as NVidia GPUs begin to get used in things like neural networks, crypto mining and cutting edge supercomputers... You could argue that without video gaming we wouldn't have had 'the third wave' of AI we now see is so transformative.

    • @brodriguez11000
      @brodriguez11000 2 роки тому +6

      Agreed. The never ending quest by gamers for realism (physics, compute shaders, etc) helped pay for the R&D that later opened up all those other needs.

    • @johndoh5182
      @johndoh5182 2 роки тому

      There's also a follow up to this as Nvidia is now not so friendly with its board partners (AIBs) anymore. You just had one drop out who was a bigger name in N. America, EVGA.
      They're clearly trying to move to where they manufacture their own products and they don't have any board partners in the future.
      The latest boondoggle almost shows the lack of concern between Nvidia and AIBs, which is the move to a high power 12V power connection from the PSU to the GPU. Nvidia I'm sure did a lot of research with this connection and for their branded boards the way they're mounted is different than probably all the AIBs are, and now AIB GPUs are creating a fire hazard for the brand new RTX 4090. Meanwhile Nvidia GPUs don't have a problem. Nvidia is probably the company that wanted the AIBs to move to this high power connector (AMD doesn't use it) because Nvidia developed the standard for it, but most likely failed to give the AIBs ALL the research data that went along with the development of this connector.

  • @sweealamak628
    @sweealamak628 3 роки тому +229

    This reminds me of another game changer back in the day: Sound Blaster. Would be nice to have a recap of the rise and fall of Creative.

    • @TheNefastor
      @TheNefastor 3 роки тому +14

      AWE32... The first time I played with a true MIDI synthesizer. It blew our socks off back then. I remember playing too much Rise of the Triad just because the music was so good.

    • @daviidgray1681
      @daviidgray1681 3 роки тому +13

      Excellent suggestion given the legacy of Creative Technologies. A technology hardware company founded in Singapore and sufficiently 'global' to achieve a NASDAQ listing. This was truly rare 20 years ago.

    • @mceajc
      @mceajc 3 роки тому +9

      Agreed! I still have an Audigy2 card - which was (and probably still is) superb - but no good driver support for the latest OS's. It did things that no sound driver/card I've seen in ten years has been able to do. It seems Realtek do things /just/ well enough that it's not worth shelling out for anything better unless you are a real enthusiast.

    • @sweealamak628
      @sweealamak628 3 роки тому +5

      @@TheNefastor I was a little kid back then and everyone would crowd in front of the family computer to marvel at the interactive game play with sound effects better than the arcade. Somehow I wish I could mod my PC and play those games again.

    • @sweealamak628
      @sweealamak628 3 роки тому +8

      @@daviidgray1681 Yep. Very rare indeed. I had a glimpse of the founder himself when he was a humble computer repairman in a rickety old shopping mall. A hardworking and honest man that repaired my brother’s computer. Little did we knew he would go on to much bigger things.

  • @pdsnpsnldlqnop3330
    @pdsnpsnldlqnop3330 3 роки тому +40

    Also important was how Nvidia were able to recruit whole teams from SGI when Microsoft announced they were doing Fahrenheit with SGI - this was in the days when Microsoft also bought SoftImage and wanted to take over the workstation space with Win NT, causing SGI difficulties apparent to all their staff. It wasn't even seen as a big deal for SGI staff to defect. So they did, taking their knowledge of the fundamentals with them.

    • @LMB222
      @LMB222 3 роки тому +11

      Now I understand why Windows NT was also built for the MIPS architecture, the one used by SGI.

    • @excitedbox5705
      @excitedbox5705 3 роки тому +5

      Nvidia has always played dirty.

  • @LikaLaruku
    @LikaLaruku 2 роки тому +7

    I remember the graphics card wars.
    I remember when graphics cards cost less than a complete desktop PC set.

  • @crylittlebitch
    @crylittlebitch 3 роки тому +10

    great video. Personally, I would've liked a section that goes more indepth into the recent years, as well as the ARM deal. Also, I'd love a video similar to this one for AMD, as well as one on the merger between AMD and Xilinx. Hopefully some of these topics will make it into future videos!

  • @toothofthewolf
    @toothofthewolf 3 роки тому +58

    Odd Matrox didn't get a mention. The Matrox Millenium was a top selling card for a while. Intel's MMX extensions was also a resource Quake utilized that helped Nvidia piggyback. But otherwise this doco was very on point.

    • @pfefferle74
      @pfefferle74 2 роки тому +12

      Matrox was only able to compete for a while because the 2D performance of their chips and DAC converters for VGA was one of the best out there, especially at higher resolutions. But when more and more games shifted to 3D and DVI became a thing, they were left in the dust.

    • @andycristea
      @andycristea 2 роки тому +3

      Quake does not use MMX.

    • @alexlo7708
      @alexlo7708 Рік тому

      How about S3?

    • @nneeerrrd
      @nneeerrrd Рік тому +2

      ​@@andycristea agree

    • @Martinit0
      @Martinit0 8 місяців тому

      I had a Matrox card specifically for their good 2D performance (at the time I didn't care about 3D games).

  • @fraktaalimuoto
    @fraktaalimuoto 3 роки тому +103

    Nvidia going for General Purpose GPU computing with CUDA was a genius move. As a GPU computing expert doing physics simulations, I say it is now a significant performance and cost benefit in high performance computing sphere.

    • @TheNefastor
      @TheNefastor 3 роки тому +9

      Didn't AMD try to do the same thing with OpenCL ? I'm a total noob compared to you, but I remember OpenCL being praised for being open-source, unlike CUDA. Did something go wrong ? Or is AMD competitive in that area ?

    • @TheNefastor
      @TheNefastor 3 роки тому +7

      @@minespeed2009 I've recently started learning deep-learning and yeah, I'm a bit scared of being trapped in the Nvidia ecosystem the way I got trapped into the Microsoft ecosystem.

    • @franciscoanconia2334
      @franciscoanconia2334 3 роки тому +6

      @@TheNefastor Just wait for the XILINX acquisition to be complete. OpenCL and FPGA processing will compete favorably against CUDA. ALSO cuda uses an api-oriented architecture while OpenCL is written in machine code and executed directly.
      So far NVDIA ecosystem has gained traction for sure. BUT AMD is on a roll and wound't be surprised if they somehow developed new synergies with their x86+GPU+FPGA ecosystem all under a single roof.

    • @TheNefastor
      @TheNefastor 3 роки тому +1

      @@franciscoanconia2334 that would be great since I speak VHDL too 😄

    • @alexandresen247
      @alexandresen247 3 роки тому +1

      how does it compare to openCL? why did you choose to use cuda instead of openCL?

  • @snawsomes
    @snawsomes 2 роки тому +6

    I hope this video makes you a lot of money over the years. Simply one of the best lectures about the history of graphics cards on the net.

  • @RalfStephan
    @RalfStephan 3 роки тому +15

    Actually, Castle Wolfenstein was the first 1st person shooter. Its wide distribution in the mailbox scene prepared for Doom's success.

    • @EivindSkau
      @EivindSkau 10 місяців тому

      One could argue that Ultima Underworld was the first one as it was released two months before Wolfenstein.

  • @punditgi
    @punditgi 3 роки тому +7

    Superb video with great explanations! Keep these videos coming!

  • @THE16THPHANTOM
    @THE16THPHANTOM 3 роки тому +22

    damn, i have never once made the connection between Nvidia and the Latin Invidia. this is the second time this has happened to me. first time being the connection between Asus and Pegasus. so much for the creativity part of my brain.

    • @Interestingworld4567
      @Interestingworld4567 3 роки тому +5

      If you put it this way I think Nvidia with their name are trying to say that every one else that are not them are Jealous meaning Envidia in Spanish.

    • @SuperBlackmen10
      @SuperBlackmen10 Рік тому

      I have read another theory saying the name came from the similarity to the word "video"

    • @renecruz2964
      @renecruz2964 10 місяців тому

      Lol I feel the same way

  • @tradito
    @tradito 3 роки тому +10

    22:00 pretty sure they mean Graphics Processing Unit, not general processing unit.

  • @CodeCharmer
    @CodeCharmer 2 роки тому +6

    Didnt some of the top engineers at Nvidia come from Silicon Graphics Inc? I think its fascinating how people moved around from various companies. From the 70's-90's I think fewer than 100 people really shaped the whole home computer market. Probably less than 50 in the chip and motherboard design space.

  • @la7era1u54
    @la7era1u54 3 роки тому +4

    3Dfx, I haven't heard that that company name in a long long time. The first graphics card I ever bought was a 3Dfx. I remember buying it like it was yesterday. I hadn't played games on a computer since the Commodore days in the 80s, but I had very fond memories of playing games all weekend and summer with the other boys from neighborhood so when I was finally on my own I went and bought a PC and a game. I took it home and tried to play it. After getting practically zero framerate I read the box and realized I needed a GPU. I knew what they were, but had no clue how to shop for one so I started researching and over 20 years later I still haven't stopped. That 3Dfx card was what got me interested into how PCs work and ultimately had me going back to school to take as many classes as I could on the subject

  • @EveGantian
    @EveGantian 2 роки тому +2

    Insanely interesting, having grown up using computers in the whole period covered by this video, it's super cool to now understand all these brands, technologies, terminologies, strategies and decision making by the companies. Thank you so much for the information - I thoroughly enjoyed it. Please keep up the good work.

  • @scottfranco1962
    @scottfranco1962 3 роки тому +8

    Just one small addition: When Intel pushed onboard graphics, where the graphics memory was part of the main memory of the CPU, it was thought that the video solution would actually be faster, since the CPU would have direct access to the frame buffer, as well as having all of the resources there to access it (cache, DMA, memory management, etc). The reason they lost that advantage in the long run was the dual advantages of VRAM or dual ported video ram, a ram that could both be read and written by the CPU at the same time as being serially read out to scan the video raster device, as well as the rise of the GPU, meaning that most of the low level video memory access was handled by a GPU on the video card that did the grunt work of drawing bits to the video ram. Thus Intel ran instead down the onboard video rabbit hole. Not only didn't they win the speed race with external video cards, but people began to notice that the onboard video solutions were sucking considerable CPU resources away from compute tasks. Thus the writing was on the wall. Later, gamers only knew the onboard video as that thing they had to flip a motherboard switch to disable when putting a graphics card in, and nowadays not even that. Its automatic.

    • @PaulSpades
      @PaulSpades 3 роки тому

      Shared CPU-GPU memory still is a good idea, sadly the graphics APIs didn't take advantage of it one bit and I think Intel's graphics drivers were always a bit crap. As such, only game consoles and some mobile devices properly implemented a memory pool with both cpu and gpu access. The Vulkan api seems to treat GPU and CPU cores symetrically, and shared memory on the ZEN platform should finally fix this on mainstream pcs.
      At this point, modern CPU cores support simd with AVX extensions, I don't know if there's any any point having separate graphics oriented APIs and architectures for gpus... the architectures seem to have merged towards eachother(GPUs getting more general instruction support and CPUs getting more parallel data instructions).

  • @Ivan-pr7ku
    @Ivan-pr7ku 3 роки тому +7

    3Dfx was a typical cash burning startup in the Dotcom era of the late 90s. The captured the PC gaming market in the right time with simple and working solution, but they always had struggled with evolving forward, with unusually long R&D cycles, while relying too much on recycling their old and proven tech. But the alienating of the OEM partners would be their biggest undoing from a long string of missteps... and they had no chance to compete with Nvidia's 6-month release cadence.

  • @sarcasmo57
    @sarcasmo57 3 роки тому +6

    Oh dude! I remember those old Vodoos they had a cable in the back of the pc that went from the video card into the 3d card. My first was the Banshee, 16mb of power! It was awesome at the time.

  • @Jimblefy
    @Jimblefy 2 роки тому +2

    Your videos are all very well researched and very interesting. Thanks to you and your team.

  • @RayMak
    @RayMak 3 роки тому +12

    I still remember their Voodoo Graphics Card

  • @hrishikeshb
    @hrishikeshb 3 роки тому +4

    Great video and I loved knowing the history behind the giant. If I've read correctly, a GPU is a Graphics Processing Unit and not a General Processing Unit as you mention it towards the end of the video - en.wikipedia.org/wiki/Graphics_processing_unit. The rise of AI and ML sciences made these chips popular because it is faster in orders of magnitude than a regular CPU with more or less the same power consumption. Nvidia is leveraging its know-how in developing graphics chips to now capture this market.

  • @Edward135i
    @Edward135i 2 роки тому +4

    3:31 SGI was so far ahead of everyone at this point they could have easily been the size of Nvidia today. SGI created the graphical hardware for the N64, and the N64 dev kits where SGI work stations. What a incredible blunder they made not getting into the consumer space.

  • @robertmorrison107
    @robertmorrison107 3 роки тому +1

    Great video. I wish you'd do one on Matrox as well.

  • @wino0000006
    @wino0000006 3 роки тому +4

    Jeez - I remember all these graphic cards revolutions in the 90s. ATI, RIVA TNT, VOODOO 3Dfx. It was the time when the AGP standard was introduced before being replaced by PCI Express.

  • @jackeldogo3952
    @jackeldogo3952 Рік тому

    I love this video, it's like what I lived through in the 90s trying to build and constantly improve my fankenstein machines. All the churn with the video cards brings me down memory lane (3Dfx, Intel 740i, D3D, OpenGL AGP vs PCI, Real3D, Voodoo 1, 2, Banshee, etc). Thanks!

  • @aniksamiurrahman6365
    @aniksamiurrahman6365 3 роки тому +4

    A followup with a GPU wars will be awesome.

  • @crunchworks22
    @crunchworks22 11 місяців тому

    I love your channel! I'm a mechanical engineer who switched from heavy machinery to semiconductor fab equipment engineering (CMP). Your videos have been so helpful to me learning how all this crazy wafer stuff works.

  • @djnavari
    @djnavari 3 роки тому +10

    This is an excellent channel. I’m definitely becoming a Patreon supporter this content is so well curated. it’s just technically excellent

  • @esvegateban
    @esvegateban 3 роки тому +19

    Don't forget that Nvidia never stopped selling their GPUs to third party companies, so today there's a myriad of brands using Nvidia chips from where to chose.

  • @NatrajChaturvedi
    @NatrajChaturvedi 3 роки тому +2

    I was young during the days of games like doom2, quake 1,2, Red Alert etc and I remember being able to run these games perfectly on our 486 machine without any discrete graphics installed. I know my dad and he hated the idea of paying extra for discrete cards so I'm sure there wasn't one in our system.
    We did have sound blaster tho and the games looked and sounded awesome!!

  • @freddyng1843
    @freddyng1843 3 роки тому +8

    Excellent video. One thing you missed out was Matrox. I believe they were the 4th market player back in the early 2000s.
    Though Nvidia GPUs usually cost more than it's rival, stability is something I always look out for which both Nvidia and Intel provides it. The red team has time and time again disappointed me in its quality. The introduction of Radeon Vega APU though change the whole gameplay for budget PCs again. I hope Intel's newest foray into the GPU space will bring a new level of benefits for consumers.
    Here are some of the Nvidia GPUs that I have ever owned and used includes:
    - Riva TNT
    - GeForce 2 MX
    - GeForce 6600 GT
    - GeForce GTX 660 Ti
    - GeForce GTX 770
    - GeForce GTX 1060
    Unfortunately bloody crypto miners are ruin the GPU prices for gamers now.

  • @BaghaShams
    @BaghaShams 3 роки тому +5

    GPU always stood for Graphics Processing Unit; GPGPU is the term for General Purpose Graphics Processing Unit

  • @GBlunted
    @GBlunted 3 роки тому +3

    So nostalgic! All these brand names take me back to my childhood and asking Santa for Voodoo cards MMX processors and AGP slots...

  • @alixxworkshop846
    @alixxworkshop846 3 роки тому +2

    This was one of those videos that I've really enjoyed watching and learned something new... Good job SIR!

  • @2drealms196
    @2drealms196 3 роки тому +28

    Thank you for covering Nivea. Love their lavender infused moisturizing cream.

    • @Cypherdude1
      @Cypherdude1 3 роки тому +1

      LOL. Nivea is pissing off a lot of people. They are going to stop producing their 30 series video chips this month. At least Intel is going to produce new video cards. There seems to be a never-ending insatiable appetite for video cards. I'm wondering when this shortage will end, if ever. They're saying the shortage will remain for 18 more months. At this point, I don't think people will care who makes the card or what the performance is as long as they get SOMETHING which is stable.

    • @mattmexor2882
      @mattmexor2882 3 роки тому

      @@Cypherdude1 Nvidia isn't going to stop producing 30 series cards this month

    • @michaelmassino6344
      @michaelmassino6344 3 роки тому +2

      @@Cypherdude1 Look again at the spelling of the above comment. Nivea makes cosmetics.

    • @David-lr2vi
      @David-lr2vi 3 роки тому +1

      😂

    • @sandyleask92
      @sandyleask92 3 роки тому

      @@mattmexor2882 This makes my purchase of buying a an already overpriced 3080ti worth it. They are doing this to keep prices high I hear.

  • @fisherfish9589
    @fisherfish9589 3 роки тому +4

    The greatest move NVIDIA did was acquiring the marketing genius Brandon “Atrioc” "glizzy hands" Ewing. He pitched and developed the iconic Greenout campaign. He also pitched "Frames Win Games" Esports campaign, which is currently the flagship Esports campaign for NVIDIA and its hardware partners. Truly the GOAT of marketing

  • @harrison3452
    @harrison3452 3 роки тому +5

    It’s a shame how hard it is presently to get a hold of the new 3000 series gpus from nvidia. Really hope this changes, for a long time I built my pcs with amd but with the new gpus they came out with it makes gaming more affordable

    • @Statek63
      @Statek63 Рік тому

      Just wait for the crypto craze to die (in pain) and you'll be good 🙂

  • @justacasualgamer5774
    @justacasualgamer5774 3 роки тому +21

    Nvidia did a lot of dirty work to win the market domination~ the CEO is cunning. That's what makes a successful company.

    • @ConorDoesItAll
      @ConorDoesItAll 3 роки тому +11

      Yeah most companies have been stabbed by Nvidia but they have so much money that they don’t care.

  • @JohnnyWednesday
    @JohnnyWednesday Рік тому +1

    The Cromemco "Dazzler" was the first graphics card in 1976. (and you're about 10 years too late for the graphics revolution - the Amiga was doing all of this in 1985 as well as the GUI. The screenshot of GLQuake you show is actually the software renderer - it doesn't use bilinear filtering. If you go to the place you sourced the image you'll see you took the wrong screenshot - this is a software shot next to the GL shot for comparison)

  • @pyrosphynx5449
    @pyrosphynx5449 3 роки тому

    this took a lot of effort and it turned out great! well done

  • @williamogilvie6909
    @williamogilvie6909 2 роки тому

    Very informative. In 1986 I joined a company in British Columbia, Gemini Technology, that had great ambitions to dominate the graphics card market. We designed an EGA chipset using Daisy workstations. I taped out the design at LSI Logic in Sept., 1987, a few months after IBM introduced the VGA. Earlier that year we had spent a month in Japan, at Seiko Epson's design center. I don't know why that effort failed, since their was little transparency there. After completing the design at LSI, I moved to Video-7, an up and coming graphics card company that had just completed a VGA chipset, and was selling graphics boards. My LSI Logic EGA chip, surprisingly, found some customers. At V-7 we examined the IBM PGA, trying to reverse engineer the chipset and I also helped integrate my previous design with a V-7 product. Gemini eventually failed and was absorbed by Seiko Epson. Video 7 merged with an LSI Logic division, but had some legal problems that required the CEO to wear an ankle bracelet. I continued designing custom chips and even ran into my former Gemini Tech. co-workers at a Silicon Valley design center. Most of all I enjoyed the time I spent in Japan.

  • @CandyGramForMongo_
    @CandyGramForMongo_ 3 роки тому +7

    This is an exceptionally objective and accurate description of the circumstance at the time. I was there, man! 😂
    Of course I loved 3dfx. Doom sold more hardware than anything!

  • @swamihuman9395
    @swamihuman9395 Рік тому

    - Excellent.
    - Thx for all your effort, and talent.
    - I got into computer graphics in the early 90s, so, I experienced all the history you covered, but did not know any of the back story... until now. Thx, again.

  • @exinetv1894
    @exinetv1894 3 роки тому

    Man, I just love your voice and calm explanation style.

  • @trombonemain
    @trombonemain 3 роки тому +3

    Wait, does “GPU” not stand for “graphics processing unit”?

  • @Mr30friends
    @Mr30friends 3 роки тому +15

    7:25
    A video on Malta and the fabs it has managed to lure in might be interesting.

    • @RandomByte89
      @RandomByte89 3 роки тому +2

      Is that Malta the country, or Malta in NY?

    • @Mr30friends
      @Mr30friends 3 роки тому

      @@RandomByte89 the country of course

    • @ntabile
      @ntabile 2 роки тому

      @@RandomByte89 If Malta in NY, THEN Global Foundries

  • @lashlarue7924
    @lashlarue7924 3 роки тому +3

    This is such great stuff! I’m completely gobsmacked by the anecdote about how 3dFX burned its customers due to an ill-conceived business idea - and then lost the entire game! 🤯

  • @rollingrock3480
    @rollingrock3480 2 роки тому

    Very informative! What a cool piece of history the GPU wars were! I remember playing Morrowind on the GeForce MX420 card back in 2002. I remember being dazzled by the water shaders. Give it 20 years from today and everything will be photorealistic. We're in for wild times coming into middle age.

  • @lexingtRick
    @lexingtRick 3 роки тому +5

    Québec's Matrox, the Parhelia, the origin of the ring bus. Canadian Ati picked it up. Matrox is still alive today.

    • @SianaGearz
      @SianaGearz 3 роки тому +2

      Parhelia, aka too little, too late, too expensive.

    • @aniksamiurrahman6365
      @aniksamiurrahman6365 3 роки тому

      What's the situation of Matrox today?

    • @jcurran8860
      @jcurran8860 3 роки тому +1

      We bought the latest Martox Hardware to get the upper hand in Decent II. Fun times in Online Gaming.

  • @RonJohn63
    @RonJohn63 2 роки тому

    18:37 I remember STB. They made a good Tseng Labs ET4000/W32 board that ran well with OS/2.

  • @petergambier
    @petergambier Рік тому

    Great show thanks Asianometry,
    My whole gaming world improved when I installed the Voodoo card, it was as different as night and day.

  • @user-7165jdhrnxymzn
    @user-7165jdhrnxymzn 2 роки тому +1

    great video! 👍🏻 could you please make a video about matrox, 3dfx, number9 , thank you

  • @Origina1saltine
    @Origina1saltine 3 роки тому +1

    i wish you had another 10 minutes on this video that explained to date graphics card market shifts.

  • @ArnaudMEURET
    @ArnaudMEURET 3 роки тому +2

    I owned every one of these early graphics cards and monitored Carmack’s .plan file eagerly in these years. The timeline is not entirely clear and some events and declarations end up seeming intermingled, albeit not intentionally.

  • @fcfdroid
    @fcfdroid 2 роки тому

    Now this is the video I was looking for! Good stuff dude 😎

  • @JamesLaserpimpWalsh
    @JamesLaserpimpWalsh 3 роки тому

    Cheers for the vid. Concise and accurate. Good work

  • @organichand-pickedfree-ran1463
    @organichand-pickedfree-ran1463 3 роки тому

    20:42 that Unreal Tournament screen shot is some good nostalgia :D

  • @epposh
    @epposh 3 роки тому +18

    i've always thought that GPU stood for graphics processing unit... perhaps you mean to say GPGPU?

    • @William-Morey-Baker
      @William-Morey-Baker 3 роки тому +7

      it does mean graphis processing unit... he was wrong... at least it was relatively small, but it makes me doubt a lot of the other stuff he says

    • @javaguru7141
      @javaguru7141 3 роки тому

      @@William-Morey-Baker All people make mistakes, even really braindead ones. It doesn't necessarily have any deeper implications.

    • @javaguru7141
      @javaguru7141 3 роки тому

      @@William-Morey-Baker I should also add that that kind of "gotcha" thinking can be very toxic and is often used to reinforce anti-intellectualism. Be careful to avoid bias.

    • @epposh
      @epposh 3 роки тому

      @@javaguru7141 i kinda had the same thoughts as william morey-baker. this wasn't just a slip of the tongue - this is basics; and if he can make mistake in this, it makes me question his knowledge about computers. i wonder if he's just someone simply reading a script, without understanding of what he's reading, which i've seen happen too many times in other youtube videos.
      p.s. this is a very simple point that william is making. big phrases like "toxic thinking" or "reinforce anti-intellectualism" are completely out of place here. sounds more like you saw this whole sentence somewhere else and decided to paste it here to appear smart.

    • @javaguru7141
      @javaguru7141 3 роки тому

      @@epposh *sigh*... I won't bother opening a conversation about the latter. But regarding the former, I watched the video again and I'm pretty convinced it really was just that - a slip of the tongue. NVIDIA really did start pushing GPGPU around that time. The acronym for that is General-Purpose GPU. You could easily backronym that as "General-Purpose Unit". I wouldn't be surprised if someone at NVIDIA actually uttered those words at some point. The important information he was trying to convey was accurate.

  • @williamlloyd3769
    @williamlloyd3769 3 роки тому +2

    Recall getting these graphic boards as hand me downs from our stock trade floor. Traders were upgrading their desktops in the 1990s at will so why let last years model go to E-waste.
    PS - are you going to do a part 2 on impact of digital currency on the graphics market?

  • @sepolopez6706
    @sepolopez6706 3 роки тому

    Great video as always! Keep in mind that without ASML nothing would be possible. Thanks!

  • @Johnslist
    @Johnslist 3 роки тому

    Excellent overview and history! Thanks, well researched.

  • @zodiacfml
    @zodiacfml 3 роки тому +6

    0:15 dang. Nvidia is now #11 next to TSMC at 545B just right now. i dont see any reason except the rumors that Nvidia are slowing down production to maintain high GPU prices

    • @elon6131
      @elon6131 3 роки тому +2

      it's a BS rumour that makes no sense lol, as nvidia sells the GPUs at the same price regardless of final price to consumers. Nvidia's stock price has everything to do with their dominance in data center and AI, very little to do with gaming sales even though these represent a good chunk of their revenue still.

    • @zodiacfml
      @zodiacfml 3 роки тому +1

      ​@@elon6131 while i did not watch that rumor, Best Buy increased their prices on Nvidia cards.
      that is shocking new info. nvidia received flak from some of their investors when crypto crashed in 2018 and crashing the stock, arguing that Nvidia downplayed their revenue from graphics cards/mining in 2018.

    • @elon6131
      @elon6131 3 роки тому +2

      ​@@zodiacfml and then nvidia got mad at them and they had to roll back. Though FE editions are the only ones nvidia directly sells, final retail price is not in their hand nor do they benefit from any markup the retailer puts on the product. The best buy increase was a best buy move (consistent with their pricing on the other cards, until nvidia had them roll it back). nvidia's FE msrp has not changed.
      i am aware they got flak after the crypto downturn, though it was most certainly not malicious as nvidia cannot know who buys their cards once they are in retailers hands (and lying on reports would be very, very illegal. you do not mess with the SEC).

    • @zodiacfml
      @zodiacfml 3 роки тому

      @@elon6131 good one.👍

    • @cedricchiu9763
      @cedricchiu9763 3 роки тому

      crypto mining

  • @hamsta11
    @hamsta11 2 роки тому

    Great video! There's also the issue where 3dfx kept suing nVidia who just patiently bore the brunt of the lawsuits until when they finally bought them. Its a good business lesson.. don't worry about doing things that your competitor might sue you for if you can plan to just eventually buy them out.

  • @angelg3986
    @angelg3986 3 роки тому +3

    Somehow Nvidia got the devs of machine learning frameworks to use their proprietary CUDA API instead of OpenCL or GLSL. This made a huge impact making their GPUs the only ones capable of AI. It would be interesting to have a video of why/how all developers chose CUDA. It still seems a strange choice locking them to a vendor.

    • @PaulSpades
      @PaulSpades 3 роки тому +1

      I'm not AI guy, but... GLSL is OpenGL's hack for opening up the graphics pipeline to software - a c-like language to process vertex and pixel data. While it can be used to process data, it needs two or at least another layer API above it, because it presents entirely the wrong abstractions for general computing. And the language compilers aren't anything to write home about either. OpenCL is that API layer above GLSL, and it's not a great API from what I've read. So CUDA being more popular is not at all surprising.
      Vulkan seem to be alot better though - it presents GPU and CPU cores (or any other type of processors) the same way, with the same memory model - this was the whole AMD homogenious compute idea and they've been executing it on the hardware level for more than a decade, now the software is there to provide a decent programming model for it.
      TLDR, There's no use making brilliant hardware if nobody can write software for it, what ABI or API, and sofftware tools you present to a programmer matters a lot.

    • @angelg3986
      @angelg3986 3 роки тому +1

      @@PaulSpades CUDA wasn't so good at the beginning - just a proprietary way to write shaders. But I agree that AMD and the rest (Intel?) neglected what impact would have this few years later.

  • @wonlop469
    @wonlop469 3 роки тому +9

    I can only speak to my experience as a PC builder sine the late 90's.
    It was the Drivers.
    At least in my experience, Nvidia drivers were generally more stable than their peers.

    • @cpt_bill366
      @cpt_bill366 2 роки тому +1

      Uhh, NO. My Voodoo worked just fine out of the box while my friends with Nvidia TnT cards were always complaining about constantly patching. To this day Nvidia's rapid patch schedule is annoying, because every time I patch I need to reset my resolution scaling. I just stopped patching as a result.

    • @wonlop469
      @wonlop469 2 роки тому +1

      ​@@cpt_bill366 Your going way back there brother, when we had to mess with IRQ's and such. I think my first was a GeForce 6600 on AGP. Never messed with ISA cards so can't speak to them. God IRQ's were a pain in the arse, along with 9600 baud modems.

  • @St0RM33
    @St0RM33 3 роки тому +3

    hey, please volume normalize your videos because they are too quiet

  • @djmips
    @djmips 11 місяців тому

    I worked at one of those 3rd party board companies in the mid nineties and so far this is the only presentation that gets it right. Great job.
    I was even offered a job at Nvidia who directly recruited me in their drive to vertically integrate graphics drivers. I am sad I said no.
    But I could tell that Nvidia was going to win it all, on my visits, they shared with me their secret inner sanctum where there computers were situated running that vaunted design simulation. Their record of getting chips out on first turn was unmatched. It's almost like your observation about TSMC - they would win with cycle times.

  • @Drumaier
    @Drumaier 2 роки тому

    Nice video and It was incredible to see that little graphic card that Intel made in the '90s that went nowhere....same with Arc now. I don't understand why the N1 CPU maker in the world can't just make some competitive good-performing GPUs. They have experience designing chips and financial resources.

  • @HeLicks
    @HeLicks 3 роки тому +1

    I was about to start production on a video essay about this exact topic...

  • @Bonifaquisha
    @Bonifaquisha 3 роки тому +6

    For those interested: The IBM Professional Graphics Controller is called 'PGA' because the A can stand for either Array or Adapter. In modern English the letters 'A' and 'C' are commonly used interchangeably, much like the letters 'ß' and '§'. This can often be confusing for those who speak British English because IBM chose to use the uncommon 'Controller', instead of the classic spelling 'Aontroller'.

  • @PlanetFrosty
    @PlanetFrosty 3 роки тому

    I worked as Director of Development at an SGI major integrator and we later added Sun Microsystems and I become CTO of a Telecom Manufacturer who pattered with one that hand licensed equipment manufactured in Taiwan.

  • @TheRealSteviee
    @TheRealSteviee 3 роки тому +1

    The voodoo was my very first GPU. Paired with an Intel 333mhz single core cpu and if I remember correctly, 128mb of ram. ah, the memories :)

  • @MostlyPennyCat
    @MostlyPennyCat Рік тому

    I remember when I first had a go with Linux, it was Debian.
    I couldn't figure out how to install drivers.
    Right up until the point you realise they just get detected and loaded as kernel modules at boot or, at most, pulled in with an apt-get command.
    Certainly was an eye opening moment in my journey as a software engineer and architect.

  • @refinedsolutions1513
    @refinedsolutions1513 3 роки тому +6

    Good to acknowledge the legend Mr Carmack. Blessings from the yoruba-Oduduwa.

  • @nufosmatic
    @nufosmatic 7 місяців тому

    1:15 - in 1991 triangle-database graphics engines where $M machines that needed a $M computer to run their databases and were used to provide displays for simulation flight trainers. I worked for a company that provided the simulation computer that requested a set of displays at 15/20/30/60Hz. Five years later those graphics machines were obsolete...

  • @picobyte
    @picobyte 3 роки тому

    Double voodoo2 with viper550 was my cannon game system, later got the 770 and so on, the early days of gaming where one of the best rides during my life.

  • @AediionTV
    @AediionTV 3 роки тому +22

    Now this is a real Brandon "Atrioc" "Marketing Extrodidinare" "G.H." "Ewing" Ewing classic

  • @JohnGaltAustria
    @JohnGaltAustria 3 роки тому

    Very well done overview.

  • @bujin5455
    @bujin5455 3 роки тому +1

    A very interesting comparison there at the end between the different vertical integration paths. People have a hard time understanding what things you should bring in house which will materially add to product quality, verses materially adding to complexity and being a burden on quality. Reminds me of the people who seem to think Apple buying TSMC is a good idea. Nothing could be further from the truth.

  • @lerntuspel6256
    @lerntuspel6256 2 роки тому +1

    Hopefully this will also help with my interview with them soon...

  • @josecuervo3351
    @josecuervo3351 3 роки тому

    Excellent historical summary!

  • @mkmishra.1997
    @mkmishra.1997 3 роки тому

    one feedback i would like to add to an otherwise great content is to increase the microphone volume output

  • @AzBachour
    @AzBachour 3 роки тому +1

    I was very sad when Nvidia arrived and killed 3Dfx, i use to love their cards... I had my beloved Voodoo 3 3000 😍

  • @FingersKungfu
    @FingersKungfu Рік тому +1

    I still remembered when the Voodoo cards ruled the market.

  • @MikkoRantalainen
    @MikkoRantalainen Рік тому

    Great document about GPU history! I'm old enough to have actually bought a Voodoo graphics card with my own money as a new.
    I feel that the drivers of NVIDIA cards are still the most important part of the technology. It appears that AMD (ex ATI) hardware may actually have more computing power but drivers fail to surface it to the applications.
    In addition, OpenCL took too many years to create and NVIDIA proprietary CUDA took the market and many apps (including open source Blender 3D) have proper hardware support for CUDA only, even today. AMD is supposed to have something cooking for Blender 3D version 3.5 but the actual performance remains to be seen.

  • @philflip1963
    @philflip1963 2 роки тому

    More info on the detailed architecture of these systems would be interesting.

  • @tma2001
    @tma2001 3 роки тому +1

    you missed a few crucial points - SGI made a lot of misteps of its own by the late 90's and most of the talent (and IP) ended up at nVidia which gave them a big advantage. Also the NV1 was not polygon based and thus incompatible with the graphics standards of the time (ok as an arcade game blackbox).
    Also wrong re. device drivers - OEM's got the soucre code to customise the branding but that was it. There were plenty of PC Workstation based professional OpenGL cards around that undercut SGI (I should know I worked for one) that could have been the next nVidia but that heritage and mindset held them back, looking down on the consumer market until it was too late and the arrival of the GeForce 256 that integrated T & L (long available on workstations but as seperate chips and very expensive).

  • @CaptainDangeax
    @CaptainDangeax 10 місяців тому

    I was running a computer shop between 1997 and 2000, and I saw brands come and go. I sold a lot of 3dfx voodoo1, voodoo2 and banshee. But in 2000, Riva 128 and Riva TNT were arriving and changing the face of the market . The king's choice of NVidia was to handle the drivers by themselves. Even with ATI there's always a wheelbarrow full of compatibility issues, even today

  • @KokoroKatsura
    @KokoroKatsura 3 роки тому +1

    viewing on my titan x (maxwell) on VGA output on CRT
    maxwell is the last gpu that actually supported analog signal

  • @Trick-Framed
    @Trick-Framed 3 роки тому

    What an awesome channel! Liked, commented, subbed! Thank you!

  • @MisakaMikotoDesu
    @MisakaMikotoDesu 2 роки тому

    Love these videos! Thanks!

  • @Kreln1221
    @Kreln1221 3 роки тому +2

    *Excellent video..., though I feel that the recent rise of crypto-currency mining, and it's effects on the GPU industry deserved a mention... Still..., very in depth and informative... Thank you for posting it...*

    • @freeculture
      @freeculture 2 роки тому +1

      Nvidia made a mistake there, and burned money for nothing. They pretended to dictate what clients could use their products for. They made a similar mistake in their attempt to push Quadro, to put artificial blocks. Turns out people quickly found ways to use their gaming boards both for science and mining, no matter what they did. Only the market itself regulated this on its own, with the current bear trend and the imminent ETH crash after it becomes un-mineable, lowered the demand.