RIP Gordon, you don't know how many people you have inspired and taught over the last 25+ years, right now any techtuber (and tech companies I imagine) is going to be mourning heavily. Thank you.
Its funny cuz theyre doing it live. Tv used to be live whether it was a show or news. Nowadays mostly news does it. And that chameleon demo in 2001 is as impressive back then as it looks good today. When theyre using the entire gpus power to render a relatively small but incredibly detailed scene. Stretching the performance for an entire game downgrades the graphics big time. it wasnt until 3 to 5 years later until we saw something like that in games i think. half life 2, doom 3 ect.
From the advent of Windows 98 to where we got to 3 years later in 2001, tech moved faster than we could think. I'm still using my 5-year old i7-2600K. 5 years back then could take you from an 486 to Pentium 4. Those were quantum leaps in computing power every generation.
@@nadirjofas3140 True but it is still on the 2019 top list of most hardware demanding games. geekculture.co/9-graphically-intensive-games-that-will-push-your-desktop-to-its-limits-in-2019/
You guys might not believe this but I actually called the number at the end to purchase a video tape copy of the program and I received my VHS copy in 2023! The guy on the other end of the line told me they've been renting a warehouse full of tapes since the late 80s and they are bleeding out financially each month but hes still really passionate about the computer chronicles and wouldn't change anything.
@@AOKONE ua-cam.com/video/eatIzqwB2dA/v-deo.html Christ died for your sins and rose on the third day, showing that anyone who trusts in him for salvation, will have everlasting life. (John 11:25-26) "Jesus said unto her, I am the resurrection, and the life: he that believeth in me, though he were dead, yet shall he live:And whosoever liveth and believeth in me shall never die. Believest thou this?" (John 3:16) For God so loved the world, that he gave his only begotten Son, that whosoever believeth in him should not perish, but have everlasting life.
This really makes me feel uneasy on how old I am.. 2001 still seems so close.. yet it was 19 fuggen years ago.. I was 16 back then and thought this is it.. I am living in the future.. playing Counter-Strike and Warcraft 3 and Diablo 2.. being excited AF..
In 2023. This video hit UA-cam in 2013 so we are almost as afar away as this video being uploaded to today as this was from its source when it was put on UA-cam
I love this show. I love retro PC gaming and these videos take me back to a better time. I also love looking at the games in the background on the shelf.
And those weren't even the best game of 2001, more like the crappiest and most overrated (especially that piece of unplayable shit that Black & White was, the only thing that software debacle did was further establish Molyneux as one of the biggest lying hacks in the game industry)
@@JH24821 I don't think too many can hold up to today's standard, not even with all the nostalgia in the world. But Heroes of Might & Magic 3, Jagged Alliance 2, System Shock 2, Fallout 1 & 2, Baldur's Gate 1 & 2, Gothic 1 & 2 (if you can tackle the controls) and GTA Vice City hold up, i would say. Storywise Planescape Torment still ranks as one of the best written games and shows how bad writing in games actually was and sadly, still is (Bethesda is the prime example of this, their writing is atrocious and always was) I recently tried to play that Diablo 2 remaster with a friend and i was shocked at how horrendous and limited it was, compared to more modern titles like Grim Dawn. It's so bare bones and downright boring in every aspect, i couldn't believe it (i hadn't played it in over 15 years) Same with CnC Generals i recently tried to play. Incredibly stupid AI, with shoddy path finding and completely unbalanced gameplay (Super Weapon General is basically invincible, since you can't do anything against that death ray from the sky). The problem with games back then is that a lot of them aren't fun because you're so spoiled today you quickly notice things like the braindead AI. Eiter the AI cheated like crazy (especially in strategy games like HoMM) or it was pre-scripted and you could easily figure their paths out and trick them (Mafia is a great example of that, every AI enemy follows a scripted path and doesn't even really react to what you do. There's a mission where you have to steal a truck. If you miss the show, the Truck speeds away, but instead of actually trying to escape from you, it follows a scripted path and drives in a circle, you can pretty much wait for it to show up again. It's laugjable) It is a whole lot better today, even if a lot of games suffer from being to easy in general (the best example is Rockstar and how their games became easier and easier starting with Bully compared to the insane time limits, precise jumping ("Just follow that damn train!") and bullet sponge enemies of the older games). Most Genres on PC aged horrendously, especially sports, racing, most Shooters (again, horrible AI and too many limitations) In that regard, console games aged better, since for example most sports games were pure arcade and titles and franchises like Tony Hawk, Mario Kart, NHL Hitz, NBA Street, Burnout, the wold Aki wrestling games or Blitz the League hold up because they are unrealistic to begin with.
Gordon the GOAT here will another great Will Smith. I have followed Gordon's work since the Boot magazine days. Still hard to take in that we won't have your insight in future tech releases. You were one of the pillars of the tech reviewing industry. So much nostalgia in this video. I remember that AW case, spent countless hours playing black and white (please make a new one), and reading Gordon's and Will's articles every month in Max PC.
What's apparent is that progress has really slowed down in the last 20 years. A game from 2001 still looks ok today, whereas a game from 1980 looked like caveman graphics to someone in 2000.
That's subjective but older games at least to me lack a lot of polish that we take for granted now a days. Graphics wise we have been spoiled by newer games so we can see the smoke and mirrors so they feel less immersive.
Started with a SFF Dell "Clamshell" Optiplex GX280 with a 2.8 GHz Prescott Pentium 4 "521" HT, 1 GB blazing 800 MHz Dual Channel DDR2 and an ATi Radeon X300 SE with a whopping 128 MB of VRAM, all running on Windows XP Professional SP3. This was in 2008 for me as my first PC, it was already useless and outdated in a world with Core i chips, however It had some of my best memories, even better than my built PC now, It's extremely cool and "rocket science" to you as a kid. It's crazy how far we've come in 20 years.. And at the time all this stuff was bleeding edge. I love this channel for preserving PC and tech history, Thank you.
this era was around the time i had to stop gaming for a few years. well, i made some bad choices and wound up in juvie for a few years. i was sent in 2001 and got out in 2004 when i turned 18. then, i had to focus on getting my life started properly. it wasnt until 4yrs ago that i got back into PCs. seeing the MASSIVE changes that occurred while i was away to recent, its absolutely crazy. back in the day, 1ghz was a HUGE deal. now, im sitting here with a Ryzen 5 3600 and its crazy to have that sort of power and thinking "meh, mid range". if i had the system i have now in lets say back in the early-mid 90s when i was real into PCs, its so advanced that i would have thought it was "alien technology". heck, i even remember standing with my uncle out in front of a store called "The Good Guys" in Reno, NV waiting for the store to open so we could snag a copy of Windows 95. i love seeing stuff from a time i wasnt able to participate in PC related stuff though.
Brilliant game - I went to Lionhead and helped beta test if for two weeks! I was about 14 or so at the time. They used to let people come and help beta test it off their forums. Shame Lionhead no longer exists.
I remember skipping a late afternoon class to go into a PC shop in the city to buy a Radeon 8500, which was a rival card to the Geforce 3. I was a boarding student and many of the kids had a Geforce 2. So it was pretty cool to for a bit to be the kid that could run the Doom 3 demo, and any game (GTA 3) with sweet performance. Those were some good day's of the PC, GPU's, sound cards, multimedia add-on's. A good Sound Blaster sound card could even noticeably boost your frame rate as to take the sound processing away from the CPU. So it was a bit of a double win, better sound and game performance.
@@Sirvanic GeForce 2 MX was a popular card. I remember most kids had that, and I probably would have had that card until the computer salesman said to the parents it was good card for games... and so a Radeon 7500 (not great at games) was put in. 🥲
At the goodwill clearance outlet in 2001 we used to anticipate with glee finding a p3 or k6-2 or slot a tower for 8.99 in the electronics clearance room.
Lol, a whole 57 million transistors. The 3080 has nearly 30 billion; I love tech advancement. I remember the ATI Radeon 9700 Pro came out around this time or a little later. That thing was a beast for its time.
The 4090 has 76 billion. Even more ridiculous if you put that in perspective. They're already working on the 5000 series and it's rumored to release next year, which is not far away.
@@infinitecanadian It’s better to cover 5% less and not distract and annoy the overwhelming majority of viewers who have empathy, and who therefor cringe at his rude behavior.
It would take probably a decade after this episode was aired for your average game model to have near that level of facial detail, lol, that's the difference between a GPU demo and an actual game.
Half-Life 2 was probably the first title to equal that demo in facial detail. So only 3 years. And that's in a gameplay with revolutionary physics. Pretty damn impressive, IMO.
And like I said in another comment, the facial animation in the tech demo for example is not rendered in real-time in actual games as if it's some form of post-processing, it's scripted animation hard-baked through the game engine. Otherwise you'd HAVE to have a GeForce 3 specifically or you'd get limited or no facial animation. Which of course is not how games work. Ahem, apart from the abomination that was PhysX.
@uni blab But games that used it required you to have an expensive PhysX card with unnecessary overheads when other games had better physics integrated into their game engines and required no additional hardware.
16:59 How is it possible to be frenetic and sedate at the same time? I got the impression that Chiefet was thinking, "OMG, OMG, I gotta see what's next!", yet he looked outwardly mellow.
An Audigy card I bought with gaming in mind in my freshman year (2002) served me well throughout a couple of my PCs up until 2020 when I bought my most recent one. I was quite disappointed that I won’t be able to use it as the mobo I bought didn’t have a single PCI slot. Thankfully the onboard sound adapter on Crosshair VIII Formula turned out to be rather decent, gave me no problems whatsoever. But regardless what they will tell you it’s not all fun and games with every mobo made these days. Just the other day I put together a new PC based on an Asus Prime series mobo for my cousin and I was appalled to hear hissing in the headphones when I moved the mouse. Brings me back to Socket-A years when even better quality boards had the same issue with their crappy VIA AC’97 compliant codec. Bottom line: the general rule that built in audio has two advantages, 1) it’s there and 2) you can turn it off is still valid.
I remember watching this episode when it 1st aired. Watching all these goodies I was so jelly.... I had a AMD Athlon Thunderbird 1ghz with 1 gig of ram, with a GeForce 256 as the vid card. It wasn't great, but it wasn't mega trash either. The 4 most played games for me back then... Quake (all 3), Tribes , Diablo II and StarCraft. It was a golden age for sure.
Awesome times. I was playing the same games as you (except tribes). Especially Diablo 2, man that sucked some hours up. My pc back then was a pentium 3 800mhz, 384mb of ram and a GeForce 2 GTS! It was quite a respectable pc from memory.
I went to my computer store seeking a GeForce 3 and they just laughed at me. None in stock. I finally got my hands on one at the local Goodwill. I could play Sims, Quake II, even World of Warcraft was playable. Great product!
I remember playing Medal Of Honor Allied Assault online as a kid with like 20fps only on the old Pentium 3 PC of my parents. Then I saved money for a GeForce 2 Ti to upgrade the graphics card and for an additional RAM stick to double it to 512MB and suddenly I got 5x more fps in this game and my skill improved drastically. Back then the performance of hardware improved so fast. The Geforce 2 Ti still used a AGP 4x slot. (PCIe wasn't the standard yet)
@@JackoBanon1 Rest in peace AGP! I remember having a 4200 TI that my dad got in our PC that had a Pentium 4 and 256MB RAM. We partly got it for me (and him!) to play Command and Conquer Renegage. A few years later we had to upgrade to a 6600GT and add a 512MB RAM module when Oblivion came out, as the 4200TI lacked Shader 2.0.
I was 11 when this aired. 5:15 What he is talking about is games that dont have an unwalkable spot where you cant walk anymore. This was the first graphics card that allowed games like MMORPGS where the world was limitless and generated as you walk into it. Thats amazing for 2001.
My Dad bought me a Pentium 2 PC from Micro Center after I was discharged from hospital after an eight month stay from a traumatic brain injury (TBI) when I was nine. It couldn't play many games, including Final Fantasy 8 and Black & White. We knew nothing about computers. I had that PC until 2004 when we bought a Compact which was tenfold better, but it also couldn't handle the hottest games. Replaced the integrated GPU with a GeForce FX 5200 and was riotously blown away! Six years later had a new PC with a GTX 295 and GTX 750. Times have evolved--bittersweet.
@@jensfalkner7799 Hi Jens! I got hit by a car when I got off the school bus. I use a wheelchair to get around, but I'm doing well. Hope you are as well.
True. Moore's Law no longer applies like it used to. Cpu clock frequencies do not double every year (or 18 months) anymore, nor have they for years. The analysts say that going forward as of 2013, the amount of transistors in cpu's will double every three years, it is no longer only about clock frequency, although it does play a minor factor. Also, clock frequency of course matters when you are comparing two chips of the same architecture. Cores and cache are king nowadays. The more the better.
Seeing this allows me to take a step back, and appreciate the games that are possible today, even as someone that grew up on Doom in the 90's, and played Quake II in 2001. Oblivion in 2006 was nothing short of a technical achievement, especially Crysis the following year, and though I mostly play JRPGs, it's because of the technology that exists that they eventually caught up quickly, with the tech of the games I've mentioned above. Heck, the minimum system requirements for Trails of Cold Steel II far exceed the minimum for Oblivion, and looks incredible, if that says anything! Easy to take the technology that has come about in the past two decades for granted, especially with the regular emphasis on frames and graphics that exceed PS3/360 standards.
I remember this show and kind of miss it. 2001 was at the end of going to computer fairs and building your own franken rig from parts. Still playing Descent II with the Windows port. Looks much more dramatic on a modern video card. Lots of home brew missions available at the Descent Mission Database website. Material defender out.
Wow Geforce 3! I had a geforce 3 ti200 clocked above a ti500 and no card had so much longevity with me as that one. If memory serves i think there were still a few games in 2004\2005 that could still run on it! Today it rests on its original box still in working condition.
I owned that card back in 2001. It was a great upgrade from my 3dfx voodoo 5 5500 (which was a killer card from early 2000). Thr next card I got the nvidia 5700fx wasnt as big of a leap. But the 6800 was and the 8800gts was so amazing it maxed games for years.
Now let’s stop and think about it for a moment there. Voodoo5 (I still have mine) was a killer card in 2000 but just a year later it no longer was and there were great upgrade options for it (I went with Radeon 8500 128MB). And now I use two Vega 64 cards which at this point are over 5 years old and they’re still capable of 60 Hz gaming in anything between 1920x1200 and 6048x1200 depending on the title, with everything turned up to 11. Despite fond memories of the late 90s and early 00s those were nasty times for gaming 🙄
My friend got back into PC gaming in 2001. He'd been away for quite a while, since around 1995, and he wanted me to build him "the best gaming PC - spare no expense". So I got him a T-bird 1.1Ghz and a Geforce 2 Ultra. The card cost £360 which was INSANE back then, considering a Geforce 2 GTS was more than adequate for gaming and was less than half that price! But his PC sure did fly. At LANS, everyone would just gather round his machine and watch loops of 3DMark 2000. After maybe 6 months, he called me one day asked me to go over to his place. He said thanks for the gaming sessions, the LANS, etc ... but he'd had enough and was buying a GasGas trials bike, so he didn't need the card anymore, and he gave me the Geforce 2 Ultra for £100! That replaced my Geforce 2 MX 200, and even though I was only running a Duron 750 at the time, every game I played just flew! I still have that card, 22 years later, in a retro rig. Had to replace the fan some years back but it still works.
Amazing watching how these graphics and computers became to be at that point in time, when just 16 years earlier the show had revolutionary bips and blops as groundbreaking graphical capabilities.
Still is a bold statement i would say. But i get what you are saying. On the other hand though, i remember playing games from that era and thinking "wow this texture looks realistic" and things like that. Its all about perception.
I thought you were making a bad joke about the actor Will Smith, then halfway through the segment I recognized his voice and realized he was Will from Tested.
The reason why this was mindblowing to us back then is becaue we made do with 2d sprites made to look 3d. Jumping from 2d to 3d all while using consumer grade components was a big deal. It was at that point certain standards were made for game development, making everone leap into the golden age of graphical fidelity.
Love all these old cards! I was right into making music so Creative Sound Blaster cards were my thing and making beats in Fruityloops. I still have a few running in my old school beige towers I have in the back games room... 🤘🏼
What's funnier is that we haven't come as far in the last 12 years as the 12 yeasr prior to this show, but instead, had to resort to stuffing more CPUS into the system.
The problem is, we've about hit the limit of how many transistors we can etch on a silicon die with current process technology, so we're instead getting small efficiency gains each generation and cramming more cores in. Of course effectively using multiple cores is still a tough nut to crack, so most of those added cores aren't really being put to use in your average applications.
Back in 2001 I was into Macs. My coworker was a hard core pc guy. He was raving about gaming on pc and laughed at me because it was impossible on a Mac. Long story short 2 years later I got into PCs and never looked back.
@@fraizie6815 That is not how computer pricing works. Back in 1979, a high end system cost $2000. Same in 2005. Same in 2023. The reason is cost of production decreases. The same is true for hard drives. In 2005 it was $109 for a high capacity (250Gb) drive. Today, that high capacity drive is the same price but the size is now 6TB. The price point stays the same.
@fraizie6815 they were always proud of their product. I remember I had a sh*tty windows Vista with an Nvidia card that cost around $1,000. Could not even play minecraft barely lol
I'm seeing a lot of people saying technology progression has slowed down a lot. Thats because we are at a wall with electron based technology. We have processor architecture thats down to 7nm. To put that in perspective, the smallest wavelengths of light are around 300-400nm.... we are at a wall and we are going to have to get creative to get past it.
@@stevensavoie856 tbh, my parents who are in their 50s cant even tell the differences, not to mention my grandma. So many subtle details that we are familiar with when we look at CGI that invisible to older people.
It's amazing because those graphics became available to the home user / gamer in 2001, while some years prion, eg. in 1993 you had to spend 500000$ for a special pc to generate those graphics!
I remember 2003 when I saw the tech demo of STALKER Oblivion Lost and the graphics blew me away, dynamic lighting and shadow, reflective surfaces, realistic bullet physics, physics in general, ragdolls, vehicle physics, complex foliage, high resolution textures. Early days of 3D gaming really was a great experience when you had Far Cry, Doom 3 and FEAR not to mention Half Life 2. Really huge nostalgia for these days specially with a CRT monitor that sweet non existent motion blur that plagues LED screens even today.
@@mymusicplaylists5163 Resolution does not really do much, it makes objects a bit sharper in the background and that is about it LOL Ray tracing is cool though but STALKER also had ray tracing in 2007
One of my favorite games of all time was released around this time - the original Deus Ex! I don't play it anymore, because the multiplayer has understandably long since died out, but I go back to it every few years to re-play the single player campaign. I'll probably still be playing it 50 years from now. Unless I'm dead.
Hard to believe this was 19 years ago. The video looks like something almost out of the 80's or early 90's. If not for the tech shown(Geforce 3) it would be fair for people to think this video is MUCH older.
@@Neodestro Not surprising, since the show started in 1983. The format never really changed much, so by 2001 it was getting pretty dated, and in fact was on the way out - it ended in 2002.
American TV always looks a little strange, even today. You can always tell a show that's filmed in the U.S. I don't know what it is, it just never looks as 'real' as TV from other countries.
My one gripe with this generation of the show. Earlier in the show's history they would just cover 15% less and let stuff breathe and not rush every guest.
I remember that Nvidia demo :). I think I still have my Thermaltake orbs somewhere too. Around this time I had a antec sx830 asus board and I think a duron overclocked with the pencil trick hehe. I had a fop38 in that system which eventually snapped off and that was the end of my duron. So many good gaming memories back then. QII all day :).
RIP Gordon, you don't know how many people you have inspired and taught over the last 25+ years, right now any techtuber (and tech companies I imagine) is going to be mourning heavily. Thank you.
you can laugh but actually in 2001 these 3d animations were mindblowing, and even today these doesn't look so bad!
You really had to be alive then to understand. When Half-Life came out I couldn`t imagine anything with better graphics.
Plutonius X I remember that in '93 with Doom!
@@UrielX1212 oh be there when we had pong , Atari 2600 etc then talk to me about retro stuff lol
@@shaolin95 Haha. It never does stop. Wait until you lived with sticks stones and then talk to me about retro stuff.
The Radeon 8500 was so much more powerful than the GeForce3, and also had truform.
I love how they have to just awkwardly sit there after they give their presentation and he moves on to the next guy.
You shall say nothing got it!!
haha its like interacting with game characters
when your pc can rending that all this people in real life...their life is worthless
Wonder what would happen if the previous guest came over to have a look at the next presentation
Its funny cuz theyre doing it live. Tv used to be live whether it was a show or news. Nowadays mostly news does it. And that chameleon demo in 2001 is as impressive back then as it looks good today. When theyre using the entire gpus power to render a relatively small but incredibly detailed scene. Stretching the performance for an entire game downgrades the graphics big time. it wasnt until 3 to 5 years later until we saw something like that in games i think. half life 2, doom 3 ect.
From the advent of Windows 98 to where we got to 3 years later in 2001, tech moved faster than we could think. I'm still using my 5-year old i7-2600K. 5 years back then could take you from an 486 to Pentium 4. Those were quantum leaps in computing power every generation.
Lol the 486 was shit just a few years later.
If you go another five years back the other way it’s even more drastic!
@@bakednerd True the software now has to catch up to the hardware. My 7 year old i5 plays Crysis 3 without a stumble.
@@Crashed131963 Crysis 3 is not new
@@nadirjofas3140 True
but it is still on the 2019 top list of most hardware demanding games. geekculture.co/9-graphically-intensive-games-that-will-push-your-desktop-to-its-limits-in-2019/
I really miss the informative computer shows.
Yeah all we get today is these try-hard buff dudes that act and sound like they've had a line of coke before filming.
@@FlyboyHelosim that sounds nothing like Linus.
@@CACOE_ Who mentioned Linus?
That's because the evolution of the technology has been slowed down in the last 5 years.
informative my ass, these clips were a bunch of fucking ads by PR shills
You guys might not believe this but I actually called the number at the end to purchase a video tape copy of the program and I received my VHS copy in 2023! The guy on the other end of the line told me they've been renting a warehouse full of tapes since the late 80s and they are bleeding out financially each month but hes still really passionate about the computer chronicles and wouldn't change anything.
That's truly Amazing.
Do you even have a VCR 😄
Sure......"they are bleeding out financially each month but hes still really passionate about the computer chronicles and wouldn't change anything."
Yeah you're right I don't believe you
It's hard to believe. 2001 feels like yesterday. My son was born in that year. He is already 20 years old.
You need Jesus, you know that right?
yeah but it was decades ago now😭😭😭😭😭😭😭😭😭😭😭😭
@@AOKONE ua-cam.com/video/eatIzqwB2dA/v-deo.html
Christ died for your sins and rose on the third day, showing that anyone who trusts in him for salvation, will have everlasting life.
(John 11:25-26) "Jesus said unto her, I am the resurrection, and the life: he that believeth in me, though he were dead, yet shall he live:And whosoever liveth and believeth in me shall never die. Believest thou this?"
(John 3:16) For God so loved the world, that he gave his only begotten Son, that whosoever believeth in him should not perish, but have everlasting life.
@@AOKONE they don't know they are indeed the chosen Gen them and there descendents
@@AOKONE why would I need a man who died 2000 years ago?
PC master race from 2001! such memories
remember to get yourself a GeForce 3 man it's the bomb and all the rage right now
PC master race from 1989 checking in
The guests just kind of 'switch off' like robots when the host moves from one to the next. 19:16
LOL
LOL
Noticed that, laughed, and then heard "Will Smith" and laughed more.
Well that dude was from Intel so I wouldn't be surprised if he's some type of droid
I noticed that on other CC videos too. The guests never leave, they just sit there, or type on a keyboard, it's so strange
I didn’t realize this show ran this long. I’m used to watching the 80s ones. It really show how quickly thinks moves at the time.
This really makes me feel uneasy on how old I am.. 2001 still seems so close.. yet it was 19 fuggen years ago.. I was 16 back then and thought this is it.. I am living in the future.. playing Counter-Strike and Warcraft 3 and Diablo 2.. being excited AF..
In 2023. This video hit UA-cam in 2013 so we are almost as afar away as this video being uploaded to today as this was from its source when it was put on UA-cam
I love this show. I love retro PC gaming and these videos take me back to a better time. I also love looking at the games in the background on the shelf.
Pentium Guy"If you notice,all the trees are blowing in the wind."
Stewart"Yea, that's cool"
Classic Stewart.
I love how they showing some of the best games from that era. Citizen Kabuto, Sacrifice, Black and White; it was really a golden age for gaming.
i loved black and white wish it would return
@@timg2973 Same. I'd love a new game or maybe even a remaster.
And those weren't even the best game of 2001, more like the crappiest and most overrated (especially that piece of unplayable shit that Black & White was, the only thing that software debacle did was further establish Molyneux as one of the biggest lying hacks in the game industry)
@@ShadowAngel18606 What would you point out as one of the best games in that era?
@@JH24821 I don't think too many can hold up to today's standard, not even with all the nostalgia in the world.
But Heroes of Might & Magic 3, Jagged Alliance 2, System Shock 2, Fallout 1 & 2, Baldur's Gate 1 & 2, Gothic 1 & 2 (if you can tackle the controls) and GTA Vice City hold up, i would say.
Storywise Planescape Torment still ranks as one of the best written games and shows how bad writing in games actually was and sadly, still is (Bethesda is the prime example of this, their writing is atrocious and always was)
I recently tried to play that Diablo 2 remaster with a friend and i was shocked at how horrendous and limited it was, compared to more modern titles like Grim Dawn. It's so bare bones and downright boring in every aspect, i couldn't believe it (i hadn't played it in over 15 years)
Same with CnC Generals i recently tried to play. Incredibly stupid AI, with shoddy path finding and completely unbalanced gameplay (Super Weapon General is basically invincible, since you can't do anything against that death ray from the sky).
The problem with games back then is that a lot of them aren't fun because you're so spoiled today you quickly notice things like the braindead AI. Eiter the AI cheated like crazy (especially in strategy games like HoMM) or it was pre-scripted and you could easily figure their paths out and trick them (Mafia is a great example of that, every AI enemy follows a scripted path and doesn't even really react to what you do. There's a mission where you have to steal a truck. If you miss the show, the Truck speeds away, but instead of actually trying to escape from you, it follows a scripted path and drives in a circle, you can pretty much wait for it to show up again. It's laugjable)
It is a whole lot better today, even if a lot of games suffer from being to easy in general (the best example is Rockstar and how their games became easier and easier starting with Bully compared to the insane time limits, precise jumping ("Just follow that damn train!") and bullet sponge enemies of the older games).
Most Genres on PC aged horrendously, especially sports, racing, most Shooters (again, horrible AI and too many limitations)
In that regard, console games aged better, since for example most sports games were pure arcade and titles and franchises like Tony Hawk, Mario Kart, NHL Hitz, NBA Street, Burnout, the wold Aki wrestling games or Blitz the League hold up because they are unrealistic to begin with.
Gordon the GOAT here will another great Will Smith. I have followed Gordon's work since the Boot magazine days. Still hard to take in that we won't have your insight in future tech releases. You were one of the pillars of the tech reviewing industry. So much nostalgia in this video. I remember that AW case, spent countless hours playing black and white (please make a new one), and reading Gordon's and Will's articles every month in Max PC.
It's really great to see with this show the progress from DOS to Windows 3(.1) to XP.
Really nostalgic.. or retro, can't really tell.
It's both
whats more impressive is the fact he thought a comb over is a good idea throughout the years ...
What's apparent is that progress has really slowed down in the last 20 years. A game from 2001 still looks ok today, whereas a game from 1980 looked like caveman graphics to someone in 2000.
What games are you playing? Call of duty?
lmao a game from 2001 one is not okay hahah. wtf
2000's , 2001 specifically was the start of the ps2 era. which were still pretty bad in polygons, and just graphics in general.
That's subjective but older games at least to me lack a lot of polish that we take for granted now a days. Graphics wise we have been spoiled by newer games so we can see the smoke and mirrors so they feel less immersive.
A game from 2001 still looks great? Have you been living in a rock?
That nvidia graphics processors will never make it, now my money is on that Phillips sound cards...
Sorry to tell you buddy, phillips lost
Right next to my xerox mouse
It’s kind of odd to watch that music demonstration and come to the realization that all the great songs of the 2000’s had not happened yet.
@@arckanum332 r/wooosh
@@therainmakerinsider r/wooosh
Started with a SFF Dell "Clamshell" Optiplex GX280 with a 2.8 GHz Prescott Pentium 4 "521" HT, 1 GB blazing 800 MHz Dual Channel DDR2 and an ATi Radeon X300 SE with a whopping 128 MB of VRAM, all running on Windows XP Professional SP3.
This was in 2008 for me as my first PC, it was already useless and outdated in a world with Core i chips, however It had some of my best memories, even better than my built PC now, It's extremely cool and "rocket science" to you as a kid. It's crazy how far we've come in 20 years.. And at the time all this stuff was bleeding edge. I love this channel for preserving PC and tech history, Thank you.
16:45 "absolutely, absolutely" I hate that kind of salesman BS. It's obviously LESS demanding on the GPU because it doesn't have to render the car.
Yeah he says it's because it's larger, but really it just moves camera a little bit forward so the car is not rendered
absolutely
I think Stewart picked up on that too, considering he cut the guy off right after that
I don't think either one of them really understood rendering tech, lol.
the car has to go faster. new view = faster car. absolutely.
I remember build an Athlon XP with a GeForce 3 200Ti that year. Think it was on an Abit motherboard with 256MB RAM and a 100Mbe NIC.
Haha I forgot about abit
Abit was the shit!
Damn, Abit used to make some legit stuff. Whatever happened to that brand.
Abit BH6 was an amazing motherboard!
@Qimodis network interface card
this era was around the time i had to stop gaming for a few years. well, i made some bad choices and wound up in juvie for a few years. i was sent in 2001 and got out in 2004 when i turned 18. then, i had to focus on getting my life started properly. it wasnt until 4yrs ago that i got back into PCs. seeing the MASSIVE changes that occurred while i was away to recent, its absolutely crazy. back in the day, 1ghz was a HUGE deal. now, im sitting here with a Ryzen 5 3600 and its crazy to have that sort of power and thinking "meh, mid range". if i had the system i have now in lets say back in the early-mid 90s when i was real into PCs, its so advanced that i would have thought it was "alien technology". heck, i even remember standing with my uncle out in front of a store called "The Good Guys" in Reno, NV waiting for the store to open so we could snag a copy of Windows 95.
i love seeing stuff from a time i wasnt able to participate in PC related stuff though.
I wish I had a mid range graphics card
I’m glad to see you got your life back on track. Stories like these are genuinely nice to hear
You could have had an i9 9900k this black friday for usd $320
Can probably still play through the best games of 2001 to 2004. If windows 10 doesnt support them might have to install win xp lol.
@@Andytlp lol. yeah, most likely.
My latest iMac would still be struggling on these games.
thats cause its a mac jk
@@josephroblesjr.8944 I mean.. you aren't wrong...
Apple buyers don't care for the performance or technical details, just for the premium factor.
Holy ****!! Black & White!! I remember playing that game and loving it as a kid! Bro the amount of technological improvement in 19 years is incredible
Brilliant game - I went to Lionhead and helped beta test if for two weeks! I was about 14 or so at the time. They used to let people come and help beta test it off their forums. Shame Lionhead no longer exists.
man i love watching this stuff today. nostalgic for sure.
I remember skipping a late afternoon class to go into a PC shop in the city to buy a Radeon 8500, which was a rival card to the Geforce 3. I was a boarding student and many of the kids had a Geforce 2. So it was pretty cool to for a bit to be the kid that could run the Doom 3 demo, and any game (GTA 3) with sweet performance. Those were some good day's of the PC, GPU's, sound cards, multimedia add-on's. A good Sound Blaster sound card could even noticeably boost your frame rate as to take the sound processing away from the CPU. So it was a bit of a double win, better sound and game performance.
@@Sirvanic GeForce 2 MX was a popular card. I remember most kids had that, and I probably would have had that card until the computer salesman said to the parents it was good card for games... and so a Radeon 7500 (not great at games) was put in. 🥲
I used my radeon 8500 for a long time
At the goodwill clearance outlet in 2001 we used to anticipate with glee finding a p3 or k6-2 or slot a tower for 8.99 in the electronics clearance room.
Finally, a youtube video that will teach me to upgrade my gaming personal computer!
They didn't seem that worried about copyright issues by playing that music back then.
Considering Philips is a record company as well probably stuff they owned the rights to anyways
Man these games look amazing, and it's almost 2020.
True
And, they're just DX 8 games.
@@ebayerr A lot of those games actually were build on DirectX 7.
@@ShadowAngel-lt8nw Crazy
The chameleon and genie demos are something I remember very much. Feels like it wasn't that long ago.
I remember playing Sacrifice. Loved the game. Also Evolva was a classic.
almost 20 years, and it still looks stunning!!!!
Lol, a whole 57 million transistors. The 3080 has nearly 30 billion; I love tech advancement.
I remember the ATI Radeon 9700 Pro came out around this time or a little later. That thing was a beast for its time.
The 4090 has 76 billion. Even more ridiculous if you put that in perspective. They're already working on the 5000 series and it's rumored to release next year, which is not far away.
19:39 Chieftec Dragon is the name of the case for anyone wondering.
Those cases are wild. My main PC is in one of those
The whole time I was watching this, I was like.. jee.. let them finish talking.
Same lmao
It was a 30 minute program which barely fits all the information into that imeframe so be grateful he knew what he was doing 😂
Mr. Chiefet has to keep the show moving. He has to fit all that information into 23 minutes or so.
@@infinitecanadian It’s better to cover 5% less and not distract and annoy the overwhelming majority of viewers who have empathy, and who therefor cringe at his rude behavior.
@@Mattribute He is not being rude. He has a job to do.
It would take probably a decade after this episode was aired for your average game model to have near that level of facial detail, lol, that's the difference between a GPU demo and an actual game.
@navid ahmad I said "average game model", Crysis was well.....Crysis, and even then the facial detail in that was still not near this demo.
Half-Life 2 was probably the first title to equal that demo in facial detail. So only 3 years. And that's in a gameplay with revolutionary physics. Pretty damn impressive, IMO.
And like I said in another comment, the facial animation in the tech demo for example is not rendered in real-time in actual games as if it's some form of post-processing, it's scripted animation hard-baked through the game engine. Otherwise you'd HAVE to have a GeForce 3 specifically or you'd get limited or no facial animation. Which of course is not how games work. Ahem, apart from the abomination that was PhysX.
@uni blab But games that used it required you to have an expensive PhysX card with unnecessary overheads when other games had better physics integrated into their game engines and required no additional hardware.
HL2 LUL i mean it was the "GAME CHANGER" imo in 04.
16:59 How is it possible to be frenetic and sedate at the same time? I got the impression that Chiefet was thinking, "OMG, OMG, I gotta see what's next!", yet he looked outwardly mellow.
Good weed.
Rewatching the video notice at 3:29 that monitor is a Radius Artica/Silicon Graphics 1600SW but with different skin
I remember buying a "gaming sound card" back in the 2000's, and it used to make a huge difference
Nowadays even middle class mainboards have decent sound chips.
You'll still hear a huge difference with an ASUS Essence STX II, though.
An Audigy card I bought with gaming in mind in my freshman year (2002) served me well throughout a couple of my PCs up until 2020 when I bought my most recent one. I was quite disappointed that I won’t be able to use it as the mobo I bought didn’t have a single PCI slot. Thankfully the onboard sound adapter on Crosshair VIII Formula turned out to be rather decent, gave me no problems whatsoever. But regardless what they will tell you it’s not all fun and games with every mobo made these days. Just the other day I put together a new PC based on an Asus Prime series mobo for my cousin and I was appalled to hear hissing in the headphones when I moved the mouse. Brings me back to Socket-A years when even better quality boards had the same issue with their crappy VIA AC’97 compliant codec. Bottom line: the general rule that built in audio has two advantages, 1) it’s there and 2) you can turn it off is still valid.
hey Kyle, thank you for bringing me here!
I remember watching this episode when it 1st aired. Watching all these goodies I was so jelly....
I had a AMD Athlon Thunderbird 1ghz with 1 gig of ram, with a GeForce 256 as the vid card.
It wasn't great, but it wasn't mega trash either. The 4 most played games for me back then...
Quake (all 3), Tribes , Diablo II and StarCraft. It was a golden age for sure.
Awesome times. I was playing the same games as you (except tribes). Especially Diablo 2, man that sucked some hours up. My pc back then was a pentium 3 800mhz, 384mb of ram and a GeForce 2 GTS! It was quite a respectable pc from memory.
I remember when I got my first Voodoo2, still have it complete in box. PC gaming is something that will always be in my heart.
I went to my computer store seeking a GeForce 3 and they just laughed at me. None in stock. I finally got my hands on one at the local Goodwill. I could play Sims, Quake II, even World of Warcraft was playable. Great product!
When was this? Yesterday?
I remember playing Medal Of Honor Allied Assault online as a kid with like 20fps only on the old Pentium 3 PC of my parents.
Then I saved money for a GeForce 2 Ti to upgrade the graphics card and for an additional RAM stick to double it to 512MB and suddenly I got 5x more fps in this game and my skill improved drastically.
Back then the performance of hardware improved so fast. The Geforce 2 Ti still used a AGP 4x slot. (PCIe wasn't the standard yet)
@@JackoBanon1 Rest in peace AGP! I remember having a 4200 TI that my dad got in our PC that had a Pentium 4 and 256MB RAM. We partly got it for me (and him!) to play Command and Conquer Renegage. A few years later we had to upgrade to a 6600GT and add a 512MB RAM module when Oblivion came out, as the 4200TI lacked Shader 2.0.
I was 11 when this aired. 5:15 What he is talking about is games that dont have an unwalkable spot where you cant walk anymore. This was the first graphics card that allowed games like MMORPGS where the world was limitless and generated as you walk into it. Thats amazing for 2001.
nVidia's flagship graphics card only costs $399. What a paradise!
Gordon from MaximumPC still around. Just aged a bit.
My Dad bought me a Pentium 2 PC from Micro Center after I was discharged from hospital after an eight month stay from a traumatic brain injury (TBI) when I was nine. It couldn't play many games, including Final Fantasy 8 and Black & White. We knew nothing about computers. I had that PC until 2004 when we bought a Compact which was tenfold better, but it also couldn't handle the hottest games. Replaced the integrated GPU with a GeForce FX 5200 and was riotously blown away! Six years later had a new PC with a GTX 295 and GTX 750.
Times have evolved--bittersweet.
What happened? Bike or skateboard accident? Hope all is fine now with your brain. 💪
@@jensfalkner7799 Hi Jens! I got hit by a car when I got off the school bus. I use a wheelchair to get around, but I'm doing well. Hope you are as well.
9:43 i see that Rogue Spear icon!!! anyone else play on msn gaming zone and do clan ladder??? man good times
i played on the zone! my game was combat flight simulator 1. man i miss the ridgerunners chat.
Man, I remember Rogue Spear. All the rooms on MSN gaming zone were called LAG = BOOT! Good times.
i did on black throne my clan was OoPA, i remember this dude named benny blanco that wld crush me all the time! cant remember his clan tho!
True. Moore's Law no longer applies like it used to. Cpu clock frequencies do not double every year (or 18 months) anymore, nor have they for years. The analysts say that going forward as of 2013, the amount of transistors in cpu's will double every three years, it is no longer only about clock frequency, although it does play a minor factor. Also, clock frequency of course matters when you are comparing two chips of the same architecture. Cores and cache are king nowadays. The more the better.
IDK why this came up in my feed, but I can't stop watching it.
Seeing this allows me to take a step back, and appreciate the games that are possible today, even as someone that grew up on Doom in the 90's, and played Quake II in 2001. Oblivion in 2006 was nothing short of a technical achievement, especially Crysis the following year, and though I mostly play JRPGs, it's because of the technology that exists that they eventually caught up quickly, with the tech of the games I've mentioned above.
Heck, the minimum system requirements for Trails of Cold Steel II far exceed the minimum for Oblivion, and looks incredible, if that says anything! Easy to take the technology that has come about in the past two decades for granted, especially with the regular emphasis on frames and graphics that exceed PS3/360 standards.
I'll never forget the moment I stepped out of the sewers in Oblivion and was greeted with that view...
I remember this show and kind of miss it. 2001 was at the end of going to computer fairs and building your own franken rig from parts. Still playing Descent II with the Windows port. Looks much more dramatic on a modern video card. Lots of home brew missions available at the Descent Mission Database website. Material defender out.
I like Homeworlds!
The background cosmos was crazy-beautiful in those games.
I love how serious Stewart takes things. He totally gets into what his guests are saying. Total professional! Love it
Wow Geforce 3!
I had a geforce 3 ti200 clocked above a ti500 and no card had so much longevity with me as that one. If memory serves i think there were still a few games in 2004\2005 that could still run on it!
Today it rests on its original box still in working condition.
this episode is a banger
I owned that card back in 2001. It was a great upgrade from my 3dfx voodoo 5 5500 (which was a killer card from early 2000). Thr next card I got the nvidia 5700fx wasnt as big of a leap. But the 6800 was and the 8800gts was so amazing it maxed games for years.
and now it's the 4000 series cards with enhanced ray tracing where will it all end?
Now let’s stop and think about it for a moment there. Voodoo5 (I still have mine) was a killer card in 2000 but just a year later it no longer was and there were great upgrade options for it (I went with Radeon 8500 128MB). And now I use two Vega 64 cards which at this point are over 5 years old and they’re still capable of 60 Hz gaming in anything between 1920x1200 and 6048x1200 depending on the title, with everything turned up to 11. Despite fond memories of the late 90s and early 00s those were nasty times for gaming 🙄
My friend got back into PC gaming in 2001. He'd been away for quite a while, since around 1995, and he wanted me to build him "the best gaming PC - spare no expense". So I got him a T-bird 1.1Ghz and a Geforce 2 Ultra. The card cost £360 which was INSANE back then, considering a Geforce 2 GTS was more than adequate for gaming and was less than half that price! But his PC sure did fly. At LANS, everyone would just gather round his machine and watch loops of 3DMark 2000.
After maybe 6 months, he called me one day asked me to go over to his place. He said thanks for the gaming sessions, the LANS, etc ... but he'd had enough and was buying a GasGas trials bike, so he didn't need the card anymore, and he gave me the Geforce 2 Ultra for £100! That replaced my Geforce 2 MX 200, and even though I was only running a Duron 750 at the time, every game I played just flew!
I still have that card, 22 years later, in a retro rig. Had to replace the fan some years back but it still works.
What about Mario's face in Super Mario 64, the title screen allowed you to mess with Mario's face and that game came out in 95/96
These game demos are still very impressive. I was like "damn I gotta get this game, looks realistic"
Well, I got some great news:
www.nvidia.com/en-gb/geforce/community/demos/
Amazing watching how these graphics and computers became to be at that point in time, when just 16 years earlier the show had revolutionary bips and blops as groundbreaking graphical capabilities.
IIRC, blips, blops, and hardware blitters (how the Amiga did "bit-block transfer")
Amazing how this geforce 3 had 57million transistors and now a rtx4090, the latest card has 76 billion transistors
"If it's not realistic, it's a waste of time" - Bold statement for 2001 lol.
Still is a bold statement i would say. But i get what you are saying. On the other hand though, i remember playing games from that era and thinking "wow this texture looks realistic" and things like that. Its all about perception.
I still have my GeForce 3 Ti500, hooked up to the same Epox 8kta3 + pro 😊. Had to replace the capacitors on the Epox, they were puffy.
I love how he cuts everyone off before they finish every 2nd sentence
I hate the host
20 years later this tech is all still pretty amazing. Black and White gameplay was great to see too I loved that game.
I remember getting a voodoo, I was blown away by its capabilities.
I still have the original Quake 2 for the pc.
Will Smith has changed _quite_ a bit.
I thought you were making a bad joke about the actor Will Smith, then halfway through the segment I recognized his voice and realized he was Will from Tested.
Dylan Craig I was making a bad joke about the actor.
He went reverse Michael Jackson, basically.
The reason why this was mindblowing to us back then is becaue we made do with 2d sprites made to look 3d. Jumping from 2d to 3d all while using consumer grade components was a big deal. It was at that point certain standards were made for game development, making everone leap into the golden age of graphical fidelity.
Love all these old cards! I was right into making music so Creative Sound Blaster cards were my thing and making beats in Fruityloops. I still have a few running in my old school beige towers I have in the back games room... 🤘🏼
Fruityloops, my man what glorious days 😁
@@kristofferkling9567 It sure was mate! 👍
In those years I spent good money building my gamer pc. I still think it was a cool thing to do. Seeing your own creations.
PCMR has always been and will always be
Jesus, this guy clung on to his last wisps of hair for literal decades.
What's funnier is that we haven't come as far in the last 12 years as the 12 yeasr prior to this show, but instead, had to resort to stuffing more CPUS into the system.
The problem is, we've about hit the limit of how many transistors we can etch on a silicon die with current process technology, so we're instead getting small efficiency gains each generation and cramming more cores in. Of course effectively using multiple cores is still a tough nut to crack, so most of those added cores aren't really being put to use in your average applications.
Back in 2001 I was into Macs. My coworker was a hard core pc guy. He was raving about gaming on pc and laughed at me because it was impossible on a Mac. Long story short 2 years later I got into PCs and never looked back.
Back in the day when people could actually buy the latest video card from NVIDIA.
The GeForce 3 was $499. Calculated for inflation, that is $850 today. Not that much cheaper.
@@fraizie6815 That is not how computer pricing works. Back in 1979, a high end system cost $2000. Same in 2005. Same in 2023. The reason is cost of production decreases. The same is true for hard drives. In 2005 it was $109 for a high capacity (250Gb) drive. Today, that high capacity drive is the same price but the size is now 6TB. The price point stays the same.
@fraizie6815 they were always proud of their product. I remember I had a sh*tty windows Vista with an Nvidia card that cost around $1,000. Could not even play minecraft barely lol
I'm seeing a lot of people saying technology progression has slowed down a lot. Thats because we are at a wall with electron based technology. We have processor architecture thats down to 7nm. To put that in perspective, the smallest wavelengths of light are around 300-400nm.... we are at a wall and we are going to have to get creative to get past it.
Wow, Gordon on Computer Chronicles :O
I'm surprised nobody else in this section is talking about that. This makes him even more legendary!
I lived for these types of shows back in the day.
"This looks like you're looking at a movie" The exact words uttered by many in 2020 after the Unreal Engine demo
Yup. We just keep making that same mistake over and over again.
🤣 it was so funny to hear that
@@stevensavoie856 tbh, my parents who are in their 50s cant even tell the differences, not to mention my grandma.
So many subtle details that we are familiar with when we look at CGI that invisible to older people.
The sound on some of these is really soft. It's surprising given the visual quality.
2:19 "57 million transistor processor" ...
22 years later the RTX 4090 has 76 BILLION transistors in it :D
Yeah, but geforce 3 did not demand new powerfull PSU, and was cost like 250$.
It's amazing because those graphics became available to the home user / gamer in 2001, while some years prion, eg. in 1993 you had to spend 500000$ for a special pc to generate those graphics!
This guy is the Bob Vila of computers, Such a Pro
Grunt grunt grunt 😂
I remember 2003 when I saw the tech demo of STALKER Oblivion Lost and the graphics blew me away, dynamic lighting and shadow, reflective surfaces, realistic bullet physics, physics in general, ragdolls, vehicle physics, complex foliage, high resolution textures. Early days of 3D gaming really was a great experience when you had Far Cry, Doom 3 and FEAR not to mention Half Life 2.
Really huge nostalgia for these days specially with a CRT monitor that sweet non existent motion blur that plagues LED screens even today.
we have come really far in terms of graphics, now we got Ray Tracing and 4k 60+fps in games and this video was from 19 years ago!
@@mymusicplaylists5163
Resolution does not really do much, it makes objects a bit sharper in the background and that is about it LOL
Ray tracing is cool though but STALKER also had ray tracing in 2007
Look at those frames, not even hitting 30
One of my favorite games of all time was released around this time - the original Deus Ex!
I don't play it anymore, because the multiplayer has understandably long since died out, but I go back to it every few years to re-play the single player campaign. I'll probably still be playing it 50 years from now. Unless I'm dead.
me2, I played the multiplayer aswell, underrated!
Hard to believe this was 19 years ago. The video looks like something almost out of the 80's or early 90's. If not for the tech shown(Geforce 3) it would be fair for people to think this video is MUCH older.
yep look 90s but is 2001 lol
@@Neodestro Not surprising, since the show started in 1983. The format never really changed much, so by 2001 it was getting pretty dated, and in fact was on the way out - it ended in 2002.
American TV always looks a little strange, even today. You can always tell a show that's filmed in the U.S. I don't know what it is, it just never looks as 'real' as TV from other countries.
@mike h Whether you think it's idiotic or not, it's true. American TV looks different.
@Patrick Stick That escalated quickly!
That comb over is mind blowing 😮
I remember when it was a 300MHz CPU headlining the PC magazine.
I remember seeing a PC magazine in a supermarket that had a full page ad that said 12mhz.
I remember reading about the GeForce 3 in a PC Format as a child, and being utterly mind blown
Well, to be in this TV show... you better explain your stuff fast and clear :P
My one gripe with this generation of the show. Earlier in the show's history they would just cover 15% less and let stuff breathe and not rush every guest.
AND DAMN IT HURRY UP!! xD
@@thomasg86 TECHNOLOGY ADVANCING TOO FAST, NO TIME TO COVER EVERYTHING PROPERL
@@thomasg86 No doubt! Susan Kare these guys are not. This is where asmr goes to die… Informative, but crazy frenetic.
The technology being presented was obsolete by the end of the presentation 😂
My all time favorite PC game was released that year. Black & White.
I remember that Nvidia demo :). I think I still have my Thermaltake orbs somewhere too. Around this time I had a antec sx830 asus board and I think a duron overclocked with the pencil trick hehe. I had a fop38 in that system which eventually snapped off and that was the end of my duron. So many good gaming memories back then. QII all day :).
Absolutely beautiful comb-over
I still have my Geforce 2,3 and 4.
Thanks to this video, I just downloaded and ran 3DMark 2001 on my kinda dated gaming PC. It's kinda fun.