Getting rid of aliasing artifacts for simulators is actually a pretty big deal. I remember when I was going for an instrument rating in MS Flight Sim '04, one big tell that I was heading off-course was when the pixel edges of things started shimmering too quickly, which is something that someone piloting a real plane could absolutely not rely on at all.
Yeah when I was rendering 4 days in lightwave for a 1 minute scene with the BttF Delorean is now done in realtime @100 fps, and people complain still that it went to 80fps for a few seconds. :)
All thanks to socialism and government regulation that's not private property rights-oriented, a.k.a. for the greater good. Democratic socialism for the win!
When I was in the Air Force my job was basically working on/upgrading/fixing flight simulators and have worked with Quantum3D before. This video brought me back so thanks for that
Sometimes I think all great computer technology has been invented in the 1990s and just needed to be perfected in later decades. Just a few computer innovations that first appeared in the 1990s and still exist in perfected versions that all build upon 1990s standards and technology: - Wifi (802.11 in 1997, 802.11a and 802.11b in 1999) - Digital Cellular Networks (2G in 1991, 3G in 1998) - World Wide Web (implemented in 1990, published in 1991) - Bluetooth (first introduced in 1998, not ratified until 2002) - USB (USB1.0 in 1996, USB1.1 in 1998, USB2.0 in 1999) - DVB (DVB-S and DVB-C in 1994, DVB-T in 1997) - ISDB (1999) - AES (1998) - PGP (1991) - MD5 (1991) - SHA-1 (1995) - OpenGL (1992) - Direct3D (1996) - JPEG (1991) - PNG (1996) - MPEG-1 Audio Layer II aka MP2 (1991) - MPEG-2 Audio Layer III aka MP3 (1995) - MPEG-2 aka H.262 (1994) - MPEG-4 (1998, often implemented in the early 2000s under brand names like "DivX, "XviD", "3vix", etc) - AVC aka H.264 (2003, but first implementations were already around in the late 1990s alongside AAC) - AC-3 aka "Dolby Digital" (1991, "Star Trek VI" was the first movie with AC-3) - DTS (1993, "Jurassic Park" was the first movie with DTS) - AAC (1997) - IPv6 (1995) - UTF-8 (1992) - UTF-16 (published in 1996, first used in the late 1990s, not ratified until 2000) - x86 instruction set extensions (MMX in 1997, SSE in 1999) - Linux (first Linux kernel was released by Linus Torvalds in 1991) - P2P file sharing (1999, "Napster" was the first P2P file sharing application and network) - MMORPGs (1991, "Neverwinter Nights" [not the 2002 game] was the first MMORPG) - Video games console emulators (1993, "Pasofami" emulating the NES was the first video game console emulator) - Consumer PC water cooling (second half of the 1990s, first commercial product in 1999)* *first watercooled computer was the UNIVAC 1 in 1951
Makes me think of my grandfathers washing machine, a million years old but still running when I've gone through 2 in 5 years but sold to me as new and better then the last one. Planned obsolescence is a hell of a drug
A lot of tech is theorized in early periods, but was not feasible due to hardware limitations (e.g VR). We just live in a period now where many of those ideas can come into fruition, the latest trend being AI.
@@theguythatknows Everything used to be built better because it was built simpler, you can't have complex modern technology that the masses want without fragility, just the sad truth man.
Computer history in particular is filled with examples of this. So much theoretical work was already done through the 70s and 80s which didn't see any practical application because fabrication could not deliver the needed physical parts or some other part of the puzzle wasn't found yet.
That OSD at 4:25 was probably the biggest nostalgia flashback I've ever had! I had a 17" Sony Trinitron Monitor capable of 120Hz back then and I absolutely loved that thing! I was using it way into the TFT era, until it died one day. Being forced to play @60Hz on my first TFT was a huge setback. The input lag was unbearable and I couldn't hit anything in online FPS. Never thought I'd ever see that OSD again. Thank you so much for that!!!
In the late 90s, I worked on an SGI Indigo that had a voodoo card with 4GPUs. It was the most impressive experience at the time. The lab upstairs set up simulations using these machines. Incredible technology for the time! It is what really pushed me to get into building gaming systems.
And later models were even more impressive, when coupled with realtime A/V hardware they allowed for interactive 3D graphics and compositing used in many live news and sports broadcasts.
Mate, in the 90's it felt like tomorrow could bring us affordable VR Headsets at any moment, but they also felt impossibly far away at the same time. We truly had no idea what was capable/possible back then, eh??
@@dylanherron3963 I used what was available, it did work (with caveats), was wildly expensive and bespoke, but very cool. Since then we have been stuck in a quagmire of attempts by corporations to avoid standards just enough to prevent competition (from Nvidia to Facebook), which is the real thing killing practical VR/AR adoption. Consider that VRLink was essentially dropped before any devices had a chance to use it, with the first dedicated one being the PSVR2 just released.
@@orbatos You bring up fantastic points, and those which I'm concerned about as a consumer/adopter of open source tech. Tech development tied to licensing (and the blocking out of such, by corporate conglomerates) are the biggest blunder I can think of man
Ah! The days when video card manufacturers could just buy chips from suppliers and expect to be able to independently make a product with them, even if it was substantially better than the reference implementation.
@@kenshinflyer In retrospect, it was a lot better for the Dreamcast to essentially be a cut down Naomi board because porting was a lot easier, and practically 1:1 to arcade.
@@kwizzeh: True, but, I mean, the Dreamcast ended up with a PowerVR GPU (that also means the NAOMI has a PowerVR GPU), which was behind Nvidia back then. Sega of America wanted the 3DFX Voodoo to be the Dreamcast's GPU, but Sega of Japan handed NEC, which was a PowerVR licensee, the contract to the Dreamcast's GPU. This whole Sega thing is also one of the reasons why 3DFX went bankrupt.
Phenomenal script and delivery. Props to Linus and the writing department for all bringing their A games and the history lesson. Thank you as well to the editing teams for the visual learners!
it reminded me a lot of the "classic" LTT videos from the days in between the shaky handheld videos and the high-end production quality we currently have
Multi-GPU is extremely powerful for computational physics. Workloads like computational fluid dynamics need more VRAM than one GPU can offer, and can pool the memory of multiple cards. It's super difficult to implement and maintain though - developers have to manually implement and optimize which data to transfer between GPUs at which point in time. Making it more difficult, InfinityFabric on AMD is broken (driver segfaults) and NVLink on Nvidia is proprietary to CUDA and not available to OpenCL. So your only option is to do communication over PCIe. Luckily, PCIe 4.0 is plenty fast and 5.0 is coming with double the bandwidth. This also enables cross-vendor multi-GPU, "SLI"-ing Radeon+Arc+GeForce. 14:49 Multi-die GPUs are super interesting. Apple succeeded it with the M1 Ultra. AMD failed with the MI250(X), turning it into a dual-GPU-in-a-socket, but tries again with the MI300. Intel with PVC has performance issues in single-GPU mode, so also offer to configure it as dual-GPU in software. Nvidia's A100 is essentially a multi-chip GPU but on a monolythic die, and performance is outstanding.
I remember my first 3DFX card the Pure3D with 6mb of ram which was $700 back in 1998. A beast of a card, with 4mb used for texture mapping AND 2mb for frame buffering. Those coloured lights in Quake 2 gave me an unfair advantage according to my friends.
The difference with 3DFX SLI and new "SLI" is that 3DFX SLI didnt have the overhead of predicting frame areas/timings. The driver worked in a set way due to the render methods used at the time and all that had to be done was the required pre-determined render areas (scan lines) sent to the display output. This is also why the Quantum cards can have AA and non-AA output active at the same time but not pool their work to a single high FPS image.
I was a flight simulator technician at FlightSafety International in the 90's. When the Riva 128 came out, I built a simple visual system based on the 'Boids' Direct3D SDK sample from Microsoft. FSI had recently purchased VITAL from McDonnell Douglas, which became FSI Visual Simulation Systems (VSS). When I demoed my efforts to the folks at VSS, I was told PCs will never be able to compete with dedicated hardware and minicomputers with custom 3D rackmount image processors would be the future. I mentioned that there was more money going into consumer 3D card development than VSS could ever hope to spend. It's all PCs and mainstream graphics now. No one can compete with millions of kids (of all ages) wanting more FPS. Nice video.
It feels weird that SLI has mostly gone by the wayside. I probably could never afford it, but it was interesting to see those kind of set-ups made by others.
I had dual 980ti's because at the time, it was cheaper than buying a 1080ti. And, if I remember correctly, the 2x9890ti's out performed the 1080ti by a hair (where supported). For that reason alone, I wish SLI / Crossifre was still supported.
Agreed. IIRC, my buddy had two 8800GT SLI'd together. Always thought it was pretty neat, despite not being perfectly effective/efficient. I actually always wanted an SLI setup, but never had the chance to do one before they disappeared.
Sli was proprietary so Nvidia was their beginning and it's end
Рік тому+5
@@Corei14 what's sad is with DX12/Vulkan devs can implement multi GPU without the bridges and vendor independent so amd/intel/nvidia but where's the titles that take advantage of it?
@@spencerscott4878 Rocked a dual 980 setup till I switched to a 3060ti. The OG system was a 4790k @ 4.6 with 32GB 1600 RAM. What a darn beast that system was back then when SLI was still around and worked well enough.
I ran 2 voodoo 2's on my first PC back in 1999 (i was 19). I got a job after finishing school and felt so rich back then. Good memories. Had an early AMD 800mhz CPU (athlon was it called or something?). And shocked my friends with my 512mb of ram! (most people had like 128 back then)What a beast.
Anti-aliasing hardware is something I would love to see again. It's a very old problem that is still today solved in "brute force", with the same calculation compartment as the very expensive chip which is supposed to render shaders and the rest of a game. This allows the most recent chips to be optimized to the maximum by delegating this task to another cheaper chip (or another dedicated GPU sector).
fxaa or msaa isn't like the old 2x,4x whatever you could do on old cards on any game anyway. it's not always easy or straightforward to get the old ss 2x,4x etc antialiasing on games that are old enough that a modern gpu would have the power to render it in the old fashioned antialiasing way, which is really the best way as far as image quality is concerned. the newer methods are cheats, more or less, to get around having to render 4x- 16x the pixels, you can use ai filter models etc too, but none of them are as good as just actually doing it. so anyway the modern methdos aren't really brute force as such but on the other hand you don't get to use the brute force method even if you wanted sometimes now.
Looking at what apple did, and the way AMD went ahead and made chiplet based gpus a thing, I wouldn't be surprised if on RDNA 4 we'd see a proper, shaders and all chiplet based multi gpu package.
So uh, thats... what "anti-aliasing hardware" is. It is brute force. In this case, the voodoo brick is running the game at 1024x768 at the frame rate of a single SLI Voodoo2 setup, and the additional cards are literally just brute-forcing additional scenes to combine for AA. That's what "anti-aliasing hardware" is, in this context. It's a hefty brute-force approach. 3dfx did have some shenanigans for AA back then though, like using a rotated grid instead of a normal grid for sampling, which almost always gets superior results.
@@lasskinn474 The term "brute force" is misused on my part (even in French...). I agree with what you say. Indeed, the mathematically pure approach is prettier but becomes absurd beyond x8 because technically, calculating the "collision area" of a pixel theoretically becomes "faster". But no card does that, and the various engines aren't developed to do that. It's a change of approach that I think needs to be done, with chips that do just that. They could be used for all types of rasterization applications even in medical, research etc... Where pixel color measurement is important. There must also be a direct relationship between the center, the area of a pixel and therefore its alpha channel and the color of the virtual "edge" of an object, but I'm not a mathematician. I'm very interested in the topic, because I've already started developing a pixel art 2D game, but pixel perfect without any needs for a AA buffer. But I don't have the money, and therefore the time to continue.
@@xFluing The change of standard is very complex. Because it makes games incompatible and therefore more difficult to develop. Separating processes from GPUs is a good idea and I think it's required, especially so that heat dissipation doesn't become absurd to control. But this requires writing new engines that separate these processes. Engines that would become incompatible with current GPUs. This is why we are stuck in old calculation principles. AI will fill in the gaps to solve what we can't manage to mathematically simplify ourselves and maintain economic competitiveness. I'm afraid we'll lose a lot of efficiency.
These rare old tech videos are my fav. I love seeing where the computer world came from. I wasn't upgrading my prebuilt purple compaq when these were out.
when you were talking about CRTs having "round pixels", your using a sony trinitron which uses an apeture grill. they are more like vertical lines on a trinitron
I love these videos on older high-end hardware, back in the day it was hard to get any coverage on these things, maybe in magazines, but that was it. Seeing it tested in video using modern reviewing standards is amazing, especially this kind of hardware which was not even accessible for a regular enthusiast when it was new.
Awesome seeing one in action. Computer shop I worked at, we sold 6 of those to one client. Was special order and cost arm an a leg. They were not used for gaming though, it was used for video rendering for some space rovers. The main rendering was down from a large Linux cluster with 100+ servers (a lot of building on my side lol), but they discovered for that time period the workstations had issues loading the renderings and rotating it, etc. I never saw the video cards in action (just the cluster server since someone popped in incorrect RAM in some server of 100, so hunting that mess down.
I know Linus has said in the past that the music intro is bad for retention and they've been considering phasing it out, and I get it, no one wants a dip in watchtime in the first minute of a video, but if Supernova gets gone for good I'm going to be heartbroken
Flight Simulator Technician here. For our OTW (Out the Window) PCS we use GTX1080s in SLI and RTX3080s depending on generation. But I've had a chance to work on the old stuff too...Where One giant card the size of a large pizza box was in charge of each primary color (RGB) The sim space is pretty amazing.
The nice thing about SLI and Crossfire from back in the day, you could buy 2 cheaper Video cards on sale and hook them together and it could equal the performance of a high end GPU at a fraction of the cost.
I don't think that Crossfire ever worked as well as 3dfx's SLI implementation because they were just designed to work that way and could still do SLI without the other chip (if that makes sense) and just spit out every other line. They were so modular! Maybe I'm misremembering it, but I swear that I've seen either an SLI set up with a defective or disabled board or just spoofed... could be crazy...
@8:14 you can see they list forward observer simulators. I trained on one in 2003, it was absolutely incredible and identical to real locations, so we could use the same measurements and compass readings in the classroom or out in the training site.
Using a secondary GPU for AA sounds like a great idea and it was way back then. Remember PhysX? Users with older cards could re-use their previous gen cards, though unlikely since buying new cards makes more money. Great watch, thanks!
Remembering this just made me wish we'd have dedicated Raytracing cards today! Since Raytracing capabilities make much bigger leaps from generation to generation than traditional (rasterization) rendering power
I think we'll see tiled GPUs soon ish because it both makes making various skus and allocating them way easier (e.g. if there's too many high end gpus sitting, they could allocate the new chiplets to lower end GPUs) and smaller chips have better yields, so especially going into sub 1nm I can see that becoming the best option for high performance GPUs. Though they would probably behave more like GPCs (Graphics Processing Cluster, a group of rop, tmu and sm, which have a bunch of cuda cores) than individual GPUs.
They (Quantum3D) were really big on simulators back in the 90s. My dad got to try out their simulator system that was over $1m at the time - he was working in the government agency in charge of urban planning and they brought the entire setup down to try to sell the simulator setup. Full "walkthrough" setup with multiple large CRTs (Trinitron/ Diamontron era) and racks of the systems with a 'threadmill' (so you can simulate actually walking through) and some controls for movement (lateral and tilt/ pitch). The deal never went through but it was quite an impressive setup involving a lot of different cards and the 'SLI' setups with display cables loopthrough (for overlaying). There were many different graphics connection standards at the time and they also tried to support most all of them - VGA, BNC, SUN 13W3, Apple etc. Pretty insane stuff.
I recall when Captain's Workspace have managed to run Minecraft Beta 1.7.3 (cannot run newer versions due to persistent texture errors) with: -Tyan Thunder 2500 -8Gb Reg ECC RAM. -2x 1Ghz Pentium III -Quantum3D Aalchemy Module (AA Module) -4x Modded Voodoo5500 PCI -4x Quantum Atlas 10KII 18,4Gb SCSI drives (in RAID 10) -Windows 2000 Advanced Server Man, it is amazing that it managed to make Minecraft look much softer than it is at default!
The editors were merciful and never edited the Linus behind that bush that kept creeping me out. I cant tell if it was a missed opportunity or a blessing.
4:56 Nice job editors. This is the first example I've noticed where you can hear an echo of Linus's voice coming from Anthony's microphone. I'm assuming this is because they were talking over each other. This probably happens in every single video on this channel but this is the first time im noticing it.
Wow, I was happy with my 3DFX twin voodoo 2 setup that I still have in a box somewhere. This looks insane! Sony Trinitron? CRT.. dang! Quakeworld ran over the internet, on my twin voodoo 2's, smoother than almost any game I have ever played up until today. The network coding was insane. Luved all the vids that explored this. John Carmack and his team were genius programmers and the workarounds they used are legend.
Yep, been there done that. It also had to do with Carmack and id putting as much assembler code in the game engine rather than C++. They did some tricks in the code that were the "firsts" category, or in the top tier developers could do at the time. I also enjoyed my dual 7950 Nvidia GX2 cards which was two cards with SLI built in. :}
Nice to bring him on to your channel with his amazing nostalgic hardware that's absolute gold! Brings me back to the days of Amiga and 286. Beautiful setup. 👍
I had great hopes for Lucid technologies until Nvidia bought them off then killed the technology. The idea of mixing graphic card technologies sounded like a great idea once they got it right. I have an RX5700Xt and a gtx 1080 backup in case my 5700 died. I would love to see how they would work together with lucid if they kept perfecting it. I did do SLI a few times though. I did have two identical BFG GPUs before which worked perfectly individually but in dli they were attracting all over the place. I installed a different bios for a different brand and same GPU with same clock speeds on both gpus and SLI worked perfectly with the cards. When I contacted ed BFG they were in total denial that it was a bios problem and refused to look into it. I wasn't alone with lots of people with the same problem. Maybe that's why BFG went out of business 😂
A Direct3d 12 solution might be a good one. Crossfire/SLI might not be used often, but one case easily could be if you have an integrated GPU into the CPU, and then a discrete one. You might be able to offload parts onto the integrated GPU
I had a Obsidian x-24 card but sold it a few years ago. Interesting card with that medusa cable. Image quality wasn't that great though but SLI with only one PCI slot! Wow!
What makes that look so smooth isn't just the graphics card but the Trinitron monitor that he's running it on. The didn't require native resolution so they could run at a lower resolution without jaggies, Had a 19 inch Sony Trinitron monitor in 2000 and in my opinion LCD/LED monitors have just caught up to them in overall quality in the last few years. Also the "Original" GeForce DDR I had was a power house at the time! Definitely a legendary card in its time!
I think monetary it makes sense if they can perfect performance scaling. NVDA would love to sell four 8090s to a single consumer who wants to quadruple their performance. It would also solve the GPU manufacture's problem of GPUs that don't sell well as all they got to do is shrink wrap 4 packages of GPUs together, shove them into a box & mail them to the consumer.
I think the problem we have with the current GPU situation is that AMD and NVidia them selves are controlling the Vram of every AIB so compare to the 90s we're pretty much doomed in terms of if we want more Vram we cant get the same GPU with higher Vram for smaller price difference 😒😒
I wish they'd just added slots to add more vram yourself given that this almost always seems to be the limitation. It's always been about profit. No vram upgrades because they wanted to sell you a 2nd GPU instead (which you could barely benefit from). Then SLI was abandoned because they realized they could charge you $1500 for a single card and everyone is a winner (except you). Greed was always the limitation. Not technology.
I like these types of weird cards. Have you guys covered already or are considering the GTX 560Ti x 2(2 dies on same card), 580 x2, 760 x2 gpus to be in a future video?
Thar simulator looked a lot like what we used in the Bradley simulators on Ft. Benning. They had full hulls, and turrets built with computer screens in the scopes. It was fun to play since it simulated jammed weapons, misfires, and even wreckage if we flipped. Well, on screen it did lol. You did have simulator rounds you needed to load in order to reload the 25mm. I forget the name of the place though. I think it was CCT, or CCTT? We always had fun going there. They had simulators for everything. Humvees, Bradley's, Abrams, M-16, AT-4. Cool place.
Modern AC simulators use a seven channel/seven projector setup with a master server. Each channel processor (server) has one GPU, and normally two DisplayPort outs to one projector using DUAL 1080p60. The software running on the master server is the special sauce that combines all seven (portrait mode 1080p) images into the image seen in the cockpit. Of course there's also special hardware like motorized edge blend plates, spectrometers to sample each projectors output so corrections can be made to make them all as accurate as possible, etc.
Same: It makes me chuckle every time a tech-tuber comments about how modern GPUs &CPUs are energy hungry. Because all that is really happening is the market went from high power draw for high performance, to maintaining performance while increasing the power efficiency, and now they are back to punching high power to the high efficiency designs.
Near the 12 minute-mark Linus does some explanation as to how Post-3dfx SLI/Crossfire calculates only parts of the screen. This tecnique was adopted bij Nvidia by splitting the screen down vertically to create areas for both individual GPU's to render. ATi however used crossfire to select areas from a checkered pattern so as to create a better weighed average of calculation per GPU. It's too bad that crossfire was less well supported in games than SLI usually giving SLI'd flagship cards from Nvidia a slight edge over Crossfire pairs from ATi. But those were the golden years of GPU consumption with Nvidia and ATi both trading back and forth the performance crown constantly (2004-2014). Times were amazing and I had the pleasure to enjoy a triple crossfire setup using HD5870's and BIOS flashed HD5850. It's so unfortunate Nvidia has pulled away from servicing it's gamer clientele in that we're now paying 2000usd for single GPU cards which barely surpass their previous gen counterpart from 3 years earlier.
I think Apple introduced multi-gpu in a chip quite sometime ago when the first retina iPad was released too. Always wondered how they did it so seamlessly. This allowed them to get 120fps too.
@@austinverlinden2236 While this card has 2 GPUs on it these 2 GPUs show up as seperate GPUs to the OS, there is some `magic` in place to attempt to use them as single GPU but of the most part they are 2 seperate gpus on one PCB with a fancy PCIe switch rather than giving each 8 lanes.
Just looked it up, seems like it’s been a ‘multi-core’ GPU in the iPad Pro since 2015 so perhaps that’s a little different, not sure - ofc not the same as desktop gpus but would be interesting to know how they split the work between the cores. Res of 2732x2048 at 120mhz on a mobile device just always impressed me & felt like it went underrated.
@@peterthefoxx So apple uses 8 core clusters with the GPU (every 8 cores acts as a single GPU from a dispatch perspective). The TBDR pipeline they use makes dispatching work across these much simpler and easy as it directs devs to group orations that are per pixel within the pipeline reudince memory contention between tiles a LOT.
You've reminded me of a video from 15 years ago of an Australian declaring Crysis 'Whooped' by running it on Quad 9800GX2 cards at an average of 50 FPS. Truly a golden era.
i mean . . AMDs iGPUs of the new 7000APUs could be used to just put several onto one PCB again. cooling should also be quite possible. and i mean . . chiplet design for graphics is a thing now right? just make something with more and more chiplets and distribute them on the pcb instead of just clumping them together.
It's such a shame Nvidia decided to kill SLI right as VR was becoming a thing. VR SLI was absolutely amazing - since both views show almost exactly the same area, just from a slightly different point, you could send the exact same draw calls to both cards and then just change the camera position by ~8cm on one. Unlike traditional SLI, there was no microstuttering and almost *no overhead.* You added a second GPU and got like a 97% fps boost. VR and 2 way SLI were a match made in heaven and Nvidia killed it.
Hey Linus! Please read this! The gamma, brightness and contrast looks all messed up on your Sony GDM FW900 and there might be a fix for it.. It's a common thing with these old Sony trinitron tubes but luckily there is a setting in the menu called 'Image Restore'. This setting doesn't work until the monitor has been on & stabilized for a while. It fixed my old Sony trinitron monitor and it might do wonders for yours as well. Hope you will try this🤞
Peak PC gaming for me was late 2010-2012 ish when top tier GPUs were actually reasonably priced, and multi GPU setup was relatively well supported. Tri-Fire 7970s was my peak. Absolutely a blast to setup and play with. Sure, stuttering and scaling weren't great, but it was a ton of fun getting things to work. Oh, and CPUs actually had enough PCIe lanes for all of your PCIe devices.
Peak PC gaming for me was '92-'03. That's when innovative, amazing games were still coming out in droves and technology was improving as quickly as it ever has. Ultima underworld was so ahead of its time it was like finding rabbit fossils in the precambrian; there was nothing remotely like it before, you can't really trace its lineage, they just iterated behind closed doors and released this master piece. It was as unexpected and ahead of its time as the 1969 (before the moon landing!) XEROX presentation of the computer mouse, windowed personal computing and teleconferencing. Then you had the tail end of point and click adventure games, which was as good as the genre ever got before dying. Then you had the explosion of FPS, RTS and MMO genres. You had all manner of weird genre mixups like car combat, FPS/RTS online multiplayer games. You had the peak of management sim games like Dungeon keeper 2, which has never been improved on since. Modding was as powerful and easy as it ever was going to be and wonderful niché games were being invented by users left and right (S&I, Gloom, Natural selection, Team fortress, DOTA etc). By 2005 or so consolitis really set in and games were really, really dumbed down and really, really homogenized; around the PS2 era. Every AAA game had to hold your clammy hand in theirs and never let you go, never let you get lost and never let you fail. It was the time of the gray shooter. Field of view shrank to accomodate console gamers. Framerates become locked to 30 or 60 FPS to accomodate consoles (60 being a multiple of the 30 FPS standard used on weak console hardware). Controls became bogged down and slow; gated behind extensive animations to hide the latency of 30 FPS; this lowered the skill ceiling so much and it still hasn't recovered. Then came the loss of the third dimension; it used to be the case with 4:3 monitors and a large field of view that you could see properly what was above and below you and you could let players exist on many different elevations; maps could be build in levels and you could have fun mechanics like rocket jumping. With 16:9 and console-FOV this was largely flattened to be big plane with no level-over-level areas. Then came microtransactions, fee-to-pay, pay-to-win, gacha and loot boxes and artificial time wasting mechanics that you could pay to skip. Only in the 2010's with the wide scale explosion of VR and indie developers have PC gaming started to recover from the death of good AAA games. Still sour about mid 00's with Doom 3 being a terrible sloppy mess that had to only allow 3 monsters at a time to be able to run on consoles; Half-life 2 being a poor shadow of the original as if Valve didn't understand what made it good; Oblivion being incredibly dumbed down and blighted by extreme level scaling to the point where it was essentially unplayable without extensive moding, it was also a pioneer of the microtransaction mess. I mostly hibernated in mods and reveled in a few early indie games for the later half of that decade. I must have played a couple of thousand hours of natural-selection alone.
PowerVR worked better with multi-GPU configurations due to how it natively rendered the scene in 16x16 pixel tiles in on die memory instead of by scanline. Plugging in multiple cards was supported even with the first generation Power VR card from NEC and it didn't even require any sort of bridge between the cards for this, AND it scaled pretty linearly with the number of GPUs. You just had the GPUs render every other tile. Sega leveraged this with the Naomi 2 arcade board which used two PowerVR 2 GPUs. It was basically a Dreamcast with a second GPU a second SH4 CPU and a transformation and lighting coprocessor added as well as more RAM.
I love the guest. He's the mid-00s Scene Spirit Animal. Feel like we would have had a good time seeing Bring Me The Horizon, MCR, TBS, or whoever back in the day.
Dude, sick FW900! I've owned three in my life and I would sell a kidney to find one today. Alas, I sold my last one due to convergence issues and blurry text, and I'd imagine that any surviving units are suffering the same. Even barely-functional ones will pop up for over $1000 (usually double or more than that) which is mind-boggling considering I paid a whopping $150 for my first one back in 2010, which I thought was "too much" for a CRT at the time. I bought the second for $250, sold it for $350 about a year later which allowed me to buy a $350 unit in better condition. I kept that one for two "long" years before selling it for a whopping $550 in 2014, when I upgraded to 1440p which was massive at the time. They were absolute space heaters and sucked up an enormous amount of juice, both of which kept me from spending a crisp stack on a dying dinosaur. PLEASE make a video about your FW900 if you can, I would love to hear your thoughts and qualms on that beast!
It was extremely fun to bring my favorite hardware and showcase it with you guys! Even more fun to drop a load of boxes on Linus though :D
My rawr XD retro tech brother! Great job in the vid :D
@@Dredile
You gave me the urge to listen to MCR after this vid
that is how it gets started. first, you drop some boxes on Linus next you start to drop all kinds of things. I love old crazy hardware stuff.
Not allowing Linus to install it was probably for the best 😉
Getting rid of aliasing artifacts for simulators is actually a pretty big deal. I remember when I was going for an instrument rating in MS Flight Sim '04, one big tell that I was heading off-course was when the pixel edges of things started shimmering too quickly, which is something that someone piloting a real plane could absolutely not rely on at all.
Its crazy that in half a human life, we got from non existent 3d render to realtime raytraced rendering. Mind blowing
Don't worry, there are plenty of dumb/greedy aholes to pull us all back down. I am such a pessimist.
Yeah when I was rendering 4 days in lightwave for a 1 minute scene with the BttF Delorean is now done in realtime @100 fps, and people complain still that it went to 80fps for a few seconds. :)
All thanks to socialism and government regulation that's not private property rights-oriented, a.k.a. for the greater good. Democratic socialism for the win!
@@TheSiprianus ????? wtf do you mean, literally all of the big 3 GPU manufacturers are in a heavily capitalistic country
Half truth as games barely RT at all. A full RT modern game isn't even close to possible.
When I was in the Air Force my job was basically working on/upgrading/fixing flight simulators and have worked with Quantum3D before. This video brought me back so thanks for that
Sometimes I think all great computer technology has been invented in the 1990s and just needed to be perfected in later decades.
Just a few computer innovations that first appeared in the 1990s and still exist in perfected versions that all build upon 1990s standards and technology:
- Wifi (802.11 in 1997, 802.11a and 802.11b in 1999)
- Digital Cellular Networks (2G in 1991, 3G in 1998)
- World Wide Web (implemented in 1990, published in 1991)
- Bluetooth (first introduced in 1998, not ratified until 2002)
- USB (USB1.0 in 1996, USB1.1 in 1998, USB2.0 in 1999)
- DVB (DVB-S and DVB-C in 1994, DVB-T in 1997)
- ISDB (1999)
- AES (1998)
- PGP (1991)
- MD5 (1991)
- SHA-1 (1995)
- OpenGL (1992)
- Direct3D (1996)
- JPEG (1991)
- PNG (1996)
- MPEG-1 Audio Layer II aka MP2 (1991)
- MPEG-2 Audio Layer III aka MP3 (1995)
- MPEG-2 aka H.262 (1994)
- MPEG-4 (1998, often implemented in the early 2000s under brand names like "DivX, "XviD", "3vix", etc)
- AVC aka H.264 (2003, but first implementations were already around in the late 1990s alongside AAC)
- AC-3 aka "Dolby Digital" (1991, "Star Trek VI" was the first movie with AC-3)
- DTS (1993, "Jurassic Park" was the first movie with DTS)
- AAC (1997)
- IPv6 (1995)
- UTF-8 (1992)
- UTF-16 (published in 1996, first used in the late 1990s, not ratified until 2000)
- x86 instruction set extensions (MMX in 1997, SSE in 1999)
- Linux (first Linux kernel was released by Linus Torvalds in 1991)
- P2P file sharing (1999, "Napster" was the first P2P file sharing application and network)
- MMORPGs (1991, "Neverwinter Nights" [not the 2002 game] was the first MMORPG)
- Video games console emulators (1993, "Pasofami" emulating the NES was the first video game console emulator)
- Consumer PC water cooling (second half of the 1990s, first commercial product in 1999)*
*first watercooled computer was the UNIVAC 1 in 1951
"You don't need to reinvent the wheel"
Makes me think of my grandfathers washing machine, a million years old but still running when I've gone through 2 in 5 years but sold to me as new and better then the last one. Planned obsolescence is a hell of a drug
A lot of tech is theorized in early periods, but was not feasible due to hardware limitations (e.g VR). We just live in a period now where many of those ideas can come into fruition, the latest trend being AI.
@@theguythatknows Everything used to be built better because it was built simpler, you can't have complex modern technology that the masses want without fragility, just the sad truth man.
Computer history in particular is filled with examples of this. So much theoretical work was already done through the 70s and 80s which didn't see any practical application because fabrication could not deliver the needed physical parts or some other part of the puzzle wasn't found yet.
That OSD at 4:25 was probably the biggest nostalgia flashback I've ever had! I had a 17" Sony Trinitron Monitor capable of 120Hz back then and I absolutely loved that thing! I was using it way into the TFT era, until it died one day. Being forced to play @60Hz on my first TFT was a huge setback. The input lag was unbearable and I couldn't hit anything in online FPS. Never thought I'd ever see that OSD again. Thank you so much for that!!!
I had a 19" Samsung monitor that weighted 23kg and I carried it to many LAN parties.
"The owner didn't trust me"
With Limus dropping expensive stuff everyday no one would trust him to handel their prized collection
did Destin let him handle the memory block and circuit board from the Saturn LVDC ?
What about his wife asking to hold the baby 😂
not the homie Limus, i don’t think he can handel this
Especially when the product is nigh impossible to find.
the owner knows
In the late 90s, I worked on an SGI Indigo that had a voodoo card with 4GPUs. It was the most impressive experience at the time. The lab upstairs set up simulations using these machines. Incredible technology for the time! It is what really pushed me to get into building gaming systems.
Love that story!!!
And later models were even more impressive, when coupled with realtime A/V hardware they allowed for interactive 3D graphics and compositing used in many live news and sports broadcasts.
Mate, in the 90's it felt like tomorrow could bring us affordable VR Headsets at any moment, but they also felt impossibly far away at the same time. We truly had no idea what was capable/possible back then, eh??
@@dylanherron3963 I used what was available, it did work (with caveats), was wildly expensive and bespoke, but very cool.
Since then we have been stuck in a quagmire of attempts by corporations to avoid standards just enough to prevent competition (from Nvidia to Facebook), which is the real thing killing practical VR/AR adoption. Consider that VRLink was essentially dropped before any devices had a chance to use it, with the first dedicated one being the PSVR2 just released.
@@orbatos You bring up fantastic points, and those which I'm concerned about as a consumer/adopter of open source tech. Tech development tied to licensing (and the blocking out of such, by corporate conglomerates) are the biggest blunder I can think of man
Ah! The days when video card manufacturers could just buy chips from suppliers and expect to be able to independently make a product with them, even if it was substantially better than the reference implementation.
I’ve never seen a man so trapped in time in my life. 2005 must have been an amazing year for this guy.
On so many levels.
Man climbed to the peak and decided to just stay there forever
Even the hair haha
I love it tbh. Bring it back.
@@njmcfreak No please. Don't. Those haircut were everywhere and were awful.
Man, 3DFX in arcade games was a wild era.
Hydro thunder!!!
NFL Blitz always had crowds around it!
Sega of Japan missed on this--something Sega of America have seen as a vision.
Yeah, the Voodoo series was the GPU to have back in the 2000s.
@@kenshinflyer In retrospect, it was a lot better for the Dreamcast to essentially be a cut down Naomi board because porting was a lot easier, and practically 1:1 to arcade.
@@kwizzeh: True, but, I mean, the Dreamcast ended up with a PowerVR GPU (that also means the NAOMI has a PowerVR GPU), which was behind Nvidia back then. Sega of America wanted the 3DFX Voodoo to be the Dreamcast's GPU, but Sega of Japan handed NEC, which was a PowerVR licensee, the contract to the Dreamcast's GPU. This whole Sega thing is also one of the reasons why 3DFX went bankrupt.
Phenomenal script and delivery. Props to Linus and the writing department for all bringing their A games and the history lesson. Thank you as well to the editing teams for the visual learners!
it reminded me a lot of the "classic" LTT videos from the days in between the shaky handheld videos and the high-end production quality we currently have
Yeah, this one is on a whole another level. I am amazed.
Multi-GPU is extremely powerful for computational physics. Workloads like computational fluid dynamics need more VRAM than one GPU can offer, and can pool the memory of multiple cards. It's super difficult to implement and maintain though - developers have to manually implement and optimize which data to transfer between GPUs at which point in time. Making it more difficult, InfinityFabric on AMD is broken (driver segfaults) and NVLink on Nvidia is proprietary to CUDA and not available to OpenCL. So your only option is to do communication over PCIe. Luckily, PCIe 4.0 is plenty fast and 5.0 is coming with double the bandwidth. This also enables cross-vendor multi-GPU, "SLI"-ing Radeon+Arc+GeForce.
14:49 Multi-die GPUs are super interesting. Apple succeeded it with the M1 Ultra. AMD failed with the MI250(X), turning it into a dual-GPU-in-a-socket, but tries again with the MI300. Intel with PVC has performance issues in single-GPU mode, so also offer to configure it as dual-GPU in software. Nvidia's A100 is essentially a multi-chip GPU but on a monolythic die, and performance is outstanding.
Cool comment! Thanks for sharing :)
I like the idea of a geforce card but then representing arc on the side. Almost a nod to the multiple cards of the past.
Man, OpenCL really needs some love, doesn't it?
I concur
I remember my first 3DFX card the Pure3D with 6mb of ram which was $700 back in 1998. A beast of a card, with 4mb used for texture mapping AND 2mb for frame buffering.
Those coloured lights in Quake 2 gave me an unfair advantage according to my friends.
The difference with 3DFX SLI and new "SLI" is that 3DFX SLI didnt have the overhead of predicting frame areas/timings. The driver worked in a set way due to the render methods used at the time and all that had to be done was the required pre-determined render areas (scan lines) sent to the display output. This is also why the Quantum cards can have AA and non-AA output active at the same time but not pool their work to a single high FPS image.
This video was very technical. Whoever wrote it did and amazing job at making it digestible and easy to understand. I enjoy these deep dive videos!
Anthony is an excellent writer.
@@epoch888Anthony is an excellent everything!
@@LG1ikLx lol no
@@epoch888 hes offensive on the eyes though
@@najeebshah. rude
Would be great to have multi-GPU that doesn't require special support from games/etc. but rather seen logically as one monolithic GPU to the software
I was a flight simulator technician at FlightSafety International in the 90's. When the Riva 128 came out, I built a simple visual system based on the 'Boids' Direct3D SDK sample from Microsoft. FSI had recently purchased VITAL from McDonnell Douglas, which became FSI Visual Simulation Systems (VSS). When I demoed my efforts to the folks at VSS, I was told PCs will never be able to compete with dedicated hardware and minicomputers with custom 3D rackmount image processors would be the future. I mentioned that there was more money going into consumer 3D card development than VSS could ever hope to spend. It's all PCs and mainstream graphics now. No one can compete with millions of kids (of all ages) wanting more FPS.
Nice video.
It feels weird that SLI has mostly gone by the wayside. I probably could never afford it, but it was interesting to see those kind of set-ups made by others.
I had dual 980ti's because at the time, it was cheaper than buying a 1080ti. And, if I remember correctly, the 2x9890ti's out performed the 1080ti by a hair (where supported). For that reason alone, I wish SLI / Crossifre was still supported.
Agreed. IIRC, my buddy had two 8800GT SLI'd together. Always thought it was pretty neat, despite not being perfectly effective/efficient. I actually always wanted an SLI setup, but never had the chance to do one before they disappeared.
Sli was proprietary so Nvidia was their beginning and it's end
@@Corei14 what's sad is with DX12/Vulkan devs can implement multi GPU without the bridges and vendor independent so amd/intel/nvidia but where's the titles that take advantage of it?
@@spencerscott4878 Rocked a dual 980 setup till I switched to a 3060ti. The OG system was a 4790k @ 4.6 with 32GB 1600 RAM. What a darn beast that system was back then when SLI was still around and worked well enough.
I ran 2 voodoo 2's on my first PC back in 1999 (i was 19). I got a job after finishing school and felt so rich back then. Good memories. Had an early AMD 800mhz CPU (athlon was it called or something?). And shocked my friends with my 512mb of ram! (most people had like 128 back then)What a beast.
Anti-aliasing hardware is something I would love to see again. It's a very old problem that is still today solved in "brute force", with the same calculation compartment as the very expensive chip which is supposed to render shaders and the rest of a game.
This allows the most recent chips to be optimized to the maximum by delegating this task to another cheaper chip (or another dedicated GPU sector).
fxaa or msaa isn't like the old 2x,4x whatever you could do on old cards on any game
anyway. it's not always easy or straightforward to get the old ss 2x,4x etc antialiasing on games that are old enough that a modern gpu would have the power to render it in the old fashioned antialiasing way, which is really the best way as far as image quality is concerned. the newer methods are cheats, more or less, to get around having to render 4x- 16x the pixels, you can use ai filter models etc too, but none of them are as good as just actually doing it. so anyway the modern methdos aren't really brute force as such but on the other hand you don't get to use the brute force method even if you wanted sometimes now.
Looking at what apple did, and the way AMD went ahead and made chiplet based gpus a thing, I wouldn't be surprised if on RDNA 4 we'd see a proper, shaders and all chiplet based multi gpu package.
So uh, thats... what "anti-aliasing hardware" is. It is brute force. In this case, the voodoo brick is running the game at 1024x768 at the frame rate of a single SLI Voodoo2 setup, and the additional cards are literally just brute-forcing additional scenes to combine for AA. That's what "anti-aliasing hardware" is, in this context. It's a hefty brute-force approach.
3dfx did have some shenanigans for AA back then though, like using a rotated grid instead of a normal grid for sampling, which almost always gets superior results.
@@lasskinn474 The term "brute force" is misused on my part (even in French...). I agree with what you say.
Indeed, the mathematically pure approach is prettier but becomes absurd beyond x8 because technically, calculating the "collision area" of a pixel theoretically becomes "faster". But no card does that, and the various engines aren't developed to do that. It's a change of approach that I think needs to be done, with chips that do just that. They could be used for all types of rasterization applications even in medical, research etc... Where pixel color measurement is important.
There must also be a direct relationship between the center, the area of a pixel and therefore its alpha channel and the color of the virtual "edge" of an object, but I'm not a mathematician.
I'm very interested in the topic, because I've already started developing a pixel art 2D game, but pixel perfect without any needs for a AA buffer. But I don't have the money, and therefore the time to continue.
@@xFluing The change of standard is very complex. Because it makes games incompatible and therefore more difficult to develop. Separating processes from GPUs is a good idea and I think it's required, especially so that heat dissipation doesn't become absurd to control. But this requires writing new engines that separate these processes. Engines that would become incompatible with current GPUs. This is why we are stuck in old calculation principles. AI will fill in the gaps to solve what we can't manage to mathematically simplify ourselves and maintain economic competitiveness. I'm afraid we'll lose a lot of efficiency.
Please keep this type of content coming, what an amazing presentation of bleeding edge tech from 2 decades ago!
These rare old tech videos are my fav. I love seeing where the computer world came from. I wasn't upgrading my prebuilt purple compaq when these were out.
when you were talking about CRTs having "round pixels", your using a sony trinitron which uses an apeture grill. they are more like vertical lines on a trinitron
I love seeing us go backwards in time and reinvent old ideas in tech.
I love these videos on older high-end hardware, back in the day it was hard to get any coverage on these things, maybe in magazines, but that was it. Seeing it tested in video using modern reviewing standards is amazing, especially this kind of hardware which was not even accessible for a regular enthusiast when it was new.
Ross' hair perfectly matches the era of hardware on display
I know. I'm surprised they showed a link to his Instagram and not his MySpace page, lol.
Awesome seeing one in action. Computer shop I worked at, we sold 6 of those to one client. Was special order and cost arm an a leg. They were not used for gaming though, it was used for video rendering for some space rovers. The main rendering was down from a large Linux cluster with 100+ servers (a lot of building on my side lol), but they discovered for that time period the workstations had issues loading the renderings and rotating it, etc. I never saw the video cards in action (just the cluster server since someone popped in incorrect RAM in some server of 100, so hunting that mess down.
I know Linus has said in the past that the music intro is bad for retention and they've been considering phasing it out, and I get it, no one wants a dip in watchtime in the first minute of a video, but if Supernova gets gone for good I'm going to be heartbroken
Flight Simulator Technician here. For our OTW (Out the Window) PCS we use GTX1080s in SLI and RTX3080s depending on generation. But I've had a chance to work on the old stuff too...Where One giant card the size of a large pizza box was in charge of each primary color (RGB) The sim space is pretty amazing.
The nice thing about SLI and Crossfire from back in the day, you could buy 2 cheaper Video cards on sale and hook them together and it could equal the performance of a high end GPU at a fraction of the cost.
And that's the real reason that it died
I don't think that Crossfire ever worked as well as 3dfx's SLI implementation because they were just designed to work that way and could still do SLI without the other chip (if that makes sense) and just spit out every other line. They were so modular! Maybe I'm misremembering it, but I swear that I've seen either an SLI set up with a defective or disabled board or just spoofed... could be crazy...
@8:14 you can see they list forward observer simulators. I trained on one in 2003, it was absolutely incredible and identical to real locations, so we could use the same measurements and compass readings in the classroom or out in the training site.
dude never left the mid 2000s. I respect that
Using a secondary GPU for AA sounds like a great idea and it was way back then. Remember PhysX? Users with older cards could re-use their previous gen cards, though unlikely since buying new cards makes more money. Great watch, thanks!
Remembering this just made me wish we'd have dedicated Raytracing cards today!
Since Raytracing capabilities make much bigger leaps from generation to generation than traditional (rasterization) rendering power
Honestly yeah, bring back separate encoding, rt, ai, accelerators@@LRM12o8
I think we'll see tiled GPUs soon ish because it both makes making various skus and allocating them way easier (e.g. if there's too many high end gpus sitting, they could allocate the new chiplets to lower end GPUs) and smaller chips have better yields, so especially going into sub 1nm I can see that becoming the best option for high performance GPUs.
Though they would probably behave more like GPCs (Graphics Processing Cluster, a group of rop, tmu and sm, which have a bunch of cuda cores) than individual GPUs.
The voodoo 2 was my first real GPU upgrade in my compaq PC when I was a kid - amazing how great that card was!
Linus never runs out of video ideas
he pays people to think
@@HDReMasterwrite*
@@HDReMaster interesting 🤔.
@@JupiterGuy1 And what do they need to do before they start writing...?
Cause technology is always improving
They (Quantum3D) were really big on simulators back in the 90s. My dad got to try out their simulator system that was over $1m at the time - he was working in the government agency in charge of urban planning and they brought the entire setup down to try to sell the simulator setup.
Full "walkthrough" setup with multiple large CRTs (Trinitron/ Diamontron era) and racks of the systems with a 'threadmill' (so you can simulate actually walking through) and some controls for movement (lateral and tilt/ pitch). The deal never went through but it was quite an impressive setup involving a lot of different cards and the 'SLI' setups with display cables loopthrough (for overlaying).
There were many different graphics connection standards at the time and they also tried to support most all of them - VGA, BNC, SUN 13W3, Apple etc. Pretty insane stuff.
I recall when Captain's Workspace have managed to run Minecraft Beta 1.7.3 (cannot run newer versions due to persistent texture errors) with:
-Tyan Thunder 2500
-8Gb Reg ECC RAM.
-2x 1Ghz Pentium III
-Quantum3D Aalchemy Module (AA Module)
-4x Modded Voodoo5500 PCI
-4x Quantum Atlas 10KII 18,4Gb SCSI drives (in RAID 10)
-Windows 2000 Advanced Server
Man, it is amazing that it managed to make Minecraft look much softer than it is at default!
Captain is my boy Kasper!!! Been to LAN parties with him :D
The editors were merciful and never edited the Linus behind that bush that kept creeping me out.
I cant tell if it was a missed opportunity or a blessing.
This retro stuff is great, please make more content like this. Would be fun to see how sound developed too in pc gaming and content creation
There's better channels aimed at that audience out there than LTT.
@@TUUK2006 I'm aware, but did you take into consideration Linus' face and how excited he was doing this stuff?
4:56 Nice job editors. This is the first example I've noticed where you can hear an echo of Linus's voice coming from Anthony's microphone. I'm assuming this is because they were talking over each other. This probably happens in every single video on this channel but this is the first time im noticing it.
Ross looks exactly like the kind of dude who would own a simulator box like that, and I mean that as a compliment
He also didnt change his style since that gpu launch 🤓
Thanks haha :D
he looks like he could be the frontman of a early 2000s screamo band
Wow, I was happy with my 3DFX twin voodoo 2 setup that I still have in a box somewhere. This looks insane!
Sony Trinitron? CRT.. dang! Quakeworld ran over the internet, on my twin voodoo 2's, smoother than almost any game I have ever played up until today. The network coding was insane. Luved all the vids that explored this. John Carmack and his team were genius programmers and the workarounds they used are legend.
Yep, been there done that. It also had to do with Carmack and id putting as much assembler code in the game engine rather than C++. They did some tricks in the code that were the "firsts" category, or in the top tier developers could do at the time.
I also enjoyed my dual 7950 Nvidia GX2 cards which was two cards with SLI built in. :}
Can't wait for direct storage to become more of a thing. I think that opens the door for using the pcie slot for intercommunication
Nice to bring him on to your channel with his amazing nostalgic hardware that's absolute gold!
Brings me back to the days of Amiga and 286. Beautiful setup. 👍
I had great hopes for Lucid technologies until Nvidia bought them off then killed the technology. The idea of mixing graphic card technologies sounded like a great idea once they got it right. I have an RX5700Xt and a gtx 1080 backup in case my 5700 died. I would love to see how they would work together with lucid if they kept perfecting it.
I did do SLI a few times though. I did have two identical BFG GPUs before which worked perfectly individually but in dli they were attracting all over the place. I installed a different bios for a different brand and same GPU with same clock speeds on both gpus and SLI worked perfectly with the cards. When I contacted ed BFG they were in total denial that it was a bios problem and refused to look into it. I wasn't alone with lots of people with the same problem. Maybe that's why BFG went out of business 😂
A Direct3d 12 solution might be a good one. Crossfire/SLI might not be used often, but one case easily could be if you have an integrated GPU into the CPU, and then a discrete one. You might be able to offload parts onto the integrated GPU
I thought I'll never see a grown up scene kid. Props to you for keeping the style alive!
Thanks bro haha
I had a Obsidian x-24 card but sold it a few years ago. Interesting card with that medusa cable. Image quality wasn't that great though but SLI with only one PCI slot! Wow!
The card’s owner is a time traveller, that’s how he gets mint hardware and a matching haircut !
Thanks LOL :D
What makes that look so smooth isn't just the graphics card but the Trinitron monitor that he's running it on. The didn't require native resolution so they could run at a lower resolution without jaggies, Had a 19 inch Sony Trinitron monitor in 2000 and in my opinion LCD/LED monitors have just caught up to them in overall quality in the last few years.
Also the "Original" GeForce DDR I had was a power house at the time! Definitely a legendary card in its time!
I think monetary it makes sense if they can perfect performance scaling. NVDA would love to sell four 8090s to a single consumer who wants to quadruple their performance. It would also solve the GPU manufacture's problem of GPUs that don't sell well as all they got to do is shrink wrap 4 packages of GPUs together, shove them into a box & mail them to the consumer.
Yeah but then you'll have to upgrade your psu with a power plant
@@InhalingWeasel "PSU manufactures love this one trick."
I loved my 260 and 460 SLI configs, I could use them for quite some time before I needed to upgrade.
Let's see how future cards will be made.
I think the problem we have with the current GPU situation is that AMD and NVidia them selves are controlling the Vram of every AIB so compare to the 90s we're pretty much doomed in terms of if we want more Vram we cant get the same GPU with higher Vram for smaller price difference 😒😒
I wish they'd just added slots to add more vram yourself given that this almost always seems to be the limitation.
It's always been about profit. No vram upgrades because they wanted to sell you a 2nd GPU instead (which you could barely benefit from).
Then SLI was abandoned because they realized they could charge you $1500 for a single card and everyone is a winner (except you).
Greed was always the limitation. Not technology.
I like these types of weird cards. Have you guys covered already or are considering the GTX 560Ti x 2(2 dies on same card), 580 x2, 760 x2 gpus to be in a future video?
Are those like the gtx 690?
@@jesselioce similar in spec. But I'm curious how all the different variants compare in games.
the guys who were killing me all the time in quake must've been running these
Still can't afford it
Same here
Thar simulator looked a lot like what we used in the Bradley simulators on Ft. Benning. They had full hulls, and turrets built with computer screens in the scopes. It was fun to play since it simulated jammed weapons, misfires, and even wreckage if we flipped. Well, on screen it did lol. You did have simulator rounds you needed to load in order to reload the 25mm. I forget the name of the place though. I think it was CCT, or CCTT? We always had fun going there. They had simulators for everything. Humvees, Bradley's, Abrams, M-16, AT-4. Cool place.
So excited for LTX!
That guy who owns that stuff has a 90s Tech and 2000s Emo Hair Style... Wow... Nostalgia Overload Episode...
This is definitely a video about GPUs
Modern AC simulators use a seven channel/seven projector setup with a master server. Each channel processor (server) has one GPU, and normally two DisplayPort outs to one projector using DUAL 1080p60. The software running on the master server is the special sauce that combines all seven (portrait mode 1080p) images into the image seen in the cockpit. Of course there's also special hardware like motorized edge blend plates, spectrometers to sample each projectors output so corrections can be made to make them all as accurate as possible, etc.
Oh dear, Linus is back
@BeamNG Legend go*
@@luigiistcrazy Where'd he been❓🤔
@@Keepskatin Pray tell, whence had he traversed prior to his arrival at this present juncture?
Nearly spat my drink out when dude stood up with the HAIR 😂😂
Ltt there’s 300 dollar 240hz 1440p monitors on Amazon maybe do a short circuit they are impressive for the price
Which one?
Honestly within 5 minutes of such a video the price would skyrocket
Innocn I got it and I’m happy with it
I still have a 1,000w PSU in my rig because I used to exclusively run AMD Crossfire.
Same: It makes me chuckle every time a tech-tuber comments about how modern GPUs &CPUs are energy hungry. Because all that is really happening is the market went from high power draw for high performance, to maintaining performance while increasing the power efficiency, and now they are back to punching high power to the high efficiency designs.
Keep the retro stuff coming 👍
One of the best LTT videos this year!!
Who dropped off the 2006 emo kid?
fr
his mother, ofc
7:46 Glad that Linus deciphered the AA, I would have been very confused about alcoholics anonymous in fighter simulation context.
it has long been established that gaming chairs are the best thing you can buy to boost fps wdym ?
Near the 12 minute-mark Linus does some explanation as to how Post-3dfx SLI/Crossfire calculates only parts of the screen. This tecnique was adopted bij Nvidia by splitting the screen down vertically to create areas for both individual GPU's to render. ATi however used crossfire to select areas from a checkered pattern so as to create a better weighed average of calculation per GPU. It's too bad that crossfire was less well supported in games than SLI usually giving SLI'd flagship cards from Nvidia a slight edge over Crossfire pairs from ATi.
But those were the golden years of GPU consumption with Nvidia and ATi both trading back and forth the performance crown constantly (2004-2014). Times were amazing and I had the pleasure to enjoy a triple crossfire setup using HD5870's and BIOS flashed HD5850. It's so unfortunate Nvidia has pulled away from servicing it's gamer clientele in that we're now paying 2000usd for single GPU cards which barely surpass their previous gen counterpart from 3 years earlier.
I think Apple introduced multi-gpu in a chip quite sometime ago when the first retina iPad was released too. Always wondered how they did it so seamlessly. This allowed them to get 120fps too.
I am not sure about dual GPUs in ipad, however i know their current Mac pro you can buy a custom-made AMD card that has 2 GPUs on a single card.
@@austinverlinden2236 While this card has 2 GPUs on it these 2 GPUs show up as seperate GPUs to the OS, there is some `magic` in place to attempt to use them as single GPU but of the most part they are 2 seperate gpus on one PCB with a fancy PCIe switch rather than giving each 8 lanes.
@@hishnash ahh that makes sense. Thanks for the information
Just looked it up, seems like it’s been a ‘multi-core’ GPU in the iPad Pro since 2015 so perhaps that’s a little different, not sure - ofc not the same as desktop gpus but would be interesting to know how they split the work between the cores. Res of 2732x2048 at 120mhz on a mobile device just always impressed me & felt like it went underrated.
@@peterthefoxx So apple uses 8 core clusters with the GPU (every 8 cores acts as a single GPU from a dispatch perspective). The TBDR pipeline they use makes dispatching work across these much simpler and easy as it directs devs to group orations that are per pixel within the pipeline reudince memory contention between tiles a LOT.
You've reminded me of a video from 15 years ago of an Australian declaring Crysis 'Whooped' by running it on Quad 9800GX2 cards at an average of 50 FPS.
Truly a golden era.
that's a lot of computer thingies!
the guy talking during the sponsor, would make the perfect vampire. his got that look
It's weird to see a 30-something emo in 2023
6:08 " Because tonight will be the night that I fall for you, oooverr aaagaaiiin!"
i mean . . AMDs iGPUs of the new 7000APUs could be used to just put several onto one PCB again.
cooling should also be quite possible.
and i mean . . chiplet design for graphics is a thing now right?
just make something with more and more chiplets and distribute them on the pcb instead of just clumping them together.
5:28 Sound like my estern european teammates on CSGO.
It's such a shame Nvidia decided to kill SLI right as VR was becoming a thing.
VR SLI was absolutely amazing - since both views show almost exactly the same area, just from a slightly different point, you could send the exact same draw calls to both cards and then just change the camera position by ~8cm on one.
Unlike traditional SLI, there was no microstuttering and almost *no overhead.* You added a second GPU and got like a 97% fps boost.
VR and 2 way SLI were a match made in heaven and Nvidia killed it.
nah
Hey Linus! Please read this!
The gamma, brightness and contrast looks all messed up on your Sony GDM FW900 and there might be a fix for it.. It's a common thing with these old Sony trinitron tubes but luckily there is a setting in the menu called 'Image
Restore'. This setting doesn't work until the monitor has been on &
stabilized for a while. It fixed my old Sony trinitron monitor and it might do wonders for yours as well.
Hope you will try this🤞
Hi Linus I'm a big fan
MultiGPU going strong in 3D and video workflows - NVLink/SLI Bridges just stopped being necessary
Damn, I really wish they tried running Unreal with this setup.
I've ran Unreal on this brick before, I'll have more content regarding it in the future on my channel! (:
Peak PC gaming for me was late 2010-2012 ish when top tier GPUs were actually reasonably priced, and multi GPU setup was relatively well supported. Tri-Fire 7970s was my peak. Absolutely a blast to setup and play with. Sure, stuttering and scaling weren't great, but it was a ton of fun getting things to work. Oh, and CPUs actually had enough PCIe lanes for all of your PCIe devices.
Peak PC gaming for me was '92-'03. That's when innovative, amazing games were still coming out in droves and technology was improving as quickly as it ever has. Ultima underworld was so ahead of its time it was like finding rabbit fossils in the precambrian; there was nothing remotely like it before, you can't really trace its lineage, they just iterated behind closed doors and released this master piece. It was as unexpected and ahead of its time as the 1969 (before the moon landing!) XEROX presentation of the computer mouse, windowed personal computing and teleconferencing. Then you had the tail end of point and click adventure games, which was as good as the genre ever got before dying. Then you had the explosion of FPS, RTS and MMO genres. You had all manner of weird genre mixups like car combat, FPS/RTS online multiplayer games. You had the peak of management sim games like Dungeon keeper 2, which has never been improved on since. Modding was as powerful and easy as it ever was going to be and wonderful niché games were being invented by users left and right (S&I, Gloom, Natural selection, Team fortress, DOTA etc).
By 2005 or so consolitis really set in and games were really, really dumbed down and really, really homogenized; around the PS2 era. Every AAA game had to hold your clammy hand in theirs and never let you go, never let you get lost and never let you fail. It was the time of the gray shooter. Field of view shrank to accomodate console gamers. Framerates become locked to 30 or 60 FPS to accomodate consoles (60 being a multiple of the 30 FPS standard used on weak console hardware). Controls became bogged down and slow; gated behind extensive animations to hide the latency of 30 FPS; this lowered the skill ceiling so much and it still hasn't recovered. Then came the loss of the third dimension; it used to be the case with 4:3 monitors and a large field of view that you could see properly what was above and below you and you could let players exist on many different elevations; maps could be build in levels and you could have fun mechanics like rocket jumping. With 16:9 and console-FOV this was largely flattened to be big plane with no level-over-level areas. Then came microtransactions, fee-to-pay, pay-to-win, gacha and loot boxes and artificial time wasting mechanics that you could pay to skip.
Only in the 2010's with the wide scale explosion of VR and indie developers have PC gaming started to recover from the death of good AAA games.
Still sour about mid 00's with Doom 3 being a terrible sloppy mess that had to only allow 3 monsters at a time to be able to run on consoles; Half-life 2 being a poor shadow of the original as if Valve didn't understand what made it good; Oblivion being incredibly dumbed down and blighted by extreme level scaling to the point where it was essentially unplayable without extensive moding, it was also a pioneer of the microtransaction mess.
I mostly hibernated in mods and reveled in a few early indie games for the later half of that decade. I must have played a couple of thousand hours of natural-selection alone.
So Apple is doing SLI, Intel is making bang for the buck GPUs. Certainly fun times to be in
PowerVR worked better with multi-GPU configurations due to how it natively rendered the scene in 16x16 pixel tiles in on die memory instead of by scanline. Plugging in multiple cards was supported even with the first generation Power VR card from NEC and it didn't even require any sort of bridge between the cards for this, AND it scaled pretty linearly with the number of GPUs. You just had the GPUs render every other tile. Sega leveraged this with the Naomi 2 arcade board which used two PowerVR 2 GPUs. It was basically a Dreamcast with a second GPU a second SH4 CPU and a transformation and lighting coprocessor added as well as more RAM.
LAST
unlikely
I love the guest. He's the mid-00s Scene Spirit Animal. Feel like we would have had a good time seeing Bring Me The Horizon, MCR, TBS, or whoever back in the day.
first
W rizz
S*it - 1 minute late.
Fax
Man I bet Ross listens to taking back Sunday like me. Once a scene kid…always a scene kids!
Maybe ;D
❤️
@@gtastuntcrew302 I’m partial to anything with Anthony Green myself. 💙 又
4:25 That OSD brings back memories of my Dell Trinitron monitors.
2:42 " because tonight will be the night it goes in, over agaiahhn"
The depth of field cinematography across the card was beeeeautiful.
Dude, sick FW900! I've owned three in my life and I would sell a kidney to find one today. Alas, I sold my last one due to convergence issues and blurry text, and I'd imagine that any surviving units are suffering the same. Even barely-functional ones will pop up for over $1000 (usually double or more than that) which is mind-boggling considering I paid a whopping $150 for my first one back in 2010, which I thought was "too much" for a CRT at the time. I bought the second for $250, sold it for $350 about a year later which allowed me to buy a $350 unit in better condition. I kept that one for two "long" years before selling it for a whopping $550 in 2014, when I upgraded to 1440p which was massive at the time.
They were absolute space heaters and sucked up an enormous amount of juice, both of which kept me from spending a crisp stack on a dying dinosaur. PLEASE make a video about your FW900 if you can, I would love to hear your thoughts and qualms on that beast!
I love that the opening shot is framed so you can see cardboard Linus longingly looking at Linus's skin in the background.
I had several of the Quantum 3d x24's, absolutely loved them!
X-24's are the shit!! My fav card in 98SE! (: