Yeah when I was rendering 4 days in lightwave for a 1 minute scene with the BttF Delorean is now done in realtime @100 fps, and people complain still that it went to 80fps for a few seconds. :)
All thanks to socialism and government regulation that's not private property rights-oriented, a.k.a. for the greater good. Democratic socialism for the win!
Getting rid of aliasing artifacts for simulators is actually a pretty big deal. I remember when I was going for an instrument rating in MS Flight Sim '04, one big tell that I was heading off-course was when the pixel edges of things started shimmering too quickly, which is something that someone piloting a real plane could absolutely not rely on at all.
That OSD at 4:25 was probably the biggest nostalgia flashback I've ever had! I had a 17" Sony Trinitron Monitor capable of 120Hz back then and I absolutely loved that thing! I was using it way into the TFT era, until it died one day. Being forced to play @60Hz on my first TFT was a huge setback. The input lag was unbearable and I couldn't hit anything in online FPS. Never thought I'd ever see that OSD again. Thank you so much for that!!!
Sometimes I think all great computer technology has been invented in the 1990s and just needed to be perfected in later decades. Just a few computer innovations that first appeared in the 1990s and still exist in perfected versions that all build upon 1990s standards and technology: - Wifi (802.11 in 1997, 802.11a and 802.11b in 1999) - Digital Cellular Networks (2G in 1991, 3G in 1998) - World Wide Web (implemented in 1990, published in 1991) - Bluetooth (first introduced in 1998, not ratified until 2002) - USB (USB1.0 in 1996, USB1.1 in 1998, USB2.0 in 1999) - DVB (DVB-S and DVB-C in 1994, DVB-T in 1997) - ISDB (1999) - AES (1998) - PGP (1991) - MD5 (1991) - SHA-1 (1995) - OpenGL (1992) - Direct3D (1996) - JPEG (1991) - PNG (1996) - MPEG-1 Audio Layer II aka MP2 (1991) - MPEG-2 Audio Layer III aka MP3 (1995) - MPEG-2 aka H.262 (1994) - MPEG-4 (1998, often implemented in the early 2000s under brand names like "DivX, "XviD", "3vix", etc) - AVC aka H.264 (2003, but first implementations were already around in the late 1990s alongside AAC) - AC-3 aka "Dolby Digital" (1991, "Star Trek VI" was the first movie with AC-3) - DTS (1993, "Jurassic Park" was the first movie with DTS) - AAC (1997) - IPv6 (1995) - UTF-8 (1992) - UTF-16 (published in 1996, first used in the late 1990s, not ratified until 2000) - x86 instruction set extensions (MMX in 1997, SSE in 1999) - Linux (first Linux kernel was released by Linus Torvalds in 1991) - P2P file sharing (1999, "Napster" was the first P2P file sharing application and network) - MMORPGs (1991, "Neverwinter Nights" [not the 2002 game] was the first MMORPG) - Video games console emulators (1993, "Pasofami" emulating the NES was the first video game console emulator) - Consumer PC water cooling (second half of the 1990s, first commercial product in 1999)* *first watercooled computer was the UNIVAC 1 in 1951
Makes me think of my grandfathers washing machine, a million years old but still running when I've gone through 2 in 5 years but sold to me as new and better then the last one. Planned obsolescence is a hell of a drug
A lot of tech is theorized in early periods, but was not feasible due to hardware limitations (e.g VR). We just live in a period now where many of those ideas can come into fruition, the latest trend being AI.
@@theguythatknows Everything used to be built better because it was built simpler, you can't have complex modern technology that the masses want without fragility, just the sad truth man.
Computer history in particular is filled with examples of this. So much theoretical work was already done through the 70s and 80s which didn't see any practical application because fabrication could not deliver the needed physical parts or some other part of the puzzle wasn't found yet.
When I was in the Air Force my job was basically working on/upgrading/fixing flight simulators and have worked with Quantum3D before. This video brought me back so thanks for that
@@kenshinflyer In retrospect, it was a lot better for the Dreamcast to essentially be a cut down Naomi board because porting was a lot easier, and practically 1:1 to arcade.
@@kwizzeh: True, but, I mean, the Dreamcast ended up with a PowerVR GPU (that also means the NAOMI has a PowerVR GPU), which was behind Nvidia back then. Sega of America wanted the 3DFX Voodoo to be the Dreamcast's GPU, but Sega of Japan handed NEC, which was a PowerVR licensee, the contract to the Dreamcast's GPU. This whole Sega thing is also one of the reasons why 3DFX went bankrupt.
In the late 90s, I worked on an SGI Indigo that had a voodoo card with 4GPUs. It was the most impressive experience at the time. The lab upstairs set up simulations using these machines. Incredible technology for the time! It is what really pushed me to get into building gaming systems.
And later models were even more impressive, when coupled with realtime A/V hardware they allowed for interactive 3D graphics and compositing used in many live news and sports broadcasts.
Mate, in the 90's it felt like tomorrow could bring us affordable VR Headsets at any moment, but they also felt impossibly far away at the same time. We truly had no idea what was capable/possible back then, eh??
@@dylanherron3963 I used what was available, it did work (with caveats), was wildly expensive and bespoke, but very cool. Since then we have been stuck in a quagmire of attempts by corporations to avoid standards just enough to prevent competition (from Nvidia to Facebook), which is the real thing killing practical VR/AR adoption. Consider that VRLink was essentially dropped before any devices had a chance to use it, with the first dedicated one being the PSVR2 just released.
@@orbatos You bring up fantastic points, and those which I'm concerned about as a consumer/adopter of open source tech. Tech development tied to licensing (and the blocking out of such, by corporate conglomerates) are the biggest blunder I can think of man
Ah! The days when video card manufacturers could just buy chips from suppliers and expect to be able to independently make a product with them, even if it was substantially better than the reference implementation.
Phenomenal script and delivery. Props to Linus and the writing department for all bringing their A games and the history lesson. Thank you as well to the editing teams for the visual learners!
Multi-GPU is extremely powerful for computational physics. Workloads like computational fluid dynamics need more VRAM than one GPU can offer, and can pool the memory of multiple cards. It's super difficult to implement and maintain though - developers have to manually implement and optimize which data to transfer between GPUs at which point in time. Making it more difficult, InfinityFabric on AMD is broken (driver segfaults) and NVLink on Nvidia is proprietary to CUDA and not available to OpenCL. So your only option is to do communication over PCIe. Luckily, PCIe 4.0 is plenty fast and 5.0 is coming with double the bandwidth. This also enables cross-vendor multi-GPU, "SLI"-ing Radeon+Arc+GeForce. 14:49 Multi-die GPUs are super interesting. Apple succeeded it with the M1 Ultra. AMD failed with the MI250(X), turning it into a dual-GPU-in-a-socket, but tries again with the MI300. Intel with PVC has performance issues in single-GPU mode, so also offer to configure it as dual-GPU in software. Nvidia's A100 is essentially a multi-chip GPU but on a monolythic die, and performance is outstanding.
I was a flight simulator technician at FlightSafety International in the 90's. When the Riva 128 came out, I built a simple visual system based on the 'Boids' Direct3D SDK sample from Microsoft. FSI had recently purchased VITAL from McDonnell Douglas, which became FSI Visual Simulation Systems (VSS). When I demoed my efforts to the folks at VSS, I was told PCs will never be able to compete with dedicated hardware and minicomputers with custom 3D rackmount image processors would be the future. I mentioned that there was more money going into consumer 3D card development than VSS could ever hope to spend. It's all PCs and mainstream graphics now. No one can compete with millions of kids (of all ages) wanting more FPS. Nice video.
It feels weird that SLI has mostly gone by the wayside. I probably could never afford it, but it was interesting to see those kind of set-ups made by others.
I had dual 980ti's because at the time, it was cheaper than buying a 1080ti. And, if I remember correctly, the 2x9890ti's out performed the 1080ti by a hair (where supported). For that reason alone, I wish SLI / Crossifre was still supported.
Agreed. IIRC, my buddy had two 8800GT SLI'd together. Always thought it was pretty neat, despite not being perfectly effective/efficient. I actually always wanted an SLI setup, but never had the chance to do one before they disappeared.
Sli was proprietary so Nvidia was their beginning and it's end
Рік тому+5
@@Corei14 what's sad is with DX12/Vulkan devs can implement multi GPU without the bridges and vendor independent so amd/intel/nvidia but where's the titles that take advantage of it?
@@spencerscott4878 Rocked a dual 980 setup till I switched to a 3060ti. The OG system was a 4790k @ 4.6 with 32GB 1600 RAM. What a darn beast that system was back then when SLI was still around and worked well enough.
The difference with 3DFX SLI and new "SLI" is that 3DFX SLI didnt have the overhead of predicting frame areas/timings. The driver worked in a set way due to the render methods used at the time and all that had to be done was the required pre-determined render areas (scan lines) sent to the display output. This is also why the Quantum cards can have AA and non-AA output active at the same time but not pool their work to a single high FPS image.
I remember my first 3DFX card the Pure3D with 6mb of ram which was $700 back in 1998. A beast of a card, with 4mb used for texture mapping AND 2mb for frame buffering. Those coloured lights in Quake 2 gave me an unfair advantage according to my friends.
These rare old tech videos are my fav. I love seeing where the computer world came from. I wasn't upgrading my prebuilt purple compaq when these were out.
Anti-aliasing hardware is something I would love to see again. It's a very old problem that is still today solved in "brute force", with the same calculation compartment as the very expensive chip which is supposed to render shaders and the rest of a game. This allows the most recent chips to be optimized to the maximum by delegating this task to another cheaper chip (or another dedicated GPU sector).
fxaa or msaa isn't like the old 2x,4x whatever you could do on old cards on any game anyway. it's not always easy or straightforward to get the old ss 2x,4x etc antialiasing on games that are old enough that a modern gpu would have the power to render it in the old fashioned antialiasing way, which is really the best way as far as image quality is concerned. the newer methods are cheats, more or less, to get around having to render 4x- 16x the pixels, you can use ai filter models etc too, but none of them are as good as just actually doing it. so anyway the modern methdos aren't really brute force as such but on the other hand you don't get to use the brute force method even if you wanted sometimes now.
Looking at what apple did, and the way AMD went ahead and made chiplet based gpus a thing, I wouldn't be surprised if on RDNA 4 we'd see a proper, shaders and all chiplet based multi gpu package.
So uh, thats... what "anti-aliasing hardware" is. It is brute force. In this case, the voodoo brick is running the game at 1024x768 at the frame rate of a single SLI Voodoo2 setup, and the additional cards are literally just brute-forcing additional scenes to combine for AA. That's what "anti-aliasing hardware" is, in this context. It's a hefty brute-force approach. 3dfx did have some shenanigans for AA back then though, like using a rotated grid instead of a normal grid for sampling, which almost always gets superior results.
@@lasskinn474 The term "brute force" is misused on my part (even in French...). I agree with what you say. Indeed, the mathematically pure approach is prettier but becomes absurd beyond x8 because technically, calculating the "collision area" of a pixel theoretically becomes "faster". But no card does that, and the various engines aren't developed to do that. It's a change of approach that I think needs to be done, with chips that do just that. They could be used for all types of rasterization applications even in medical, research etc... Where pixel color measurement is important. There must also be a direct relationship between the center, the area of a pixel and therefore its alpha channel and the color of the virtual "edge" of an object, but I'm not a mathematician. I'm very interested in the topic, because I've already started developing a pixel art 2D game, but pixel perfect without any needs for a AA buffer. But I don't have the money, and therefore the time to continue.
@@xFluing The change of standard is very complex. Because it makes games incompatible and therefore more difficult to develop. Separating processes from GPUs is a good idea and I think it's required, especially so that heat dissipation doesn't become absurd to control. But this requires writing new engines that separate these processes. Engines that would become incompatible with current GPUs. This is why we are stuck in old calculation principles. AI will fill in the gaps to solve what we can't manage to mathematically simplify ourselves and maintain economic competitiveness. I'm afraid we'll lose a lot of efficiency.
I ran 2 voodoo 2's on my first PC back in 1999 (i was 19). I got a job after finishing school and felt so rich back then. Good memories. Had an early AMD 800mhz CPU (athlon was it called or something?). And shocked my friends with my 512mb of ram! (most people had like 128 back then)What a beast.
when you were talking about CRTs having "round pixels", your using a sony trinitron which uses an apeture grill. they are more like vertical lines on a trinitron
I love these videos on older high-end hardware, back in the day it was hard to get any coverage on these things, maybe in magazines, but that was it. Seeing it tested in video using modern reviewing standards is amazing, especially this kind of hardware which was not even accessible for a regular enthusiast when it was new.
Awesome seeing one in action. Computer shop I worked at, we sold 6 of those to one client. Was special order and cost arm an a leg. They were not used for gaming though, it was used for video rendering for some space rovers. The main rendering was down from a large Linux cluster with 100+ servers (a lot of building on my side lol), but they discovered for that time period the workstations had issues loading the renderings and rotating it, etc. I never saw the video cards in action (just the cluster server since someone popped in incorrect RAM in some server of 100, so hunting that mess down.
Thar simulator looked a lot like what we used in the Bradley simulators on Ft. Benning. They had full hulls, and turrets built with computer screens in the scopes. It was fun to play since it simulated jammed weapons, misfires, and even wreckage if we flipped. Well, on screen it did lol. You did have simulator rounds you needed to load in order to reload the 25mm. I forget the name of the place though. I think it was CCT, or CCTT? We always had fun going there. They had simulators for everything. Humvees, Bradley's, Abrams, M-16, AT-4. Cool place.
I know Linus has said in the past that the music intro is bad for retention and they've been considering phasing it out, and I get it, no one wants a dip in watchtime in the first minute of a video, but if Supernova gets gone for good I'm going to be heartbroken
They (Quantum3D) were really big on simulators back in the 90s. My dad got to try out their simulator system that was over $1m at the time - he was working in the government agency in charge of urban planning and they brought the entire setup down to try to sell the simulator setup. Full "walkthrough" setup with multiple large CRTs (Trinitron/ Diamontron era) and racks of the systems with a 'threadmill' (so you can simulate actually walking through) and some controls for movement (lateral and tilt/ pitch). The deal never went through but it was quite an impressive setup involving a lot of different cards and the 'SLI' setups with display cables loopthrough (for overlaying). There were many different graphics connection standards at the time and they also tried to support most all of them - VGA, BNC, SUN 13W3, Apple etc. Pretty insane stuff.
I recall when Captain's Workspace have managed to run Minecraft Beta 1.7.3 (cannot run newer versions due to persistent texture errors) with: -Tyan Thunder 2500 -8Gb Reg ECC RAM. -2x 1Ghz Pentium III -Quantum3D Aalchemy Module (AA Module) -4x Modded Voodoo5500 PCI -4x Quantum Atlas 10KII 18,4Gb SCSI drives (in RAID 10) -Windows 2000 Advanced Server Man, it is amazing that it managed to make Minecraft look much softer than it is at default!
The editors were merciful and never edited the Linus behind that bush that kept creeping me out. I cant tell if it was a missed opportunity or a blessing.
The nice thing about SLI and Crossfire from back in the day, you could buy 2 cheaper Video cards on sale and hook them together and it could equal the performance of a high end GPU at a fraction of the cost.
I don't think that Crossfire ever worked as well as 3dfx's SLI implementation because they were just designed to work that way and could still do SLI without the other chip (if that makes sense) and just spit out every other line. They were so modular! Maybe I'm misremembering it, but I swear that I've seen either an SLI set up with a defective or disabled board or just spoofed... could be crazy...
Nice to bring him on to your channel with his amazing nostalgic hardware that's absolute gold! Brings me back to the days of Amiga and 286. Beautiful setup. 👍
Wow, I was happy with my 3DFX twin voodoo 2 setup that I still have in a box somewhere. This looks insane! Sony Trinitron? CRT.. dang! Quakeworld ran over the internet, on my twin voodoo 2's, smoother than almost any game I have ever played up until today. The network coding was insane. Luved all the vids that explored this. John Carmack and his team were genius programmers and the workarounds they used are legend.
Yep, been there done that. It also had to do with Carmack and id putting as much assembler code in the game engine rather than C++. They did some tricks in the code that were the "firsts" category, or in the top tier developers could do at the time. I also enjoyed my dual 7950 Nvidia GX2 cards which was two cards with SLI built in. :}
@8:14 you can see they list forward observer simulators. I trained on one in 2003, it was absolutely incredible and identical to real locations, so we could use the same measurements and compass readings in the classroom or out in the training site.
I think we'll see tiled GPUs soon ish because it both makes making various skus and allocating them way easier (e.g. if there's too many high end gpus sitting, they could allocate the new chiplets to lower end GPUs) and smaller chips have better yields, so especially going into sub 1nm I can see that becoming the best option for high performance GPUs. Though they would probably behave more like GPCs (Graphics Processing Cluster, a group of rop, tmu and sm, which have a bunch of cuda cores) than individual GPUs.
Using a secondary GPU for AA sounds like a great idea and it was way back then. Remember PhysX? Users with older cards could re-use their previous gen cards, though unlikely since buying new cards makes more money. Great watch, thanks!
Remembering this just made me wish we'd have dedicated Raytracing cards today! Since Raytracing capabilities make much bigger leaps from generation to generation than traditional (rasterization) rendering power
Same: It makes me chuckle every time a tech-tuber comments about how modern GPUs &CPUs are energy hungry. Because all that is really happening is the market went from high power draw for high performance, to maintaining performance while increasing the power efficiency, and now they are back to punching high power to the high efficiency designs.
Flight Simulator Technician here. For our OTW (Out the Window) PCS we use GTX1080s in SLI and RTX3080s depending on generation. But I've had a chance to work on the old stuff too...Where One giant card the size of a large pizza box was in charge of each primary color (RGB) The sim space is pretty amazing.
4:56 Nice job editors. This is the first example I've noticed where you can hear an echo of Linus's voice coming from Anthony's microphone. I'm assuming this is because they were talking over each other. This probably happens in every single video on this channel but this is the first time im noticing it.
The real reason they killed sli is for the vram tier pricing system. You will also notice they refuse to make any affordable card with 24gb vram. Who needs to buy a $10k card when you can nvlink 2x 4090. Although you do get some power efficiency the 4090 slams workloads with enough vram.
I wish they'd just added slots to add more vram yourself given that this almost always seems to be the limitation. It's always been about profit. No vram upgrades because they wanted to sell you a 2nd GPU instead (which you could barely benefit from). Then SLI was abandoned because they realized they could charge you $1500 for a single card and everyone is a winner (except you). Greed was always the limitation. Not technology.
I had great hopes for Lucid technologies until Nvidia bought them off then killed the technology. The idea of mixing graphic card technologies sounded like a great idea once they got it right. I have an RX5700Xt and a gtx 1080 backup in case my 5700 died. I would love to see how they would work together with lucid if they kept perfecting it. I did do SLI a few times though. I did have two identical BFG GPUs before which worked perfectly individually but in dli they were attracting all over the place. I installed a different bios for a different brand and same GPU with same clock speeds on both gpus and SLI worked perfectly with the cards. When I contacted ed BFG they were in total denial that it was a bios problem and refused to look into it. I wasn't alone with lots of people with the same problem. Maybe that's why BFG went out of business 😂
I think monetary it makes sense if they can perfect performance scaling. NVDA would love to sell four 8090s to a single consumer who wants to quadruple their performance. It would also solve the GPU manufacture's problem of GPUs that don't sell well as all they got to do is shrink wrap 4 packages of GPUs together, shove them into a box & mail them to the consumer.
I like these types of weird cards. Have you guys covered already or are considering the GTX 560Ti x 2(2 dies on same card), 580 x2, 760 x2 gpus to be in a future video?
What makes that look so smooth isn't just the graphics card but the Trinitron monitor that he's running it on. The didn't require native resolution so they could run at a lower resolution without jaggies, Had a 19 inch Sony Trinitron monitor in 2000 and in my opinion LCD/LED monitors have just caught up to them in overall quality in the last few years. Also the "Original" GeForce DDR I had was a power house at the time! Definitely a legendary card in its time!
You've reminded me of a video from 15 years ago of an Australian declaring Crysis 'Whooped' by running it on Quad 9800GX2 cards at an average of 50 FPS. Truly a golden era.
i'm from same era, and yes, now this makes sense how we'd see some arcades with what at the time was crazy good graphics that blew everything you could have at home away.
I think the problem we have with the current GPU situation is that AMD and NVidia them selves are controlling the Vram of every AIB so compare to the 90s we're pretty much doomed in terms of if we want more Vram we cant get the same GPU with higher Vram for smaller price difference 😒😒
I think Apple introduced multi-gpu in a chip quite sometime ago when the first retina iPad was released too. Always wondered how they did it so seamlessly. This allowed them to get 120fps too.
@@austinverlinden2236 While this card has 2 GPUs on it these 2 GPUs show up as seperate GPUs to the OS, there is some `magic` in place to attempt to use them as single GPU but of the most part they are 2 seperate gpus on one PCB with a fancy PCIe switch rather than giving each 8 lanes.
Just looked it up, seems like it’s been a ‘multi-core’ GPU in the iPad Pro since 2015 so perhaps that’s a little different, not sure - ofc not the same as desktop gpus but would be interesting to know how they split the work between the cores. Res of 2732x2048 at 120mhz on a mobile device just always impressed me & felt like it went underrated.
@@peterthefoxx So apple uses 8 core clusters with the GPU (every 8 cores acts as a single GPU from a dispatch perspective). The TBDR pipeline they use makes dispatching work across these much simpler and easy as it directs devs to group orations that are per pixel within the pipeline reudince memory contention between tiles a LOT.
Modern AC simulators use a seven channel/seven projector setup with a master server. Each channel processor (server) has one GPU, and normally two DisplayPort outs to one projector using DUAL 1080p60. The software running on the master server is the special sauce that combines all seven (portrait mode 1080p) images into the image seen in the cockpit. Of course there's also special hardware like motorized edge blend plates, spectrometers to sample each projectors output so corrections can be made to make them all as accurate as possible, etc.
I had a Obsidian x-24 card but sold it a few years ago. Interesting card with that medusa cable. Image quality wasn't that great though but SLI with only one PCI slot! Wow!
i mean . . AMDs iGPUs of the new 7000APUs could be used to just put several onto one PCB again. cooling should also be quite possible. and i mean . . chiplet design for graphics is a thing now right? just make something with more and more chiplets and distribute them on the pcb instead of just clumping them together.
Ill never forget being a kid saving to buy FF8 that came on 4 cds. Got home and thats when i learned about video cards. It took months but i saved up and bought a voodoo2. Most rewarding moment of my life lol.
It's such a shame Nvidia decided to kill SLI right as VR was becoming a thing. VR SLI was absolutely amazing - since both views show almost exactly the same area, just from a slightly different point, you could send the exact same draw calls to both cards and then just change the camera position by ~8cm on one. Unlike traditional SLI, there was no microstuttering and almost *no overhead.* You added a second GPU and got like a 97% fps boost. VR and 2 way SLI were a match made in heaven and Nvidia killed it.
A Direct3d 12 solution might be a good one. Crossfire/SLI might not be used often, but one case easily could be if you have an integrated GPU into the CPU, and then a discrete one. You might be able to offload parts onto the integrated GPU
Man, I miss my ObsidianX24. Never before or since has my jaw dropped as much as firing that up with some unreal, the just intro. 1024X768..... so glorius lol. Seeing the inside of this PC reminded me of the golden age. Building watercooling loops from trips to the autoparts store..... I think my 1st one used an 83 civic heater core for a rad..... Scan;ine interleaving was really cool when it worked. This is when 200 bucks got you top end GPUs, 400 got you the Obsidion. We truely are in the dark days right now with software garbage everywhere propping up bloated feature sets.
It was extremely fun to bring my favorite hardware and showcase it with you guys! Even more fun to drop a load of boxes on Linus though :D
My rawr XD retro tech brother! Great job in the vid :D
@@Dredile
You gave me the urge to listen to MCR after this vid
that is how it gets started. first, you drop some boxes on Linus next you start to drop all kinds of things. I love old crazy hardware stuff.
Not allowing Linus to install it was probably for the best 😉
I’ve never seen a man so trapped in time in my life. 2005 must have been an amazing year for this guy.
On so many levels.
Man climbed to the peak and decided to just stay there forever
Even the hair haha
I love it tbh. Bring it back.
@@njmcfreak No please. Don't. Those haircut were everywhere and were awful.
Its crazy that in half a human life, we got from non existent 3d render to realtime raytraced rendering. Mind blowing
Don't worry, there are plenty of dumb/greedy aholes to pull us all back down. I am such a pessimist.
Yeah when I was rendering 4 days in lightwave for a 1 minute scene with the BttF Delorean is now done in realtime @100 fps, and people complain still that it went to 80fps for a few seconds. :)
All thanks to socialism and government regulation that's not private property rights-oriented, a.k.a. for the greater good. Democratic socialism for the win!
@@TheSiprianus ????? wtf do you mean, literally all of the big 3 GPU manufacturers are in a heavily capitalistic country
Half truth as games barely RT at all. A full RT modern game isn't even close to possible.
Getting rid of aliasing artifacts for simulators is actually a pretty big deal. I remember when I was going for an instrument rating in MS Flight Sim '04, one big tell that I was heading off-course was when the pixel edges of things started shimmering too quickly, which is something that someone piloting a real plane could absolutely not rely on at all.
"The owner didn't trust me"
With Limus dropping expensive stuff everyday no one would trust him to handel their prized collection
did Destin let him handle the memory block and circuit board from the Saturn LVDC ?
What about his wife asking to hold the baby 😂
not the homie Limus, i don’t think he can handel this
Especially when the product is nigh impossible to find.
the owner knows
That OSD at 4:25 was probably the biggest nostalgia flashback I've ever had! I had a 17" Sony Trinitron Monitor capable of 120Hz back then and I absolutely loved that thing! I was using it way into the TFT era, until it died one day. Being forced to play @60Hz on my first TFT was a huge setback. The input lag was unbearable and I couldn't hit anything in online FPS. Never thought I'd ever see that OSD again. Thank you so much for that!!!
I had a 19" Samsung monitor that weighted 23kg and I carried it to many LAN parties.
Sometimes I think all great computer technology has been invented in the 1990s and just needed to be perfected in later decades.
Just a few computer innovations that first appeared in the 1990s and still exist in perfected versions that all build upon 1990s standards and technology:
- Wifi (802.11 in 1997, 802.11a and 802.11b in 1999)
- Digital Cellular Networks (2G in 1991, 3G in 1998)
- World Wide Web (implemented in 1990, published in 1991)
- Bluetooth (first introduced in 1998, not ratified until 2002)
- USB (USB1.0 in 1996, USB1.1 in 1998, USB2.0 in 1999)
- DVB (DVB-S and DVB-C in 1994, DVB-T in 1997)
- ISDB (1999)
- AES (1998)
- PGP (1991)
- MD5 (1991)
- SHA-1 (1995)
- OpenGL (1992)
- Direct3D (1996)
- JPEG (1991)
- PNG (1996)
- MPEG-1 Audio Layer II aka MP2 (1991)
- MPEG-2 Audio Layer III aka MP3 (1995)
- MPEG-2 aka H.262 (1994)
- MPEG-4 (1998, often implemented in the early 2000s under brand names like "DivX, "XviD", "3vix", etc)
- AVC aka H.264 (2003, but first implementations were already around in the late 1990s alongside AAC)
- AC-3 aka "Dolby Digital" (1991, "Star Trek VI" was the first movie with AC-3)
- DTS (1993, "Jurassic Park" was the first movie with DTS)
- AAC (1997)
- IPv6 (1995)
- UTF-8 (1992)
- UTF-16 (published in 1996, first used in the late 1990s, not ratified until 2000)
- x86 instruction set extensions (MMX in 1997, SSE in 1999)
- Linux (first Linux kernel was released by Linus Torvalds in 1991)
- P2P file sharing (1999, "Napster" was the first P2P file sharing application and network)
- MMORPGs (1991, "Neverwinter Nights" [not the 2002 game] was the first MMORPG)
- Video games console emulators (1993, "Pasofami" emulating the NES was the first video game console emulator)
- Consumer PC water cooling (second half of the 1990s, first commercial product in 1999)*
*first watercooled computer was the UNIVAC 1 in 1951
"You don't need to reinvent the wheel"
Makes me think of my grandfathers washing machine, a million years old but still running when I've gone through 2 in 5 years but sold to me as new and better then the last one. Planned obsolescence is a hell of a drug
A lot of tech is theorized in early periods, but was not feasible due to hardware limitations (e.g VR). We just live in a period now where many of those ideas can come into fruition, the latest trend being AI.
@@theguythatknows Everything used to be built better because it was built simpler, you can't have complex modern technology that the masses want without fragility, just the sad truth man.
Computer history in particular is filled with examples of this. So much theoretical work was already done through the 70s and 80s which didn't see any practical application because fabrication could not deliver the needed physical parts or some other part of the puzzle wasn't found yet.
When I was in the Air Force my job was basically working on/upgrading/fixing flight simulators and have worked with Quantum3D before. This video brought me back so thanks for that
Man, 3DFX in arcade games was a wild era.
Hydro thunder!!!
NFL Blitz always had crowds around it!
Sega of Japan missed on this--something Sega of America have seen as a vision.
Yeah, the Voodoo series was the GPU to have back in the 2000s.
@@kenshinflyer In retrospect, it was a lot better for the Dreamcast to essentially be a cut down Naomi board because porting was a lot easier, and practically 1:1 to arcade.
@@kwizzeh: True, but, I mean, the Dreamcast ended up with a PowerVR GPU (that also means the NAOMI has a PowerVR GPU), which was behind Nvidia back then. Sega of America wanted the 3DFX Voodoo to be the Dreamcast's GPU, but Sega of Japan handed NEC, which was a PowerVR licensee, the contract to the Dreamcast's GPU. This whole Sega thing is also one of the reasons why 3DFX went bankrupt.
In the late 90s, I worked on an SGI Indigo that had a voodoo card with 4GPUs. It was the most impressive experience at the time. The lab upstairs set up simulations using these machines. Incredible technology for the time! It is what really pushed me to get into building gaming systems.
Love that story!!!
And later models were even more impressive, when coupled with realtime A/V hardware they allowed for interactive 3D graphics and compositing used in many live news and sports broadcasts.
Mate, in the 90's it felt like tomorrow could bring us affordable VR Headsets at any moment, but they also felt impossibly far away at the same time. We truly had no idea what was capable/possible back then, eh??
@@dylanherron3963 I used what was available, it did work (with caveats), was wildly expensive and bespoke, but very cool.
Since then we have been stuck in a quagmire of attempts by corporations to avoid standards just enough to prevent competition (from Nvidia to Facebook), which is the real thing killing practical VR/AR adoption. Consider that VRLink was essentially dropped before any devices had a chance to use it, with the first dedicated one being the PSVR2 just released.
@@orbatos You bring up fantastic points, and those which I'm concerned about as a consumer/adopter of open source tech. Tech development tied to licensing (and the blocking out of such, by corporate conglomerates) are the biggest blunder I can think of man
Ah! The days when video card manufacturers could just buy chips from suppliers and expect to be able to independently make a product with them, even if it was substantially better than the reference implementation.
Phenomenal script and delivery. Props to Linus and the writing department for all bringing their A games and the history lesson. Thank you as well to the editing teams for the visual learners!
Yeah, this one is on a whole another level. I am amazed.
Multi-GPU is extremely powerful for computational physics. Workloads like computational fluid dynamics need more VRAM than one GPU can offer, and can pool the memory of multiple cards. It's super difficult to implement and maintain though - developers have to manually implement and optimize which data to transfer between GPUs at which point in time. Making it more difficult, InfinityFabric on AMD is broken (driver segfaults) and NVLink on Nvidia is proprietary to CUDA and not available to OpenCL. So your only option is to do communication over PCIe. Luckily, PCIe 4.0 is plenty fast and 5.0 is coming with double the bandwidth. This also enables cross-vendor multi-GPU, "SLI"-ing Radeon+Arc+GeForce.
14:49 Multi-die GPUs are super interesting. Apple succeeded it with the M1 Ultra. AMD failed with the MI250(X), turning it into a dual-GPU-in-a-socket, but tries again with the MI300. Intel with PVC has performance issues in single-GPU mode, so also offer to configure it as dual-GPU in software. Nvidia's A100 is essentially a multi-chip GPU but on a monolythic die, and performance is outstanding.
Cool comment! Thanks for sharing :)
I like the idea of a geforce card but then representing arc on the side. Almost a nod to the multiple cards of the past.
Man, OpenCL really needs some love, doesn't it?
I concur
I was a flight simulator technician at FlightSafety International in the 90's. When the Riva 128 came out, I built a simple visual system based on the 'Boids' Direct3D SDK sample from Microsoft. FSI had recently purchased VITAL from McDonnell Douglas, which became FSI Visual Simulation Systems (VSS). When I demoed my efforts to the folks at VSS, I was told PCs will never be able to compete with dedicated hardware and minicomputers with custom 3D rackmount image processors would be the future. I mentioned that there was more money going into consumer 3D card development than VSS could ever hope to spend. It's all PCs and mainstream graphics now. No one can compete with millions of kids (of all ages) wanting more FPS.
Nice video.
It feels weird that SLI has mostly gone by the wayside. I probably could never afford it, but it was interesting to see those kind of set-ups made by others.
I had dual 980ti's because at the time, it was cheaper than buying a 1080ti. And, if I remember correctly, the 2x9890ti's out performed the 1080ti by a hair (where supported). For that reason alone, I wish SLI / Crossifre was still supported.
Agreed. IIRC, my buddy had two 8800GT SLI'd together. Always thought it was pretty neat, despite not being perfectly effective/efficient. I actually always wanted an SLI setup, but never had the chance to do one before they disappeared.
Sli was proprietary so Nvidia was their beginning and it's end
@@Corei14 what's sad is with DX12/Vulkan devs can implement multi GPU without the bridges and vendor independent so amd/intel/nvidia but where's the titles that take advantage of it?
@@spencerscott4878 Rocked a dual 980 setup till I switched to a 3060ti. The OG system was a 4790k @ 4.6 with 32GB 1600 RAM. What a darn beast that system was back then when SLI was still around and worked well enough.
This video was very technical. Whoever wrote it did and amazing job at making it digestible and easy to understand. I enjoy these deep dive videos!
Anthony is an excellent writer.
@@epoch888Anthony is an excellent everything!
@@LG1ikLx lol no
@@epoch888 hes offensive on the eyes though
@@najeebshah. rude
The difference with 3DFX SLI and new "SLI" is that 3DFX SLI didnt have the overhead of predicting frame areas/timings. The driver worked in a set way due to the render methods used at the time and all that had to be done was the required pre-determined render areas (scan lines) sent to the display output. This is also why the Quantum cards can have AA and non-AA output active at the same time but not pool their work to a single high FPS image.
nVidia bought all the 3dfx assets, yet couldn't build a similar, solid SLI setup
@@halofreak1990 Yes, if you read my comment I explained why.
I remember my first 3DFX card the Pure3D with 6mb of ram which was $700 back in 1998. A beast of a card, with 4mb used for texture mapping AND 2mb for frame buffering.
Those coloured lights in Quake 2 gave me an unfair advantage according to my friends.
These rare old tech videos are my fav. I love seeing where the computer world came from. I wasn't upgrading my prebuilt purple compaq when these were out.
Please keep this type of content coming, what an amazing presentation of bleeding edge tech from 2 decades ago!
Anti-aliasing hardware is something I would love to see again. It's a very old problem that is still today solved in "brute force", with the same calculation compartment as the very expensive chip which is supposed to render shaders and the rest of a game.
This allows the most recent chips to be optimized to the maximum by delegating this task to another cheaper chip (or another dedicated GPU sector).
fxaa or msaa isn't like the old 2x,4x whatever you could do on old cards on any game
anyway. it's not always easy or straightforward to get the old ss 2x,4x etc antialiasing on games that are old enough that a modern gpu would have the power to render it in the old fashioned antialiasing way, which is really the best way as far as image quality is concerned. the newer methods are cheats, more or less, to get around having to render 4x- 16x the pixels, you can use ai filter models etc too, but none of them are as good as just actually doing it. so anyway the modern methdos aren't really brute force as such but on the other hand you don't get to use the brute force method even if you wanted sometimes now.
Looking at what apple did, and the way AMD went ahead and made chiplet based gpus a thing, I wouldn't be surprised if on RDNA 4 we'd see a proper, shaders and all chiplet based multi gpu package.
So uh, thats... what "anti-aliasing hardware" is. It is brute force. In this case, the voodoo brick is running the game at 1024x768 at the frame rate of a single SLI Voodoo2 setup, and the additional cards are literally just brute-forcing additional scenes to combine for AA. That's what "anti-aliasing hardware" is, in this context. It's a hefty brute-force approach.
3dfx did have some shenanigans for AA back then though, like using a rotated grid instead of a normal grid for sampling, which almost always gets superior results.
@@lasskinn474 The term "brute force" is misused on my part (even in French...). I agree with what you say.
Indeed, the mathematically pure approach is prettier but becomes absurd beyond x8 because technically, calculating the "collision area" of a pixel theoretically becomes "faster". But no card does that, and the various engines aren't developed to do that. It's a change of approach that I think needs to be done, with chips that do just that. They could be used for all types of rasterization applications even in medical, research etc... Where pixel color measurement is important.
There must also be a direct relationship between the center, the area of a pixel and therefore its alpha channel and the color of the virtual "edge" of an object, but I'm not a mathematician.
I'm very interested in the topic, because I've already started developing a pixel art 2D game, but pixel perfect without any needs for a AA buffer. But I don't have the money, and therefore the time to continue.
@@xFluing The change of standard is very complex. Because it makes games incompatible and therefore more difficult to develop. Separating processes from GPUs is a good idea and I think it's required, especially so that heat dissipation doesn't become absurd to control. But this requires writing new engines that separate these processes. Engines that would become incompatible with current GPUs. This is why we are stuck in old calculation principles. AI will fill in the gaps to solve what we can't manage to mathematically simplify ourselves and maintain economic competitiveness. I'm afraid we'll lose a lot of efficiency.
I ran 2 voodoo 2's on my first PC back in 1999 (i was 19). I got a job after finishing school and felt so rich back then. Good memories. Had an early AMD 800mhz CPU (athlon was it called or something?). And shocked my friends with my 512mb of ram! (most people had like 128 back then)What a beast.
I love seeing us go backwards in time and reinvent old ideas in tech.
when you were talking about CRTs having "round pixels", your using a sony trinitron which uses an apeture grill. they are more like vertical lines on a trinitron
Would be great to have multi-GPU that doesn't require special support from games/etc. but rather seen logically as one monolithic GPU to the software
That's the way SLI worked on 3dfx cards. The Glide API presented an SLI setup to games as if it were a single card.
I love these videos on older high-end hardware, back in the day it was hard to get any coverage on these things, maybe in magazines, but that was it. Seeing it tested in video using modern reviewing standards is amazing, especially this kind of hardware which was not even accessible for a regular enthusiast when it was new.
Ross' hair perfectly matches the era of hardware on display
I know. I'm surprised they showed a link to his Instagram and not his MySpace page, lol.
Awesome seeing one in action. Computer shop I worked at, we sold 6 of those to one client. Was special order and cost arm an a leg. They were not used for gaming though, it was used for video rendering for some space rovers. The main rendering was down from a large Linux cluster with 100+ servers (a lot of building on my side lol), but they discovered for that time period the workstations had issues loading the renderings and rotating it, etc. I never saw the video cards in action (just the cluster server since someone popped in incorrect RAM in some server of 100, so hunting that mess down.
dude never left the mid 2000s. I respect that
Thar simulator looked a lot like what we used in the Bradley simulators on Ft. Benning. They had full hulls, and turrets built with computer screens in the scopes. It was fun to play since it simulated jammed weapons, misfires, and even wreckage if we flipped. Well, on screen it did lol. You did have simulator rounds you needed to load in order to reload the 25mm. I forget the name of the place though. I think it was CCT, or CCTT? We always had fun going there. They had simulators for everything. Humvees, Bradley's, Abrams, M-16, AT-4. Cool place.
Linus never runs out of video ideas
he pays people to think
@@HDReMasterwrite*
@@HDReMaster interesting 🤔.
@@JupiterGuy1 And what do they need to do before they start writing...?
Cause technology is always improving
Man I bet Ross listens to taking back Sunday like me. Once a scene kid…always a scene kids!
Maybe ;D
❤️
@@gtastuntcrew302 I’m partial to anything with Anthony Green myself. 💙 又
I know Linus has said in the past that the music intro is bad for retention and they've been considering phasing it out, and I get it, no one wants a dip in watchtime in the first minute of a video, but if Supernova gets gone for good I'm going to be heartbroken
They (Quantum3D) were really big on simulators back in the 90s. My dad got to try out their simulator system that was over $1m at the time - he was working in the government agency in charge of urban planning and they brought the entire setup down to try to sell the simulator setup.
Full "walkthrough" setup with multiple large CRTs (Trinitron/ Diamontron era) and racks of the systems with a 'threadmill' (so you can simulate actually walking through) and some controls for movement (lateral and tilt/ pitch). The deal never went through but it was quite an impressive setup involving a lot of different cards and the 'SLI' setups with display cables loopthrough (for overlaying).
There were many different graphics connection standards at the time and they also tried to support most all of them - VGA, BNC, SUN 13W3, Apple etc. Pretty insane stuff.
I recall when Captain's Workspace have managed to run Minecraft Beta 1.7.3 (cannot run newer versions due to persistent texture errors) with:
-Tyan Thunder 2500
-8Gb Reg ECC RAM.
-2x 1Ghz Pentium III
-Quantum3D Aalchemy Module (AA Module)
-4x Modded Voodoo5500 PCI
-4x Quantum Atlas 10KII 18,4Gb SCSI drives (in RAID 10)
-Windows 2000 Advanced Server
Man, it is amazing that it managed to make Minecraft look much softer than it is at default!
Captain is my boy Kasper!!! Been to LAN parties with him :D
The editors were merciful and never edited the Linus behind that bush that kept creeping me out.
I cant tell if it was a missed opportunity or a blessing.
The nice thing about SLI and Crossfire from back in the day, you could buy 2 cheaper Video cards on sale and hook them together and it could equal the performance of a high end GPU at a fraction of the cost.
And that's the real reason that it died
I don't think that Crossfire ever worked as well as 3dfx's SLI implementation because they were just designed to work that way and could still do SLI without the other chip (if that makes sense) and just spit out every other line. They were so modular! Maybe I'm misremembering it, but I swear that I've seen either an SLI set up with a defective or disabled board or just spoofed... could be crazy...
The voodoo 2 was my first real GPU upgrade in my compaq PC when I was a kid - amazing how great that card was!
Ross looks exactly like the kind of dude who would own a simulator box like that, and I mean that as a compliment
He also didnt change his style since that gpu launch 🤓
Thanks haha :D
he looks like he could be the frontman of a early 2000s screamo band
the guys who were killing me all the time in quake must've been running these
This retro stuff is great, please make more content like this. Would be fun to see how sound developed too in pc gaming and content creation
There's better channels aimed at that audience out there than LTT.
@@TUUK2006 I'm aware, but did you take into consideration Linus' face and how excited he was doing this stuff?
Nice to bring him on to your channel with his amazing nostalgic hardware that's absolute gold!
Brings me back to the days of Amiga and 286. Beautiful setup. 👍
Wow, I was happy with my 3DFX twin voodoo 2 setup that I still have in a box somewhere. This looks insane!
Sony Trinitron? CRT.. dang! Quakeworld ran over the internet, on my twin voodoo 2's, smoother than almost any game I have ever played up until today. The network coding was insane. Luved all the vids that explored this. John Carmack and his team were genius programmers and the workarounds they used are legend.
Yep, been there done that. It also had to do with Carmack and id putting as much assembler code in the game engine rather than C++. They did some tricks in the code that were the "firsts" category, or in the top tier developers could do at the time.
I also enjoyed my dual 7950 Nvidia GX2 cards which was two cards with SLI built in. :}
@8:14 you can see they list forward observer simulators. I trained on one in 2003, it was absolutely incredible and identical to real locations, so we could use the same measurements and compass readings in the classroom or out in the training site.
I think we'll see tiled GPUs soon ish because it both makes making various skus and allocating them way easier (e.g. if there's too many high end gpus sitting, they could allocate the new chiplets to lower end GPUs) and smaller chips have better yields, so especially going into sub 1nm I can see that becoming the best option for high performance GPUs.
Though they would probably behave more like GPCs (Graphics Processing Cluster, a group of rop, tmu and sm, which have a bunch of cuda cores) than individual GPUs.
3:43 Some may not like it but I love the sound of old computers starting up. All fans spinning up to speed.
Using a secondary GPU for AA sounds like a great idea and it was way back then. Remember PhysX? Users with older cards could re-use their previous gen cards, though unlikely since buying new cards makes more money. Great watch, thanks!
Remembering this just made me wish we'd have dedicated Raytracing cards today!
Since Raytracing capabilities make much bigger leaps from generation to generation than traditional (rasterization) rendering power
Honestly yeah, bring back separate encoding, rt, ai, accelerators@@LRM12o8
I still have a 1,000w PSU in my rig because I used to exclusively run AMD Crossfire.
Same: It makes me chuckle every time a tech-tuber comments about how modern GPUs &CPUs are energy hungry. Because all that is really happening is the market went from high power draw for high performance, to maintaining performance while increasing the power efficiency, and now they are back to punching high power to the high efficiency designs.
The card’s owner is a time traveller, that’s how he gets mint hardware and a matching haircut !
Thanks LOL :D
Flight Simulator Technician here. For our OTW (Out the Window) PCS we use GTX1080s in SLI and RTX3080s depending on generation. But I've had a chance to work on the old stuff too...Where One giant card the size of a large pizza box was in charge of each primary color (RGB) The sim space is pretty amazing.
I thought I'll never see a grown up scene kid. Props to you for keeping the style alive!
Thanks bro haha
4:56 Nice job editors. This is the first example I've noticed where you can hear an echo of Linus's voice coming from Anthony's microphone. I'm assuming this is because they were talking over each other. This probably happens in every single video on this channel but this is the first time im noticing it.
Can't wait for direct storage to become more of a thing. I think that opens the door for using the pcie slot for intercommunication
ross, setting up a 25yr old pc with his 25yr old hair style. ok maybe not 25yrs, but early 2000's emo rockers want their style back.
this is what i came here for
Oh dear, Linus is back
@BeamNG Legend go*
@@luigiistcrazy Where'd he been❓🤔
@@Keepskatin Pray tell, whence had he traversed prior to his arrival at this present juncture?
Nearly spat my drink out when dude stood up with the HAIR 😂😂
This is definitely a video about GPUs
The real reason they killed sli is for the vram tier pricing system. You will also notice they refuse to make any affordable card with 24gb vram. Who needs to buy a $10k card when you can nvlink 2x 4090. Although you do get some power efficiency the 4090 slams workloads with enough vram.
So excited for LTX!
I wish they'd just added slots to add more vram yourself given that this almost always seems to be the limitation.
It's always been about profit. No vram upgrades because they wanted to sell you a 2nd GPU instead (which you could barely benefit from).
Then SLI was abandoned because they realized they could charge you $1500 for a single card and everyone is a winner (except you).
Greed was always the limitation. Not technology.
I had great hopes for Lucid technologies until Nvidia bought them off then killed the technology. The idea of mixing graphic card technologies sounded like a great idea once they got it right. I have an RX5700Xt and a gtx 1080 backup in case my 5700 died. I would love to see how they would work together with lucid if they kept perfecting it.
I did do SLI a few times though. I did have two identical BFG GPUs before which worked perfectly individually but in dli they were attracting all over the place. I installed a different bios for a different brand and same GPU with same clock speeds on both gpus and SLI worked perfectly with the cards. When I contacted ed BFG they were in total denial that it was a bios problem and refused to look into it. I wasn't alone with lots of people with the same problem. Maybe that's why BFG went out of business 😂
That guy who owns that stuff has a 90s Tech and 2000s Emo Hair Style... Wow... Nostalgia Overload Episode...
I think monetary it makes sense if they can perfect performance scaling. NVDA would love to sell four 8090s to a single consumer who wants to quadruple their performance. It would also solve the GPU manufacture's problem of GPUs that don't sell well as all they got to do is shrink wrap 4 packages of GPUs together, shove them into a box & mail them to the consumer.
Yeah but then you'll have to upgrade your psu with a power plant
@@InhalingWeasel "PSU manufactures love this one trick."
Dang My Chemical Romance as this weeks guest, didn't realize you were an emo fan Linus LOL
I like these types of weird cards. Have you guys covered already or are considering the GTX 560Ti x 2(2 dies on same card), 580 x2, 760 x2 gpus to be in a future video?
Are those like the gtx 690?
@@jesselioce similar in spec. But I'm curious how all the different variants compare in games.
0:37 left side of the screen
Still can't afford it
Same here
What makes that look so smooth isn't just the graphics card but the Trinitron monitor that he's running it on. The didn't require native resolution so they could run at a lower resolution without jaggies, Had a 19 inch Sony Trinitron monitor in 2000 and in my opinion LCD/LED monitors have just caught up to them in overall quality in the last few years.
Also the "Original" GeForce DDR I had was a power house at the time! Definitely a legendary card in its time!
Keep the retro stuff coming 👍
You've reminded me of a video from 15 years ago of an Australian declaring Crysis 'Whooped' by running it on Quad 9800GX2 cards at an average of 50 FPS.
Truly a golden era.
Ltt there’s 300 dollar 240hz 1440p monitors on Amazon maybe do a short circuit they are impressive for the price
Which one?
Innocn I got it and I’m happy with it
Arcades definitely benefitted from this technology. I remember how crazy good graphics were at the arcade compared to home computers in the 90's.
i'm from same era, and yes, now this makes sense how we'd see some arcades with what at the time was crazy good graphics that blew everything you could have at home away.
I think the problem we have with the current GPU situation is that AMD and NVidia them selves are controlling the Vram of every AIB so compare to the 90s we're pretty much doomed in terms of if we want more Vram we cant get the same GPU with higher Vram for smaller price difference 😒😒
5:28 Sound like my estern european teammates on CSGO.
I think Apple introduced multi-gpu in a chip quite sometime ago when the first retina iPad was released too. Always wondered how they did it so seamlessly. This allowed them to get 120fps too.
I am not sure about dual GPUs in ipad, however i know their current Mac pro you can buy a custom-made AMD card that has 2 GPUs on a single card.
@@austinverlinden2236 While this card has 2 GPUs on it these 2 GPUs show up as seperate GPUs to the OS, there is some `magic` in place to attempt to use them as single GPU but of the most part they are 2 seperate gpus on one PCB with a fancy PCIe switch rather than giving each 8 lanes.
@@hishnash ahh that makes sense. Thanks for the information
Just looked it up, seems like it’s been a ‘multi-core’ GPU in the iPad Pro since 2015 so perhaps that’s a little different, not sure - ofc not the same as desktop gpus but would be interesting to know how they split the work between the cores. Res of 2732x2048 at 120mhz on a mobile device just always impressed me & felt like it went underrated.
@@peterthefoxx So apple uses 8 core clusters with the GPU (every 8 cores acts as a single GPU from a dispatch perspective). The TBDR pipeline they use makes dispatching work across these much simpler and easy as it directs devs to group orations that are per pixel within the pipeline reudince memory contention between tiles a LOT.
I love that the opening shot is framed so you can see cardboard Linus longingly looking at Linus's skin in the background.
It's weird to see a 30-something emo in 2023
Props to ross, when he said that it wasn't a phase in the mid 2000s, he really meant it
that's a lot of computer thingies!
Modern AC simulators use a seven channel/seven projector setup with a master server. Each channel processor (server) has one GPU, and normally two DisplayPort outs to one projector using DUAL 1080p60. The software running on the master server is the special sauce that combines all seven (portrait mode 1080p) images into the image seen in the cockpit. Of course there's also special hardware like motorized edge blend plates, spectrometers to sample each projectors output so corrections can be made to make them all as accurate as possible, etc.
it has long been established that gaming chairs are the best thing you can buy to boost fps wdym ?
I wish SLI resurrects one day, it's just so gorgeous to have multiple GPUs/cards in one setup.
Who dropped off the 2006 emo kid?
fr
his mother, ofc
I loved my 260 and 460 SLI configs, I could use them for quite some time before I needed to upgrade.
Let's see how future cards will be made.
Hi Linus I'm a big fan
I had a Obsidian x-24 card but sold it a few years ago. Interesting card with that medusa cable. Image quality wasn't that great though but SLI with only one PCI slot! Wow!
i mean . . AMDs iGPUs of the new 7000APUs could be used to just put several onto one PCB again.
cooling should also be quite possible.
and i mean . . chiplet design for graphics is a thing now right?
just make something with more and more chiplets and distribute them on the pcb instead of just clumping them together.
Ill never forget being a kid saving to buy FF8 that came on 4 cds. Got home and thats when i learned about video cards. It took months but i saved up and bought a voodoo2. Most rewarding moment of my life lol.
It's such a shame Nvidia decided to kill SLI right as VR was becoming a thing.
VR SLI was absolutely amazing - since both views show almost exactly the same area, just from a slightly different point, you could send the exact same draw calls to both cards and then just change the camera position by ~8cm on one.
Unlike traditional SLI, there was no microstuttering and almost *no overhead.* You added a second GPU and got like a 97% fps boost.
VR and 2 way SLI were a match made in heaven and Nvidia killed it.
nah
7:46 Glad that Linus deciphered the AA, I would have been very confused about alcoholics anonymous in fighter simulation context.
Damn, I really wish they tried running Unreal with this setup.
I've ran Unreal on this brick before, I'll have more content regarding it in the future on my channel! (:
6:08 " Because tonight will be the night that I fall for you, oooverr aaagaaiiin!"
So Apple is doing SLI, Intel is making bang for the buck GPUs. Certainly fun times to be in
A Direct3d 12 solution might be a good one. Crossfire/SLI might not be used often, but one case easily could be if you have an integrated GPU into the CPU, and then a discrete one. You might be able to offload parts onto the integrated GPU
LAST
unlikely
Please make more retro videos with Ross! He seems super interesting. I'd love to see more.
Thanks :D
first
W rizz
S*it - 1 minute late.
Fax
Man, I miss my ObsidianX24. Never before or since has my jaw dropped as much as firing that up with some unreal, the just intro. 1024X768..... so glorius lol. Seeing the inside of this PC reminded me of the golden age. Building watercooling loops from trips to the autoparts store.....
I think my 1st one used an 83 civic heater core for a rad..... Scan;ine interleaving was really cool when it worked. This is when 200 bucks got you top end GPUs, 400 got you the Obsidion. We truely are in the dark days right now with software garbage everywhere propping up bloated feature sets.
I had several of the Quantum 3d x24's, absolutely loved them!
X-24's are the shit!! My fav card in 98SE! (:
Dude! Bring back the intros. I need my dopamine hit! 😅