As the viewer who lent you that board it was an amazing video but there are a couple of remarks I'll like to do about the setup: 1. Windows 10 works on the board, I was able to install it and use it with no problem on the board and without changing anything on the bios. 2. The GPUs have dual GPU cores and a SLI chip on the board, so it is like any other GPU as the GPUs are joined in the board and they work on non sli setups, you just need SLI if you want to use 2 of those dual GTX 760 cards. 3. The green flickering happens because of the output GPU as you said, just plug it into another output and it will work fine, same with the sli performance and stability, it is not easy to make it work but it does. I actually included some information about that with the setup.
One of my old coworkers bought this platform (FX-72) in 2007 while I was piecing together my Q6600 system. I think he ended up spending almost double what I did, and with AMD being quite competitive in those times, we both had very high expectations and were looking forward to comparing performance. I ended up having good luck with silicon lottery and my Q6600 could run 3.9GHz stable, and I had it paired with some fast Micron D9 DDR2. I cannot overstate just how much faster my Q6600 build was versus his FX-72 build, I actually felt bad for him, because it was a slaughter.
My father was an electrical engineer for AMD at that time and he agrees 110% with you, and says it wasn't his most proud product hes produced... yea he helped design and make that and thinks its garbage...
@@TheBloodypeteWrong. An electrical engineer is that exact type of person who would work on processors and design them. You are thinking of electricians.
It would be interesting to see a comparison video showing both AMD's and Intel's failures over the years and how they were able to improve over time. Heck, go all the way back to the 90's when the AMD K6 was introduced as a starting point. Now that's nostalgia.
I would not call the Super Socket 7 and the AMD K6/II/III line up a failure, as it was cheaper for consumers, and the IPC was just good if not better than Intel. I had a K6 II 550Mhz machine I built with my money from my frist after school job, and it played every game of the era I threw at it on Win 98se, and DOS.
@@CommodoreFan64 I think you misunderstood what I meant. Back in the K6 days, AMD was an up and comer that really pushed Intel hard. However, over the years since then, both companies started making mistakes and had to recover from them. That's the kind of video I'm looking for. Go back to the golden era where they were both squaring off, show their mistakes, and bring everything back to the present where they're now back in contention with each other.
@@xtreamerpt7775 Correct. Even before the 286. Speaking about that, if you look closely to an AMD 286, it's copyrighted by Intel at the bottom of the chip. Most of them came out in 1982. They also reversed engineered the Intel 8080 with a simple picture of the core die without iicence in 1975, that they codenamed Am9080. And they were around even before that time (they were founded in 1969 actually). After that, Intel gave AMD a licence to produce CPUs for them as a second source, which lasted until 1984-1985-ish, when AMD started making their own CPUs, like the Am386DX and DXL-40 models, for example. But ALL Am286 CPUs were produced under licence and copyrighted by Intel.
For those who are confused, this is not the FX line of CPUs of that launched in 2011 on the bulldozer architecture, those were completely different and shared a floating point unit between every two cores which led to lawsuits for false advertising later because they really weren't true eight cores, and their performance reflected that
Yup, people often forget this one. In plain words, the issue was that you had essentially pairs of cores. Each pair shared some resources, and when you were utilizing the second core in a pair, single-core performance of that pair went down significantly. And it needs to be said that even the full single-core performance wasn't great compared to Intel, and at the time there were still a lot of games which actually only ran on a single core. I also feel like people often overstate the "budgetness" of these CPUs. They were cheapER than Intel's but also generally had more expensive motherboards if you wanted feature parity. The i5 destroyed the 8-Core on single-core performance, while the i3, although only dual core, had hyperthreading, so it was often a better option than the quad cores from AMD which, thanks to their design, were acceptable dual cores or weak quad cores (depending on the load).
16:05 Reason why the already installed nvidia drivers for the gtx760 didn't work with the rtx3060 is beacuse at least for win7, the driver support is divided between gtx10 series (and older) and the rtx20 series (and newer). So it would be correct that the 3060 needs different drivers.
@@arnox4554 They actually did drop active support for some while, the latest version is 474 for win7 and they will only provide security updates going forward but no more regular driver updates.
They also do similar stuff to this for Quadro and GTX. I tried plugging a 1080 into an M5000 laptop and the whole install got borked because of driver conflicts and neither card worked. Similar story with a Pro W5700 and a Vega M GL laptop, the Pro drivers installed on top of the normal ones, and so the Vega didn't work until I DDU'd.
@@arnox4554 Ah got it, yeah that's not what is happening there :P Tho some versions of drivers may still run worse or less stable with some specific games, just like it is with all drivers on all platforms.
@@kimnice I reacted to this bit myself - why did they just keep mentioning nVidia cards on an AMD system? The obvious thing - I would have thought - would be to test it with Crossfire. Using a couple of 4870x2's would've made more sense. This entire video was a waste on what is a piece of rare computing history.
I super appreciate those seizure warnings. Luckily I never got one before, but I do get some pretty bad headaches from rapid flashing lights etc. Thanks, editing team!
I remember "upgrading" from a Core 2 Duo E6600 system to an FX 8350 system, and if it weren't for the actual upgrade from an 8800 GTS 320MB to a GTX 760, I honestly felt like my PC barely got any better, certainly not to an extent that I was expecting going from 2 cores to 8 cores. I recall CS:GO struggling hard, meanwhile watching older Intel i5s that ran it better than my system with the same GPU... Then I switched to an i7-4790k with the same GPU and RAM (I think? pretty sure they were both DDR3) and man, CS:GO FPS tripled and other games easily saw a 25 - 50% boost. That CPU was SO BAD. I gave it to my step dad, he had a "better" use-case for it, with video rendering/editing, rather than gaming, but he still complained of system instability and it feeling slow, even when we got him an SSD to install Windows and programs onto. It is hard to find game benchmarks comparing an e6600 to an fx 8350, but one I found did Valorant on a 750 Ti for both... the 8350 got 35 FPS @ 1080p.. the e6600 for six years prior with 6 fewer cores? It got 23 FPS. Obviously both are brutal, the 750 Ti isn't even all that good in comparison to the 760 I wound up with, but still.. I expected so much more after waiting so long to upgrade, and its performance in Windows and programs was just soooo bad and sluggish, especially compared with the 4790k. Night and day differences. I am back to AMD now though. Went from a 4790k to a R5 3600X, then I went to a 5600X and now a 5800X3D all on the same X570 motherboard. That i7 was great though. I had it paired with a 760, a 980, a 1080, and a 1080 Ti. Now running 3070 Ti with my 5800X3D, never been happier with my PC, I run 1080p high refresh rate, and it feels like this PC could last me another 3 - 4 years at least honestly. Not worried about the 8GB VRAM, it hasn't stopped me from running any games at 1080p with RT yet, and DLSS means no worries honestly as I've only had to use that in a couple games at Quality setting so far, so going to Balanced/Performance is still an option down the line if needed, or turn off RT in some titles, or turn down a couple settings I don't value much. The thing with my current set up that I like so much though is how flippin stable it is. I *never* blue screen, games never crash, everything just works perfectly. Haven't had this level of stability from a system probably ever. It has been 1.5 years since I reformatted/reinstalled windows even and it is still just smoooooth.
I did the same, had a FX-8120, 8350 and then was gifted a 9590. Got sick of trying to keep the 9590 cool and got a 4790k that lasted me until 2021. Still is a decent gaming cpu even by today's standards but I moved on to a 10900 and then 11900k that I got for a great deal. (11900k + good z590 mobo for barely over $300 NEW)
I went from an (iirc) fx4100 which was dogshit, not even actually 4core, and couldn't run jack shit to an fx 8350 that was still dogshit. It overheated constantly even after thermal paste replacement and a box fan blowing at it. Absolutely awful CPUs and put me off AMD CPUs for life
@@nater420 You tried two cpus from the same generation and had the same bad experience and now are put off from amd? Come on now. Ever since ryzen amd has been very good.
@@Jtwizzle It's already not for quite a few titles unless you are ok with dropping texture quality down a bit (which is one of the biggest visual impacts by far).
As someone who owned one of these. The biggest issues with the benchmarks from the day is they were done on Windows XP, since Windows Vista hadn't yet released by the time this came out. I ran Vista and 7 on the board and actually had much better numbers than reviewers did. But this setup wasn't really consumer ready, but was neat.
2006 was a time when it was better to have a fast dual core for gaming than a quad core. The problem at that time was that many games could only use a single thread. A frequently used gaming CPU at this time was a core2duo E6700, which was much faster for gaming than, for example, a core2quad Q6600. The AMD fanbase at the time was using the Athlon X2 6000+, but this CPU was a bit slower than the E6700.
I still got my C2D E8500 with its GT 9800 GTX+ and 8GB of ram, suprisingly fast still today, you can work on it. It's the second fastest Gaming CPU of its time, only the E8600 was faster, and you had to pay 30% for 2x160mhz more. 3,16ghz and 6mb of Cache.
ah the q6600, i had mine for years. Overclocked like a beast! Remember selling it for £25 after Uni...went to a 4770k which again lasted me forever, but CPU's had stagnated for a long time around then, just got a 5800x3d and hoping to have struck onto another long laster!
Anyone remember the ASUS MARS GTX 295? 2x GTX 285 spec chips with 4 GB VRAM versus 2 x GTX 280 spec chips with 1792 MB. That thing was a beast back then! A very expensive one I could never afford though.
My first home built desktop was with the FX6300... While it wasn't great at productivity stuff, it held up to High/Ultra 1080p gaming until about 2015/2016. That little chip means something to me. Surely not because it was great, but it was definitely serviceable and was my first time seating a CPU.
Same here, FX 6300 Black Edition. Started out with a GT 610 and later got a GTX 1050. It wasn’t able to run GTA 5 though, would always crash on that first mission when you play as Franklin with Lamar
I shot myself in the foot by getting a FX 4170. It worked fairly ok with the games I played but took forever to encode my videos. I also didn't need a heater in the winter lol
Those Bulldozer and Piledriver CPUs are a completely different lineup than the ones in the video though, by about five years. I still have the FX-8350 and I believe Steve from GamersNexus still daily drives his FX-8370
Same! I loved that cpu! It was 2013 and I was running it with Gt630 and than R9 270x, that system put me through a lot! If there wouldn't be Covid I would never had to upgrade that poor computer. It was really keeping up with anything I wanted to play and it aged well... its now in family PC mainly for printing stuff :/, but man, it still works great! I would love to see a FX-6300 video, something with overclocking it to 5Ghz and playing modern games
The 760 was the card I bought because I ditched SLI, finally recognizing that the fps counter was meaningless if it played like crap. Never thought someone would be making a dual gpu care in that era .. let alone buying one , let alone midrange cards :). I went with a 2x hd3870 (dual gpu in one card ) at the start , single hd4xxx series , hd5850, dual hd5850, dual gtx560ti. I finally leaned my lesson by then to just get a single card, no matter what .
Ah yes, the Quad-Father. I would have killed for this back in the mid 2k's. It was pretty much a hamfisted attempt at catching up to intel's core2 quads.
AMD developed the platform within the same time as Core 2 Quad's development, and they released 30 days apart from one another. AMD didn't have much chance to tweak anything to stay ahead, and K10 was still a few months away. They banked on the "FASN8" program where K10 would be a drop-in upgrade and allow for 8 cores, and once 65nm yields for K10 proved to be a nightmare and the whole architecture had to be cut down to fit (axing most of the L3 cache) they cancelled that plan.
I'd love to see a video where the challenge is to get 60fps in a handful of modern games using the oldest hardware possible, including OC and used parts! Good reference to see what still holds up
I own both setups, almost, bought the AMD fancy FX 64x myself, almost buying the dual socket board with one CPU. Never used the system a lot, too many driver issues, AMD now...? Happy i found a Q9650 CPU in 2006, able to build my first DDR 3 system on nForce, that was the cheap intel system, just before the Core launch. The Q9650 system we sill use, 16 Gb, the NVMe drive makes it fast enough to run Win10 x64 fast, the new RTX card in it was not a waste ! 775 can still be modern if you do it good, early 2000's ! The ASUS MARS, was years later, cheap GTX 660 SLi ! wow, how weird, i understand why linus did this build !
Back In the late 90's, I had an Abit BP6 dual socket 370 Mainboard, with two Celeron 466 cpu's overclocked to 525 MHz. The only problem was that at the time the only SMP O/S available to me was Windows NT4 Workstation, which barely supported any games. But it was a very cool mobo at the time 🙂
The Vega VII didn't paper launch. It still is a beast for crypto mining comparable like a more efficient 3080. They were mostly bought out before reaching consumers. Have seen my fair share of Those fragile beasts
When my current PC was first built it came with an FX 8350. Compared to the previous PC's Phenom II 1055t, only BeamNG and Space Engineers saw any improvement at all and some applications even got worse outright with stability issues. I didn't hesitate to upgrade it to an R5 2600x the moment affordable Ryzen MBs started being sold in my area. The improvement was night and day. My GPU wasn't getting bottlenecked anymore to the point I was even able to upgrade my monitor setup. (There was a whole other can of worms after that, but it was more due to ASUS related shenanigans than a fault of the platform itself.)
You could run those on non-SLI boards. That was the main gimmick of the dual-GPU cards... at least as far as I know? My newest one from nVIDIA is the GTX 690 so maybe it changed later.
jam, running mid level GK104 chips on one PCIx bus, both the GTX 690 and ASUS MARS, was the solution back then, slow chips, getting some FPS, able to sink ? The GTX 690 i still own, the EVGA one blower factory model, no fancy sticker, needed for basic 1080p gaming, replaced as soon as i could, the GTX 1080 did that for me, still own the 1080ti.
You're forgetting about the Bulldozers, I took the risk of buying a "FIRST EVER 8 CORE DESKTOP CPU!" instead of that lame quad core 2600K from intel, boy did I pay for that, the FX-8120 I got overclocked all the way to 4.9GHZ on air but was still 30-40% slower than the Phenom II x4 955BE I upgraded from
I have a still functional system using the the non-ECC version of the board with the FX-74s (the PCIe slots are blue vs red), and I actually liked it overall. Yes it ran hot, but I had the thermaltake vortex coolers to handle that (which look pretty killer too i might add), and I later used liquid cooling specifically for the GPUs (nVidia Quadro 4000s, later upgraded to K4000s). It can actually play Doom 2016 at 720p on moderate-ish settings (deffs not 60fps, but playable). You can install Windows 10, and even Windows 11 if you remove the TPM requirement. WIndows 11 is pretty awful performance-wise though. I think the biggest killer was that the RAM slots and onboard devices were known for going bad or being bad from the factory. I remember on cold winter nights i would literally drape a blanket over me and it (at the foot of my chair) and just use the machine to keep me warm. lol.
I had a Athlon X2 black edition in an am2/am2+ asus board. Later upgraded to a phenom ×4 955 black edition. Used that for many years with a GTS 250. First gpu was 8500gt.
I remember that era well. I was an AMD fanboy up to that point. Went from Athlons to an Opteron 170 as my first dial core CPU. But, yeah did AMD drop off a cliff fast. When, it was time for my next upgrade, I really had no choice but to switch to team blue and the stupidly good C2Q 6600, that thing was an overclocking beast!
I remember there being some really cool build back in the early 2000's that used multi GPU and multi CPU setups. I would really love a video about what people were doing back then with those.
Hey Linus, i still use my Asus KCMA D-8 which uses a dual opteron 4130 setup with ddr3 ram and use it as a home server setup, A setup from 2009. I'm so glad you are enlightening the world about the world's best platform.
I had dual Opteron 270 system for my gaming rig around that time and it did rip copying data on sata drives with two nforce 4 chipsets. It did make a difference against DDR2 dual cores at the time for some reason until quad cores from AMD came out.
Next up: one of those old (circa 2006) Tyan dual-core Opteron motherboards with 4 CPU sockets plus the Tyan M4881 daughterboard that adds an additional 4 sockets for a total of 8 CPU sockets and 16 CPU cores in one machine
My 15 year old AMD PC the HP dc5850 was great. It had a AM2+ socket with a Phenom X3 and later a Phenom II X4 B97 (4x 3.2GHz); max 8GB of DDR2 (800MHz) and 2nd gen PCIe with a 128GB SSD on SATA-2. I used it till April 2019, when I replaced it with the 2nd slowest Ryzen ever (Ryzen 3 2200G), but still a huge improvement 2x the speed of the Phenom II X4. That 2008 HP dc5850 is still in use with a friend. I love the AM2+ and AM4 motherboards, they both supported at least 2 generations of CPUs.
The editing in this video should be the gold standard for LTT, they even impressed me with the quality of seizure warning. It is appricated when they warn people about loud sound or strobing lights.
Amazing to see Opterons in the channel. I remember I was this close to buying a G34 socket server motherboard to get some of those cheap CPUs... But you can imagine why I didn't for it. It could've been incredible for server things, but Ryzen going more affordable and more reasonable (even Threadripper and Epyc) basically negates all purpose, unless I go "just for fun".
My old PC had an AMD FX-8350 Eight-Core Processor from 2012, and it honestly wasn’t all that bad. Yes, it ran very hot and really raised the temperature of my room, but it was snappy, great for gaming, and handled anything I threw at it. It still performs well today.
I just had this running in the background while assembling a bike. I wasn't paying attention, but loved the throwing of Gary under the bus over a past CPU choice!
I have a MSI k8n master2-far with 2 opteron 280, and it is similar to the quad fx platforme. Really good on the paper (SLI, 12gb DDR400 ecc, 2*2core cpus, 2gigabit lan, on a e-atx form factor,...) but released in 2005... 1 year too last for it to be interesting.
Based on the performance charts.. I wonder how they even Marketed this Quad FX then since it was still not good? Did any applications use more threads back then? Some type of engineering program, for example?
I didn't recognize Gary till you mentioned ASUS, met him at HardOCP's Gaming Convention thingy. (Met Kyle Bennett too) both awesome guys! Also I still have an AMD FX-8350 + GTX 980 as a spare gaming rig. Also have a C2Q 9650 that runs a GTX 960 (before that it was a 560 Ti)
Even though it sucks, it sure is awesome. This is one of those things that you build today just out of pure fun, because you can. Multi GPU and CPU setups were absolutely a blast to play with back in the day. It usually wasn't practical, but it was a blast to tweak.
Man these were the days though that overclocking WAS necessity out of the box. Where you'd instantly get 25% instant increase... it would be interesting but I don't know how safe to do with its age now.
New Chinese dual Xeon ATX motherboards are 100 or so, the 4-18 core CPUs pulled from big server farms are very cheap. X79 uses quad channel DDr3, X99 uses dual channel DDR4. You can buy 19" rack dual CPU server slabs for 150 upwards from A******n , they generally have two PCIe slots at the rear, also a power connector for GPU lead. fit a USB3 card in the second slot. the biggest caveat is they have lots of small fans and are jet engine noisy at start up..
Linus, I had an 8350 FX on an ASRock motherboard with a gen2 m.2 slot that I stumbled upon accidentally I used a gen3 NVME and it performed amazingly well with DDR3 2400 MHz memory 16gb I had this rig for 3 years then upgraded to Ryzen 7 3800X Personally I recently built another FX the 8320,16gb DDR 1866 and m.2 in a riser card...I gave my old system to a friend You gotta admire AMD for going the distance and scaring the heck out of Intel nowadays So yeah when on disability I can't afford Intel but it worked for me. BigT,Alaska
my first computer was a dell optiplex 780 sff with a core 2 duo (unsure on model) then upgraded it to a core 2 quad q6600 and damn it went for what it was. great little computer and i got it for $30aud way back lol
@@lands1459 interesting, the guy i bought it off mustve got a different cpu at some point or as an upgrade cause after a double check it had a E8400 duo @3Ghz
Don't think there's too much difference, as the Intel quad cores at that time were simply two dual core chips on one package with no direct connection. Much like this dual CPU setup from AMD.
I have a retrocomputer with one of those! I purchased it in 2017 for a celebratory bonus build in parallel to my TR 1950X build. I have a number of retrocomputers with... intentionally weird and questionable parts, including the infamous nvidia leafblower (FX 5800 ultra) and a 3dfx Voodoo 5500. The Voodoo 6000 and XGI Volari Duo both evade me sadly. ...I also have some systems with modern weird and questionable parts too, like a working system with Via/Centaur's last x86 cpu (the CHA/CNS chip) before Intel acquihired Centaur.
I remember the Mars 760's. I honestly almost bought one. Kinda wish I had now cause I collect dual GPU cards and finding them is incredibly difficult. I would have hated myself at the time, but I would have one now for my collection.
The MARS i sold, some old guy needed it, getting way more for it then i ever payed for it Got the GTX from Nvidia, the EVGA 690 model fefurbigest, RNA for a dead 590 card, still own that. GTX 660 running in SLi was mad, to many overhead, the GTX 690 did the job, just basic 1080p gaming was needed !
You guys should do a video about old school modding. The AMD K6? For example where with a silver pen you could re-connect the lines they cut on the die to create the cheaper CPU version. And get some GPU's as well where "Soft modding" was still a thing to unlock the under clocked GPU's because those failed yield testing and offered as cheaper versions. Those days are gone and sadly missed - fun times but would be good for us older techno bobs to enjoy and to enlighten the newer generations of the "good times"
I remember upgrading from an athlon xp to an Athlon 64 FX-74 and it was such an incredible upgrade. I also remember going into fry's and the sales rep trying to sell me on quad fx (or more like trying to sell my dad on buying me quad fx) . Glad that didn't happen lol
How did you upgrade to an FX-74 if you didn't buy the Quad FX system? The FX-74 only works on Quad FX, it was made specifically for that board and was only sold in pairs (2 CPUs per box).
@@FixedFunction lmao you made me go dig out that old pc (in the garage) and to my surprise it's an FX-62, guess I got it mixed up (I was like 12 at the time /shrug)
Shout out to the optical audio users out there at the time! (I was only one due to a Best Buy warranty gift where I got "upgraded" after buying a floor unit for $100 to a surround sound $1100 unit, but still. It was amazing.)
I just swapped out a AMD FX 8350 8 Core from my work PC for a latest AMD. I didn't realise just how bad it was performance wise, The CPU was great back in 2013, but later I added a 1050Ti GPU from 2017 and it could barely use 50% of the available power of the GPU. However it did hold up well, with water cooling, it's been faultless for the last 10 years reliability wise and I didn't notice much in the way of limitations for low level gaming, but I couldn't get almost any good results rendering videos which is why I upgraded. It's a huge difference, the latest AMD CPU's paired with a 3060Ti make a beast of a PC for both gaming and video production.
As someone who ran a 9850BE Phenom and upgraded to a 1100T Phenom II then side-graded to an FX9590... I did want to try any of the dual-CPU options AMD offered. Couldn't afford it though as I worked in restaurant kitchens at the time so I was minimum wage w/o tips. It was morbid curiosity that had me wanting to try the platforms even though I knew from a plethora of benchmarks that they were easily at least double the cost for the performance compared to normal systems. Even my 1100T and FX9590 would both outperform most of those dual-CPU options from AMD. But wait! There's more! While the 1100T had a decent array of instruction sets for its generation, and the FX9590 had more, neither had some of the most recent instruction sets. This meant that some games would refuse to launch let alone run. You were lucky if you could find a community-made patch that would act as a translation layer for the missing instruction set to a compatible one that either Phenom II or FX could run.
That still powerful from my 10th gen i5 😀back in then, amd decided to put more but not so powerful cores into processors. İ guess they were like, ohhh mate if we add more cores it will be more powerful
@@doublevendetta ohh i Dont really recognize nor i have Heard quad FX thing but i didnt think it was that New i think that multicore thing is 2011 thing. So sorry if i was wrong
About https on old hardware. Old CPUs lack a feature that's found on all modern CPUs and that's AES hardware encode and decode. If you go to youtube on old hardware, then the CPU must do all the work to decode and encode the network traffic. That causes the CPU to max out and the video to struggle at the beginning. Other single cores and even some duo 2s would suffer from this problem but never fully recover because they can't catch up with encryption (just stay laggy and dropped frames). So really comes down to that hardware decoding and encoding that makes the modern web work! Imagine opening a website where the processor had to do all the work. It would be like starting up a program every time. Oh, that's kind of like what Linus experienced when doing so...
This is the era when I really got into PCs. A friend of mine got some cool PC game and we heard about LANs. I bought PC magazines like a true nerd and eventually saved up and built my first system around a Core2Quad Q9300 and a Radeon HD3870. The magazines taught me enough fundamentals and my passion didn't stop at gaming. The interest stuck around and now I'm a professional software dev with an MSc in computer science. I worry that computers these days don't inspire the way they use to. The kids I do meet don't seem to care about it at all, just phones and social media. Sad...
Kids these days are inspired as well. Let's face it, you're just looking for a reason to shit on the younger generation, which is especially ironic given you rely on said social media to share shit like this.
I would actually be sad that no one would try to completely off the line or better yet off computer. Like the life used to be. We're too dependent on computers, myself included.
10:47 (Seizure warning) Most of my computers since 2010 have been 2-4 way SLI machines. In my experience, that exact flickering is caused by a bad connection of the SLI bridge, due to oxidization or whatever. Usually just wiggling the SLI bridge around should fix the issue
The thing with platforms like these is that they don't have a solid upgrade path which brings things to today with the Threadripper platforms. Essentially every generation of Threadripper. Sure the first generation got an upgrade from 16 core to 32 core max but they were the same Zen 1+ architecture. Then then was the TR4 socket for the Threadripper 3000 series which never saw a new chip released for them. Ditching the previous dead end socket for another dead end socket. Then there was the Threadripper Pro which added yet another socket to the Threadripper history but hooray it did provide a generational upgrade with some gotchas. The Threadripper Pro 3000s were mostly OEM only with off the shelve parts for DIY builders only appearing shortly before OEMs had access to the Threadripper Pro 5000s, thus making Threadripper Pro 3000 in the DIY area kind of rare: an upgrade path did exist for the dozen or so people who had one. While the consumer and server segments of AMDs line up got some 3D V-cache parts, Threadripper again was neglected. The saving grace for Threadripper Pro 3000 and 5000 is that the many DIY motherboards can also work with Epyc chips. One downside of being mostly an Epyc chip for Threadripper Pro is that they can be locked to an OEM platform, complicating the used market.
the radeon 7 was not vaporware, it just was completely grabbed by miners i assembled rigs and in personally mounted hundreds of these, at the end of ETH life you could push 100mhs EACH at less than 200w power draw it was not until the 4090 arrived that i was beaten with 110mhs but at OVER 300w so in the end the Radeon VII remained king of GPU mining
I had an old crossfire setup with a pair of 290x 4gb cards on water, that I used until 2018. Once I took it down, one of the GPUs simply would not come back to life on it's own, it would recognize if I had it in a rig with the other one but it was totally DOA, no picture nothing, when it was in a rig by itself. When I had it as the primary GPU it would do that same thing with the flickering textures and other nasty problems, clearly it had some bad VRAM but also I think the output engine was broken because one of the outputs didn't work either. I had no idea anything was broken until after I took it apart lol
I appreciate the fact you compared the Quadfather setup to a Metapod 😂 incidentally, I play Pokémon a lot lately, since my Z97-P board died, and I had to downgrade to a shitty H61 chipset board with only 2 RAM slots with an i5-2500K (which clearly doesn't run my games). But at least you didn't make the comparison to Butterfree, which is one of my best Pokémon, which actually allows you to steamroll the whole game with the right item and moveset.
still running my fx-8350 till this day. Built my system like 7 years ago. Bought it because it was dirt cheap and ovoerclocked eversince to stable 5ghz. upgraded couple of years ago from a 750ti to a rx480 and upgraded to more ram, multiple ssd's and even an m.2 ssd trough an pcie adapter. Sure the system isn't the best but i really never had to complain about system instability or something. Still runs my games fine but in the near future, i want to upgrade system to a sweet 5800x3d
I used to rock a Athlon II X4- was always scared of the 95W TDP and the fact that my only working cooler was just extruded aluminum with a tiny 80mm fan.
I built a Netflix 4K standards PC in 2017, I have since upgraded the ram to 32GB from 16, replaced my 120GB boot M.@ drive with a Team Group 1TB and replaced the 1050 ti FTW cpu with a ROG STRIX 8GB 980 something or other. and thank goodness $K Netflix is available to other chiopsets, GPU's and Browsers. Also the tubing from my built in Water cooling was sitting on the GPU backplate and melted a bit causing a bit of coolant to leak onto and dry on said backplate.. I don't dare moving the computer anymore for fear of shifting that tube and causing a full on coolant lean inside my running PC. also part of the pump system sometimes works and I have only heard the fans on the radiator spin up twice, but strangely it keeps really good temps in limping passive mode... ;) That's the water loop not the PC.
I just gotta say, the Bueller-esque cut to under the table as you plug in the board and continue to trash talk about Gary was fantastic. Kudos to the people who put that together.
My 14 year old i7-975EE still works, (2009) - the only issue was the mainboard failed but in a new board it still runs and plays fairly modern games at 1080P quite happily - IT just goes to show the leaps made in CPU speed and tech over those 2-3 years, and the stalemate in development between then and when Ryzen put in an appearance to wake Intel up again.
Watching your videos for like more than 5 yrs now, graduated college, now doing masters, and yet every video is so exciting and full of knowledge that contributed a hell lot in developing my love towards tech. Just keep making awesome videos. Love from India
I remember the day my father got a quad-fx machine, it was just a business computer with basic graphics, so it was very much overkill. It was late in the lifecycle, and we had it for several years, and I became an AMD fan because of it. About 5 years later I decided on my first build with an fx8120 oc to 4Ghz. After several upgrades in the past 10 years (20gb mismatched ram, rx580) that bulldozer is impressively still running overclocked after all this time. For as much hate amd got, they made some decent and reliable chips.. My poor computer maxed cpu usage in Horizon: Zero Dawn, but i got my one play through lol
I have to wonder as to the GPU choice for upgrade that was tried here. I think a more appropriate one would be a 1050 or RX 580. They both would beat the crap out of that 760 and have better support under W7. Consider giving it whirl even if just in the lab.
I owned one of these for a number of years and used it as a workstation for graphics. I actually offered it to LTT via Twitter, but I assume you are quite crushed by random messages. I am surprised how poorly it ran for you because it worked pretty well for me. I ran Windows 10 and Linux, dual quad-core Opterons just like here, 8GB of memory (I think), and dual 9800 GX2 cards. Blender rendering was slightly slower than my 5.2Ghz overclocked FX-8350. Gaming was terrible, so I never did that except for screwing around. My weird issue, which you may have had but simply did not run it long enough to find out, is the well-known BIOS battery drain bug. As long as the system is on, the BIOS battery is being actively drained. After maybe a week of up-time, the system will crash, requiring you to take the GPU out and replace the battery. It was not a good system, but I see it as the beginning of what is now referred to as HEDT, even though Intel used the term first like three years earlier for a standard Pentium. I wish both Intel and AMD had stuck with this.
As the viewer who lent you that board it was an amazing video but there are a couple of remarks I'll like to do about the setup:
1. Windows 10 works on the board, I was able to install it and use it with no problem on the board and without changing anything on the bios.
2. The GPUs have dual GPU cores and a SLI chip on the board, so it is like any other GPU as the GPUs are joined in the board and they work on non sli setups, you just need SLI if you want to use 2 of those dual GTX 760 cards.
3. The green flickering happens because of the output GPU as you said, just plug it into another output and it will work fine, same with the sli performance and stability, it is not easy to make it work but it does. I actually included some information about that with the setup.
But who is gonna take time to read your instructions, am I right? /s
@@MuitoDaora
NOT Linus of course
Brilliant man... thanks for keeping, tinkering and sending this board in. Such interesting content.
Thanks for sharing this board with us!
I am glad you had one to loan them. I buried all my FX stuff in the back yard.
One of my old coworkers bought this platform (FX-72) in 2007 while I was piecing together my Q6600 system. I think he ended up spending almost double what I did, and with AMD being quite competitive in those times, we both had very high expectations and were looking forward to comparing performance. I ended up having good luck with silicon lottery and my Q6600 could run 3.9GHz stable, and I had it paired with some fast Micron D9 DDR2. I cannot overstate just how much faster my Q6600 build was versus his FX-72 build, I actually felt bad for him, because it was a slaughter.
My father was an electrical engineer for AMD at that time and he agrees 110% with you, and says it wasn't his most proud product hes produced... yea he helped design and make that and thinks its garbage...
man id love to pick his brain about so much stuff as an amd fanboy
my auntie is Lisa Su and said you are saying bullsh*t to farm likes
I'm guessing he was actually an electronics engineer? Electrical is the wires in your house scale stuff...
@@TheBloodypete and magnetism or three phase induction
@@TheBloodypeteWrong. An electrical engineer is that exact type of person who would work on processors and design them. You are thinking of electricians.
It would be interesting to see a comparison video showing both AMD's and Intel's failures over the years and how they were able to improve over time. Heck, go all the way back to the 90's when the AMD K6 was introduced as a starting point. Now that's nostalgia.
I would not call the Super Socket 7 and the AMD K6/II/III line up a failure, as it was cheaper for consumers, and the IPC was just good if not better than Intel. I had a K6 II 550Mhz machine I built with my money from my frist after school job, and it played every game of the era I threw at it on Win 98se, and DOS.
@@CommodoreFan64 I think you misunderstood what I meant. Back in the K6 days, AMD was an up and comer that really pushed Intel hard. However, over the years since then, both companies started making mistakes and had to recover from them. That's the kind of video I'm looking for. Go back to the golden era where they were both squaring off, show their mistakes, and bring everything back to the present where they're now back in contention with each other.
@@CommodoreFan64 i had ALL those Boards, DAM near Bulletproof ! Those were fun boards to play with,the OLD DOS DAYS TOO ! DLL HELL !! The Old Times, 😎
@@honorablejay Not a up and comer, AMD is around since the 286 clone, i think early 80s
@@xtreamerpt7775 Correct. Even before the 286.
Speaking about that, if you look closely to an AMD 286, it's copyrighted by Intel at the bottom of the chip. Most of them came out in 1982. They also reversed engineered the Intel 8080 with a simple picture of the core die without iicence in 1975, that they codenamed Am9080. And they were around even before that time (they were founded in 1969 actually). After that, Intel gave AMD a licence to produce CPUs for them as a second source, which lasted until 1984-1985-ish, when AMD started making their own CPUs, like the Am386DX and DXL-40 models, for example.
But ALL Am286 CPUs were produced under licence and copyrighted by Intel.
For those who are confused, this is not the FX line of CPUs of that launched in 2011 on the bulldozer architecture, those were completely different and shared a floating point unit between every two cores which led to lawsuits for false advertising later because they really weren't true eight cores, and their performance reflected that
Yup, people often forget this one. In plain words, the issue was that you had essentially pairs of cores. Each pair shared some resources, and when you were utilizing the second core in a pair, single-core performance of that pair went down significantly. And it needs to be said that even the full single-core performance wasn't great compared to Intel, and at the time there were still a lot of games which actually only ran on a single core.
I also feel like people often overstate the "budgetness" of these CPUs. They were cheapER than Intel's but also generally had more expensive motherboards if you wanted feature parity. The i5 destroyed the 8-Core on single-core performance, while the i3, although only dual core, had hyperthreading, so it was often a better option than the quad cores from AMD which, thanks to their design, were acceptable dual cores or weak quad cores (depending on the load).
Yep this wasn't the 2010s FX
But Buldozer were true 8core, but just 8 weak cores. Look and single-core and multi-core scores.
Also
Not to be confused with AMD-FX and not to be confused with Intel Core 2 Quad
Wasn't Intel's first attempt with quad CPU also similar like 2 dual cores basically bolted together until they released Core2Duo
16:05 Reason why the already installed nvidia drivers for the gtx760 didn't work with the rtx3060 is beacuse at least for win7, the driver support is divided between gtx10 series (and older) and the rtx20 series (and newer).
So it would be correct that the 3060 needs different drivers.
Thanks for explaining this! I had first thought Nvidia just let their drivers for Windows 7 take a shit and got super mad.
@@arnox4554 They actually did drop active support for some while, the latest version is 474 for win7 and they will only provide security updates going forward but no more regular driver updates.
They also do similar stuff to this for Quadro and GTX. I tried plugging a 1080 into an M5000 laptop and the whole install got borked because of driver conflicts and neither card worked.
Similar story with a Pro W5700 and a Vega M GL laptop, the Pro drivers installed on top of the normal ones, and so the Vega didn't work until I DDU'd.
@@aijena Oh, that's fine. I meant outright shipping broken drivers for Windows 7.
@@arnox4554 Ah got it, yeah that's not what is happening there :P
Tho some versions of drivers may still run worse or less stable with some specific games, just like it is with all drivers on all platforms.
The power consumption of Quad FX is brutal
we threw out about 7000 FX chips when they came out. that's how trash they were.
Imagine having Quad FX with Radeon R9 295X2... crossfire..during the day?
@@kimnice I reacted to this bit myself - why did they just keep mentioning nVidia cards on an AMD system? The obvious thing - I would have thought - would be to test it with Crossfire.
Using a couple of 4870x2's would've made more sense.
This entire video was a waste on what is a piece of rare computing history.
@@Q-nt-Tf You threw away seven-thousand CPUs?
@@I_enjoy_some_things Yeah that don't sound sus 😂
I super appreciate those seizure warnings. Luckily I never got one before, but I do get some pretty bad headaches from rapid flashing lights etc. Thanks, editing team!
Agreed! More accessible videos are always better.
@@collinblack9605 that's probably the skype call sound you're thinking of
NVlink replaced SLI and is only in servers
I remember "upgrading" from a Core 2 Duo E6600 system to an FX 8350 system, and if it weren't for the actual upgrade from an 8800 GTS 320MB to a GTX 760, I honestly felt like my PC barely got any better, certainly not to an extent that I was expecting going from 2 cores to 8 cores. I recall CS:GO struggling hard, meanwhile watching older Intel i5s that ran it better than my system with the same GPU... Then I switched to an i7-4790k with the same GPU and RAM (I think? pretty sure they were both DDR3) and man, CS:GO FPS tripled and other games easily saw a 25 - 50% boost. That CPU was SO BAD. I gave it to my step dad, he had a "better" use-case for it, with video rendering/editing, rather than gaming, but he still complained of system instability and it feeling slow, even when we got him an SSD to install Windows and programs onto.
It is hard to find game benchmarks comparing an e6600 to an fx 8350, but one I found did Valorant on a 750 Ti for both... the 8350 got 35 FPS @ 1080p.. the e6600 for six years prior with 6 fewer cores? It got 23 FPS. Obviously both are brutal, the 750 Ti isn't even all that good in comparison to the 760 I wound up with, but still.. I expected so much more after waiting so long to upgrade, and its performance in Windows and programs was just soooo bad and sluggish, especially compared with the 4790k. Night and day differences.
I am back to AMD now though. Went from a 4790k to a R5 3600X, then I went to a 5600X and now a 5800X3D all on the same X570 motherboard. That i7 was great though. I had it paired with a 760, a 980, a 1080, and a 1080 Ti. Now running 3070 Ti with my 5800X3D, never been happier with my PC, I run 1080p high refresh rate, and it feels like this PC could last me another 3 - 4 years at least honestly. Not worried about the 8GB VRAM, it hasn't stopped me from running any games at 1080p with RT yet, and DLSS means no worries honestly as I've only had to use that in a couple games at Quality setting so far, so going to Balanced/Performance is still an option down the line if needed, or turn off RT in some titles, or turn down a couple settings I don't value much. The thing with my current set up that I like so much though is how flippin stable it is. I *never* blue screen, games never crash, everything just works perfectly. Haven't had this level of stability from a system probably ever. It has been 1.5 years since I reformatted/reinstalled windows even and it is still just smoooooth.
I did the same, had a FX-8120, 8350 and then was gifted a 9590. Got sick of trying to keep the 9590 cool and got a 4790k that lasted me until 2021. Still is a decent gaming cpu even by today's standards but I moved on to a 10900 and then 11900k that I got for a great deal. (11900k + good z590 mobo for barely over $300 NEW)
I went from an (iirc) fx4100 which was dogshit, not even actually 4core, and couldn't run jack shit to an fx 8350 that was still dogshit. It overheated constantly even after thermal paste replacement and a box fan blowing at it. Absolutely awful CPUs and put me off AMD CPUs for life
@@nater420 You tried two cpus from the same generation and had the same bad experience and now are put off from amd? Come on now. Ever since ryzen amd has been very good.
8GB vram is probably still plenty for 1080p for several more years. Even at 2k res I dont have to turn textures down to often on my 3070.
@@Jtwizzle It's already not for quite a few titles unless you are ok with dropping texture quality down a bit (which is one of the biggest visual impacts by far).
As someone who owned one of these. The biggest issues with the benchmarks from the day is they were done on Windows XP, since Windows Vista hadn't yet released by the time this came out. I ran Vista and 7 on the board and actually had much better numbers than reviewers did. But this setup wasn't really consumer ready, but was neat.
2006 was a time when it was better to have a fast dual core for gaming than a quad core. The problem at that time was that many games could only use a single thread. A frequently used gaming CPU at this time was a core2duo E6700, which was much faster for gaming than, for example, a core2quad Q6600. The AMD fanbase at the time was using the Athlon X2 6000+, but this CPU was a bit slower than the E6700.
But there is world beyond gaming
I still got my C2D E8500 with its GT 9800 GTX+ and 8GB of ram, suprisingly fast still today, you can work on it. It's the second fastest Gaming CPU of its time, only the E8600 was faster, and you had to pay 30% for 2x160mhz more. 3,16ghz and 6mb of Cache.
@@A-BYTE64 That does not change the situation that in 2006 there were only a few applications with multicore cpu support.
ah the q6600, i had mine for years. Overclocked like a beast! Remember selling it for £25 after Uni...went to a 4770k which again lasted me forever, but CPU's had stagnated for a long time around then, just got a 5800x3d and hoping to have struck onto another long laster!
@@joshyc2006 well i pray that your long lasting journey of CPUs continue , 5800X3D is also a gaming beast of this era/time
Anyone remember the ASUS MARS GTX 295? 2x GTX 285 spec chips with 4 GB VRAM versus 2 x GTX 280 spec chips with 1792 MB. That thing was a beast back then! A very expensive one I could never afford though.
As I recall, it may have been called Mars, but it ran as hot as Venus!
My first home built desktop was with the FX6300... While it wasn't great at productivity stuff, it held up to High/Ultra 1080p gaming until about 2015/2016. That little chip means something to me. Surely not because it was great, but it was definitely serviceable and was my first time seating a CPU.
I had that, too! And Crossfire 2 hd6970's lol
Same here, FX 6300 Black Edition. Started out with a GT 610 and later got a GTX 1050. It wasn’t able to run GTA 5 though, would always crash on that first mission when you play as Franklin with Lamar
I shot myself in the foot by getting a FX 4170. It worked fairly ok with the games I played but took forever to encode my videos. I also didn't need a heater in the winter lol
Those Bulldozer and Piledriver CPUs are a completely different lineup than the ones in the video though, by about five years. I still have the FX-8350 and I believe Steve from GamersNexus still daily drives his FX-8370
Same! I loved that cpu! It was 2013 and I was running it with Gt630 and than R9 270x, that system put me through a lot! If there wouldn't be Covid I would never had to upgrade that poor computer. It was really keeping up with anything I wanted to play and it aged well... its now in family PC mainly for printing stuff :/, but man, it still works great! I would love to see a FX-6300 video, something with overclocking it to 5Ghz and playing modern games
The 760 was the card I bought because I ditched SLI, finally recognizing that the fps counter was meaningless if it played like crap.
Never thought someone would be making a dual gpu care in that era .. let alone buying one , let alone midrange cards :).
I went with a 2x hd3870 (dual gpu in one card ) at the start , single hd4xxx series , hd5850, dual hd5850, dual gtx560ti. I finally leaned my lesson by then to just get a single card, no matter what .
Ah yes, the Quad-Father. I would have killed for this back in the mid 2k's. It was pretty much a hamfisted attempt at catching up to intel's core2 quads.
only child molesters use amd.
AMD developed the platform within the same time as Core 2 Quad's development, and they released 30 days apart from one another. AMD didn't have much chance to tweak anything to stay ahead, and K10 was still a few months away. They banked on the "FASN8" program where K10 would be a drop-in upgrade and allow for 8 cores, and once 65nm yields for K10 proved to be a nightmare and the whole architecture had to be cut down to fit (axing most of the L3 cache) they cancelled that plan.
I'd love to see a video where the challenge is to get 60fps in a handful of modern games using the oldest hardware possible, including OC and used parts! Good reference to see what still holds up
I own both setups, almost, bought the AMD fancy FX 64x myself, almost buying the dual socket board with one CPU. Never used the system a lot, too many driver issues, AMD now...?
Happy i found a Q9650 CPU in 2006, able to build my first DDR 3 system on nForce, that was the cheap intel system, just before the Core launch.
The Q9650 system we sill use, 16 Gb, the NVMe drive makes it fast enough to run Win10 x64 fast, the new RTX card in it was not a waste ! 775 can still be modern if you do it good, early 2000's !
The ASUS MARS, was years later, cheap GTX 660 SLi ! wow, how weird, i understand why linus did this build !
The 1080ti would be the undefeated goat
Back In the late 90's, I had an Abit BP6 dual socket 370 Mainboard, with two Celeron 466 cpu's overclocked to 525 MHz. The only problem was that at the time the only SMP O/S available to me was Windows NT4 Workstation, which barely supported any games. But it was a very cool mobo at the time 🙂
I think you should have gone with the 7950GX2. I always dreamed of that card back in the day.
Yeah that or 9800GX2 would have made sense for the era. A single GTX 760 is overkill let alone quad GTX 760’s 🫠😂
@@rare6499 100% the reason they didnt is because both the 9800GX2 and 7950GX2 are "extremely" unreliable and hard to find working nowadays
The Vega VII didn't paper launch. It still is a beast for crypto mining comparable like a more efficient 3080. They were mostly bought out before reaching consumers. Have seen my fair share of Those fragile beasts
Im amazed a 4-year old GPU is still so competitive at mining
When my current PC was first built it came with an FX 8350. Compared to the previous PC's Phenom II 1055t, only BeamNG and Space Engineers saw any improvement at all and some applications even got worse outright with stability issues. I didn't hesitate to upgrade it to an R5 2600x the moment affordable Ryzen MBs started being sold in my area. The improvement was night and day. My GPU wasn't getting bottlenecked anymore to the point I was even able to upgrade my monitor setup. (There was a whole other can of worms after that, but it was more due to ASUS related shenanigans than a fault of the platform itself.)
That double cut shot under the table about his masochism was great, keep improving your skit making LTT crew it shows in the product!
You could run those on non-SLI boards. That was the main gimmick of the dual-GPU cards... at least as far as I know? My newest one from nVIDIA is the GTX 690 so maybe it changed later.
jam,
running mid level GK104 chips on one PCIx bus, both the GTX 690 and ASUS MARS, was the solution back then, slow chips, getting some FPS, able to sink ?
The GTX 690 i still own, the EVGA one blower factory model, no fancy sticker, needed for basic 1080p gaming, replaced as soon as i could, the GTX 1080 did that for me, still own the 1080ti.
You're forgetting about the Bulldozers, I took the risk of buying a "FIRST EVER 8 CORE DESKTOP CPU!" instead of that lame quad core 2600K from intel, boy did I pay for that, the FX-8120 I got overclocked all the way to 4.9GHZ on air but was still 30-40% slower than the Phenom II x4 955BE I upgraded from
I have a still functional system using the the non-ECC version of the board with the FX-74s (the PCIe slots are blue vs red), and I actually liked it overall. Yes it ran hot, but I had the thermaltake vortex coolers to handle that (which look pretty killer too i might add), and I later used liquid cooling specifically for the GPUs (nVidia Quadro 4000s, later upgraded to K4000s). It can actually play Doom 2016 at 720p on moderate-ish settings (deffs not 60fps, but playable). You can install Windows 10, and even Windows 11 if you remove the TPM requirement. WIndows 11 is pretty awful performance-wise though. I think the biggest killer was that the RAM slots and onboard devices were known for going bad or being bad from the factory.
I remember on cold winter nights i would literally drape a blanket over me and it (at the foot of my chair) and just use the machine to keep me warm. lol.
I had a Athlon X2 black edition in an am2/am2+ asus board. Later upgraded to a phenom ×4 955 black edition. Used that for many years with a GTS 250. First gpu was 8500gt.
I remember that era well. I was an AMD fanboy up to that point. Went from Athlons to an Opteron 170 as my first dial core CPU. But, yeah did AMD drop off a cliff fast. When, it was time for my next upgrade, I really had no choice but to switch to team blue and the stupidly good C2Q 6600, that thing was an overclocking beast!
I remember the Pentium 4 days. One of my friends had the 3.0 Ghz and it was a pain to cool. I only had the 2.4 Ghz
I remember there being some really cool build back in the early 2000's that used multi GPU and multi CPU setups. I would really love a video about what people were doing back then with those.
Hey Linus, i still use my Asus KCMA D-8 which uses a dual opteron 4130 setup with ddr3 ram and use it as a home server setup, A setup from 2009. I'm so glad you are enlightening the world about the world's best platform.
11:25 Perhaps "because it is there" is not sufficient reason for -climbing a mountain- buying that product.
i think it would perform a lot better with the dual cores 3ghz (4 cores effectively) and definitely dual channel ram and a 1060 at most
I had dual Opteron 270 system for my gaming rig around that time and it did rip copying data on sata drives with two nforce 4 chipsets. It did make a difference against DDR2 dual cores at the time for some reason until quad cores from AMD came out.
That screen tearing edit was glorious! That editor needs a raise :D
the edits seem old i had to check when it was released like the music in the background is so loud
You should do a video build with all the oldest computer components have in stock and compared to a new systemy
Next up: one of those old (circa 2006) Tyan dual-core Opteron motherboards with 4 CPU sockets plus the Tyan M4881 daughterboard that adds an additional 4 sockets for a total of 8 CPU sockets and 16 CPU cores in one machine
My 15 year old AMD PC the HP dc5850 was great. It had a AM2+ socket with a Phenom X3 and later a Phenom II X4 B97 (4x 3.2GHz); max 8GB of DDR2 (800MHz) and 2nd gen PCIe with a 128GB SSD on SATA-2. I used it till April 2019, when I replaced it with the 2nd slowest Ryzen ever (Ryzen 3 2200G), but still a huge improvement 2x the speed of the Phenom II X4. That 2008 HP dc5850 is still in use with a friend.
I love the AM2+ and AM4 motherboards, they both supported at least 2 generations of CPUs.
The editing in this video should be the gold standard for LTT, they even impressed me with the quality of seizure warning. It is appricated when they warn people about loud sound or strobing lights.
Amazing to see Opterons in the channel. I remember I was this close to buying a G34 socket server motherboard to get some of those cheap CPUs... But you can imagine why I didn't for it. It could've been incredible for server things, but Ryzen going more affordable and more reasonable (even Threadripper and Epyc) basically negates all purpose, unless I go "just for fun".
My old PC had an AMD FX-8350 Eight-Core Processor from 2012, and it honestly wasn’t all that bad. Yes, it ran very hot and really raised the temperature of my room, but it was snappy, great for gaming, and handled anything I threw at it. It still performs well today.
if you have a gtx 1060 or lower then yea its fine
@@C0mmanderX The old PC had a GTX 960
That could be perfect for winter setup for those who lives in very cold climate
I used to have a fx-6300 and the computer sat near my leg. Could be shirtless with the AC on and still be sweating
same, to this day i am still running an i7 3rd gen, works fine
I just had this running in the background while assembling a bike. I wasn't paying attention, but loved the throwing of Gary under the bus over a past CPU choice!
I have a MSI k8n master2-far with 2 opteron 280, and it is similar to the quad fx platforme. Really good on the paper (SLI, 12gb DDR400 ecc, 2*2core cpus, 2gigabit lan, on a e-atx form factor,...) but released in 2005... 1 year too last for it to be interesting.
9:52 bottom, right image 💀
im honestly surprised and impressed AMD managed to crawl its way back as a competitor
I don’t think anyone would invest that much in old parts. That’s why LTT did it for us.
Based on the performance charts.. I wonder how they even Marketed this Quad FX then since it was still not good? Did any applications use more threads back then? Some type of engineering program, for example?
Small office file and service server market, not for home. These systems in tower cases would be sitting in closets in offices.
I didn't recognize Gary till you mentioned ASUS, met him at HardOCP's Gaming Convention thingy. (Met Kyle Bennett too) both awesome guys!
Also I still have an AMD FX-8350 + GTX 980 as a spare gaming rig. Also have a C2Q 9650 that runs a GTX 960 (before that it was a 560 Ti)
Even though it sucks, it sure is awesome. This is one of those things that you build today just out of pure fun, because you can. Multi GPU and CPU setups were absolutely a blast to play with back in the day. It usually wasn't practical, but it was a blast to tweak.
Man these were the days though that overclocking WAS necessity out of the box. Where you'd instantly get 25% instant increase... it would be interesting but I don't know how safe to do with its age now.
FX is a really cursed name for tech...
Retrospectives of components from that era always makes me want to look at old archived Maximum PC magazines.
Linus needs to try use a single core celeron lol... Then you'll feel the power of wasted silicon
It could be 100x more painful for him than his recent "cheapest laptop ever" video
@@sihamhamda47 I never knew a browser window lagging out was possible on modern cpus till I experienced the abomination they try sell
I'm old, so I really love seeing you guys look at older "high end" stuff I could only dream about back when I was a kid 😂
I love the dual socket MB they just look cooler
Probably hotter actually
New Chinese dual Xeon ATX motherboards are 100 or so, the 4-18 core CPUs pulled from big server farms are very cheap. X79 uses quad channel DDr3, X99 uses dual channel DDR4.
You can buy 19" rack dual CPU server slabs for 150 upwards from A******n , they generally have two PCIe slots at the rear, also a power connector for GPU lead. fit a USB3 card in the second slot. the biggest caveat is they have lots of small fans and are jet engine noisy at start up..
Linus,
I had an 8350 FX on an ASRock motherboard with a gen2 m.2 slot that I stumbled upon accidentally
I used a gen3 NVME and it performed amazingly well with DDR3 2400 MHz memory 16gb
I had this rig for 3 years then upgraded to Ryzen 7 3800X
Personally I recently built another FX the 8320,16gb DDR 1866 and m.2 in a riser card...I gave my old system to a friend
You gotta admire AMD for going the distance and scaring the heck out of Intel nowadays
So yeah when on disability I can't afford Intel but it worked for me.
BigT,Alaska
my first computer was a dell optiplex 780 sff with a core 2 duo (unsure on model) then upgraded it to a core 2 quad q6600 and damn it went for what it was. great little computer and i got it for $30aud way back lol
ive got one of these laying around, it comes with an E7500 stock
@@lands1459 interesting, the guy i bought it off mustve got a different cpu at some point or as an upgrade cause after a double check it had a E8400 duo @3Ghz
13:58 props to the editor who put in the effect, subtle but nice.
Please check into CPU latency ! From both team red and blue
Don't think there's too much difference, as the Intel quad cores at that time were simply two dual core chips on one package with no direct connection. Much like this dual CPU setup from AMD.
Note to editor: Background music too loud. That's all, thanks.
I second that.
time for a best CPU EVER VIDEO
I have a retrocomputer with one of those! I purchased it in 2017 for a celebratory bonus build in parallel to my TR 1950X build. I have a number of retrocomputers with... intentionally weird and questionable parts, including the infamous nvidia leafblower (FX 5800 ultra) and a 3dfx Voodoo 5500. The Voodoo 6000 and XGI Volari Duo both evade me sadly.
...I also have some systems with modern weird and questionable parts too, like a working system with Via/Centaur's last x86 cpu (the CHA/CNS chip) before Intel acquihired Centaur.
I remember the Mars 760's. I honestly almost bought one. Kinda wish I had now cause I collect dual GPU cards and finding them is incredibly difficult. I would have hated myself at the time, but I would have one now for my collection.
The MARS i sold, some old guy needed it, getting way more for it then i ever payed for it
Got the GTX from Nvidia, the EVGA 690 model fefurbigest, RNA for a dead 590 card, still own that.
GTX 660 running in SLi was mad, to many overhead, the GTX 690 did the job, just basic 1080p gaming was needed !
You guys should do a video about old school modding. The AMD K6? For example where with a silver pen you could re-connect the lines they cut on the die to create the cheaper CPU version. And get some GPU's as well where "Soft modding" was still a thing to unlock the under clocked GPU's because those failed yield testing and offered as cheaper versions.
Those days are gone and sadly missed - fun times but would be good for us older techno bobs to enjoy and to enlighten the newer generations of the "good times"
The Quad FX looks like something out of a *fever nightmare*
Your profile picture looks like something out of a fever nightmare
9:57 this video was a month+ in the works... dam... [this was recorded when the "Custom 'YTX' gaming build" was first uploaded]
I remember upgrading from an athlon xp to an Athlon 64 FX-74 and it was such an incredible upgrade. I also remember going into fry's and the sales rep trying to sell me on quad fx (or more like trying to sell my dad on buying me quad fx) . Glad that didn't happen lol
How did you upgrade to an FX-74 if you didn't buy the Quad FX system? The FX-74 only works on Quad FX, it was made specifically for that board and was only sold in pairs (2 CPUs per box).
@@FixedFunction lmao you made me go dig out that old pc (in the garage) and to my surprise it's an FX-62, guess I got it mixed up (I was like 12 at the time /shrug)
Shout out to the optical audio users out there at the time! (I was only one due to a Best Buy warranty gift where I got "upgraded" after buying a floor unit for $100 to a surround sound $1100 unit, but still. It was amazing.)
I love these legacy hardware reviews!
I just swapped out a AMD FX 8350 8 Core from my work PC for a latest AMD. I didn't realise just how bad it was performance wise, The CPU was great back in 2013, but later I added a 1050Ti GPU from 2017 and it could barely use 50% of the available power of the GPU. However it did hold up well, with water cooling, it's been faultless for the last 10 years reliability wise and I didn't notice much in the way of limitations for low level gaming, but I couldn't get almost any good results rendering videos which is why I upgraded. It's a huge difference, the latest AMD CPU's paired with a 3060Ti make a beast of a PC for both gaming and video production.
Like for the warning!
As someone who ran a 9850BE Phenom and upgraded to a 1100T Phenom II then side-graded to an FX9590...
I did want to try any of the dual-CPU options AMD offered. Couldn't afford it though as I worked in restaurant kitchens at the time so I was minimum wage w/o tips.
It was morbid curiosity that had me wanting to try the platforms even though I knew from a plethora of benchmarks that they were easily at least double the cost for the performance compared to normal systems. Even my 1100T and FX9590 would both outperform most of those dual-CPU options from AMD.
But wait! There's more! While the 1100T had a decent array of instruction sets for its generation, and the FX9590 had more, neither had some of the most recent instruction sets. This meant that some games would refuse to launch let alone run. You were lucky if you could find a community-made patch that would act as a translation layer for the missing instruction set to a compatible one that either Phenom II or FX could run.
That still powerful from my 10th gen i5 😀back in then, amd decided to put more but not so powerful cores into processors. İ guess they were like, ohhh mate if we add more cores it will be more powerful
how
Swing and a miss my guy, those slightly less powerful single-core but better multi-core CPU for coming out when 8th and 9th gen were a thing
@@doublevendetta ohh i Dont really recognize nor i have Heard quad FX thing but i didnt think it was that New i think that multicore thing is 2011 thing. So sorry if i was wrong
About https on old hardware. Old CPUs lack a feature that's found on all modern CPUs and that's AES hardware encode and decode. If you go to youtube on old hardware, then the CPU must do all the work to decode and encode the network traffic. That causes the CPU to max out and the video to struggle at the beginning. Other single cores and even some duo 2s would suffer from this problem but never fully recover because they can't catch up with encryption (just stay laggy and dropped frames). So really comes down to that hardware decoding and encoding that makes the modern web work! Imagine opening a website where the processor had to do all the work. It would be like starting up a program every time. Oh, that's kind of like what Linus experienced when doing so...
Will you be auctioning this off at some point!😂
9:20 Linus WTF have such good voice. We need him to sing Karaoke in @channelsuperfun !
This is the era when I really got into PCs. A friend of mine got some cool PC game and we heard about LANs. I bought PC magazines like a true nerd and eventually saved up and built my first system around a Core2Quad Q9300 and a Radeon HD3870. The magazines taught me enough fundamentals and my passion didn't stop at gaming. The interest stuck around and now I'm a professional software dev with an MSc in computer science. I worry that computers these days don't inspire the way they use to. The kids I do meet don't seem to care about it at all, just phones and social media. Sad...
Kids these days are inspired as well. Let's face it, you're just looking for a reason to shit on the younger generation, which is especially ironic given you rely on said social media to share shit like this.
I would actually be sad that no one would try to completely off the line or better yet off computer. Like the life used to be. We're too dependent on computers, myself included.
If there is one thing that being a long time AMD user has taught me, is that if it has FX on it's name, avoid it like it's the plague
10:47 (Seizure warning)
Most of my computers since 2010 have been 2-4 way SLI machines. In my experience, that exact flickering is caused by a bad connection of the SLI bridge, due to oxidization or whatever. Usually just wiggling the SLI bridge around should fix the issue
The thing with platforms like these is that they don't have a solid upgrade path which brings things to today with the Threadripper platforms. Essentially every generation of Threadripper. Sure the first generation got an upgrade from 16 core to 32 core max but they were the same Zen 1+ architecture. Then then was the TR4 socket for the Threadripper 3000 series which never saw a new chip released for them. Ditching the previous dead end socket for another dead end socket.
Then there was the Threadripper Pro which added yet another socket to the Threadripper history but hooray it did provide a generational upgrade with some gotchas. The Threadripper Pro 3000s were mostly OEM only with off the shelve parts for DIY builders only appearing shortly before OEMs had access to the Threadripper Pro 5000s, thus making Threadripper Pro 3000 in the DIY area kind of rare: an upgrade path did exist for the dozen or so people who had one. While the consumer and server segments of AMDs line up got some 3D V-cache parts, Threadripper again was neglected. The saving grace for Threadripper Pro 3000 and 5000 is that the many DIY motherboards can also work with Epyc chips. One downside of being mostly an Epyc chip for Threadripper Pro is that they can be locked to an OEM platform, complicating the used market.
the radeon 7 was not vaporware, it just was completely grabbed by miners i assembled rigs and in personally mounted hundreds of these, at the end of ETH life you could push 100mhs EACH at less than 200w power draw it was not until the 4090 arrived that i was beaten with 110mhs but at OVER 300w so in the end the Radeon VII remained king of GPU mining
I remember upgrading from my old Turion to a bulldozer and being quite disappointed. Turion was such a beast and I got such good mileage out of it.
11:15 The way Gary said, "yes it is!"
My word, he should make audiobooks
I had an old crossfire setup with a pair of 290x 4gb cards on water, that I used until 2018. Once I took it down, one of the GPUs simply would not come back to life on it's own, it would recognize if I had it in a rig with the other one but it was totally DOA, no picture nothing, when it was in a rig by itself. When I had it as the primary GPU it would do that same thing with the flickering textures and other nasty problems, clearly it had some bad VRAM but also I think the output engine was broken because one of the outputs didn't work either. I had no idea anything was broken until after I took it apart lol
11:43 heck yeah, love a good Linus Drop Tips montage.
That CPU cooler... Reminds me of the time I used to cool an AMD 939 setup with a Thermaltake Big Typhoon.
11:51 Gary looks like Linus’s disappointed father right here 😂
I appreciate the fact you compared the Quadfather setup to a Metapod 😂 incidentally, I play Pokémon a lot lately, since my Z97-P board died, and I had to downgrade to a shitty H61 chipset board with only 2 RAM slots with an i5-2500K (which clearly doesn't run my games).
But at least you didn't make the comparison to Butterfree, which is one of my best Pokémon, which actually allows you to steamroll the whole game with the right item and moveset.
Thank you Gary, now I don't feel too bad about my 300€ arm chromebook I bought just to see how ChromeOS is evolving.
People spent nearly 3 grand on 390's, you can't believe people bought these things ?
still running my fx-8350 till this day. Built my system like 7 years ago. Bought it because it was dirt cheap and ovoerclocked eversince to stable 5ghz. upgraded couple of years ago from a 750ti to a rx480 and upgraded to more ram, multiple ssd's and even an m.2 ssd trough an pcie adapter. Sure the system isn't the best but i really never had to complain about system instability or something. Still runs my games fine but in the near future, i want to upgrade system to a sweet 5800x3d
I used to rock a Athlon II X4- was always scared of the 95W TDP and the fact that my only working cooler was just extruded aluminum with a tiny 80mm fan.
I built a Netflix 4K standards PC in 2017, I have since upgraded the ram to 32GB from 16, replaced my 120GB boot M.@ drive with a Team Group 1TB and replaced the 1050 ti FTW cpu with a ROG STRIX 8GB 980 something or other. and thank goodness $K Netflix is available to other chiopsets, GPU's and Browsers. Also the tubing from my built in Water cooling was sitting on the GPU backplate and melted a bit causing a bit of coolant to leak onto and dry on said backplate.. I don't dare moving the computer anymore for fear of shifting that tube and causing a full on coolant lean inside my running PC. also part of the pump system sometimes works and I have only heard the fans on the radiator spin up twice, but strangely it keeps really good temps in limping passive mode... ;) That's the water loop not the PC.
I just gotta say, the Bueller-esque cut to under the table as you plug in the board and continue to trash talk about Gary was fantastic. Kudos to the people who put that together.
My 14 year old i7-975EE still works, (2009) - the only issue was the mainboard failed but in a new board it still runs and plays fairly modern games at 1080P quite happily - IT just goes to show the leaps made in CPU speed and tech over those 2-3 years, and the stalemate in development between then and when Ryzen put in an appearance to wake Intel up again.
Watching your videos for like more than 5 yrs now, graduated college, now doing masters, and yet every video is so exciting and full of knowledge that contributed a hell lot in developing my love towards tech. Just keep making awesome videos. Love from India
I remember the day my father got a quad-fx machine, it was just a business computer with basic graphics, so it was very much overkill. It was late in the lifecycle, and we had it for several years, and I became an AMD fan because of it. About 5 years later I decided on my first build with an fx8120 oc to 4Ghz. After several upgrades in the past 10 years (20gb mismatched ram, rx580) that bulldozer is impressively still running overclocked after all this time. For as much hate amd got, they made some decent and reliable chips..
My poor computer maxed cpu usage in Horizon: Zero Dawn, but i got my one play through lol
Not to be confused with AMD-FX and not to be confused with Intel Core 2 Quad
Imagine playing Project Offset on this puppy
If you know that hadn't died with Larrabee
I remember back in the day I had dual cere E6300 1.8ghz overclocked to rock solid 3.4ghz. good days.
I have to wonder as to the GPU choice for upgrade that was tried here. I think a more appropriate one would be a 1050 or RX 580. They both would beat the crap out of that 760 and have better support under W7. Consider giving it whirl even if just in the lab.
I owned one of these for a number of years and used it as a workstation for graphics. I actually offered it to LTT via Twitter, but I assume you are quite crushed by random messages. I am surprised how poorly it ran for you because it worked pretty well for me. I ran Windows 10 and Linux, dual quad-core Opterons just like here, 8GB of memory (I think), and dual 9800 GX2 cards. Blender rendering was slightly slower than my 5.2Ghz overclocked FX-8350. Gaming was terrible, so I never did that except for screwing around. My weird issue, which you may have had but simply did not run it long enough to find out, is the well-known BIOS battery drain bug. As long as the system is on, the BIOS battery is being actively drained. After maybe a week of up-time, the system will crash, requiring you to take the GPU out and replace the battery. It was not a good system, but I see it as the beginning of what is now referred to as HEDT, even though Intel used the term first like three years earlier for a standard Pentium. I wish both Intel and AMD had stuck with this.