Thanks for watching! Music is Digital Sunset by Karl Casey - ua-cam.com/video/xbacHaqsRBI/v-deo.html Thanks for watching! You can support my work here: www.buymeacoffee.com/ratechyt discord.gg/PFb9cMstZH ▪ instagram.com/ratechyt ▪ twitter.com/ratechyt ▪ facebook.com/ratechyt
@@n0nyabznss untrue. Some boards are only capable of running 24gb. I got a x58 pure black sapphire for instance and no matter what kind of ram I introduced, I was unable to get it past 24gb, even at low speeds and with extra voltage. A lot of people around the web have reported sucess for 48gb but thats not for EVERY board.
So have you tried the 6 core Xeon? I actually found it better to keep BCLK around 190 and then just raise the ratio and QPI. This way even Ram can be pushed higher close to 2000mhz as well and you can easily achieve clocks past 4ghz too.
I find it outrageous that, even back in 2009, the first-generation Core i7s supported triple-channel RAM and yet we're reduced to dual-channel in 2023.
Its because the ist gen i7's where on the hedt platform. X58 followed up by x79, x99 and x299 which all supported quadchannel. The consumer i7 870 was only dual channel. The hedt platform died. With AMD dominateing with thread ripper but then never releasing an zen 3 version. Intel seems to be releasing a new one with sapphire rapids. Edit corrected x59 to x58.
@@Synthematix I'd like to ask how exactly did AMD prove that dual channel is better than quad? I don't disagree with the latency claim, but the word "better" is very ambiguous. if anything their move of their zen 2 threadripper cpu's to octa-channel from zen 1's quad-channel might indicate that having more channels is better. But I assume it really only benefits high bandwidth requiring use cases. That being said I'm fully aware most people do not need more than dual-channel and increased cost and potential latency hit are downsides for majority of users.
Went from i7 950 4ghz to X5675 4.5ghz. really nice CPUs retired the setup in 2021, Pretty experienced with X58. 200-216 Bclk is the sweet spot , some boards can run max turbo multi some can’t. Memory and QPI is very important 3.4-3.8ghz uncore has quite big performance gains. The 45nm chips can go to about 1866-2133mhz on high density kits whilst 32nm can easily do 2200-2400mhz+ with reasonable IMC voltage (1.35v). As for boards Gigabyte , Asus or EVGA Are the best to get , avoid the rest. As always good video!
The RX 6600XT is likely being bandwidth starved in some games, since you're only using PCIE 2.0 x8, that's the same bandwidth as a 6500XT in a PCIE 3.0 system and even that card suffers greatly from that in some instances.
Even a 3090 in a movie 3.0 system is not bandwidth starved at x16. Gamers Nexus did tests on this a few years ago. None of the RX6xxx or RTX 3xxx cards are bottlenecked on pcie 3.0x16
@@LastExile1989 Rx 6600 xt hast only 8 pci lanes, first gen i7 can only use pcie 2.0, so rx 6600xt uses pcie 2.0 x8 that equal to pcie 3.0 x4 which makes quite big bottleneck even for rx 6500 xt. So a rtx 3090 can be not bottlenecked by pci 3.0 x16, but it is totally irrevelant to what Dr. Fez tries to say since this system just using pcie 3.0 x4.
I know someone who runs this cpu as their daily driver to this day. She just really needs to get a sata ssd to truly enjoy the speed her pc has to offer.
I did in the late-2010s and it was A-OK with a GTX 970. I regretted popping the RX 580 into it in 2019! Because it caused me to waste time and a motherboard, just to troubleshoot, when it was fine, everytime with a GTX 970. I also just got the GTX 970 used on May 29, 2019, IIRC. It was because I also bought an RX 580 weeks later.
I use it as a back up pc with a HD7950, still runs any game I want so far, but it does seem to struggle more in some games that it should. I feel like that may be due to missing instruction sets.
I am still using a Cope i7 920 on EVGA X58 @4Ghz (built in 2009) as my main Office PC (has a 980Ti handed down from my 4790K system). I only really use it for Word processing (MS Word) and browsing/youtube. The only game I have played on it in the last 5 years is a few matches of World of Tanks, and it works fine on a 2560x1080 (ultra wide) 75Hz monitor.
I'm still using an overclocked i7 920 system that I put together in 2009. It's had a few upgrades (RAM, graphics and an SSD) but I'm saving up for a new system as a few crashes have started to happen. Not bad for a system that has barely been switched off in 14 years.
While the I7 960 is impressive for its age, the LGA 1366 platform really shines when you put a Westmere 6 core 12 thread chip in. It's a more refined architecture (960 is 45nm, Westmere is 32) and as you noted, there is a ton of overclocking headroom left on the table. Getting one of those chips (X5690 is the best, though may be too expensive for what it is) to around 4.5GHz can yield similar performance to a Ryzen 5 1600. Triple channel memory also goes a long way toward helping the longevity of these processors. X5650 processors sell for less than $10 on eBay.
Sorry, this is a bit of a necro-post. But Watching this video again got me thinking... The i7 900 Series came out in 2009. And the Phenom II x4 series also came out in 2009, with the first stepping early in the year, and the later C3 stepping coming out a month after the i7 960. I wonder how these would compare... 🤔
Awesome video and motherboard you got your hands on man! It looks like we like to test the same kind of scenarios. The highest I was able to clock my i7 870 to was 3.8Ghz and raising the voltage higher than something like ~1.34v caused the temperature on one of the cores to hit 100C and completely shutdown the PC. That triple channel is a nice feature and should help even more with the CPU performance, especially after you were able to push the QPI Frequency. Strangely, I found the performance to be a hit and miss between my OC'ed i7 870 and stock FX 8350, where the FX 8350 would have a higher frame-rate in Dying Light 1 but with a periodic stutter and the i7 870 would perform much better in Resident Evil Operation Raccoon City even at stock. Personally, I found my Nvidia GTX 1060 to perform better in most DirectX 11 titles than my RX 480 from the CPU side, but my RX 480 would perform better in most DirectX 12 titles CPU side.
Very decent performance RA, The Lack of AVX puts a damper on future titles though. I personally went to Sandy bridge 2600k from the Core 2 Q9650 back in the day. (What an upgrade! Also still running great) Thanks for the detailed review!
I went from a super OC'ed i3-540 to a 3770K. Kept that 1st gen i3 for far too long, but it wasn't until a bunch of new games were released that I actually wanted to play, that I felt like I had to upgrade. It was though, as with your experience, a huge upgrade - just in the overall responsiveness of the machine.
Still having an x58 system, w3670 @ 4.7ghz uncore @ 4ghz , 24gb ddr3 @ ~2200mhz, gtx 1080ti, its a power hog, cpu only chugs like 250-280w in full load, definitely showingnits age but it still performs similar or close to my ryzen 2600 setup but wirh just like 2.5x - 3x the power Really a very good platform for its age
So my cousin recently received a computer (a decade-old Alienware A51) from one of his mother's co-workers whose husband recently passed away. It's got an i7 960 and a GTX 1060. I built a budget computer for them a couple years ago with an i3 6100 and a GTX 1050 Ti SC. Obviously, the 1060 is a significant upgrade from the 1050. However, the new computer has superior RAM. However, when it comes to the CPU, the differences are not as easy to see. Some comparison websites say the i7 960 is significantly better then the i3 6100, but others say the i3 is superior. When trying to boot up and reset the Alienware computer, there was a pop and it shut off. It was stored outside (under a roofed patio, but still subject to dew/mist). I suspect the power supply shorted due to moisture. Am I better transferring the 1060 graphics card to the i3 6100 computer that's already built and working? Or is the i7 960 superior enough to the i3 6100 to warrant tracking down the issues with an older system? Ideally, I would gut the old Alienware case, and rebuild the internals with new gen parts, but my cousin is on a limited income. Is the i7 960 worth the effort, or is the i3 6100 an equal match? I should also note my cousin is technologically illiterate. OC'ing does not apply. He couldn't even figure how to plug in the power supply cord or plug in the HDMI cord from his PC to his TV.
Performance wise the i7 960 is better, especially when overclocked, but it looks like overclocking is out of question. I guess it makes more sense to stick with the i3 since it will feel snappier than the stock i7 960 when it comes to simple day to day tasks, and upgrade it to an i5 or an i7 some time in the future.
Don't see a lot of videos on this CPU so this was great. I still have my old rig stored in a closet which has an i7 960 so it's great to see what it can still do.
The old-style BIOS, too. Looks like only some motherboard BIOSes on 1366 can work properly with later than Radeon R9 and GeForce GTX 980 series, I suspect. The BIOS issue I experienced, was by accident, after installing an RX 580 in 2019 on my Asus P6T Deluxe. After that, I noticed that when my PC was unplugged then plugged back in, the BIOS config was lost. Then I got an Asus P6T6 WS Revolution from eBay and still the same symptoms with an RX 580! Only after getting another motherboard and going back to my GeForce GTX 970, did I know why the BIOS was acting extremely strange!
@@RJARRRPCGP That's how I found it while upgrading a x58 PC with a RX 570. When it worked in another motherboard, and then failed again with a RX 550. I installed a GTX 1060 and a GTX 670 and no issues. I took the RX 570 and tested it on a loop in another PC and no issues. I updated the machine BIOS and no matter what it would either fail to display or then loose display. So that PC ended up with a GTX 1060 6GB which was better anyway.
Just retired my i7 930 a couple weeks ago for a b550 5700x. Massive improvement in frametime and 1% lows, but minimal increase in avg fps with the same gtx780. I guess the gpu is the bottle neck now.
Awesome B-roll at the beginning! As for first "i" gen series, well it is indeed impressive what they are capable in today's games, but honestly x58 platform is obsolete in my opinion, because it lacks even first gen AVX and when cpu starts to completely fail launching games then its time for an upgrade. Budget a320 mobo + 1600AF would not only outperform x5650 but also launch all games + excellent upgrade path. Imo at least avx1 capabilities is a MUST for any cpu these days.
Also in addition x58 usually only has SATA II, MBR Boot and limited to 2TB boot size, no UEFI, no GPT Boot, can't boot from NVME SSD unless you do some custom HDD boot I/O address redirection and it also typically has only has PCIe 2.0, and no USB 3.0.
havent had issues starting a game, or running them at playable fps with a stock i7 960, but it is struggling more than it should in cod coldwar. I feel like it may be due to missing instruction sets.
X58 was such a powerhouse i have that same board with a i7 980x my youngest boy uses it nowdays but it will still do basically anything he wants its paired with a 1070 ti
Even if frames show to be good you are getting stutter on the games. And on competitive games animations might also trigger stutters and you are giving away the mostly needed high refresh rates at low 1% required on most intense of combats. I wouldn0t mind the performance if it wasn't for how much more you need to play RTS games, this CPU will suffer a lot compare to modern CPU even a Ryzen 5600x would triple the frames on specific title and leap you off the bellow 30fps area. I'm happy with my 4790K to be honest and ain't far apart from that 960 regarding performance, cheap 32GB system with a cheap Asus Hero 7 for optimal OC and good onboard sound solution (comes with surround sound).
The xeon x5675 is 41% faster and %17% lower cost than the i7-960 with stock settings. For me the xeon is the clear winner over these two cpus, also runs on the same x58 chipset. The xeon also has 6c/12t vs 4c/8t Passmark: x5675=7754 i7-960=5491 Also I have got my x5675 to 4.8Ghz, pretty badazz for a $12 cpu, give it a try.
I lived on an i3 540 until 2020Q4 where I replaced it with a new motherboard and an Athlon 200GE which was already an uplift and I am since 2023Q1 on an R5 3600 finally. Although the CPU was weak I could get by with it mostly okayish I didn't notice any bottlenecks since my GPU was also on the lower end side but I always wanted to have this CPU instead of the i3 540 even if just for the higher core-count.
My PC specs and clock speed. .GA-X58-USB3 rev.1.1 motherboard. .CPU i7 960@3.2ghz / OC 4.5ghz .Bequiet Aircooler .Balistix RAM DDR3 1600 mhz .GTX 1660 Super Gaming X. .850w Power Supply Note: With a water cooler, you can clock the CPU to 5.0. With a good aircooler no more than 4.5 stable. Cinebench R20 score 1410 Temps: °40 to °45 idle. °70 to °80 max load Cinebench CPU load. °50 to °65 on gaming.
I got A HP with the same CPU and I paired it with AGTX1050TI 4GB Mini zoltak LOW- profile I play battlefield 2 battlefield 3 HALF life..... half life 2 serious Sam 1st encounter and serious Sam 2nd encounter I play aliens colonial Marines I think that this system I got for $40 it is the best I have ever had fun with and it plays every game fine I have 2SS d's running in raid 0 SO MANY PAST GAMES AND THEY PLAY SMOOTH AS SILK ILTEA SETTINGS OR HIGH AND ULTRA MIX..😊😊😊😊😊😊❤❤❤❤❤❤❤❤❤
for almost a decade (10s) intel was just going slowly in terms of progressing single core speeds of its processors. So, after 10 years (2020) the overall increase of single core speed was about 120 % which is crazy slow... And after that Apple silicon came .. and x86 is DEAD.. and they dont know it..
technology hasnt changed really that much in the past 16 years were still stuck on bluray quality movies, hard drives have gotten mildly bigger and now arent being made. weve got cheaper 4k tvs and slightly better cell phones but not really much as smart phones were also already available by 2008. Windows 7 came out in 2009 but it is still a great operating system -hardly any improvements there. The new generation is so vastly different from me being born in the early 80's. I used to see massive improvements from the 90's to the 2000's but thats pretty much where improvements ended. In fact Im not really seeing hardly any advancements in technology at all since 2010. The xbox one came out in 2013 but really isnt that much an upgrade over the xbox 360 with kinekt. Most of these games could have been made to play on a 360 for sure. A lot of technology has been actually down grading. We've had some improvements in internet speed but my friend just bought a brand new dell laptop and the wifi is using a wifi 5 chip with only one antenna so the speeds were only like 50 of 500. How are you going to put a 10 year old wifi chip in a brand new computer and then not even have an ethernet plug? Its a 40 dollar fix to upgrade to wifi 7 and install antennas from amazon but really have to complain to Dell they really dropped the ball with that. seeing that a 2008 computer is still playing the latest games only slightly less good is quite aggravating yet people still keep convincing themselves to spend all that money on computers and making intel and amd super rich.
Radeon suffered using a 8lane PCIE while nvidia used 16lanes of PCIE so when you pair it with a Gen 2 16lane pcie it really suffers. So what made this generation of cpu motherboard combos was SLI and cross fire gpus because basically in premise more lanes more optimization and less bottlenecking.
an i7 975x or any 9xxk cpu overclocked with 16 gigs ddr3 would've been a well built system that would serve from 2009 to 2019 with great performance only need to upgrade the gpu every half a decade 😂
2070 faster in some title I'm very sure it because PCIe lanes. 2070 has 16x making it able to deliver higher bandwidth then 6600XT that only has 8x lanes
Hey, anyone has a working Z68 mobo? I have a 2700K + 7970 combo to resurrect. :) These old processors are awesome for sure. Keeping them ticking is an issue though.
Mostly because games are more GPU bottle necked than CPU. Not that there isn't a need for CPU power(as he saw when he ran stock), but if you have to pick between CPU and GPU in your budget and gaming is your goal then putting more in the GPU isn't a bad idea. I'm still running a Phenom II 965 on my machine. That's a 2009 chip and it still runs things quite well.
How are the 970-990x 6c/12t these days? I have an old Dell workstation I pass around to friends when they need a rig to get gaming with and I'm wondering how it's doing with more recent releases.
more proof gpu is the big cheese around here. cpu matters of course. but damn this thing is still running games to this day lol. so technically if you wanted to, you could play everything but the most demanding titles at 60fps with anything from the past 6-8 years. especially at 4k.
when I used a xeon on this platform i had nothing but problems, had to do a bios update couldnt overclock, boot problems etc going back to a 960 solved all my problems.
This was the one I chose out of the store to run my life and believe me it was super messy to get to the point where the system was at the cable modem because right when I was having my second backup system installed it immediately got downgraded to i5 with emergency services via voter ballot and of course I have tried the i3 the same way via homelessness but we found out a short while ago the system is an authentic i5 I mean the people insisted. This one is at a great level to be in the future because it's just a little processor so it's lightweight on people power. Allot of the times people know about getting bumped so well during transitions in the day from earlier models that they forget this is an adult thing so where people with children's systems prefer creating their own work schedule this thing has you up on top of the internet during the day then at night it shoots out what it saved and continues streaming so where they do it every time then yours is only during the day so it's kind of like a transition product!
Know its unlikely that you will se this but the gamers nexus video on "gpu busy" with that software would be realy intresting for this kind of content to see how the bottleneck is happening
W or x or even the i7's what ever is cheepest. My x5650 would do 4.5ghz and desyncing the qpi from the ram actually brought a performance improvement. I was running 1600 cl 8-8-8-22 tri channel with 6ranks per channel and a qpi if synced for 2400+mhz ram. Don't remember the exzact number.
Man .. what i rly dont get is taht you Folks NEVER talk about the Power Consumption with comes with these Chips. Espacily nower Days. This CPU need a 130W TDP .. that is crazy .. 130W TDP for a CPU .. there is a reason PPL dont build this anymore .. it needs to much Power
should have vulnerabilities closed as default, vulnerabilities were used by CIA, and later it was vulnerabilities leaked to russiachinese military and iraniannorthkorean military
Great video, my father and I would of been pumped too have one of these back in the day. It's good to see someone still running a few tests on them. Thank you.
The 6600xt is much faster GPU but still with the 2070 both are not utilised to its full potential. There must be some kind of bottleneck on the 2070.. memory bandwidth ? I have the luxury of having few semi and modern CPU at my disposal and guess what. My main rig is a stock i5-9400F CPU, i play mainly older ie older then 5 years old titles and the my main bottleneck are my GPUs (GTX 1080 and RX6600). I could easily swap up to 9600K, 11400K or even 12700KF but what is the point.. I am at 85% CPU on 9400F, so i would be like 40% on 12700KF.
I bought a new system just to play Starcraft II and Wolfenstein New Order - way back when. Didn't have a lot of money, was at University at the time. So after reading hundreds of reviews I got a Gigabyte UD motherboard with an i3-540. Overclocked it easily to 3.93Ghz on air, added 4GB DDR3 1600 and an HD6870 and everything was sweet. A few years later, my friend upgraded his machine and I inherited his Sabretooth board and his i7-960, with a huge Corsair HS+FAN. That CPU ran all day long at 4Ghz and was amazingly fast! I used it for half a decade, through multiple GPU generations, and I still have the board + CPU today in the storage.
My best gpu in descending order are rx 6650xt, rtx 2070, gtx 1660/gtx 1070, gtx 980, gtx 970, rx 480, r9 290, gtx 770. I figure I've got enough gpu to run panel tests on my cpu if I wanted to make videos for Middle aged dads who want to relate to their kids through computers.
I have the Intel DX58SO2 with the same cpu paired with a high performance Intel heatsink/fan. Really the only time I use this setup is for GPU testing, never tried any over clocking, but may try in the future. Great video! well produced.
We have the same motherboard. The Rampage III Formula is a tank. Native SATA III support really helps with SSD read/write speeds. I'm running a Strix R9 380 OC in it. Mine came bundled with a 990X CPU, tho. With liquid cooling I was pushing 5GHz and keeping up with Intel i7 CPUs up to 7th Gen and AMD Ryzen Gen 1, with the only limitation being instruction sets (or lack thereof). I've since switched it back to air cooling and lower clocks since it's now running my media server. 10 years (and thousands of run hours) in my ownership. I've since upgraded to a Z390/9700K system with an Arc A770 as my main gaming PC.
Nvidia has influenced gamedev for decades to bet more on GPU than CPU, so CPU-dependent part of gaming engines are stagnating for years now. No wonder older CPUs are still valid compared to same-year GPUs. Even my FX 8300 is still capable of running most modern games, which is ridiculous.
ah yes, the first core i cpus. i was just starting middle school when these came out, i remember other kids talking about how insane the performance was on these machines. those motherboards look very utilitarian and complex and are my inspiration for building new systems to this day. its not easy getting anything to look quite as brutal as those motherboards did but i try my best. the case helps a lot.
@@jamezxh It really depends, the quad core i7s from the Sandy/Ivy Bridge are either slightly better or perform similarly to the FX 8350 in CPU demanding titles, that if the CPUs are at stock. (i7's from the 1st till 9th gen didn't had an agressive turbo boost so there was quite a considerable OC headroom)
@@ismaelsoto9507 you mean I-5’s . My old 3770 destroys the 8350, the single core performance is woefully slow on the 8350. And overclocking does little to improve it
I still run an x5650 at 4.3ghz as my main pc. It does me fine for my gaming needs even today but the extra 2 cores and 4 threads over the i7s really helps.
I was running a Xeon X5680 in a X58 Sabertooth at 4.3Ghz. It died after a few months and I replaced it and OC's it to 4.1. Was my daily driver for a few years.
Let's put you on a treadmill and make you sprint and see how long before you die. My money's on you lasting less than a few months. Now where'd I put my bullwhip?
LGA 1366 was one of my first major computer builds that I did all myself. I was dabbling in the world of Xeons. I had to part ways from that whole scene due to the lack of instruction sets for VR at the time. I've been consumer CPU since.
I was rocking a 920 engineering sample until 2016, then I jumped to a xeon w3680 on the same board since those came down in price, and that's 6 cores instead of 4, definitely worth it.
I have noticed one issue with it. or it really struggles in some games where it shouldnt and I think it may be lacking instruction sets or something. otherwise I dunno whats up rly.
It's kinda weird to see people using these 1st gen chips again since i didn't really know anyone who used them back in the day, it's neat to see it running so many tripple a games still but i think the lack of AVX instructions has left it as a historical curiosity now Sandy Bridge chips are cheap as chips and those can run more games.
Thanks for watching! Music is Digital Sunset by Karl Casey - ua-cam.com/video/xbacHaqsRBI/v-deo.html
Thanks for watching! You can support my work here: www.buymeacoffee.com/ratechyt
discord.gg/PFb9cMstZH ▪ instagram.com/ratechyt ▪ twitter.com/ratechyt ▪ facebook.com/ratechyt
It's over locked though. Do it again in stock settings and see what it can do. Probably won't run most of these games
@@jponz85 I did add a few stock results. Like I said in the beginning there was no point of going too in-depth.
Just a heads up, you can run 48gb of RAM on any x58 board.
@@n0nyabznss untrue. Some boards are only capable of running 24gb. I got a x58 pure black sapphire for instance and no matter what kind of ram I introduced, I was unable to get it past 24gb, even at low speeds and with extra voltage. A lot of people around the web have reported sucess for 48gb but thats not for EVERY board.
So have you tried the 6 core Xeon? I actually found it better to keep BCLK around 190 and then just raise the ratio and QPI. This way even Ram can be pushed higher close to 2000mhz as well and you can easily achieve clocks past 4ghz too.
I find it outrageous that, even back in 2009, the first-generation Core i7s supported triple-channel RAM and yet we're reduced to dual-channel in 2023.
Its because the ist gen i7's where on the hedt platform.
X58 followed up by x79, x99 and x299 which all supported quadchannel.
The consumer i7 870 was only dual channel.
The hedt platform died. With AMD dominateing with thread ripper but then never releasing an zen 3 version. Intel seems to be releasing a new one with sapphire rapids.
Edit corrected x59 to x58.
Because AMD have proved dual channel done properly is faster
@@Synthematix That is not true. Dual-channel will never be faster than quad-channel.
@@selohcin this isnt quad channel its 3 channel, and all it does is add cpu latency, the latest amd cpus use dual channel.
@@Synthematix I'd like to ask how exactly did AMD prove that dual channel is better than quad? I don't disagree with the latency claim, but the word "better" is very ambiguous.
if anything their move of their zen 2 threadripper cpu's to octa-channel from zen 1's quad-channel might indicate that having more channels is better. But I assume it really only benefits high bandwidth requiring use cases.
That being said I'm fully aware most people do not need more than dual-channel and increased cost and potential latency hit are downsides for majority of users.
Went from i7 950 4ghz to X5675 4.5ghz. really nice CPUs retired the setup in 2021, Pretty experienced with X58. 200-216 Bclk is the sweet spot , some boards can run max turbo multi some can’t. Memory and QPI is very important 3.4-3.8ghz uncore has quite big performance gains. The 45nm chips can go to about 1866-2133mhz on high density kits whilst 32nm can easily do 2200-2400mhz+ with reasonable IMC voltage (1.35v). As for boards Gigabyte , Asus or EVGA Are the best to get , avoid the rest. As always good video!
Thank you!
The RX 6600XT is likely being bandwidth starved in some games, since you're only using PCIE 2.0 x8, that's the same bandwidth as a 6500XT in a PCIE 3.0 system and even that card suffers greatly from that in some instances.
Even a 3090 in a movie 3.0 system is not bandwidth starved at x16. Gamers Nexus did tests on this a few years ago. None of the RX6xxx or RTX 3xxx cards are bottlenecked on pcie 3.0x16
@@LastExile1989 The RX 6600(XT) only have 8 PCI-E lanes, the performance drop isn't noticeable on PCI-E 3.0, but with PCI-E 2.0 it's significant.
@@LastExile1989 Rx 6600 xt hast only 8 pci lanes, first gen i7 can only use pcie 2.0, so rx 6600xt uses pcie 2.0 x8 that equal to pcie 3.0 x4 which makes quite big bottleneck even for rx 6500 xt. So a rtx 3090 can be not bottlenecked by pci 3.0 x16, but it is totally irrevelant to what Dr. Fez tries to say since this system just using pcie 3.0 x4.
Rx 6700 and Rx 6700 xt will give u great fps with older systems the 96mb of. Cache makes up for having slower pcie 2.0 and 3.0 slot
The fact that you can still use a 2009 CPU and have a decent time even at high settings with a new GPU is insane.
It's not insane the marketing has brain washed people thinking they can't game unless they have an i9 and 3090😂
I know someone who runs this cpu as their daily driver to this day. She just really needs to get a sata ssd to truly enjoy the speed her pc has to offer.
I did in the late-2010s and it was A-OK with a GTX 970. I regretted popping the RX 580 into it in 2019! Because it caused me to waste time and a motherboard, just to troubleshoot, when it was fine, everytime with a GTX 970. I also just got the GTX 970 used on May 29, 2019, IIRC. It was because I also bought an RX 580 weeks later.
I use it as a back up pc with a HD7950, still runs any game I want so far, but it does seem to struggle more in some games that it should. I feel like that may be due to missing instruction sets.
I am still using a Cope i7 920 on EVGA X58 @4Ghz (built in 2009) as my main Office PC (has a 980Ti handed down from my 4790K system). I only really use it for Word processing (MS Word) and browsing/youtube. The only game I have played on it in the last 5 years is a few matches of World of Tanks, and it works fine on a 2560x1080 (ultra wide) 75Hz monitor.
I'm still using an overclocked i7 920 system that I put together in 2009. It's had a few upgrades (RAM, graphics and an SSD) but I'm saving up for a new system as a few crashes have started to happen. Not bad for a system that has barely been switched off in 14 years.
Insane! thanks for sharing, i build a new pc in 2022 with great specs 9900k and 3060 ti for $1k
While the I7 960 is impressive for its age, the LGA 1366 platform really shines when you put a Westmere 6 core 12 thread chip in. It's a more refined architecture (960 is 45nm, Westmere is 32) and as you noted, there is a ton of overclocking headroom left on the table. Getting one of those chips (X5690 is the best, though may be too expensive for what it is) to around 4.5GHz can yield similar performance to a Ryzen 5 1600. Triple channel memory also goes a long way toward helping the longevity of these processors.
X5650 processors sell for less than $10 on eBay.
Sorry, this is a bit of a necro-post. But Watching this video again got me thinking...
The i7 900 Series came out in 2009. And the Phenom II x4 series also came out in 2009, with the first stepping early in the year, and the later C3 stepping coming out a month after the i7 960. I wonder how these would compare... 🤔
Awesome video and motherboard you got your hands on man! It looks like we like to test the same kind of scenarios. The highest I was able to clock my i7 870 to was 3.8Ghz and raising the voltage higher than something like ~1.34v caused the temperature on one of the cores to hit 100C and completely shutdown the PC.
That triple channel is a nice feature and should help even more with the CPU performance, especially after you were able to push the QPI Frequency. Strangely, I found the performance to be a hit and miss between my OC'ed i7 870 and stock FX 8350, where the FX 8350 would have a higher frame-rate in Dying Light 1 but with a periodic stutter and the i7 870 would perform much better in Resident Evil Operation Raccoon City even at stock.
Personally, I found my Nvidia GTX 1060 to perform better in most DirectX 11 titles than my RX 480 from the CPU side, but my RX 480 would perform better in most DirectX 12 titles CPU side.
I have i7 990x cpu. It does pretty well in gaming.
Very decent performance RA, The Lack of AVX puts a damper on future titles though. I personally went to Sandy bridge 2600k from the Core 2 Q9650 back in the day. (What an upgrade! Also still running great) Thanks for the detailed review!
I went from a super OC'ed i3-540 to a 3770K. Kept that 1st gen i3 for far too long, but it wasn't until a bunch of new games were released that I actually wanted to play, that I felt like I had to upgrade. It was though, as with your experience, a huge upgrade - just in the overall responsiveness of the machine.
2600k also quite good processor, no reason i see to upgrade, just pair it with good graphics card and enjoy
Still having an x58 system, w3670 @ 4.7ghz uncore @ 4ghz , 24gb ddr3 @ ~2200mhz, gtx 1080ti, its a power hog, cpu only chugs like 250-280w in full load, definitely showingnits age but it still performs similar or close to my ryzen 2600 setup but wirh just like 2.5x - 3x the power
Really a very good platform for its age
Only? Thats a lot
Great performance for its age :)
i think that the Radeon GPU is running slower in some games because of the PCI express 8X at 2.0 speeds
That would be like PCIe 3.0 at 4x which is much too slow.
So my cousin recently received a computer (a decade-old Alienware A51) from one of his mother's co-workers whose husband recently passed away.
It's got an i7 960 and a GTX 1060. I built a budget computer for them a couple years ago with an i3 6100 and a GTX 1050 Ti SC.
Obviously, the 1060 is a significant upgrade from the 1050. However, the new computer has superior RAM.
However, when it comes to the CPU, the differences are not as easy to see. Some comparison websites say the i7 960 is significantly better then the i3 6100, but others say the i3 is superior.
When trying to boot up and reset the Alienware computer, there was a pop and it shut off. It was stored outside (under a roofed patio, but still subject to dew/mist). I suspect the power supply shorted due to moisture.
Am I better transferring the 1060 graphics card to the i3 6100 computer that's already built and working? Or is the i7 960 superior enough to the i3 6100 to warrant tracking down the issues with an older system?
Ideally, I would gut the old Alienware case, and rebuild the internals with new gen parts, but my cousin is on a limited income.
Is the i7 960 worth the effort, or is the i3 6100 an equal match?
I should also note my cousin is technologically illiterate. OC'ing does not apply. He couldn't even figure how to plug in the power supply cord or plug in the HDMI cord from his PC to his TV.
Performance wise the i7 960 is better, especially when overclocked, but it looks like overclocking is out of question. I guess it makes more sense to stick with the i3 since it will feel snappier than the stock i7 960 when it comes to simple day to day tasks, and upgrade it to an i5 or an i7 some time in the future.
Don't see a lot of videos on this CPU so this was great. I still have my old rig stored in a closet which has an i7 960 so it's great to see what it can still do.
My file server still runs an i3 530. It's fine.
The only thing holding up this socket is its lack of AVX and AVX2 support.
The old-style BIOS, too. Looks like only some motherboard BIOSes on 1366 can work properly with later than Radeon R9 and GeForce GTX 980 series, I suspect. The BIOS issue I experienced, was by accident, after installing an RX 580 in 2019 on my Asus P6T Deluxe. After that, I noticed that when my PC was unplugged then plugged back in, the BIOS config was lost. Then I got an Asus P6T6 WS Revolution from eBay and still the same symptoms with an RX 580! Only after getting another motherboard and going back to my GeForce GTX 970, did I know why the BIOS was acting extremely strange!
@@RJARRRPCGP That's how I found it while upgrading a x58 PC with a RX 570. When it worked in another motherboard, and then failed again with a RX 550. I installed a GTX 1060 and a GTX 670 and no issues. I took the RX 570 and tested it on a loop in another PC and no issues. I updated the machine BIOS and no matter what it would either fail to display or then loose display. So that PC ended up with a GTX 1060 6GB which was better anyway.
Just retired my i7 930 a couple weeks ago for a b550 5700x.
Massive improvement in frametime and 1% lows, but minimal increase in avg fps with the same gtx780. I guess the gpu is the bottle neck now.
Awesome upgrade! The 5700x is a very solid chip. 8 cores, power efficient, and runs cool. So far I have been enjoying mine
Awesome B-roll at the beginning!
As for first "i" gen series, well it is indeed impressive what they are capable in today's games, but honestly x58 platform is obsolete in my opinion, because it lacks even first gen AVX and when cpu starts to completely fail launching games then its time for an upgrade. Budget a320 mobo + 1600AF would not only outperform x5650 but also launch all games + excellent upgrade path.
Imo at least avx1 capabilities is a MUST for any cpu these days.
Also in addition x58 usually only has SATA II, MBR Boot and limited to 2TB boot size, no UEFI, no GPT Boot, can't boot from NVME SSD unless you do some custom HDD boot I/O address redirection and it also typically has only has PCIe 2.0, and no USB 3.0.
havent had issues starting a game, or running them at playable fps with a stock i7 960, but it is struggling more than it should in cod coldwar. I feel like it may be due to missing instruction sets.
X58 was such a powerhouse i have that same board with a i7 980x my youngest boy uses it nowdays but it will still do basically anything he wants its paired with a 1070 ti
This gets an upvote just for it's shier audacity! It has no right to still be that good.
Even if frames show to be good you are getting stutter on the games.
And on competitive games animations might also trigger stutters and you are giving away the mostly needed high refresh rates at low 1% required on most intense of combats.
I wouldn0t mind the performance if it wasn't for how much more you need to play RTS games, this CPU will suffer a lot compare to modern CPU even a Ryzen 5600x would triple the frames on specific title and leap you off the bellow 30fps area.
I'm happy with my 4790K to be honest and ain't far apart from that 960 regarding performance, cheap 32GB system with a cheap Asus Hero 7 for optimal OC and good onboard sound solution (comes with surround sound).
The xeon x5675 is 41% faster and %17% lower cost than the i7-960 with stock settings. For me the xeon is the clear winner over these two cpus, also runs on the same x58 chipset.
The xeon also has 6c/12t vs 4c/8t
Passmark:
x5675=7754
i7-960=5491
Also I have got my x5675 to 4.8Ghz, pretty badazz for a $12 cpu, give it a try.
I lived on an i3 540 until 2020Q4 where I replaced it with a new motherboard and an Athlon 200GE which was already an uplift and I am since 2023Q1 on an R5 3600 finally. Although the CPU was weak I could get by with it mostly okayish I didn't notice any bottlenecks since my GPU was also on the lower end side but I always wanted to have this CPU instead of the i3 540 even if just for the higher core-count.
My PC specs and clock speed.
.GA-X58-USB3 rev.1.1 motherboard.
.CPU i7 960@3.2ghz /
OC 4.5ghz
.Bequiet Aircooler
.Balistix RAM DDR3 1600 mhz
.GTX 1660 Super Gaming X.
.850w Power Supply
Note: With a water cooler, you can clock the CPU to 5.0. With a good aircooler no more than 4.5 stable.
Cinebench R20 score 1410
Temps:
°40 to °45 idle.
°70 to °80 max load Cinebench CPU load.
°50 to °65 on gaming.
The 960 was too bad for me in 2021… it was alright at games but for Davinci Resolve it shat itself 14 times over before I could even export
I got A HP with the same CPU and I paired it with AGTX1050TI 4GB Mini zoltak LOW- profile I play battlefield 2 battlefield 3
HALF life.....
half life 2 serious Sam 1st encounter and serious Sam 2nd encounter I play aliens colonial Marines I think that this system I got for $40 it is the best I have ever had fun with and it plays every game fine I have 2SS d's running in raid 0
SO MANY PAST GAMES AND THEY PLAY SMOOTH AS SILK ILTEA SETTINGS OR HIGH AND ULTRA MIX..😊😊😊😊😊😊❤❤❤❤❤❤❤❤❤
for almost a decade (10s) intel was just going slowly in terms of progressing single core speeds of its processors. So, after 10 years (2020) the overall increase of single core speed was about 120 % which is crazy slow... And after that Apple silicon came .. and x86 is DEAD.. and they dont know it..
excellent review and fun demo ..
A bit over kill on the cards but interesting..😊😊😊😊😊😊😊😊
❤❤❤❤❤❤❤❤
In the x58 mb some games wheren't played because CPU hasn't avx support. I have x5680 and w3690😢
Tlou
Starfield
technology hasnt changed really that much in the past 16 years were still stuck on bluray quality movies, hard drives have gotten mildly bigger and now arent being made. weve got cheaper 4k tvs and slightly better cell phones but not really much as smart phones were also already available by 2008. Windows 7 came out in 2009 but it is still a great operating system -hardly any improvements there. The new generation is so vastly different from me being born in the early 80's. I used to see massive improvements from the 90's to the 2000's but thats pretty much where improvements ended. In fact Im not really seeing hardly any advancements in technology at all since 2010. The xbox one came out in 2013 but really isnt that much an upgrade over the xbox 360 with kinekt. Most of these games could have been made to play on a 360 for sure. A lot of technology has been actually down grading. We've had some improvements in internet speed but my friend just bought a brand new dell laptop and the wifi is using a wifi 5 chip with only one antenna so the speeds were only like 50 of 500. How are you going to put a 10 year old wifi chip in a brand new computer and then not even have an ethernet plug? Its a 40 dollar fix to upgrade to wifi 7 and install antennas from amazon but really have to complain to Dell they really dropped the ball with that. seeing that a 2008 computer is still playing the latest games only slightly less good is quite aggravating yet people still keep convincing themselves to spend all that money on computers and making intel and amd super rich.
Radeon suffered using a 8lane PCIE while nvidia used 16lanes of PCIE so when you pair it with a Gen 2 16lane pcie it really suffers.
So what made this generation of cpu motherboard combos was SLI and cross fire gpus because basically in premise more lanes more optimization and less bottlenecking.
an i7 975x or any 9xxk cpu overclocked with 16 gigs ddr3 would've been a well built system that would serve from 2009 to 2019 with great performance only need to upgrade the gpu every half a decade 😂
nVidia CPU overhead problem exists in RTX 30 series (and possibly 40 series as well), not in RTX 20 series.
Edit: Great video though!
2070 faster in some title I'm very sure it because PCIe lanes.
2070 has 16x making it able to deliver higher bandwidth then 6600XT that only has 8x lanes
Hey, anyone has a working Z68 mobo? I have a 2700K + 7970 combo to resurrect. :)
These old processors are awesome for sure. Keeping them ticking is an issue though.
My papa passed and left me his 960 system. I was very surprised how good it gamed with a 5500 xt threw in with it. Chip runs VERY hot however.
Amazing. How is it possible, that such old CPU can handle such demanding games?
Almost 15 years after his release date, but he runs pretty well.
Mostly because games are more GPU bottle necked than CPU. Not that there isn't a need for CPU power(as he saw when he ran stock), but if you have to pick between CPU and GPU in your budget and gaming is your goal then putting more in the GPU isn't a bad idea.
I'm still running a Phenom II 965 on my machine. That's a 2009 chip and it still runs things quite well.
How are the 970-990x 6c/12t these days? I have an old Dell workstation I pass around to friends when they need a rig to get gaming with and I'm wondering how it's doing with more recent releases.
This is still my PC specs to this date. Running an Nvidia 660 GPU. No gaming at all on that PC, just bill paying and web surfing.
I use a x3450 in a secondary system, and got windows 11 on it Using Rufus and it still works great. I'm Greatful I can still put it to use
i used w3680 4.82 ghz, 2400 mhz cl10 triple ram and 4000 uncore, it is the beast
So, i am still running my i7 960, just about 15 years and still cranking.
more proof gpu is the big cheese around here. cpu matters of course. but damn this thing is still running games to this day lol. so technically if you wanted to, you could play everything but the most demanding titles at 60fps with anything from the past 6-8 years. especially at 4k.
when I used a xeon on this platform i had nothing but problems, had to do a bios update couldnt overclock, boot problems etc going back to a 960 solved all my problems.
Well, when you have that bad of a CPU bottleneck. I guess you should just crank every game to max settings
any used gpu with pcie x16 wouldve done a lot better, since the rx6600 is only pcie x8
This was the one I chose out of the store to run my life and believe me it was super messy to get to the point where the system was at the cable modem because right when I was having my second backup system installed it immediately got downgraded to i5 with emergency services via voter ballot and of course I have tried the i3 the same way via homelessness but we found out a short while ago the system is an authentic i5 I mean the people insisted. This one is at a great level to be in the future because it's just a little processor so it's lightweight on people power. Allot of the times people know about getting bumped so well during transitions in the day from earlier models that they forget this is an adult thing so where people with children's systems prefer creating their own work schedule this thing has you up on top of the internet during the day then at night it shoots out what it saved and continues streaming so where they do it every time then yours is only during the day so it's kind of like a transition product!
Bet you GTA-5 will not drop frames, studder with the 4 core chip
How the CPU doesn't goes 100% usage with 4 core
Results may be different with nvidia gpu
The old i7's never lie ❤
Know its unlikely that you will se this but the gamers nexus video on "gpu busy" with that software would be realy intresting for this kind of content to see how the bottleneck is happening
Assassin creed odyssey will run don't worry about it but legend of Zelda doesn't support this cpu at all
nice efforts! 1366 is still a beast even without modern instruction. now you need to get a 6 core w3690 and compare.
W or x or even the i7's what ever is cheepest. My x5650 would do 4.5ghz and desyncing the qpi from the ram actually brought a performance improvement.
I was running 1600 cl 8-8-8-22 tri channel with 6ranks per channel and a qpi if synced for 2400+mhz ram. Don't remember the exzact number.
Man .. what i rly dont get is taht you Folks NEVER talk about the Power Consumption with comes with these Chips. Espacily nower Days. This CPU need a 130W TDP .. that is crazy .. 130W TDP for a CPU .. there is a reason PPL dont build this anymore .. it needs to much Power
I literally talk about power consumption at 10:25.
you can play valhalla but not odyssey, weird that hhey did that
Does it have avx instructions set? Mine is xeon x3470 can't play newer titles, im thinking of upgrades but not so expensive in terms of price though
9:54
Its good that you at least know the fps because of that coloring and font i cant read those numbers
should have vulnerabilities closed as default, vulnerabilities were used by CIA, and later it was vulnerabilities leaked to russiachinese military and iraniannorthkorean military
Oh no! How will you keep your launch codes safe now?
Great video, my father and I would of been pumped too have one of these back in the day. It's good to see someone still running a few tests on them. Thank you.
suggestion, upload in higher bitrates
Ddr3 is what makes it slow if it had ddr4 is be way faster
Damn the editing for the start was legendary
Thank you!
I've just very recently upgraded from the i7 990x to the i9 13900k
Huge upgrade!
If you care about stutter you'll definitely want a full upgrade and not just an OC but that's huge overclock gains which we never see anymore
Today with frequency scaling hardware overclocks itself. But there'll always be dipsticks that think they can push things further than they should.
This is shows that how good the GPU can handle a game with less CPU usage
The 6600xt is much faster GPU but still with the 2070 both are not utilised to its full potential. There must be some kind of bottleneck on the 2070.. memory bandwidth ? I have the luxury of having few semi and modern CPU at my disposal and guess what. My main rig is a stock i5-9400F CPU, i play mainly older ie older then 5 years old titles and the my main bottleneck are my GPUs (GTX 1080 and RX6600). I could easily swap up to 9600K, 11400K or even 12700KF but what is the point.. I am at 85% CPU on 9400F, so i would be like 40% on 12700KF.
I bought a new system just to play Starcraft II and Wolfenstein New Order - way back when. Didn't have a lot of money, was at University at the time. So after reading hundreds of reviews I got a Gigabyte UD motherboard with an i3-540. Overclocked it easily to 3.93Ghz on air, added 4GB DDR3 1600 and an HD6870 and everything was sweet. A few years later, my friend upgraded his machine and I inherited his Sabretooth board and his i7-960, with a huge Corsair HS+FAN. That CPU ran all day long at 4Ghz and was amazingly fast! I used it for half a decade, through multiple GPU generations, and I still have the board + CPU today in the storage.
I'm still rocking a i5-760 @3.9GHz for my HTPC 😅 paired with a GTX 1050 2GB it feels just as fast as my laptop with a i5-1035G4 (full 35 watt limit)
My best gpu in descending order are rx 6650xt, rtx 2070, gtx 1660/gtx 1070, gtx 980, gtx 970, rx 480, r9 290, gtx 770.
I figure I've got enough gpu to run panel tests on my cpu if I wanted to make videos for Middle aged dads who want to relate to their kids through computers.
Please do include video editing as well in older cpu review.
I have the Intel DX58SO2 with the same cpu paired with a high performance Intel heatsink/fan. Really the only time I use this setup is for GPU testing, never tried any over clocking, but may try in the future. Great video! well produced.
i was just recently running an i7 870 (may-nov 18) and it worked way better than i expected
I'm using an AMD A10-4600M,
We have the same motherboard. The Rampage III Formula is a tank. Native SATA III support really helps with SSD read/write speeds. I'm running a Strix R9 380 OC in it.
Mine came bundled with a 990X CPU, tho. With liquid cooling I was pushing 5GHz and keeping up with Intel i7 CPUs up to 7th Gen and AMD Ryzen Gen 1, with the only limitation being instruction sets (or lack thereof). I've since switched it back to air cooling and lower clocks since it's now running my media server. 10 years (and thousands of run hours) in my ownership.
I've since upgraded to a Z390/9700K system with an Arc A770 as my main gaming PC.
Nvidia has influenced gamedev for decades to bet more on GPU than CPU, so CPU-dependent part of gaming engines are stagnating for years now. No wonder older CPUs are still valid compared to same-year GPUs. Even my FX 8300 is still capable of running most modern games, which is ridiculous.
The 6-core Xeons work quite well on these older setups and can deliver some decent gaming performance comparable to at least 2nd Gen Ryzen 🥰🤩🤯
Tell me, please, what did you do to make dying light 2 run on this CPU?
3:50
this is the cpu i am stuck with using rn
ah yes, the first core i cpus. i was just starting middle school when these came out, i remember other kids talking about how insane the performance was on these machines. those motherboards look very utilitarian and complex and are my inspiration for building new systems to this day. its not easy getting anything to look quite as brutal as those motherboards did but i try my best. the case helps a lot.
that intro was so sick. great video as always. do you think this CPU is better than fx 8350?
Only one way to find out!
Any i7 is better than a FX-8350.
@@jamezxh It really depends, the quad core i7s from the Sandy/Ivy Bridge are either slightly better or perform similarly to the FX 8350 in CPU demanding titles, that if the CPUs are at stock. (i7's from the 1st till 9th gen didn't had an agressive turbo boost so there was quite a considerable OC headroom)
@@ismaelsoto9507 you mean I-5’s . My old 3770 destroys the 8350, the single core performance is woefully slow on the 8350. And overclocking does little to improve it
I still run an x5650 at 4.3ghz as my main pc. It does me fine for my gaming needs even today but the extra 2 cores and 4 threads over the i7s really helps.
super strong processor indeed...
an i7 970 or better is something to look at. 6 cores and 12 threads for cheap!
I bought an X5670 that's collecting dust, haven't had the time to test it. Will soon though!
AMAZING intro, loved it! very nice video and impressive results.
Thank you!
It is still time for an upgrade
12:01 strange
Good video.
I was running a Xeon X5680 in a X58 Sabertooth at 4.3Ghz. It died after a few months and I replaced it and OC's it to 4.1. Was my daily driver for a few years.
Let's put you on a treadmill and make you sprint and see how long before you die. My money's on you lasting less than a few months. Now where'd I put my bullwhip?
LGA 1366 was one of my first major computer builds that I did all myself. I was dabbling in the world of Xeons. I had to part ways from that whole scene due to the lack of instruction sets for VR at the time. I've been consumer CPU since.
I was rocking a 920 engineering sample until 2016, then I jumped to a xeon w3680 on the same board since those came down in price, and that's 6 cores instead of 4, definitely worth it.
I have noticed one issue with it. or it really struggles in some games where it shouldnt and I think it may be lacking instruction sets or something. otherwise I dunno whats up rly.
It's kinda weird to see people using these 1st gen chips again since i didn't really know anyone who used them back in the day, it's neat to see it running so many tripple a games still but i think the lack of AVX instructions has left it as a historical curiosity now Sandy Bridge chips are cheap as chips and those can run more games.
I'll definitely be checking out Sandy Bridge in the near future as well!
Also PCIe 3.0, Sata III, 1Gb NIC, and often USB 3.0 were on Z68 as well I think.
a n i m e
n
i
m
e
Ok i use an intel i7 960😅
im still using i7 3770 in 2023
❤🎉
you can go even lower spec wise, I've used a i7 860 before and these little lynnfield monsters are pretty good still