Seems the 4C/8T CPUs are still relevant. You should add the i3-12100F to the lot as a comparison. It actually compares very favorably to the 6 core CPUs
Yeah, this entire testing is flawed, if it's just about the core count then he should've used CPUs from the same generation or at least set the same clocks. Of course newer CPUs are going to perform better. Even newer CPUs with the same core count is going to perform better.
@@flushfireNot really flawed tho, he specifically mentions how those cpus perform, and since it comes down to each game, the results will change a lot, especially when coffee lake its still is lga1151. If these cpus perform well, then its a given a current quad core like the 12100f will perform better when it even the 8700k in MT.
For CPU productivity benchmarks, I'd recommend compiling code (chromium is a popular choice but I wouldn't mind seeing something new), compressing/decompressing files, calculating pi (to a specific digit), etc. I mean what you tested with also works, what you're benchmarking just needs to be consistent so results are comparable and you've done that.
isn't high amount of ram the most important thing for compiling, CPUs are quite powerful in I/O operations memory speed is the bottle neck (most of the time)
I kinda added the notes about the RX 6600's bandwidth limitations at the last minute because it didn't occur to me earlier, but it's something I might have to dig further into in the future. It was being tested on the highest performing CPU of the four, and it lost performance in CS2 and Fortnite - two games not exactly know for being perfectly optimised, so it's possible that I just got unlucky with those benchmarks. Oh, and FWIW: Cinebench R23: i7 6700K 1130 / 6042 R3 3100 1202 / 6792 i7 8700K 1318 / 9448 R5 5600X 1552 / 11139 Time Spy CPU score: R3 3100 5366 i7 6700K 5765 R5 5600X 8482 i7 8700K 8587
Btw, love the content. Also, what are you doing wrong? Everyone else is getting about 30% better performance out of both of those ryzen chips. It is weirdly deadlined at 30% more frame rate with both the 3100 and 6100 with almost identical hardware. Kinda makes me wonder if you accidentally hobbled the ram in single channel or something
OH buddy I am shame to admit I have i5 2500 , 2018 I had great idea I will upgrate PC when CP 2077 will came out as that was year when CDPR annouced it , and mix of covid & crypto crap scraped my plan , my mobo after bios upgrate can run i7 3770k , and I have considered to do that , but price of purchease is just not wort it . i7 3770k + 16gb ram 1600mhz + past + alcohol = 140$ and that cpu get 6.463p ... meanwhile Ryzen 5 4500 + 16gb ddr4 3200mhz + mobo = 200$ and that cpu get 16.221p , and have 6 cores 12 threads , so that is very cheap upgrate , and mobo also support Ryzen 7 7800x3d, and have 4 ram slots = in future I can make easy future upgrate , so shame to say 1155 platform is jurassic .
It’s great to see that quad cores, while a little long in the tooth, may still be usable for a bit longer. I also liked the comparison with the 6 core chips. Great video good sir!
I ran a 6700k / 1070 for nearly 7 years, 2016-2023. I was running nearly everything at max settings, 100ish FPS for the first 3 or 4. You don't need to upgrade regularly, if when you do upgrade you future proof. I finally went for it this May, got a 13700k and a used 6800xt. Very happy with what I got for my dollar
@@nossy232323 Very big upgrade indeed. Congrats. I went for a bit of a tamer path lol. got a 4080 and my buddy sold me his 12700k for 150. Very happy with my setup
I went from an Pentium 4 with a GeForce 6800 I built in 2004 to a i7-6700K with a GTX970 in 2015. What an upgrade that was lol. Still have that P4 nearly twenty years later, and the 6700K continues to pull it’s weight at work.
@@_n8thagr8_63 I can imagine! That's still a pretty solid upgrade. I'm using my system now with that Asus 1440P 240 refresh OLED. That way I can keep my setup longer by not going 4K and in the meanwhile I have the pleasure of a high refresh rate low lag screen.
@_n8thagr8_63 awesome setup! I use the same GPU paired to a 7800x3d at 1440p. Yeah the 4080 is not an average card at all. If it wasn't for the 4090, I think we'd all be pretty impressed with 25%+3090ti performance for 3080ti pricing. Obviously the 12700k is more than enough to enjoy the 4080 too
In games that don’t scale with core/thread counts, it does indeed compete quite well and serves as a solid budget gaming chip. Naturally it’s the fastest quad-core ever made (if you don’t count taking a 13900K/KS & disabling all cores except for 4 P cores 😅), but the age of the quad-core (even with HT/MT) has been coming to an end for a while now; many games are now beginning to be optimized primarily for 8c/16t due to the custom Zen 2 8-core CPUs found in current gen consoles. That being said, 6c/12t chips like the 5600/X or 12400/F (or even 7500F & 7600/X) can all still provide a strong balance between gaming performance & productivity capability. I personally own a 5600 running a 4.8Ghz OC, and it’s been a fantastic chip for just over $100. Pairs very well with my 3060ti (also OC’ed to around 2175Mhz on the core clock) & 32GB (16GB x 2) 3200Mhz CL14. However due to the ever-increasing demands for modern games (looking at you, Starfield), I’m looking into getting a 12700K/12900K & Z790 depending on local prices a few months from now
@@couriersix2443 im waiting till the next gen of chips before even moving to newer cpus etc. Same with gpu cause ill get a boost of performance but not enough, alby the i3 13100 will be cool in comparison to older 6 cores like a i7 8700k and i7 5820k aswell as the Ryzen 5 3600/2600x so im hoping we see that on icerbergs channel.
@@EBMproductions1 yeah I get that, it seems both AMD and Intel have “somewhat” hit an IPC wall with current gen; for Intel the IPC gains from 12th to 14th gen is less than 20% unless you push very very high clocks (such as over 6Ghz) or pair them with very fast DDR5 (7000Mhz or higher, with low CAS latency). Same with Zen 4 vs Zen 3, but anything older than Zen 3 (such as Zen+ or Zen 2) sees considerable improvement by comparison. In my case I was running a 2600 at 4.1Ghz all-core and even when I first upgraded to the 5600 running it at stock speeds, it was a pretty big leap overall. Tbh I really only want a 12700K or better for the extra cores; I’ve also been interested in tinkering with E cores since 12th gen was released. I think it was a smart move on Intel’s part to introduce the E cores since they offload system processes from the P cores, so they can be completely dedicated to intensive workloads (just wish they’d finally make an i3 for desktop that has some, like maybe a 4P + 4E config or something). But having at least 8 P cores will yield better gaming performance in games like Cyberpunk where I’m currently seeing 100% utilization out of my OC’ed 5600 at 1080p/high native res (or 1440p/high using DLSS quality)
There are a metric TON of games that run just fine on a quad core, even newer games and a lot of esports titles. But yeah... expecting them to play most modern AAA titles smoothly is a bit much to expect in 2023, but the 6700K did a lot better than I expected it to. Crazy though, I was about to say something when you said "on the same socket." I had no idea there was a mod to run Coffee Lake on Z2xx chipsets. That's pretty awesome.
There are hardly any '''''AAA'''''' slop games that are worth upgrading an entire platform over. If you're fine on a 6700k, don't upgrade your PC over a shitty game like Cyberpunk 2077 with virtually no replay value. It's only these AAA games that are requiring this much, just like how back then when AM3+ was all AMD had, many developers had used intel compilers and many games need just a strong single core which intel always delivered on vs AMD's MOAR CORES.
running a 4790-non-k right now with a RX 5700XT on a 1080p monitor and don't really feel like I "need" an upgrade ... the cpu might be holding back the GPU a little bit but the monitor can't do more than 60fps anyways and I don't play competitive online multiplayer games infested with tryhards and cheaters ... so I rather bump up the graphics settings to keep the GPU from getting bored ;)
The 4 core 8 threaded CPU's seem to hold up ok. I think anything Haswell i7 or Zen + or up will get the job done. (- Heavy Ray tracing) Thanks for the excellent comparison Iceberg!
The 5700xt is a gem that is severely overlooked in the used market. Close to 1080ti performance and can be had for just over $100 and if amd gets their driver based fluid motion figured out that'll inject a lot more life into a very budget system
Ehhh, it craps itself in the latest titles like Alan Wake 2, not even being able to maintain 30 fps at 1080p LOW (which is just embarrassing to be honest).
"Just to have an Nvidia card" Is literally one of the most stupid brainwashed statements one could ever make. What the fuck does having "Nvidia" in front of the name of your card do for you? The VRAM Is almost never going to be utilized anyway with the power those cards have anyway, but that's the only argument that exists for the 1080ti vs the 5700XT, and it's more like 50-100% increase in cost because you can get the 5700XT for around $100 and the 1080ti for around $200, and the cheapest single blower model for $175 and it was like this at the time of your comment too so you make no sense. @@silver1407
Just ordered a little miniPC with 16 GB of RAM, 500 GB NVME SSD, and Win11 Pro...for $159 total. It feels pretty snappy with Win11, despite it's 4c/4t Celeron N5105 CPU topping out at 2.8 GHz or so. I have it connected to a large TV in living room, so it's duties are surfing, watching streaming movies, etc., and I am quite satisfied with it.
My i5-6600 is still serving me well after upgrading from an i3-6100, Although both cpus are already more than enough to drive my GT 1030 (GDDR5) hahaha. But yeah, both cpus are really showing their age tho. The i5 still got some horsepower left for maybe a friend group game server setup and stuff. Who knows :))
Disabling the Spectre / Meltdown mitigations would give you a bit of extra performance on the old intel CPUs. As would replacing the toothpaste under the IHS with liquid metal. Would love to see a video on the cofee lake bios mod and how the old Z170 boards handle a 9900K for example.
Bingo! Dellided and fully OCed 6700k on one of the later Z270 boards that can handle some serious speed ddr4 still competes neck and neck with the likes of 5600x in all the titles that aren't very thread heavy (in those it simply starts lacking in multi thread performance though obviously).
Great content, realistic benchmarks, very very good narrative and awesome backing music tracks!!Dude I can honestly say that your content is light-years better from a lot of "big" UA-cam names!Wish you good luck and most importantly have fun with what you're doing,cause we can clearly see that you love what you're doing 🙂👍
Great job on the video. If possible, would you be interested in making a follow up performance comparison of 4 core 8 thread cpus vs 4 core 4 thread cpus? I'd like to see how much the extra threads help the quad core cpus to extract all its juce.
Again, depends on the programs load, as it can only take advantage of the extra threads if the tasks are light enough not to use the core time entirely anyway. So it will be the usual suspects in gaming, lots of shorter processor commands get the highest advantage.
Heck, I still game on a 4C/8T Xeon on a Z97 platform. For gaming, I rarely notice a difference between that and my main rig, an i9-9900K. Big advantage, as it is a low power Xeon that idles on nothing.
I like that you put the full stats for MSI afterburner in top left? and show utilised RAM not just allocated and the other information in detail, this helps a great deal ty
time xz --threads=8 some-random-disk.img another compression algorithm could also be used instead of xz ... gzip or whatever - I just use xz to shrink HDD images of my old computers before I put them on the NAS as backup
I run an i7-6700 (non-K) with an RTX 2070 just fine. I play some CPU heavy games like Kerbal Space Program 1 with no problem. I tend to play older games like SWBF2 2017 (still rather demanding) and ultra modded Skyrim. For having a monitor capped at 60fps, everything plays just fine and buttery smooth. I have no reason to upgrade the CPU quite yet. I do intend to get into VR, so I'll probably upgrade to something like a i5-12600K first.
I replaced my 7700k in last january or so. No regrets. It was a hot chip that I had to delid to even make work somehow, but it still lasted me over 5 years. Once I replaced my 3060ti with a 4080 I knew I had to upgrade the CPU too.
@@hsko8007 But the thermal paste under the ihs was of poor quality. My chip was pretty good, I could get 5.3Ghz across all cores @1.35V on air (NH-D15) You can find a lot of articles about hot kaby lake chips, they were notorious for high thermals even under the best of coolers because of the thermal compound under the ihs. Before delidding it was 85-90C at 4.5GHz which was the boost clock out of the box. I had even mild undervolt of -15mV which was still stable. (This was with a 240mm aio)
My cousin went from a 1800X to a 5800 X3D on the same motherboard. What an amazing upgrade on a single platform. Imagine if Intel could do the same thing. Thank God for AMD.
AM4 is a bit of an outlier. Hopefully AM5 follows this but who knows if it will be quite as long-lived. I have the 7800x3d now and gave my AM4 board and 5600x to my wife and built a rig for her. I nearly went 5800x3d, but actually wanted to be able to upgrade later as well as build my Wife a rig. But, back in the day I wasn't so lucky. I had the FM2 platform with a 5600k APU at the time, and was never really offered a good upgrade path. FM2 was dead in the water before it even made a splash. And AM3 had barely anything worthy of mention outside of budget options
@@white_mage It's important to note in regards to LGA 775 that while it did have an unusually long life for an Intel socket, there's two definitive eras - the latter half of the Pentium 4s, and the Core 2s. Most often, a motherboard that can run one won't be able to run the other.
The 12100f handles a 3060ti without noticeable bottleneck. Thats a glorious combo that for 600$ will run every AAA game this generation. As It better than the PS5 SoC
I have 4 quad core systems including a Q6600 up to a 3200G, yes I bought a 6 core 12 thread 5000 series AM4 and slapped in a B550 board with 32GB of 3600, but only as I was in California. The generational jump was probably more effective than any increase in cores.
that 100/200 series mod is only gonna work well on high end boards. You have to remember that those boards were only designed with 4c/8t chips in mind so only overkill vrm boards have the headroom to work with em.
With a 4 core today you would still want an AMD card as they have less driver overhead than all nvidia cards. Sure you don't get DLSS but if you want to max out your CPU for a few more years AMD is the way to go.
Saying you need a powerful computer to run Starfield is like saying you need a powerful truck to drive around because your city made the roads out of mud
I run a optiplex with a 6600 and 8gb of ram and 500gb of storage. It replaced my Nvidia shield TV box and I've never been happier. The thing didn't like with my shield was having to side load everything. Chrome for example. I felt a search engine, better than the one they offered, should have been standard out of the box. I loved my shield but the proprietary elements really turned me off over time. So I have an additional 16gb of ram coming, 8dollars on eBay, and paired with my wireless controller the new optiplex will be ready for a little gaming. The i5 6600 is perfect in this specific use case. Final price... 35 for the optiplex, 8 bucks 16 gb ram. Wireless controller 15 bucks. Sorry Nvidia Shield
@@TheBcoolGuy No GPU. Just onboard graphics and an optimized CPU. Intel optimization software kicks ass. This i5 6600 is doing the job just fine by itself. Runs at steady 3.8ghz and doesn't blink an eye at 1440p. It just works. I totally got lucky and I really didn't expect it to work as well as it does. But it does. I'm watching a Danny Elfman concert rn in UA-cam in 1440p and it looks and sounds amazing.
The 12100/f is the better option right now since you can get a cheaper b760m ddr5 bclk gen board now. Alderlake refresh patched out the trick used for locked sku overclocking. For the US it's a very compelling option right now. The B760M PG riptide is $120-130 usually and has the bclk generator, and you can get 32GB of jedec Hynix for about $73 that can be OCed to 6800-7200 cl34. Locked ADL skus seem to overclock around 5-5.3ghz all core usually.
Had the Ryzen 3 3100 paired with a GTX 1660 SUPER and it did pretty well (needing some changes in settings in certain games). It sure is a good staple cpu when youre in the process of saving up to upgrade, if you find it for cheap
0:44 mainstream six core? If anything I think a quart is pretty much the bare minimum now. Hell I own a 12 core and I think that’s getting a little too little I need to get a 16 core soon I’ve never even heard of a six core CPU I didn’t know that was a thing I thought they were only in multiples of four after dual core
Great video and really relevant to me too! I got a decent deal on a laptop a year ago with black frdiday, it has a quad core i5-10300h and an RTX 3060 mobile. The cpu is somewhat close, being locked at 4.3 Ghz. Keep up the great content!
My favorite build right now to game on is an i5-6500 paired with an A2000 6gb in an SFF case. I have a 7800X3D and 4080, for reference. Other fun builds: i5-8500 + 4060LP, i5-11400 + RX6400 (weird one, mostly for avx512 emulators + hard focus on linux gaming)
I was using a 3100 until the start of this year on a b350 board and I only upgraded due to a GPU upgrade so it made sense at the time to get the most from my GPU but honestly for most use cases it'll surprise people and I think if you can get your hands on a 3300x used you're in even better shape
The 3100 is my very first processor. Bought in 2020 and it's still going strong. Considering these were AMD's unicorn chips, I'll hold on to it until the day it dies.
13:32 Yeah you're right. If you got any supported relatively modern GPU then you'd go for GPU rendering rather than CPU rendering. However, CPU rendering has one advantage that GPU doesn't (at least, not yet) and that is being able to use your system ram to render. Meaning, if your GPU has 8gb of ram for a scene that is 10gb large you will not be able to render that at all on the GPU since blender doesn't yet support out-of-core rendering. I believe CPU can do this but if I am mistaken, another argument is that you can always easily upgrade your system ram. Edit: To add also, I run an i7-6700 with an RTX 3080 and mostly do GPU rendering but having a beefy CPU helps a lot with simulations and is nice to have as a fallback when CUDA/OPTIX decides to play up.
I had an i7 6700 (non k) until recently. It played apex legends absolutely fine, no stuttering or frame drops. I just got a ryzen 7600 which I’m sure will be a nicer upgrade path
13:30 you are kinda right. Generally you are correct but theoreticaly quality of CPUs render was higher than GPU one but that was more of the case in the past. So unless you need extremly accurate render GPU will do fine. But still those edgecases still exist.
Used Ryzen 3 1200 since the launch day, served me well not going to lie, recently in september I made a switch to a Ryzen 5 7600 and an Rx 7900 xt from an Rx 570 2 weeks ago. In games that I play and tasks that I do I didnt see a future in 4c cpus.
Still using an i7-6700 in one of my gaming towers. Although there is a noticeable bottleneck, it still hits the 1080p 60fps gaming experience. This isn't always the case for modern triple A titles, but it handles the games I like just fine. Pro tip, if you have a beefier gpu you can sometimes bump the resolution up to 1440p for better performance. How you might ask? It offsets more of the workload to the gpu, making your cpu not work as hard resulting with slightly higher frames and obviously better visuals. It's a win win scenario! This doesn't work with all games, but it's definitely worth a shot.
Hmm. The 6700k was 2015/2016. You should have used a Ryzen 5 1500X for comparison here. MAYBE the 2500X but that's stretching it. The 8700ks were 2017's hot chip, king of SCIPC. Compared to the closest AMD chip would be the 2600x. This review makes AMD look far better than they ever were at the time. I understand wanting to use Zen 2 as that is where they finally start to truly catch up to Intel but it's not a fair comparison.
Just a note about Fortnite performance mode on AMD GPUs, they dont fully boost unless you set a Minimum frequency either in the driver or with an external tool such as MoreClockTool/MorePowerTool from igors lab. If you look at your recording its only hitting 1300mhz
Yes, I was aware that this behaviour was happening but my attempts to bypass it in the driver never worked out. I'll have to give MoreClockTool/MorePowerTool a try in my next CPU test. Thanks!
Not a gaming use, but I am rocking a Pentium N6005 (in an Odroid H3+) with 16 GB of RAM and a pair of 6 TB hard drives as a home server (NAS, Plex, HTTP, etc.) and it has run like a dream. So, while I can't comment on gaming uses, I can certainly attest to the fact that quad core CPUs have plenty of useful applications today!
Where I live the I7 6700k costs about as much second hand as a I3 12100f new. If you get a cheap DDR4 motherboard and RAM with it, I doubt that the cost is that much higher. Downside: No overclocking. Upside: 2 years warranty.
I'm still running my i7-6700k with a 1080Ti. I usually update to a new system around the 5 year mark or when a game requires it. This time it's Forza Motorsport that needs a 6 core minimum. I have two systems waiting for a build, an i7-12700k with a 3080Ti and an i9-13900k with a 4090. Hopefully, these new systems will last for a long time. Fair winds and following seas to all.
Nice comparison! Could you also add your i7-5775c to the competition? This one should give the 6700K a hard time and I'm curious how it will compare to the 8700K ...
as someone who both likes to put together scrapheap PCs of random old parts *and* just built a full brand new rig with a 13600K "best performance-budget value", I felt that intro lol Currently trying to put together a rig around a 4C/8T R3 5350 I pulled out of a broken mini PC, so I'm definitely curious as to the quad-core in 2023(2024) question.
This comparison really does show me that I made a great choice upgrading to an AMD Ryzen 5 5600X. I have it paired with a RTX 2080 Super, and all the games I play run well enough at 2560x1440 or 1920x1080 to make me extremely happy.
Still running an i7-6700 (non K). I think the biggest issue I face is less the processor and more the 8 GB of 2133 MHz ram and the SSHD that I run my games off of. Thinking of oing for the R5-7600x, 32 GB 6000 MHz CL30 ram and a Crucial P5 Plus as replacement. Should be quite the upgrade for me. Still keeping my rx 5700 xt for a little longer though.
or any X299 motherboard can use Skylake to Comet lake without anything but a CPU swap and your good to go gen 6 to 10 all in one mobo. waiting for the next Hedt chipset , x99 and x299 have been awesome
I'm wonder how my i5 3570k will perform in your benchmarks. this year i bought an xfx 6900xt so i want to know how much i going to gain from i5 12400f/R5 5600. great vid as always.
i pickup up a 3100 for super cheap and replaced my i7 4770 and the deference is pretty big in game like pubg. alot less stuttery also is the 5600x got auto oc on? they boost single core to 4.85 ghz
I never owned a quad core Intel CPU -- well, not counting the i7 860 in my 2009 iMac and maybe a laptop. The only quad core I intentionally bought as a standalone CPU was a Ryzen 5 1500X. It was quite strong in the Zen1 range. EDIT: I actually do have a quad-core Intel CPU: a 10100. I use it in a file server, and it has never broken a sweat.
Puget systems benchmarks premiere pro, after effects, and davinci resolve. Maybe you should consider that for future videos. I have a i7 6850k, and I use it for editing, and oh my god its sooo slow. Need to upgrade soon
I've got a Xeon E3-1245 v2, and it's a quad with hyperthreading. It sits in an old HP Z220 SFF workstation that my employer was getting rid of. It came with a third Gen i5, but I wanted to run ECC memory, and I noted that the 1245 supported it. I've parked it in the corner, replaced the aging HDD with 4 SATA SSDs, threw three of them in a ZFS pool, and one to boot, and she's all flash based, with ECC memory, for under $100 (workstation was free mind you) It sits happily in the corner at my ex wife's house. We're still friends and she has 1 gig municipal internet. It does the usual box in the corner things, Plex server, SMB shares, and a win 10 virtual machine because my ex doesn't want to learn Linux 😢
I was running Dual cores when Quad cores are the norm originally a Core 2 Duo E3400 (2c/2t), then went to mobile Intel Core i5-4210H (2c/4t) Then somewhere in the middle of the pandemic, I ended up upgrading directly to a mobile AMD Ryzen 7 6800H (8c/16t) GPU wise, I originally played with the a GeForce GTX 650 (Desktop), then went to mobile GeForce GTX 850M (4GB Version), then to a GeForce RTX 3050 Mobile Though, my first ever Family PC was either a Pentium 3 or Pentium 4 with an integrated SiS 630 Chipset with its integrated graphics SiS 305 iGPU (running originally Windows ME then I broke it then it was upgraded to XP) The Core 2 was running Windows Vista then 7 The first gaming laptop was running Windows 8 then 8.1 then 10 My new laptop runs on Windows 11
8-threaded quad-cores is cheatin' tho. I think they're still quite capable in most scenarios. Using my old i5-4590 hurts sometimes in the newest stuff. Bless it, how it tries.
I have a backup setup with a i7 3770K, 32GB DDR3 @1600, Nvidia RTX 2060 Super 8GB ( MSI Ventus OC 8 GB ) and im playing The last of Us on almost max graphics native 1080p, not bad for a 12 year old cpu.
Clock speed and 3D cache can increase the longevity of 4C8T a while longer. Imagine a theoretical Ryzen 3 7300X3D at 5ghz. It would have excellent gaming performance.
Most things these days are written for 8 threads or so. Issue is, most people are running more than one thing at once. Multimonitor is nearly default these days, and on top of the game, and window's hefty performance overhead, most people will have a web browser, discord, and about half the time an animated wallpaper all running elsewhere. Plenty of tasks to eat up a properly scheduled high core cpu. That being said, just a year or two ago, I was daily driving all of these tasks (minus the animated wallpaper) on a *non multithreaded* quad core, and not even one of the more recent ones. People wildly overestimate how vital a top of the line CPU is these days if you're satisfied with being average. Any new-bought TOTL system built today WILL be cpu limited, really harshly.
I run an i3 10100f and its is fairly comparable to an i7700 non-k in raw performance, I think it would be interesting for you to test out one of the newer i3s!
Seems the 4C/8T CPUs are still relevant. You should add the i3-12100F to the lot as a comparison. It actually compares very favorably to the 6 core CPUs
Yeah, this entire testing is flawed, if it's just about the core count then he should've used CPUs from the same generation or at least set the same clocks. Of course newer CPUs are going to perform better. Even newer CPUs with the same core count is going to perform better.
@@flushfireNot really flawed tho, he specifically mentions how those cpus perform, and since it comes down to each game, the results will change a lot, especially when coffee lake its still is lga1151.
If these cpus perform well, then its a given a current quad core like the 12100f will perform better when it even the 8700k in MT.
I have that one!
@@flushfireI would rather say imcomplete at most.
12100F would murder some 6c cpu
For CPU productivity benchmarks, I'd recommend compiling code (chromium is a popular choice but I wouldn't mind seeing something new), compressing/decompressing files, calculating pi (to a specific digit), etc. I mean what you tested with also works, what you're benchmarking just needs to be consistent so results are comparable and you've done that.
isn't high amount of ram the most important thing for compiling, CPUs are quite powerful in I/O operations memory speed is the bottle neck (most of the time)
@@ristekostadinov2820 RAM is important for linkage. Compilation is mostly CPU bound, although faster SSD helps.
Using large equations on word back to back to back is actually a pretty good single core test. Tends to take about a minute on a 7th gen quad core
IIRC GamersNexus said that chromium compile ended up scaling 1:1 with cache size making it a pretty pointless benchmark.
I second this
I kinda added the notes about the RX 6600's bandwidth limitations at the last minute because it didn't occur to me earlier, but it's something I might have to dig further into in the future. It was being tested on the highest performing CPU of the four, and it lost performance in CS2 and Fortnite - two games not exactly know for being perfectly optimised, so it's possible that I just got unlucky with those benchmarks.
Oh, and FWIW:
Cinebench R23:
i7 6700K 1130 / 6042
R3 3100 1202 / 6792
i7 8700K 1318 / 9448
R5 5600X 1552 / 11139
Time Spy CPU score:
R3 3100 5366
i7 6700K 5765
R5 5600X 8482
i7 8700K 8587
More things to look forward too.
Great content overall
What I really need to know: Does pcie x3 absolute dogwater now?
@@prixivus85 Gen 3? No, it's fine. I wouldn't buy an x4 card for a Gen 3 board, but x16 cards should be fine on Gen 3.
But you have a 6900XT.
Would be awesome to see if those old CPU's would be fine with that behemoth.
Btw, love the content. Also, what are you doing wrong? Everyone else is getting about 30% better performance out of both of those ryzen chips. It is weirdly deadlined at 30% more frame rate with both the 3100 and 6100 with almost identical hardware. Kinda makes me wonder if you accidentally hobbled the ram in single channel or something
Right in the feels, my I7 3770 is still in my main gaming rig, 4 cores 8 threads
the 8 threads is helping you the most.
OH buddy I am shame to admit I have i5 2500 , 2018 I had great idea I will upgrate PC when CP 2077 will came out as that was year when CDPR annouced it , and mix of covid & crypto crap scraped my plan , my mobo after bios upgrate can run i7 3770k , and I have considered to do that , but price of purchease is just not wort it .
i7 3770k + 16gb ram 1600mhz + past + alcohol = 140$ and that cpu get 6.463p ... meanwhile
Ryzen 5 4500 + 16gb ddr4 3200mhz + mobo = 200$ and that cpu get 16.221p , and have 6 cores 12 threads , so that is very cheap upgrate , and mobo also support Ryzen 7 7800x3d, and have 4 ram slots = in future I can make easy future upgrate , so shame to say 1155 platform is jurassic .
still using a i7 3770K with a GTX 1070 Ti and 32GB of RAM.
I'm not upgrading until windows 10 looses all support lol
@@88meatwad
Only thing halfway modern in that system is the 32GB RAM lmao.
i7 3770k, 16gb, 1660 super at 1440p and still enjoying life.
It’s great to see that quad cores, while a little long in the tooth, may still be usable for a bit longer. I also liked the comparison with the 6 core chips. Great video good sir!
I ran a 6700k / 1070 for nearly 7 years, 2016-2023. I was running nearly everything at max settings, 100ish FPS for the first 3 or 4. You don't need to upgrade regularly, if when you do upgrade you future proof.
I finally went for it this May, got a 13700k and a used 6800xt. Very happy with what I got for my dollar
Brings back good memories. My first CPU was a 7700k paired with a 1070. Plenty of good years with that duo
I had a 6700K + GTX 1080. Finally upgraded this year to a 7950X3D + RTX 4090!!!
@@nossy232323 Very big upgrade indeed. Congrats. I went for a bit of a tamer path lol. got a 4080 and my buddy sold me his 12700k for 150. Very happy with my setup
I went from an Pentium 4 with a GeForce 6800 I built in 2004 to a i7-6700K with a GTX970 in 2015. What an upgrade that was lol. Still have that P4 nearly twenty years later, and the 6700K continues to pull it’s weight at work.
@@_n8thagr8_63 I can imagine! That's still a pretty solid upgrade. I'm using my system now with that Asus 1440P 240 refresh OLED. That way I can keep my setup longer by not going 4K and in the meanwhile I have the pleasure of a high refresh rate low lag screen.
@_n8thagr8_63 awesome setup! I use the same GPU paired to a 7800x3d at 1440p.
Yeah the 4080 is not an average card at all. If it wasn't for the 4090, I think we'd all be pretty impressed with 25%+3090ti performance for 3080ti pricing.
Obviously the 12700k is more than enough to enjoy the 4080 too
*i3 13100 really should be reviewed to add to this quad core discussion as i know that cpu competes with the Ryzen 5 5600/5600x in many titles*
In games that don’t scale with core/thread counts, it does indeed compete quite well and serves as a solid budget gaming chip. Naturally it’s the fastest quad-core ever made (if you don’t count taking a 13900K/KS & disabling all cores except for 4 P cores 😅), but the age of the quad-core (even with HT/MT) has been coming to an end for a while now; many games are now beginning to be optimized primarily for 8c/16t due to the custom Zen 2 8-core CPUs found in current gen consoles. That being said, 6c/12t chips like the 5600/X or 12400/F (or even 7500F & 7600/X) can all still provide a strong balance between gaming performance & productivity capability. I personally own a 5600 running a 4.8Ghz OC, and it’s been a fantastic chip for just over $100. Pairs very well with my 3060ti (also OC’ed to around 2175Mhz on the core clock) & 32GB (16GB x 2) 3200Mhz CL14. However due to the ever-increasing demands for modern games (looking at you, Starfield), I’m looking into getting a 12700K/12900K & Z790 depending on local prices a few months from now
@@couriersix2443 im waiting till the next gen of chips before even moving to newer cpus etc. Same with gpu cause ill get a boost of performance but not enough, alby the i3 13100 will be cool in comparison to older 6 cores like a i7 8700k and i7 5820k aswell as the Ryzen 5 3600/2600x so im hoping we see that on icerbergs channel.
@@EBMproductions1 yeah I get that, it seems both AMD and Intel have “somewhat” hit an IPC wall with current gen; for Intel the IPC gains from 12th to 14th gen is less than 20% unless you push very very high clocks (such as over 6Ghz) or pair them with very fast DDR5 (7000Mhz or higher, with low CAS latency). Same with Zen 4 vs Zen 3, but anything older than Zen 3 (such as Zen+ or Zen 2) sees considerable improvement by comparison. In my case I was running a 2600 at 4.1Ghz all-core and even when I first upgraded to the 5600 running it at stock speeds, it was a pretty big leap overall. Tbh I really only want a 12700K or better for the extra cores; I’ve also been interested in tinkering with E cores since 12th gen was released. I think it was a smart move on Intel’s part to introduce the E cores since they offload system processes from the P cores, so they can be completely dedicated to intensive workloads (just wish they’d finally make an i3 for desktop that has some, like maybe a 4P + 4E config or something). But having at least 8 P cores will yield better gaming performance in games like Cyberpunk where I’m currently seeing 100% utilization out of my OC’ed 5600 at 1080p/high native res (or 1440p/high using DLSS quality)
Too expensive if you compare it to something like a 5600. Plus needing a new mobo, am4 is way cheaper
In the same way a 5600x keeps up with a 5900x in most games
There are a metric TON of games that run just fine on a quad core, even newer games and a lot of esports titles. But yeah... expecting them to play most modern AAA titles smoothly is a bit much to expect in 2023, but the 6700K did a lot better than I expected it to. Crazy though, I was about to say something when you said "on the same socket." I had no idea there was a mod to run Coffee Lake on Z2xx chipsets. That's pretty awesome.
It's not just Z270, Coffee Time runs on Z170 as well. You can slap a 9900k in a Z170 with good VRMs.
Depends on the AAA title, many free ones I tried of Epic work ridiculously well on AM4 5000 series architecture anyway.
There are hardly any '''''AAA'''''' slop games that are worth upgrading an entire platform over. If you're fine on a 6700k, don't upgrade your PC over a shitty game like Cyberpunk 2077 with virtually no replay value. It's only these AAA games that are requiring this much, just like how back then when AM3+ was all AMD had, many developers had used intel compilers and many games need just a strong single core which intel always delivered on vs AMD's MOAR CORES.
Ran a 4790k until recently. Now it's in my brother's rig. Still powerful enough for any game he plays at 1080p
running a 4790-non-k right now with a RX 5700XT on a 1080p monitor and don't really feel like I "need" an upgrade ...
the cpu might be holding back the GPU a little bit but the monitor can't do more than 60fps anyways and I don't play competitive online multiplayer games infested with tryhards and cheaters ... so I rather bump up the graphics settings to keep the GPU from getting bored ;)
The 4 core 8 threaded CPU's seem to hold up ok. I think anything Haswell i7 or Zen + or up will get the job done. (- Heavy Ray tracing) Thanks for the excellent comparison Iceberg!
Wait until you see the traditional 4C8T Core i3 13th Gen can still push RTX 4090 to 99% GPU usage , in 1080p
@@niezzayt3809It's impossible for the majority of games, you would need an i9-13900K.
The 5700xt is a gem that is severely overlooked in the used market. Close to 1080ti performance and can be had for just over $100 and if amd gets their driver based fluid motion figured out that'll inject a lot more life into a very budget system
I've always felt that the 1080ti was worth the extra 10-15% in cost just to have an Nvidia card and the better performance, not to mention the vram ;)
Ehhh, it craps itself in the latest titles like Alan Wake 2, not even being able to maintain 30 fps at 1080p LOW (which is just embarrassing to be honest).
@@HenrySomeone Because of the mesh shaders, lowly RX 6500 can handle Alan Wake 2 better than GTX 1080 TI
"Just to have an Nvidia card" Is literally one of the most stupid brainwashed statements one could ever make. What the fuck does having "Nvidia" in front of the name of your card do for you? The VRAM Is almost never going to be utilized anyway with the power those cards have anyway, but that's the only argument that exists for the 1080ti vs the 5700XT, and it's more like 50-100% increase in cost because you can get the 5700XT for around $100 and the 1080ti for around $200, and the cheapest single blower model for $175 and it was like this at the time of your comment too so you make no sense. @@silver1407
the card is like ten years old. Lucky it can even still keep up today.@@HenrySomeone
Just ordered a little miniPC with 16 GB of RAM, 500 GB NVME SSD, and Win11 Pro...for $159 total. It feels pretty snappy with Win11, despite it's 4c/4t Celeron N5105 CPU topping out at 2.8 GHz or so. I have it connected to a large TV in living room, so it's duties are surfing, watching streaming movies, etc., and I am quite satisfied with it.
Amazon for $169 right? Tried to get my hands on one before they sold out. They looked like the perfect HTPC start.
Got mine for $159 the last week or so in Sept, it arrived on my birthday a few days back...; completely satisfied with it!@@Trick-Framed
Pair with Nvidia.... Nvidia Driver overhead on an old processor.
My i5-6600 is still serving me well after upgrading from an i3-6100,
Although both cpus are already more than enough to drive my GT 1030 (GDDR5) hahaha.
But yeah, both cpus are really showing their age tho. The i5 still got some horsepower left for maybe a friend group game server setup and stuff. Who knows :))
same same dude. upgrading it seems will be happening sooner than later.
Disabling the Spectre / Meltdown mitigations would give you a bit of extra performance on the old intel CPUs. As would replacing the toothpaste under the IHS with liquid metal.
Would love to see a video on the cofee lake bios mod and how the old Z170 boards handle a 9900K for example.
Bingo! Dellided and fully OCed 6700k on one of the later Z270 boards that can handle some serious speed ddr4 still competes neck and neck with the likes of 5600x in all the titles that aren't very thread heavy (in those it simply starts lacking in multi thread performance though obviously).
How can you disable the mitigations?
Is it worth it when the 5600x can be found for under $150 and comes with a cooler? AM4 boards have never been cheaper. @@HenrySomeone
Great content, realistic benchmarks, very very good narrative and awesome backing music tracks!!Dude I can honestly say that your content is light-years better from a lot of "big" UA-cam names!Wish you good luck and most importantly have fun with what you're doing,cause we can clearly see that you love what you're doing 🙂👍
Fun, thorough video. Can't wait for more.
Still have an old quad core skylake here. Am impressed given the range of quadcores how well they ran.
Great job on the video. If possible, would you be interested in making a follow up performance comparison of 4 core 8 thread cpus vs 4 core 4 thread cpus? I'd like to see how much the extra threads help the quad core cpus to extract all its juce.
Again, depends on the programs load, as it can only take advantage of the extra threads if the tasks are light enough not to use the core time entirely anyway.
So it will be the usual suspects in gaming, lots of shorter processor commands get the highest advantage.
Huge amount of information packed into a, well.. relatively, very short video. Great stuff!
Hey, I was talking as fast as I could, gimme a break 😜
@@IcebergTech but that's what I'm here for!
Heck, I still game on a 4C/8T Xeon on a Z97 platform. For gaming, I rarely notice a difference between that and my main rig, an i9-9900K. Big advantage, as it is a low power Xeon that idles on nothing.
I like that you put the full stats for MSI afterburner in top left? and show utilised RAM not just allocated and the other information in detail, this helps a great deal ty
I've being doing that for a while, but I never labelled it before so I think some people were confused as to why RAM allocation appeared twice!
As a productivity benchmark you could test file compression since it uses the CPU heavily
File compression benchmarks seem like a more useful and realistic productivity focused benchmark for CPUs.
time xz --threads=8 some-random-disk.img
another compression algorithm could also be used instead of xz ... gzip or whatever - I just use xz to shrink HDD images of my old computers before I put them on the NAS as backup
even though i already got some nice parts, i still love watching budget pc build channels such as yourself
I run an i7-6700 (non-K) with an RTX 2070 just fine. I play some CPU heavy games like Kerbal Space Program 1 with no problem. I tend to play older games like SWBF2 2017 (still rather demanding) and ultra modded Skyrim. For having a monitor capped at 60fps, everything plays just fine and buttery smooth. I have no reason to upgrade the CPU quite yet. I do intend to get into VR, so I'll probably upgrade to something like a i5-12600K first.
I have a 4 core i7 7820hq laptop, the thing runs beautifully with everything i put this laptop through.
Video production is damn good, i'm actually also building a budget pc with ryzen 3 3100 so this came at a perfect time, thanks you!
One of the most useful videos out there for second hand PC parts that can be bought online
I replaced my 7700k in last january or so. No regrets. It was a hot chip that I had to delid to even make work somehow, but it still lasted me over 5 years. Once I replaced my 3060ti with a 4080 I knew I had to upgrade the CPU too.
is it that hot though? it's rated at 91W which isn't a lot
@@hsko8007 But the thermal paste under the ihs was of poor quality.
My chip was pretty good, I could get 5.3Ghz across all cores @1.35V on air (NH-D15)
You can find a lot of articles about hot kaby lake chips, they were notorious for high thermals even under the best of coolers because of the thermal compound under the ihs.
Before delidding it was 85-90C at 4.5GHz which was the boost clock out of the box. I had even mild undervolt of -15mV which was still stable. (This was with a 240mm aio)
@@hsko8007 If you overclock it, yes. A Delid and a 5.2ghz OC will raise the wattage used significantly.
7700k+3060ti must've been a cpu bottleneck, too.
@@senyaiv yes on some games I suppose.
I was running it at 1440p so it was mostly running 99% GPU load so no big issues there.
My cousin went from a 1800X to a 5800 X3D on the same motherboard. What an amazing upgrade on a single platform. Imagine if Intel could do the same thing. Thank God for AMD.
they did long ago with the 775 socket. one could go from a single core celeron at 1.6ghz to a 4 core core2extreme with 8gb of ram in 2008.
AM4 is a bit of an outlier. Hopefully AM5 follows this but who knows if it will be quite as long-lived.
I have the 7800x3d now and gave my AM4 board and 5600x to my wife and built a rig for her.
I nearly went 5800x3d, but actually wanted to be able to upgrade later as well as build my Wife a rig.
But, back in the day I wasn't so lucky. I had the FM2 platform with a 5600k APU at the time, and was never really offered a good upgrade path. FM2 was dead in the water before it even made a splash.
And AM3 had barely anything worthy of mention outside of budget options
@@white_mage It's important to note in regards to LGA 775 that while it did have an unusually long life for an Intel socket, there's two definitive eras - the latter half of the Pentium 4s, and the Core 2s. Most often, a motherboard that can run one won't be able to run the other.
@@AlistairWolfe993 i see. well, i was an am2 user at the time ¯\_(ツ)_/¯
@@white_mage I know, I have the Q6600 Core 2 Quad, still not as big a performance jump as 1 and 2 thousand series AMD to 5000 series.
The 12100f handles a 3060ti without noticeable bottleneck. Thats a glorious combo that for 600$ will run every AAA game this generation. As It better than the PS5 SoC
Its better unless the optimization for the game is trash
I find it admirable that you admit to watching HBU but still went with an AMD card for yourself considering how firmly on team Green those two are
I have 4 quad core systems including a Q6600 up to a 3200G, yes I bought a 6 core 12 thread 5000 series AM4 and slapped in a B550 board with 32GB of 3600, but only as I was in California.
The generational jump was probably more effective than any increase in cores.
that 100/200 series mod is only gonna work well on high end boards. You have to remember that those boards were only designed with 4c/8t chips in mind so only overkill vrm boards have the headroom to work with em.
Yep. I have an Asus Z170 Deluxe that has the VRMs to handle a 9900k. But few of the boards from these gens did.
Watching this on an 8 core 16 thread (45w) laptop, I see this as an absolute win.
That's not many Watts per thread..
What's that clock speed? 2GHz 😏
@@peterpan408 3.2 ghz base, boosts up to 4.4 ghz, zen 3 (and later) efficiency is quite remarkable
With a 4 core today you would still want an AMD card as they have less driver overhead than all nvidia cards. Sure you don't get DLSS but if you want to max out your CPU for a few more years AMD is the way to go.
You need to buy an extra core or two to run NVidia..
When will NVidia implement a hardware scheduler?!
fsr 3 > dlss
went from a 1050ti to a rx5700 (bought used for less than 200USD). The AMD card is surprisingly cold and silent. Even colder than the 1050ti.
@@bansheezs Fake frame tech is useless no matter if it is fsr3 or dlss3.
i still rock a 4790K, while is not the best, it does it's job paired with a 1660
Saying you need a powerful computer to run Starfield is like saying you need a powerful truck to drive around because your city made the roads out of mud
Remember that the Steam Deck is a quad core
I'd love to see a deeper dive on quad cores, I ought to check out your videos to see if you've followed up
Of course Intel would STILL be manufacturing and selling 4C8T CPUs in 2023 on their latest platform. Can't kick an old habit LOL. Great video sir.
Stunning job man! I’d love to see the 5300g
still dont know why zour channel doesnt blow up. tenicalltt well deep, excelent reasearch, tinkering and presentation
I'm still rocking a I7-4790K and a GTX 1080 TI, CPU may be old but its still rocking it and will play anything I throw at it.
I run a optiplex with a 6600 and 8gb of ram and 500gb of storage. It replaced my Nvidia shield TV box and I've never been happier. The thing didn't like with my shield was having to side load everything. Chrome for example. I felt a search engine, better than the one they offered, should have been standard out of the box. I loved my shield but the proprietary elements really turned me off over time. So I have an additional 16gb of ram coming, 8dollars on eBay, and paired with my wireless controller the new optiplex will be ready for a little gaming. The i5 6600 is perfect in this specific use case. Final price... 35 for the optiplex, 8 bucks 16 gb ram. Wireless controller 15 bucks. Sorry Nvidia Shield
What's the GPU?
@@TheBcoolGuy No GPU. Just onboard graphics and an optimized CPU. Intel optimization software kicks ass. This i5 6600 is doing the job just fine by itself. Runs at steady 3.8ghz and doesn't blink an eye at 1440p. It just works. I totally got lucky and I really didn't expect it to work as well as it does. But it does. I'm watching a Danny Elfman concert rn in UA-cam in 1440p and it looks and sounds amazing.
It would be interesting to compare also i3-13100 and soon i3-14100
The 12100/f is the better option right now since you can get a cheaper b760m ddr5 bclk gen board now. Alderlake refresh patched out the trick used for locked sku overclocking. For the US it's a very compelling option right now. The B760M PG riptide is $120-130 usually and has the bclk generator, and you can get 32GB of jedec Hynix for about $73 that can be OCed to 6800-7200 cl34. Locked ADL skus seem to overclock around 5-5.3ghz all core usually.
Great vid. becoming one of my fave channels.
Had the Ryzen 3 3100 paired with a GTX 1660 SUPER and it did pretty well (needing some changes in settings in certain games). It sure is a good staple cpu when youre in the process of saving up to upgrade, if you find it for cheap
0:44 mainstream six core? If anything I think a quart is pretty much the bare minimum now. Hell I own a 12 core and I think that’s getting a little too little I need to get a 16 core soon
I’ve never even heard of a six core CPU I didn’t know that was a thing I thought they were only in multiples of four after dual core
Still running a 7700k with 6700XT. Runs well enough for 1440p.
shoot I'm running an 8400 with a RX 6600 and I'm doing just fine with 3440 x 1440 at 60 HZ. I mostly play single player games so it isn't a problem.
loving these vids iceberg.
Great video and really relevant to me too! I got a decent deal on a laptop a year ago with black frdiday, it has a quad core i5-10300h and an RTX 3060 mobile. The cpu is somewhat close, being locked at 4.3 Ghz. Keep up the great content!
My favorite build right now to game on is an i5-6500 paired with an A2000 6gb in an SFF case. I have a 7800X3D and 4080, for reference.
Other fun builds: i5-8500 + 4060LP, i5-11400 + RX6400 (weird one, mostly for avx512 emulators + hard focus on linux gaming)
I was using a 3100 until the start of this year on a b350 board and I only upgraded due to a GPU upgrade so it made sense at the time to get the most from my GPU but honestly for most use cases it'll surprise people and I think if you can get your hands on a 3300x used you're in even better shape
The 3100 is my very first processor. Bought in 2020 and it's still going strong. Considering these were AMD's unicorn chips, I'll hold on to it until the day it dies.
Cpu benchmarks you can include cinebench 23, 24, handbrake, compression and decompression test using winrar.
very glad you added productivity
13:32 Yeah you're right. If you got any supported relatively modern GPU then you'd go for GPU rendering rather than CPU rendering. However, CPU rendering has one advantage that GPU doesn't (at least, not yet) and that is being able to use your system ram to render. Meaning, if your GPU has 8gb of ram for a scene that is 10gb large you will not be able to render that at all on the GPU since blender doesn't yet support out-of-core rendering. I believe CPU can do this but if I am mistaken, another argument is that you can always easily upgrade your system ram.
Edit: To add also, I run an i7-6700 with an RTX 3080 and mostly do GPU rendering but having a beefy CPU helps a lot with simulations and is nice to have as a fallback when CUDA/OPTIX decides to play up.
I mean I was using a i5-7500 up until 2023 when I upgraded, it lasted me well :)
coffee lake is not the same socket as skylake, even though intel uses the same confusing 1151 name, but they are not compatible
Do you shoot all the b roll footage yourself? i love it.
13:31 yes, Cuda/Optix would be the absolute choice, so yes, the cpu rendering benchmarks aren't representitive for the average blender user
Mine 7600k struggling with rx5700xt red devil. CPU always on the max and GPU bounces from 57% - 94% depends on the game and settings. Running 1080p.
I'd love to see how the Ryzen 3 3300x compares to those chips nowadays. Sadly I can't seem to find any source for one...
I had an i7 6700 (non k) until recently. It played apex legends absolutely fine, no stuttering or frame drops.
I just got a ryzen 7600 which I’m sure will be a nicer upgrade path
13:30 you are kinda right. Generally you are correct but theoreticaly quality of CPUs render was higher than GPU one but that was more of the case in the past. So unless you need extremly accurate render GPU will do fine. But still those edgecases still exist.
I still run a HTPC on an i7-4770 with a GTX 1660 Super. It still holds it's own in most games at 1080p with some reduced settings.
I like the benchies. I wish you did some max OCing on the old cpus and tried to squeeze as much as you could from them.
Used Ryzen 3 1200 since the launch day, served me well not going to lie, recently in september I made a switch to a Ryzen 5 7600 and an Rx 7900 xt from an Rx 570 2 weeks ago. In games that I play and tasks that I do I didnt see a future in 4c cpus.
Still using an i7-6700 in one of my gaming towers. Although there is a noticeable bottleneck, it still hits the 1080p 60fps gaming experience. This isn't always the case for modern triple A titles, but it handles the games I like just fine. Pro tip, if you have a beefier gpu you can sometimes bump the resolution up to 1440p for better performance. How you might ask? It offsets more of the workload to the gpu, making your cpu not work as hard resulting with slightly higher frames and obviously better visuals. It's a win win scenario! This doesn't work with all games, but it's definitely worth a shot.
Ryzen 3300x is I think closer match to 6700k or even 7700k because it have all cores on one chiplet (like intels)
Mate I looked at the title and thought wtf is Jurassic lake I missed that one 🤣 been a long day...
Hmm. The 6700k was 2015/2016. You should have used a Ryzen 5 1500X for comparison here. MAYBE the 2500X but that's stretching it. The 8700ks were 2017's hot chip, king of SCIPC. Compared to the closest AMD chip would be the 2600x. This review makes AMD look far better than they ever were at the time. I understand wanting to use Zen 2 as that is where they finally start to truly catch up to Intel but it's not a fair comparison.
Just a note about Fortnite performance mode on AMD GPUs, they dont fully boost unless you set a Minimum frequency either in the driver or with an external tool such as MoreClockTool/MorePowerTool from igors lab. If you look at your recording its only hitting 1300mhz
Also looks like the same is happening is CS2
Yes, I was aware that this behaviour was happening but my attempts to bypass it in the driver never worked out. I'll have to give MoreClockTool/MorePowerTool a try in my next CPU test. Thanks!
Not a gaming use, but I am rocking a Pentium N6005 (in an Odroid H3+) with 16 GB of RAM and a pair of 6 TB hard drives as a home server (NAS, Plex, HTTP, etc.) and it has run like a dream. So, while I can't comment on gaming uses, I can certainly attest to the fact that quad core CPUs have plenty of useful applications today!
Where I live the I7 6700k costs about as much second hand as a I3 12100f new. If you get a cheap DDR4 motherboard and RAM with it, I doubt that the cost is that much higher. Downside: No overclocking. Upside: 2 years warranty.
I'm still running my i7-6700k with a 1080Ti. I usually update to a new system around the 5 year mark or when a game requires it. This time it's Forza Motorsport that needs a 6 core minimum. I have two systems waiting for a build, an i7-12700k with a 3080Ti and an i9-13900k with a 4090. Hopefully, these new systems will last for a long time.
Fair winds and following seas to all.
Nice comparison! Could you also add your i7-5775c to the competition? This one should give the 6700K a hard time and I'm curious how it will compare to the 8700K ...
as someone who both likes to put together scrapheap PCs of random old parts *and* just built a full brand new rig with a 13600K "best performance-budget value", I felt that intro lol
Currently trying to put together a rig around a 4C/8T R3 5350 I pulled out of a broken mini PC, so I'm definitely curious as to the quad-core in 2023(2024) question.
This comparison really does show me that I made a great choice upgrading to an AMD Ryzen 5 5600X. I have it paired with a RTX 2080 Super, and all the games I play run well enough at 2560x1440 or 1920x1080 to make me extremely happy.
Recently updated my 6700k - It was fine for everyday use, Only when I started playing flight sims like DCS and il2 i wanted an upgrade
I3 14100 will be a 6 core 12 thread CPU. So yeah 4 cores are done for as well as games requiring more than that to run and have better scalability.
Still running an i7-6700 (non K). I think the biggest issue I face is less the processor and more the 8 GB of 2133 MHz ram and the SSHD that I run my games off of. Thinking of oing for the R5-7600x, 32 GB 6000 MHz CL30 ram and a Crucial P5 Plus as replacement. Should be quite the upgrade for me. Still keeping my rx 5700 xt for a little longer though.
or any X299 motherboard can use Skylake to Comet lake without anything but a CPU swap and your good to go gen 6 to 10 all in one mobo. waiting for the next Hedt chipset , x99 and x299 have been awesome
I'm wonder how my i5 3570k will perform in your benchmarks. this year i bought an xfx 6900xt so i want to know how much i going to gain from i5 12400f/R5 5600.
great vid as always.
i pickup up a 3100 for super cheap and replaced my i7 4770 and the deference is pretty big in game like pubg. alot less stuttery also is the 5600x got auto oc on? they boost single core to 4.85 ghz
5600X user here yes it's true. You can use PBO to get 4.85ghz. It's really good for a hexacore.
I never owned a quad core Intel CPU -- well, not counting the i7 860 in my 2009 iMac and maybe a laptop. The only quad core I intentionally bought as a standalone CPU was a Ryzen 5 1500X. It was quite strong in the Zen1 range. EDIT: I actually do have a quad-core Intel CPU: a 10100. I use it in a file server, and it has never broken a sweat.
Puget systems benchmarks premiere pro, after effects, and davinci resolve. Maybe you should consider that for future videos. I have a i7 6850k, and I use it for editing, and oh my god its sooo slow. Need to upgrade soon
I've got a Xeon E3-1245 v2, and it's a quad with hyperthreading. It sits in an old HP Z220 SFF workstation that my employer was getting rid of. It came with a third Gen i5, but I wanted to run ECC memory, and I noted that the 1245 supported it.
I've parked it in the corner, replaced the aging HDD with 4 SATA SSDs, threw three of them in a ZFS pool, and one to boot, and she's all flash based, with ECC memory, for under $100 (workstation was free mind you)
It sits happily in the corner at my ex wife's house. We're still friends and she has 1 gig municipal internet. It does the usual box in the corner things, Plex server, SMB shares, and a win 10 virtual machine because my ex doesn't want to learn Linux 😢
I was running Dual cores when Quad cores are the norm
originally a Core 2 Duo E3400 (2c/2t), then went to mobile Intel Core i5-4210H (2c/4t)
Then somewhere in the middle of the pandemic, I ended up upgrading directly to a mobile AMD Ryzen 7 6800H (8c/16t)
GPU wise, I originally played with the a GeForce GTX 650 (Desktop), then went to mobile GeForce GTX 850M (4GB Version), then to a GeForce RTX 3050 Mobile
Though, my first ever Family PC was either a Pentium 3 or Pentium 4 with an integrated SiS 630 Chipset with its integrated graphics SiS 305 iGPU (running originally Windows ME then I broke it then it was upgraded to XP)
The Core 2 was running Windows Vista then 7
The first gaming laptop was running Windows 8 then 8.1 then 10
My new laptop runs on Windows 11
Phoronix has kernel compilation tests among other benchmarks in their PTS suite.
8-threaded quad-cores is cheatin' tho. I think they're still quite capable in most scenarios.
Using my old i5-4590 hurts sometimes in the newest stuff. Bless it, how it tries.
I have a backup setup with a i7 3770K, 32GB DDR3 @1600, Nvidia RTX 2060 Super 8GB ( MSI Ventus OC 8 GB ) and im playing The last of Us on almost max graphics native 1080p, not bad for a 12 year old cpu.
i basically went from a i7 4790k to a Ryzen 7 5800X3D for gaming and my gosh it was huge
Clock speed and 3D cache can increase the longevity of 4C8T a while longer. Imagine a theoretical Ryzen 3 7300X3D at 5ghz. It would have excellent gaming performance.
Most things these days are written for 8 threads or so. Issue is, most people are running more than one thing at once. Multimonitor is nearly default these days, and on top of the game, and window's hefty performance overhead, most people will have a web browser, discord, and about half the time an animated wallpaper all running elsewhere. Plenty of tasks to eat up a properly scheduled high core cpu. That being said, just a year or two ago, I was daily driving all of these tasks (minus the animated wallpaper) on a *non multithreaded* quad core, and not even one of the more recent ones. People wildly overestimate how vital a top of the line CPU is these days if you're satisfied with being average. Any new-bought TOTL system built today WILL be cpu limited, really harshly.
Would love to see a 3300X.
My buddy is still rocking an i5 6500 (4 cores without hyperthreading) and a 1060 EEK!
I run an i3 10100f and its is fairly comparable to an i7700 non-k in raw performance, I think it would be interesting for you to test out one of the newer i3s!