Hey bro, Here in the Philippines the i3 14100f only costs 6000 pesos ($102USD) but it is 2nd hand, If I buy a brand new i5 12400F it costs 7000 pesos ($120USD) what is better to get? And then what is the best budget mother board for those two? Thanks a lot
I would that you should go with 12400f as it is a Lil bit expensive but offer much better performance, as for the motherboard you can go with h610 ddr4 or ddr5 according to your budget. The H610 board should be fine if you are constrained on a budget
5GHz? looks like you've completely lost silicon lottery. Mine running 24/7 @5.2GHz 1.375VCore. Custom water loop so it's never over 70C even under 100% load. As i heard 5GHz was possible for every single chip, 5.1 for at least half of them and 5.2 for like 10%. Anything higher was possible but with higher than safe voltages (mine can't even boot Windows @ 5.3GHz 1.4V VCore, it requires at least 1.425 to be somewhat stable and i don't want to push it harder). And it's this 500MHz OC what really makes the difference. In some new games it's a line between under 60 and over 60 FPS, sometimes it pushes 1% lows from like 40 to 60 FPS. It's more than simple 10% boost. Sadly i had to update BIOS to newer version to fit a new set of RAM after one stick misteriously died (never had RAM stick die after 2 years of normal use, no OV/OC, used it's rated XMP profile) and i moved from 4x8 to 4x32Gb set, without update Windows wouldn't see whole memory and could only access half of it. After update there was introduced a bug which affected OC: you have different multipliers for regular loads and AVX2.0 loads and it would set the AVX2.0 multipliers when AVX2.0 load is present (like video encoding etc) and my chip works fine at AVX2.0 5.1GHz but crashes at 5.2. But this bug will always set only to AVX multiplier so i had to set it to 52 and avoid this kind of loads on normal OC profile.
I get around 90-110fps in that same spot on a 4090. The 9900K at 40% is nearly using 100% of its physical cores. 8C/16T so half of the theoretical 100% performance is hyperthreading which is not equivalent to the full power of 8 phyiscal cores. Example on my 8700K it was the equivalent of having 1.7 physical cores extra performance acording to benchmarks that show how the CPU scales with HT on vs OFF( CPU-Z has one). On my 13700K the hyperthreading scales better so its the equivalent of having 2.03 phyisical cores of extra performance.
DDR4 3600MHz is pretty close to 6000 DDR 5 because the former has much lower latency. The differences here are the improved single threaded performance of the new architecture and the higher multi threaded performance of the i9 due to core/thread count
RAM Frequency is bullshit, you should alway look for RAM Latency which basically is RAM frequency multiplied by timings. Higher frequency RAM has higher timings also and real latency drops not that much, if you have 2 sets of RAM with same frequency but different timings, lower timings one will provide better results. Sometimes better and slightly tweaked DDR4 3600 can outspeed cheap DDR5 6000 with sky high timings.
Has anyone ever confirmed this person is legit with the amount of hardware configurations he has? Just wondering before I use his channel results for my decisions
Something is very off about this: I get similar or even better results (in some games) with my i9 9900k 4800 Ghz, 32 GB 3200 RAM and a RX 6900 XT at 1440p (native ress). In some cases (old single core titles) where is no CPU bottleneck, a i3 1200f can go hand to hand with my 9900k...but once the game is using more than 4 cores and the CPU is bottlenecking, the 9900k will smoke that peewee out of it's existence. A 4 core CPU today is a bottleneck on wheels. Adding to the list the current debacle of 13th and 14th gen, at this point the only reasonable upgrade to a 9900k might be a AMD's 7800x3d. Other than that, not worth it.
Yall this is why you dont need to let mainstream media make you think you need the new platforms. You can grt a budget cpu right now if you so change also and use a 4080 super no problem with it. Mainstream media has people thinking they need the greatest or the bottleneck will effect them. 14400f ans 13400f play on 4080 supers beautiuflly just like a 10900k or 11900k would be. But then again the nee modern budget cpus are 20 cores like the i5 13400f so it does make a difference. Just dont fall for the whole over spec cpu crap. When you have highe end gpus you will be gaming at high resolutions and will notice a verh small difference betwern flagship and budget at those resolutions real world gameplay. People kind of have it backwards. Since budget cpu are so powerful now.
For new generation CPU you will also need new motherboard and set of RAM. I have good motherboard which allows me to OC everything i need, provide stable voltages etc. And i have 128Gb DDR4 3200 (yes, i know 3200 sounds bad but i had to lower timings, it just doesn't want to work @ 3600 even with higher timings) so moving to a new platform is out of question right now.
@ThereWasNoFreeName actually I agree. Just like the 13400f does better on ddr4 3200 then ddr5. People just don't get it. All depends on the cpu. Yoy don't need a 7800x3d and ddr5 to enjoy performance even on a 4080 super.
@@reviewforthetube6485 I don't have enough budget to upgrade anything but graphics card this year. Gotta be moving from 3080ti to 5090 when it hits the stores. And I have a feeling OCed 9900k will be sufficient for few more years. At 4K (I do have 4K display) I already maxing out GPU load while CPU chills and in some games I use DL DSR for even better quality. All I need is stable 60FPS and 9900K delivers, but 3080ti struggles in some games even after mild overclock. Overclocking and tweaking your system provides quite significant boost. Sometimes simply bumping up GPU clock by 100mhz, VRAM 500mhz, CPU by 300-400mhz and lowering timings by 1-2 gets you from stuttery 50ish fps to stable 60FPS.
reason 9900k is DDR4 platform ,14100 is DDR5 Platform。In some multi-threaded and multi-core greatest optimized games, such as Red Dead Redemption 2, the 14100 on the DDR5 platform is much worse than the 9900K. In some new games, the performance of the two is almost the same.
The i3-14100 has higher IPC and use higher speed memory, but the i9-9900K is still twice the CPU cores. So in heavy threaded games, the i9-9900K still outperform it. And will be even better in the future vs the new i3-14100. The i9-9900K is IMO a better buy for the same price.
the 14100F CPU now thinking most of them will be using a pair of 6000mhz ram and RTX 4090 just to keep this fps .. But most in reality the 14100F will be pair with mid or budget parts since it was made for that use in actual cases.
My previous CPU was 3930K OCed to 4.7GHz and that thing was power hungry beast, for some reason it didn't show load over 255W but under load it always just capped @ 255W. Now that's what i call CPU EATS the power... 9900K is on the diet even being OC.
the i3 is pcie 4.0 and the i9 is pcie 3.0. So the core i9 forces everything to run at pcie 3.0 speeds. That's the ram, drives, and gpu all throttled simply because the core i9 isn't on the same plane of divine existence.
В целом, можно сказать следующее: при разгоне до 5,0-5,2 GHz, i9 9900K/KS до сих пор отличный вариант, и им без проблем можно пользоваться еще пару лет для игры на максимальных настройках в Full HD, только нужно хорошее охлаждение.
главное - память. в видео память 3600 мгц на хуниксах (16-19-19-39) и она вообще не настроена. ринг наверное 4300. у меня i7-8700k @4.9 ггц, ринг 4.6, и настроенная память 3600 мгц 14-14-14-30 CR1 (выше не едет из-за хламовой материнки) и то производительнее чем то, что в видео. i9-9900k при 5 ггц на ядра, 4.7-4.8 ггц на ринг, с настроенной памятью 4266+ мгц CL16 будет значительно лучше этого i3. и будет лучше даже i5-13400.
Но тут ненадо забывать что да 9900 хороший камень, но против сильный нагрев в разгоне и потребление большое, и на эту платформу ты ничего уже не поставишь а на 14 ую ты можешь как минимум i5 воткнуть и он то точно делает 9900. Но 9900 хороший камень но на бу досих пор неоправданно дорогой
Двоякая ситуация. В по первых, никто не использует 14100 с 4090, там будут скорее какие-то заглушки PCIe слота вроде 3050/3060/4060, ну 4070 максимум и то случайно. И при 1080р будет постоянный упор в GPU что с 9900, что с 14100. Во вторых, с 4090 мало кто использует 1080р моники, минимум 2К. А при повышении разрешения зависимость от проца сильно снижается и если тут мы наблюдаем отставание 9900К, то я уверен, что в повышенном разрешении будет +- паритет по fps. Так что в большинстве реальных сценариев, если нет какой-то жесткой погони за постоянно топовой производительностью, на 9900К можно сидеть походу еще лет 5, как раз в аккурат до выхода новых убогих консолек)
8 cores and 16 threads are still more useful than 4 cores and 8 threads. If it’s just for gaming and a slightly better FPS, then the latest generation of Intel is better. But for work and more, the I9 9900K is the best. I am using an i5 9400k, and because I am reluctant to upgrade all the components, I only upgraded the CPU to an i9 9900k
What I can see, there are games who take better use of higher number of cores and those who don't. BTW this i3 has DDR5 as memory, which is a big improvement...
Run my 9900K with rtx 4070 at 3440x1440 - Play all games CPU Load 30% and eats 45-50wt - it has been doing great job since 2018 and still does! very good cpu!!
Thank you for the wonderful video 👍🏻.. I want to buy a powerful laptop for gaming, watching movies, and other work, and its price is reasonable.. What 2024 models do you recommend for me?
Ну так нечестно с оперативной памятью 6000 и 3600 сравнивать, тут больший прирост из-за ddr5, а не из-за производительности процессора! Хотя вот в Киберпанке и RDR 9900 показывает себя лучше там где общая нагрузка на процессор выше, что в целом доказывает что он производительнее!)
DDR4 3600 МГц очень близка к 6000 DDR 5, поскольку первая имеет гораздо меньшую задержку. Различия здесь заключаются в улучшенной однопоточной производительности новой архитектуры и более высокой многопоточной производительности i9 из-за количества ядер/потоков
Andrei - I'm positive you've gotten this question before... but your onscreen stats.... are they set through afterburner? I like the way they look, but I'm finding it difficult getting anything to look like that.... :( What program are you using if it's not Afterburner?
To make matter worse, this actually implicates the 10700k and 11700k as well all thanks to intel for their amazing generational leap from 9th gen to 11th gen.
Well yeah, intel usually sucks at leaping when it s on same platform, that s why they change platforms quite often, and usually platform leaps are huge, most of them are on average 30-35% better than previous platform latest generation
I suppose. The 11700k isn't that much faster than a 10700k. The Rocket Lake 14nm backport blew chunks. When Steve from GN called the 11900k a waste of sand, he was being nice. For those that don't know, the 9900k and 10700k are pretty much the same thing.... except the 10700k has a higher TDP and it generally overclocks a bit better than the 9900k.
I have the 10700k. Used to be great few years back but I cannot stand its bottlenecking nowadays with the high end 40 series GPUs in cpu intensive games. Especially if you insist on stable 144fps or 165fps.
I do understand your point, but who would by the14100f and stick to DDR4? It would really hurt the upgrade path, to eventually get a flagship part and still be on DDR4. This test shows what you can get with an entry level CPU, and decent, not great DDR5. Granted the board used was not entry level. I think this video shows how well the 9900K is holding up nearly 6 years after release.
@@davidandrew6855 brother. it's an entry level cpu. if you're on a budget and you cant even buy an i5 you might as well get ddr4 an dedicate that money you would've spent on ddr5 on a better gpu. ddr4 is dirt cheap these day. especially on the used market. but that's besides the point. if you want to compare the performance of two cpus you should use the same dram if you could to level the playing field and remove as many variants as you can.
@@davidandrew6855 and that's not just a "decent, not great" ddr5 kit. with a bit of tuning you can get some amazing performance out of it. I got 6200mhz cl30 on a much worse kit.
@@chadfang2267 _"if you're on a budget and you cant even buy an i5 you might as well get ddr4 an dedicate that money you would've spent on ddr5 on a better gpu. ddr4 is dirt cheap these day. "_ I would usually agree with you, but the difference in price between the ram was $30 bucks. I'd much rather have the better RAM then a minimal GPU upgrade. There is money to be saved on the motherboard if it is DDR4, but to me it seems illogical to stick with DDR4 on a new system build, that I might want to upgrade over the years. Again, my thoughts feel free to disagree. As I said I see your point, but in this instance the newer CPU can handle better RAM, why not allow it to actually compete with the 6 year old flagship, vs holding back any performance? Granted we are talking an i3, but if there is more performance to be had on a new system build why not go for it?
@@chadfang2267 I understand the point of tuning memory, I've seen amazing things out of DDR5 5600 with CL28 and it was only about $108 bucks for 32GB. I was just saying what was used in the testing was standard DDR5 with a CL of 38 not even CL30.
в видео память 3600 мгц на хуниксах 16-19-19-39 и она вообще не настроена. ринг наверное 4300. у меня i7-8700k @4.9 ггц, ринг 4.6, и настроенная память 3600 мгц 14-14-14-30 CR1 (выше не едет из-за хламовой материнки) и то производительнее чем то, что в видео. i9-9900k при 5 ггц на ядра, 4.7-4.8 ггц на ринг, с настроенной памятью 4266+ мгц CL16 будет значительно лучше этого i3. и будет лучше даже i5-13400.
Now actually test the CPUs in CPU-bound scenarios instead of cranking everything to the highest settings. Cranking things to ultra is NOT how you test CPUs, you do the OPPOSITE, reduce everything to low. These channels never learn. It's baffling.
1080p is normal even today like EX said but the most used gpus are the 60s Series like 1660, 2060 and the 3060. People like us who spend over thousand of euros for PC parts are the minority, the top 1% of gaming pc users. But I understand what you I mean who would use a 4090 for Full HD. I would like to see a comparison in 1080p, 1440p and 4k to see how the cpu performed.
@@LilianaStar So Esports gamers that play 1080p competitively in world tournaments using 4090s are stupid? Even though 1080p offers them maximum FPS and low latency that effectively improves their gameplay and increases their chances of winning?
Games :
Forza Horizon 5 - 0:06 - gvo.deals/TestingGamesForza5
CYBERPUNK 2077 - 1:04 - gvo.deals/TestingGamesCP2077
Hogwarts Legacy - 2:09 - gvo.deals/TG3HogwartsLegacy
Ghost of Tsushima - 3:05
Red Dead Redemption 2 - 4:04 - gvo.deals/TestingGamesRDR2
PUBG - 5:01
Horizon Forbidden West - 5:55
The Witcher 3 - 6:44 - gvo.deals/TestingGamesWitcher
Starfield - 7:40
Microsoft Flight Simulator - 8:36 - gvo.deals/TestingGamesMFS20
System:
Windows 11
Core i3-14100F - bit.ly/420c9K8
ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc
32Gb RAM DDR5 6000MHz - bit.ly/3BOxlni
Intel i9 9900K - bit.ly/2BwZmGH
Asus ROG Strix Z390-F Gaming - bit.ly/3dsxzEE
32Gb RAM DDR4 3600Mhz - bit.ly/35vyWko
CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
Can you please test CS2?
10980XE vs 9900KS w/ 2x Titan RTX NVLink? "Best PC 5 years ago today, Consumer or HEDT?"
6 years... it seemed to be just 2 years ago
Look at the cpu power usage. You can expect the newer one to be well... newer but also potentially have longer lifespan with same cooling
14100f is nonsense cpu 12400f has better performance +2 cores with same power consumption price almost same (depends where)
Hey bro,
Here in the Philippines the i3 14100f only costs 6000 pesos ($102USD) but it is 2nd hand, If I buy a brand new i5 12400F it costs 7000 pesos ($120USD) what is better to get?
And then what is the best budget mother board for those two?
Thanks a lot
@xflashu8075
I would that you should go with 12400f as it is a Lil bit expensive but offer much better performance, as for the motherboard you can go with h610 ddr4 or ddr5 according to your budget. The H610 board should be fine if you are constrained on a budget
@@techman4085 Hi bro it's me again, Can i ask you? What gpu is best for my i5 12400F.
Gtx1650super
Gtx1060 6gb
Gtx980 4gb
RX580 8gb 2304sp
I have limited budget 😅 but soon i will upgrade it to higher card.
I loved my 9900K rig that I ran from 2019 to 2023. Rock solid 5GHZ stable OC, never missed a beat, kicked the ass of everything I threw at it.
What GPU u have?
@dysphase6767 started with a EVGA 2080 , then upgraded to a 3080FE in 2021.
@@Dr.D00p I'm stick rocking the i7 4790k since Dec 2014 paired with the RTX 3060 currently.
5GHz? looks like you've completely lost silicon lottery. Mine running 24/7 @5.2GHz 1.375VCore. Custom water loop so it's never over 70C even under 100% load. As i heard 5GHz was possible for every single chip, 5.1 for at least half of them and 5.2 for like 10%. Anything higher was possible but with higher than safe voltages (mine can't even boot Windows @ 5.3GHz 1.4V VCore, it requires at least 1.425 to be somewhat stable and i don't want to push it harder). And it's this 500MHz OC what really makes the difference. In some new games it's a line between under 60 and over 60 FPS, sometimes it pushes 1% lows from like 40 to 60 FPS. It's more than simple 10% boost. Sadly i had to update BIOS to newer version to fit a new set of RAM after one stick misteriously died (never had RAM stick die after 2 years of normal use, no OV/OC, used it's rated XMP profile) and i moved from 4x8 to 4x32Gb set, without update Windows wouldn't see whole memory and could only access half of it. After update there was introduced a bug which affected OC: you have different multipliers for regular loads and AVX2.0 loads and it would set the AVX2.0 multipliers when AVX2.0 load is present (like video encoding etc) and my chip works fine at AVX2.0 5.1GHz but crashes at 5.2. But this bug will always set only to AVX multiplier so i had to set it to 52 and avoid this kind of loads on normal OC profile.
@@BravoSixGoingDark dude u can build a 5700x3d system with the 3060ti and get much more performance
wtffffffffff , 2:12 , i9+4090 on 1080p and below 50fpsssssssssss , wowww
40% GPU usage means massive CPU bottleneck
its 4k, what do you expect?
@@Pawcio2115 The i9-9900K is also at 40%. It's just not a well optimized game
I get around 90-110fps in that same spot on a 4090. The 9900K at 40% is nearly using 100% of its physical cores. 8C/16T so half of the theoretical 100% performance is hyperthreading which is not equivalent to the full power of 8 phyiscal cores. Example on my 8700K it was the equivalent of having 1.7 physical cores extra performance acording to benchmarks that show how the CPU scales with HT on vs OFF( CPU-Z has one). On my 13700K the hyperthreading scales better so its the equivalent of having 2.03 phyisical cores of extra performance.
U need a eye test. Plz watch video again, it's 1920*1080@@RobloxianX
i3: The Future is now, old man!
El futuro es hoy oíste viejo :V
maybe the difference in performance is due to RAM speed. 3600MHz on i9 vs 6000MHz on the i3
DDR4 3600MHz is pretty close to 6000 DDR 5 because the former has much lower latency. The differences here are the improved single threaded performance of the new architecture and the higher multi threaded performance of the i9 due to core/thread count
RAM Frequency is bullshit, you should alway look for RAM Latency which basically is RAM frequency multiplied by timings. Higher frequency RAM has higher timings also and real latency drops not that much, if you have 2 sets of RAM with same frequency but different timings, lower timings one will provide better results. Sometimes better and slightly tweaked DDR4 3600 can outspeed cheap DDR5 6000 with sky high timings.
I9 9900k makes difference when you overclock it
+ OC RAM ~4266 with low timings and latency
i3 4400mhz vs i9 4700mhz
Imagine overclocking in 2024 when AMD cpus working fine out of the box since 2017 lmao.
@@notenjoying666 this isn't an amd vs Intel comparison
@@rashamurshed3507 Haha you just mad because intel is useless in 2024 until you oc it.
It's insane seeing a quad core i3 defeat an octa core i9 -- with a difference of only 5 years!!
Feliz da vida com meu i7 8700k rodando em 4.7ghz em todos núcleos junto com a rtx 3070ti
Melhor cpu da oitava geração
Quer vender? 😂
Yo estoy igual. i7 8700k oc 4.7Ghz pero con una 3080ti.
Buen Equipo
@@NickT9330 muy Bueno
great comprrison. a flagship $450 cpu from several years ago to a budget one today is crazy.
No. Even Today's i3 still slower
@@KelvinKMSno
@@KelvinKMS new generation i3 have fastest cache memory compared to older. So that's make difference.
Has anyone ever confirmed this person is legit with the amount of hardware configurations he has? Just wondering before I use his channel results for my decisions
Something is very off about this: I get similar or even better results (in some games) with my i9 9900k 4800 Ghz, 32 GB 3200 RAM and a RX 6900 XT at 1440p (native ress). In some cases (old single core titles) where is no CPU bottleneck, a i3 1200f can go hand to hand with my 9900k...but once the game is using more than 4 cores and the CPU is bottlenecking, the 9900k will smoke that peewee out of it's existence. A 4 core CPU today is a bottleneck on wheels. Adding to the list the current debacle of 13th and 14th gen, at this point the only reasonable upgrade to a 9900k might be a AMD's 7800x3d. Other than that, not worth it.
Well, he is playing in 1080p which is definitely bottlenecking the system.
Yall this is why you dont need to let mainstream media make you think you need the new platforms. You can grt a budget cpu right now if you so change also and use a 4080 super no problem with it. Mainstream media has people thinking they need the greatest or the bottleneck will effect them. 14400f ans 13400f play on 4080 supers beautiuflly just like a 10900k or 11900k would be. But then again the nee modern budget cpus are 20 cores like the i5 13400f so it does make a difference. Just dont fall for the whole over spec cpu crap. When you have highe end gpus you will be gaming at high resolutions and will notice a verh small difference betwern flagship and budget at those resolutions real world gameplay. People kind of have it backwards. Since budget cpu are so powerful now.
For new generation CPU you will also need new motherboard and set of RAM. I have good motherboard which allows me to OC everything i need, provide stable voltages etc. And i have 128Gb DDR4 3200 (yes, i know 3200 sounds bad but i had to lower timings, it just doesn't want to work @ 3600 even with higher timings) so moving to a new platform is out of question right now.
@ThereWasNoFreeName actually I agree. Just like the 13400f does better on ddr4 3200 then ddr5. People just don't get it. All depends on the cpu. Yoy don't need a 7800x3d and ddr5 to enjoy performance even on a 4080 super.
@@reviewforthetube6485 I don't have enough budget to upgrade anything but graphics card this year. Gotta be moving from 3080ti to 5090 when it hits the stores. And I have a feeling OCed 9900k will be sufficient for few more years. At 4K (I do have 4K display) I already maxing out GPU load while CPU chills and in some games I use DL DSR for even better quality. All I need is stable 60FPS and 9900K delivers, but 3080ti struggles in some games even after mild overclock. Overclocking and tweaking your system provides quite significant boost. Sometimes simply bumping up GPU clock by 100mhz, VRAM 500mhz, CPU by 300-400mhz and lowering timings by 1-2 gets you from stuttery 50ish fps to stable 60FPS.
reason 9900k is DDR4 platform ,14100 is DDR5 Platform。In some multi-threaded and multi-core greatest optimized games, such as Red Dead Redemption 2, the 14100 on the DDR5 platform is much worse than the 9900K. In some new games, the performance of the two is almost the same.
If we talk about a resolution of 2k and higher, then 9900 will be enough for a long time
Bf2042 will prove u wrong 😉
The i3-14100 has higher IPC and use higher speed memory, but the i9-9900K is still twice the CPU cores. So in heavy threaded games, the i9-9900K still outperform it. And will be even better in the future vs the new i3-14100. The i9-9900K is IMO a better buy for the same price.
Yah, but at that point just get a 7600x and call it a day, and later can wait to get a cheap ryzen 9700X3D
the 14100F CPU now thinking most of them will be using a pair of 6000mhz ram and RTX 4090 just to keep this fps .. But most in reality the 14100F will be pair with mid or budget parts since it was made for that use in actual cases.
in 2024 4cores = garbage
First 🔥🇲🇦
9900k is my daily driver since 2019, and will be until my mb breaks. Which GPU?, 3070.
9900k eats twice the power constantly but it holds up like a champ
My previous CPU was 3930K OCed to 4.7GHz and that thing was power hungry beast, for some reason it didn't show load over 255W but under load it always just capped @ 255W. Now that's what i call CPU EATS the power... 9900K is on the diet even being OC.
You are great! I was eagerly waiting for this video
so you say my i9 has shrunk into i3??
are you joking?
yea the IPC go more eficient on each Gen, games only use 4-6 cores at all, and in games was enought for beat 9900
it’s been 6 years. It’s normal
The i3 is overload than i9 and The 13 generation is ddr5 instead ddr4 cause this The fps is better
I9 works fast where ai npc are badly optimized
*I always knew, CORE I9 SUX!*
the i3 is pcie 4.0 and the i9 is pcie 3.0. So the core i9 forces everything to run at pcie 3.0 speeds. That's the ram, drives, and gpu all throttled simply because the core i9 isn't on the same plane of divine existence.
9900k 38 % 14100 %61 . 02:27 intel needs some update. then 9900k gets 30 fps more
The i9 its stil a good cpu for the right price.
I'd love to see 7800X3D vs. 14600K OC
В целом, можно сказать следующее: при разгоне до 5,0-5,2 GHz, i9 9900K/KS до сих пор отличный вариант, и им без проблем можно пользоваться еще пару лет для игры на максимальных настройках в Full HD, только нужно хорошее охлаждение.
главное - память. в видео память 3600 мгц на хуниксах (16-19-19-39) и она вообще не настроена. ринг наверное 4300. у меня i7-8700k @4.9 ггц, ринг 4.6, и настроенная память 3600 мгц 14-14-14-30 CR1 (выше не едет из-за хламовой материнки) и то производительнее чем то, что в видео.
i9-9900k при 5 ггц на ядра, 4.7-4.8 ггц на ринг, с настроенной памятью 4266+ мгц CL16 будет значительно лучше этого i3. и будет лучше даже i5-13400.
Но тут ненадо забывать что да 9900 хороший камень, но против сильный нагрев в разгоне и потребление большое, и на эту платформу ты ничего уже не поставишь а на 14 ую ты можешь как минимум i5 воткнуть и он то точно делает 9900. Но 9900 хороший камень но на бу досих пор неоправданно дорогой
Двоякая ситуация. В по первых, никто не использует 14100 с 4090, там будут скорее какие-то заглушки PCIe слота вроде 3050/3060/4060, ну 4070 максимум и то случайно. И при 1080р будет постоянный упор в GPU что с 9900, что с 14100. Во вторых, с 4090 мало кто использует 1080р моники, минимум 2К. А при повышении разрешения зависимость от проца сильно снижается и если тут мы наблюдаем отставание 9900К, то я уверен, что в повышенном разрешении будет +- паритет по fps. Так что в большинстве реальных сценариев, если нет какой-то жесткой погони за постоянно топовой производительностью, на 9900К можно сидеть походу еще лет 5, как раз в аккурат до выхода новых убогих консолек)
@@K961Kv2о, и Вы здесь, уважаемый) Хоть кто-то оставляет адекватные комментарии по счет разгона и настройки.
@@НикитаИпатов-м5н да и за 9900 на вторичке до сих пор дофига просят
То чувство когда моя rx 6900 xt выдает в Хогвартсе 150 фпс на полностью ультра настройках и rtx 4090 выдает 49/54 фпс 😂 и стоит она 240 тыс ₽
that i9 is still doing pretty good for today.
В pubg i3 мертвый вообще,как и во всех играх такого уровня. Только для одиночных игр.
Wow new i3 💪
?? it's 6!!! years difference
Bro need i3 14100f + rtx 4060 ti with games benchmark ❤..
4060 lmao ..
Fun fact, the i9 9900K is comparable to a mordern Phone CPU, if you give the flagship phone 10W
8 cores and 16 threads are still more useful than 4 cores and 8 threads. If it’s just for gaming and a slightly better FPS, then the latest generation of Intel is better. But for work and more, the I9 9900K is the best. I am using an i5 9400k, and because I am reluctant to upgrade all the components, I only upgraded the CPU to an i9 9900k
Still have a 9900ks in on of my tinker builds I’m running it 5.3ghz.
I9 9900K was the last proper Intel CPU - everything after that is a mess . So if you have I9 just upgrade to Ryzen CPU.
Ironically cyberpunk have the best optimization from the benchmark, while being to most shit game on release 😂😂😂
What I can see, there are games who take better use of higher number of cores and those who don't. BTW this i3 has DDR5 as memory, which is a big improvement...
Run my 9900K with rtx 4070 at 3440x1440 - Play all games CPU Load 30% and eats 45-50wt - it has been doing great job since 2018 and still does! very good cpu!!
Зато i9 засчет количества ядер менее загружен
No need for the 14100F, 13100F or even the 12100F will be at most 10% slower than the 14100F lol
chắc do ram nên fps k lên dc
Thank you for the wonderful video 👍🏻.. I want to buy a powerful laptop for gaming, watching movies, and other work, and its price is reasonable.. What 2024 models do you recommend for me?
8 old architecture physical cores vs 8 newest arch threads at lower clock speed
further proof that it's not about how many cores a cpu has
Ну так нечестно с оперативной памятью 6000 и 3600 сравнивать, тут больший прирост из-за ddr5, а не из-за производительности процессора! Хотя вот в Киберпанке и RDR 9900 показывает себя лучше там где общая нагрузка на процессор выше, что в целом доказывает что он производительнее!)
DDR4 3600 МГц очень близка к 6000 DDR 5, поскольку первая имеет гораздо меньшую задержку. Различия здесь заключаются в улучшенной однопоточной производительности новой архитектуры и более высокой многопоточной производительности i9 из-за количества ядер/потоков
If you are comparing a CPU from six years ago, include some benchmark games from six years ago as well.
still wins 14100
DDR5 is also a good advantage for the 14100f. interresting to see those good perfs for a 140/150€ budget proco
Always enjoy your videos👍
Aaaand the ryzen 7500F wins. 🙂
i play 2077, i can still use it to the day 16900k out
Still the beast, don't need upgrade about 3 years
Andrei - I'm positive you've gotten this question before... but your onscreen stats.... are they set through afterburner? I like the way they look, but I'm finding it difficult getting anything to look like that.... :( What program are you using if it's not Afterburner?
I9 usage is almost half most of the time
You mean 5 years?
as an owner of i9 9900KF, I am not happy with these results 😭
sorry i9 for gaming was a waste like 2080ti
ya I gonna keep my I9 for 2 more years
DDR4 vs DDR5 LoL, go 14100F DDR4 too ?
turn off 4 cores on 9900 too
Buy i9-9900k same price too!
I cant find i9 9900k vs i7 14700k
You not using same ram frequency
I have a question did how much ram did you use on i3 core
2x16gb same as i9
do please 7500F vs 9900K
Bruh the 7500F stomps the 9900K
are you asking this as for interested or as a joke?
i3 14100F is the best
this is hilarious
Do the same test but in 1440p.
less cpu usage, more breach still win 14100
To make matter worse, this actually implicates the 10700k and 11700k as well all thanks to intel for their amazing generational leap from 9th gen to 11th gen.
Well yeah, intel usually sucks at leaping when it s on same platform, that s why they change platforms quite often, and usually platform leaps are huge, most of them are on average 30-35% better than previous platform latest generation
I suppose. The 11700k isn't that much faster than a 10700k. The Rocket Lake 14nm backport blew chunks. When Steve from GN called the 11900k a waste of sand, he was being nice.
For those that don't know, the 9900k and 10700k are pretty much the same thing.... except the 10700k has a higher TDP and it generally overclocks a bit better than the 9900k.
I have the 10700k. Used to be great few years back but I cannot stand its bottlenecking nowadays with the high end 40 series GPUs in cpu intensive games. Especially if you insist on stable 144fps or 165fps.
why wouldn't you use the same ram if you could for a cpu comparison?
the 14100f supports ddr4
I do understand your point, but who would by the14100f and stick to DDR4? It would really hurt the upgrade path, to eventually get a flagship part and still be on DDR4.
This test shows what you can get with an entry level CPU, and decent, not great DDR5. Granted the board used was not entry level.
I think this video shows how well the 9900K is holding up nearly 6 years after release.
@@davidandrew6855 brother. it's an entry level cpu. if you're on a budget and you cant even buy an i5 you might as well get ddr4 an dedicate that money you would've spent on ddr5 on a better gpu. ddr4 is dirt cheap these day. especially on the used market. but that's besides the point. if you want to compare the performance of two cpus you should use the same dram if you could to level the playing field and remove as many variants as you can.
@@davidandrew6855 and that's not just a "decent, not great" ddr5 kit. with a bit of tuning you can get some amazing performance out of it. I got 6200mhz cl30 on a much worse kit.
@@chadfang2267 _"if you're on a budget and you cant even buy an i5 you might as well get ddr4 an dedicate that money you would've spent on ddr5 on a better gpu. ddr4 is dirt cheap these day. "_
I would usually agree with you, but the difference in price between the ram was $30 bucks. I'd much rather have the better RAM then a minimal GPU upgrade. There is money to be saved on the motherboard if it is DDR4, but to me it seems illogical to stick with DDR4 on a new system build, that I might want to upgrade over the years. Again, my thoughts feel free to disagree.
As I said I see your point, but in this instance the newer CPU can handle better RAM, why not allow it to actually compete with the 6 year old flagship, vs holding back any performance? Granted we are talking an i3, but if there is more performance to be had on a new system build why not go for it?
@@chadfang2267 I understand the point of tuning memory, I've seen amazing things out of DDR5 5600 with CL28 and it was only about $108 bucks for 32GB. I was just saying what was used in the testing was standard DDR5 with a CL of 38 not even CL30.
6 years plus completely different category. I`d love to see the 9100F and the 14900K also in the comparison! Nice vid!
Рано пока менять свой 9900
в видео память 3600 мгц на хуниксах 16-19-19-39 и она вообще не настроена. ринг наверное 4300. у меня i7-8700k @4.9 ггц, ринг 4.6, и настроенная память 3600 мгц 14-14-14-30 CR1 (выше не едет из-за хламовой материнки) и то производительнее чем то, что в видео.
i9-9900k при 5 ггц на ядра, 4.7-4.8 ггц на ринг, с настроенной памятью 4266+ мгц CL16 будет значительно лучше этого i3. и будет лучше даже i5-13400.
На 8700K/8086K до сих пор можно прекрасно сидеть. Если видюха не слишком мощная.
Actually I am more impressed seeing the 9900K performing so well still!
These new i3s are real bargain gaming chips
lmao..
Now actually test the CPUs in CPU-bound scenarios instead of cranking everything to the highest settings. Cranking things to ultra is NOT how you test CPUs, you do the OPPOSITE, reduce everything to low. These channels never learn. It's baffling.
Bro suggest me a laptop under 70-75k which can easily play cyberpunk with 70-80fps
Need 4k tests, I know it's a CPU comparison but who plays at 1080 these days, last time I played at 1080 was 10 years ago
The world doesn't revolve around you.. the majority of gamers according to steam survey still play at 1080p.
@@EX0007 imagine spending all that money to game at 1080p 🤣 the stupidity is mind boggling
1080p is normal even today like EX said but the most used gpus are the 60s Series like 1660, 2060 and the 3060. People like us who spend over thousand of euros for PC parts are the minority, the top 1% of gaming pc users. But I understand what you I mean who would use a 4090 for Full HD.
I would like to see a comparison in 1080p, 1440p and 4k to see how the cpu performed.
@@LilianaStar So Esports gamers that play 1080p competitively in world tournaments using 4090s are stupid? Even though 1080p offers them maximum FPS and low latency that effectively improves their gameplay and increases their chances of winning?
@@LilianaStarmajority of people don't have 4090 or even 40,30xx GPUs. get over yourself
GPU is doing the heavy lifting. CPU plays little to no role here.
Не в одном тесте, не один процессор, не работал на 💯% тесты были на оптемизацию игр? 😅