Це відео не доступне.
Перепрошуємо.
Is Fast RAM A Waste? Unleashing the Core i9-9900K with DDR4-4000
Вставка
- Опубліковано 31 гру 2018
- Check Prices Now:
DDR4 Memory: amzn.to/2nM6BDy
Asrock: www.asrock.com...
Support us on Patreon / hardwareunboxed
Is Fast RAM A Waste? Unleashing the Core i9-9900K with DDR4-4000
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Music By: / lakeyinspired
Tim's video from yesterday "The Worst PC Games of 2018", will be released tomorrow. YT messed that one up on us, Gamers Nexus had the same issue with their video: twitter.com/HardwareUnboxed/status/1079699865028837376
Thanks for answer. ;)
Thanks for this video, it's a question from enthusiasts that no one in the entire youtube community has addressed nor has wanted to update.
#Bitwit had this problem too...
Where is the CL?? Is important info!!
Could you test at this speeds but diferent CL? I mean, 3000@ CL14 and CL16 for all the speeds. Or how about how to overclock ram?
human eye can only see 2400mhz ram speed
They changed it in 2019 now the human eye can see 4000mhz
we already have evolved to 7nm so the human eye already can see 4000Mhz in quad channel
It can only see 2400mhz if you downloaded it.
This type of jokes will never expire. Meh. :)
just take a few shots of some hard drink and brain and eyes would be overclocked enough to notice well beyond 2400mhz ram
I assume the "Lower is better" for FPS test is incorrect. Anyway, great test :)
Not if you are Ubisoft.
it IS better......................
for your wallet. (Kappa)
Remember that time when I made an obvious typo and no one pointed it out? 😂
Its for the cinematic feel. And 640kB should be enough for everyone anyway.
"Lower is better" is you are above 30 fps. If you are below, then higher is better, but no more than 30. That test measures how close you are to the ideal console experience according to Sony and Micro$oft (as related by Ubi$oft).
www.kitguru.net/components/graphic-cards/anton-shilov/ubisoft-microsoft-and-sony-demand-pc-games-to-be-locked-at-30fps/
Try this video with Ryzen plz
Ryzen likes high bandwidth because of the infinity fabric, but applications like games like low latency. 3200MT/s CL14 is the sweet-spot in terms of best compatibility, price, and performance, but in the real world it's more of a nice to have rather than a necessity, so isn't worth worrying over compared to like a cheaper 3200MT/s CL16 kit. You can also get better results with 3466MT/s, and 3600MT/s with tight timings, although 3600MT/s is gambling territory with Zen's IMC so it's not guaranteed to work.
Core i9-9900K! Legacy hardware still relevant in January 2019!
We have done quite a few Ryzen memory tests in the past.
@@Hardwareunboxed please with this 4000 mhz 1 more ryzen test we all want and do include ryzen 1st gen and 2nd gen ryzen 5 series.
@@Hardwareunboxed Please consider one where you compare 3000+ ram with XMP profiles vs manually tuned timings, it don't need to be a giant comparison, some more questionable sites just seems to suggest that tighter timings (same Mhz) yield amazing results on Ryzen
RGB will give 50% latency reduction.
Actually, when you buy GSkill RGB RAM, you will buy their best cherry picket ones.
I know it's an RGB joke @ABCDE0103 made.
Just letting y'all know it's actually a bit true.
Just rent some cloud ram and you will do fine with rgb
@@N0N0111 lol no it isn't. Any old idiot will buy RGB memory. Their very best dies go in the non RGB Trident Z. It nearly always overclocks better. Not saying the RGB only getts the crap die's, but the 3200 C14 dies in the RGB are double the price. And Joe Blogs is not buying that one (why pay double for the same thing? *their thoughts*).
This comment made my day LOL
why am i watching this at 5 in the morning...
@Atlantean You are ALSO high on meth ??? Does your mom know about this ???
because you are watching on your phone like me and don't want to get up and go to work
I sure wish you would have kept the CAS latency time fixed, rather than leaving it at CL17 for every clock speed. That puts the CAS latency at 8.50 ns for the 4000MHz kit and 14.2 ns for the 2400MHz. How are you supposed to draw any meaningful conclusions about different clock speeds when it's impossible to decouple the effects from different CAS latencies? Between the 4000MHz and 2400MHz kits, both the clock speed and CAS latency saw gains of 67%! Testing 2400MHz RAM at CL17 is very unrealistic, compared to kits that are on the market, and biases the results in favor of the higher speed kits. This test was poorly done and should be reworked using timings which provide the same latencies for every clock speed.
oh wow good catch didn't realize he did that I skipped to benchmarks :( Thats going to have a huge difference on the results. So this isn't an accurate benchmark at all. The differences will still be enough to warrant getting higher speed but not by this much but by how much?
I was thinking the same thing. It seams that the test would be in favor of the high speed ram due to cl staying the same .
Yeap, my 3000 kit has 15-15-15-35 and sometimes its faster than say 3400 with CL16-18-18-36.
They need the best avaiable kit for each clock rate and use their own best timings.
@@wh173 I have the g skill ripjaws 3200mhz with 14-14-14-(I think 34) although it was expensive
@@user-kk4bq7mb8u Yes but the point is there would be huge gains to be made on the low end whilst the high end memory is already capped out and accurately represented. It's like underclocking the low speed RAM which of course is going to perform worse to begin with. Almost all low/medium speed RAM is CL16 or better.
Here you go guys, choc chip cookie recipe:
Ingredients
3/4 cup granulated sugar
3/4 cup packed brown sugar
1 cup butter, softened
1 teaspoon vanilla
1 egg
2 1/4 cups Gold Medal™ all-purpose flour
1 teaspoon baking soda
1/2 teaspoon salt
1 package (12 ounces) semisweet chocolate chips (2 cups)
1 cup coarsely chopped nuts, if desired
Steps
1 Heat oven to 375ºF.
2 In large bowl, beat granulated sugar, brown sugar, butter, vanilla and egg with electric mixer on medium speed or mix with spoon until well blended. Stir in flour, baking soda and salt (dough will be stiff). Stir in chocolate chips and nuts.
3 Drop dough by rounded tablespoonfuls about 2 inches apart onto ungreased cookie sheet.
4 Bake 8 to 10 minutes or until light brown (centers will be soft). Cool 1 to 2 minutes; remove from cookie sheet. Cool on wire rack.
Enjoy 😎
Thanks my dude!
This is the most useful Post here
Brownies!
8 oz granulated sugar
1 1/2 oz cocoa
3oz SR flour
1/2 tsp salt
2 eggs
2 tbsps creamy milk
4 oz melted butter
3 oz copped walnuts or raisins .
Stir together sugar, cocoa, flour and salt.
Beat eggs and milk and add to dry mixture, with melted butter.
Add walnuts or raisins.
Pour into
Tin and bake on GM 4 for 30 mins. Cool in tin.
cream cheese icing:
1lb cream cheese
1/4c milk
1tsp orange zest
put cream cheese in microwave for 30 secs at a time till able to mix with fork. add milk and zest. mix. enjoy
cinnamon rolls:
2 packages active dry yeast (4 1/2 teaspoons)
1 cup lukewarm water
1 teaspoon granulated sugar, to proof the yeast
6 tablespoons shortening (Crisco)
1 cup granulated sugar
8 cups unbleached all-purpose flour (you might need more if the dough is sticky, up to 9-10 cups)
2 cups hot water
2 eggs, beaten
1 tablespoon salt
Softened butter (about 1/2 cup total)
Brown sugar (about 1 1/2 cups total)
Cinnamon (about 2 tablespoons total)
Add yeast to 1 cup of lukewarm water and add the sugar. Set aside for about five minutes. In the bowl of a stand mixer, add shortening, sugar, and salt to the 2 cups of hot water and beat for 30 seconds, using the beater blade. Let cool to lukewarm temperature. Stir in 2 cups of flour and mix until smooth. Add yeast mixture and mix until well combined. Mix in the beaten eggs. Gradually stir in the remaining flour and mix with the dough hook for about 2 minutes. Remove dough from the bowl and place on a lightly floured counter. Knead by hand, add a little flour if the dough is still sticky. Knead until dough feels satiny and smooth. Cover and let rise for 30 minutes. After the dough has doubled in size, remove it from the bowl and divide it in half. With a rolling pin, roll one half of the dough into a rectangular shape, about 22 X 13 inches. Spread dough evenly with half of the softened butter, about 1/4 cup. Sprinkle dough with half of the brown sugar, raisins, and cinnamon. You can omit the raisins, but my family loves them. Roll up dough into one long roll. Cut rolls, using a piece of dental floss or thread, about two inches thick. Place rolls in greased 9X13 baking pans. Now follow the exact same steps with the other half of the dough. Let the rolls rise until double in bulk. Preheat oven to 350 degrees F. Bake for 20-30 minutes or until cinnamon rolls are golden brown on top and cooked in the middle. Every oven is different so check at 20 minutes to be safe, but it might take longer. You don’t want them to be doughy in the middle. Time will also vary based on how big they get during the second rise. Ours take closer to 28-30 minutes. Remove from the oven and let the rolls cool to room temperature. While the rolls are cooling, make the frosting. In a medium bowl, whisk together butter, sugar, milk, and vanilla. Frost the cooled cinnamon rolls generously!
If you downclock RAM without tightening the timings, of course you'll get a difference in frame rate. It only makes sense to compare different RAM frequencies if you're also changing timings to match. The memory bandwidth would still scale, but frame rates will be more affected by memory latency and CPU cache than memory bandwidth.
Can you please tell me witch one of these give me best fps and lowest frametime for i9 10900k im stuck bitween 3200cl14 and 3800cl18
@@mehrdadbakhtiary6096 3200cl14 almost surely. The lower latency is much more important for good FPS than bandwidth, which both have more than enough.
RIP birds
raaaaaaaaaaa
Aay
I think 3200 mhz is best sweet spot
I was surprised because of slow internet but spoilers 1:30
I find tightening those timings really splits the bird in half every time
10:30 back in January, but it is January... *KOWALSKI ANALYSIS*
At 10:11 he says at the start of January 201**8** not 201**9**
@@dmynerd78 r/whooosh
Happy New Year to Y'All Guys
you too bud
May Jesus save you in 2019 cause you watch to much porn! Happy New Year!
+Därk Vador Lol I don't watch p0rn
You don't? It is so good mate ! Especially with a OC kit of b-die for 4k :D
"building a gaming pc right now is a bad idea 2018" how the tables have turned lol
11:50 - real reason why RAM prices are so high
Yeah, no.. he built up that collection with time, and had little impact on the market. You are just jealous.
@@MarcABrown-tt1fp Woosh
@@FroggyCrimes Uhh... what?
@@MarcABrown-tt1fp Woosh
+@@TooManyTrees1 +PraiseTheSun20 No that is not a "Woosh" or whatever the hell that means; I assume that came from reddit.
My comment has yet to see a reasonable disclaimer from either of you two, so I'm just going to leave my comment unchallenged.
Unless I'm missing something?
(EDIT*) Nevermind, I seem terrible at catching jokes, wow...
RAM speed matters so much I can't believe it gets ignored a lot during benchmarks. This is especially true with the two most recent Assassin's Creed games. I gained easily 15-20 fps going from 2400MHz to 3200MHz with an i7-7700k OC'd to 4.7GHz. I noticed my game being significantly less stuttery too. It feels like when the bottleneck is on the CPU, having a faster RAM makes a huge difference.
Yeah that's the key point, whether you are CPU bottlenecked. If you are it can make a huge difference, but if not it doesn't matter much.
@@shredderorokusaki3013 24 fps is not "very fast" lol
@@Malus1531 For Ryzens everything above 1 fps is very fast.
@@shredderorokusaki3013 GTX 970 isnt made for 1440p. But if your ok with 30 fps ^^
@@BITCOIlN you mean Pentium 4's :p Actually the GTX 970 is the total bottleneck there.
"2018; Why building a gaming PC right now is a bad idea."
2021 - Hold my beer
What you want is a kit of 3200 low latency (CL14) and then clocking it, get timings tighter.
The problem with most 4000Mhz kits, they're almost always CL18/19 with terrible and loose timing.
@Jason Pancakes
You can do the same with the 4000CL19 kit.
@@kindis4282 Except a 4000cl19 kit costs a hell of a lot more than a 3200CL14 kit.
Imagine if this 4000mhz kit can get to c16 or c15
@@jason.Pancakes Depends on the brand, kit and where you live. Prices from newegg:
G.Skill Trident Z 3200MHz CL 14-14-14-34: 200 usd
G.Skill Trident Z 4000MHz CL 19-21-21-41: 220 usd
G.Skill Trident Z 4000MHz CL 18-19-19-39 : 250 usd
@Jaz
Im sure it can with the right voltage.
CL14 means, the Ram takes 14 cyclse respond to a request from the CPU
Wich translates to:
3200CL14 => 14/1600,000,000=0,00000000875 or 8.75ns
4000CL18 => 18/2000,000,000=0,00000000900 or 9,00ns
So responses times are rooghly the same
2:55 No, it isn't margin of error. It's the tighter memory timings.
Contrary to what you claimed, you did not leave memory timings the same from one test to the next. You left the cycle count the same, but reduced the cycle length with each clock increase. The result is that you have reduced the timings at each step.
More concretely, a latency cycle count of 17 at 1200MHz (i.e. "2400MHz" in DDR speak) is an absolute latency of about 14.17ns. The same figure of 17 at 1333MHz is an absolute latency of about 12.75ns. And jumping all the way to 2000MHz, it's at 8.5ns.
If you wanted to keep the timings the same between tests, you'd have to increase the cycle count by the same percentage as the memory clock. So DDR4 2400 at CL 17 has the same timings as DDR4 2666 at about CL 19. At the top end, the cycle count for DDR4 4000 would need to be about 28.
@MyAssDefeatsYou It is relevant because it's severely gimping the lower end memories and they are not correctly represented.
thats what i thought
*Short answer* : between 3000Mhz and 3200Mhz is the sweetspot for performance. If you go over 3200Mhz you'll get diminishing returns or nothing at all. (I'm taking about gaming performance at 1440p)
Yeah, plus who plays at 1080p with a Vega 64/1080 or above level GPU anyway?
@@FF-jf8yg I do :D But I want 144Hz with all eyecandy on, so thats why.
@@FF-jf8yg Exactly. Of course some people do, but it's just silly. I use my 1080 ti pc to game at 4k high-ultra settings. Does it fine. I know in a few recent releases and upcoming games soon Ideally I should turn the resolution down to 1800p or 1440p, but for all of my current games library (Skyrim SE, Dark Souls games, DOOM, Divinity 2, Witcher 3, Nier etc, It's runs well at 4k60.
@@AndrewTSq Hmm, I have a Vega 64 and 1440p 75+ FPS is enough for me.
@@arrax8241 I never buy Nvidia cards, which is why I'm stuck with a Vega 64. I hope Vega 2 or highest end Navi can at least match 1080 Ti, and that's a minimum requirement among AMD fans right now.
3200mhz at cas14 with Samsung B die is what I would recommend. Costs more than the non B die kits that are typically cas16 but you get much more overclock potential. It's also the most optimal option for Ryzen if your motherboard supports it.
This 4000C17 kit is really using the same Samsung b-dies and pcb (or slightly better pcb) that you find on the 3200MHZ C14 kits. The only difference is that the 4000MHz c17 kit is using better binned dies that can clock to 4000MHz while retaining a smaller true latency and also doing that at a voltage of 1.35V. If you want you can downclock this kit to 3200MHz and likely be able to do so with a CAS latency of just 12 or 13 (and all that at 1.35V). It is possible to also do something like 3600MHz C15 or C14.
It's also incredibly expensive.
Grossly overpriced though. For most people not honestly worth the $$$. But of course if people want to spend that much more, more power to them, it's their money.
Unfortunately for mini itx users that want 32gb ram, 3200 cl14 is probably the best
Bullshit, it's a waste of money, get the cheapest 3200 cl16 you can get and youre good, I overpaid for b-die Ripjaws cl15 3200 and it can't go any higher and timings can't be touched, to tighten the timings I had to downlock it to 3000mhz to have optimized cl14.
Omg you guys did my request, tysm! Thanks for years of content and hopefully years more to come!
Buildzoid taught us a great shortcut to comparing memory frequency and timing combinations.
Take the freq and divide by the timing. The bigger the number the better e.g.
4000÷17= 235
Vs
3000÷14= 214
That's a good rough guide but there are loads of primary, secondary and tertiary timings.
@@Hardwareunboxed yeh definately. It's common to see c16 kits but then the primary sub timings are 18 18 40 or worse. May as well not be c16
@@Hardwareunboxed Nvm. you said it at the very end of the video :D
What were the timings in this video for each frequency? CL17 for all of them? Just thinking because 3200MHz I'd put it as CL14 or CL15.
@@AllThatJazOfficial If the sub timings are worse, I just average them so something like CL14-16-16 I look it as CL15-15-15
I got my 4000mhz c17 kit on newegg global for cheaper than most 3200 kits with good timings here in Australia. If you're going to go high clocks gotta keep those timings tight
Australia sucks, except for the animals and the people and the landscape.
@@mver191 he says as he watches an Australian UA-cam channel lol
Professionally presented content which is like a breath of fresh air. No BS and straight to the point. Thanks Steve.
Just an FYI, on a couple of the game benchmarks, namely Shadow of the Tomb Raider, you left "Lower is Better" on it instead of higher. Not a big deal, but figured you would like to know. Either way, great video and super interesting!
pls test rgb ram vs non rgb ram performance
This is like the most important comparison that no one has done yet and it is just irresponsible at this point.
"Ready to Go Boom" RAM?
I got a 300% FPS boost by upgrading to RGB ram. Can confirm.
That would be a complete waste of everyone's time. This man makes videos for a living.
GUYS,
Fixing the latency may skew the results (where differences in performance exist), however it does not make them "worthless" as some have said. If you see NO differences in performance between the EIGHT different frequencies then reducing the CL value isn't going to help as there's no CPU bottleneck. Conversely, if you see an obvious increase in performance with higher frequencies then you expect the same thing with the only difference being how your CL value compares to the one used. For example, if you have 2400MHz CL15 memory then you'll get higher performance than 2400MHz CL17 for this scenario where performance is scaling. So you can extrapolate very useful information. For example, if you saw minimal, but some, scaling above 3200MHz CL17 then perhaps get a 3200MHz CL16 kit or whatever.
curious about why you didn't use the improved CL as the clock speed is lowered? Often that is seen to give better results making the difference between high CL/high MHz vs low CL/low MHz minimal?
i have the same specs running on gigabyte z390 master and let me say something : when in intense situations 4000 mhz mem helps the min fps a lot and it saves my ass lots of time tho i am hardcore fps gamer
Sad but it could not save your extra 200$.
Hey bro I'm buying z390 aorus master with i9 9900k what I should buy rgb ram plz All help me I'm stuck on this choice
hold on..... isnt the 9900k limited to 2666mhz ram?
Xmp
Can't believe there's corona beer, corona software and corona virus.
3:21 the highlight bar needs overclocking
basically buy DRR4 3200 C14 ram and call it a day.
Some of those sticks can be OC to cl14 3466 or cl15 3600
Expensiiiiiive.
@@EnergeticMooMoo Yep. It´s expensive. CL16 3000/3200 MHz is quite cheap though (in comparaison)
buy 3200 cl16 and call it a day, cl14 is overpriced, literally twice the price in my country between cheap cl16 corsair and the cheapest cl14 flair x, with this money wasting you might as well go with Intel with the cheapest junk rams and have peace of mind.
@@BITCOIlN i agree i got my corsair 3200 cl16 last year and i am happy with it
Uff i am using ddr4 16gb 2400 for a year now.. Didn't even know that it matters.
Most kits will overclock to 2666 or 2800 pretty easily at 1.35v, give it a shot.
boingkster got my 2666ram to 3000mhz with 1.287v
i sold my 4x4gb 3000 for £180 and bought 2x8gb 2400 for £80, it doesnt.
Toms Tech why would you downgrade
Ayy, Dark Souls comrade. I'th am too.
Great video as always. Thank you for putting appropriate emphasis on timings, too many people just crank the speed and leave timings to auto, not realizing they are kneecapping their performance due to looser timings. I'd argue that in some cases having tighter timings would be better than having the next speed up.
I still think there are missing information such as CL of their respective bus speed. It might not make a difference at all in quantitative terms but perhaps it would highly reflect the value of their respective bus speed and cl timing.
What about DDR3 Memory is it still usable for gaming in 2019. 9lease compare DDR3 2400 and DDR4
Mohamed Guendouzi None of the modern desktop CPUs support DDR3 anymore. You can’t compare apples to apples because you’ll be using different CPUs, which adds more variables.
@@jamescaldwell6205 you can use lpddr3 with some z170 board , and use ddr4 on other z170 board. The comparison will be more legit in that case
Mikael Kab That would be true, using a. 6700k. I’m still not sure it would be fair though, because LPDDR3 may be slower than regular since its low power. I do want to see that test though
Here im sitting, rocking a i5 4670k @ 4,4ghz with just 4x4 1600mhz ddr3.
Maybe this is a retarded question, but ddr3 had way less MHz. So does that mean that every ddr3 CPU is bottlenecked by the RAM? Or did they just dont need that much MHz these days?
@@giuseppearpino8657 K Thanks
Also when it comes to Bottlenecking it 100% depends on your workload. Gaming wise you will not get bottlenecked by Ram, it can affect performance by a small % but it won’t bottleneck you in almost all cases.
I've got CL14 3,000mhz. However I was able to bump it to 3200 without touching timings. Otherwise, surprised it scales like this at 1080p. But like you said, fast timing 3,000-3,400 is just fine.
Great video Steve. I've pestered you for this test for a while now so I thank you.
I've very happy with my 4000mhz c17 kit for the low price I got it for. I'm going to push closer to 1.5v and drop to c16 or lower hopefully
Can't wait for 6 GHz CPUs paired with 6 GHz DDR5 RAM 🤘🏻😁
Well, 5ghz CPUs and 5ghz RAM is already possible
I believe 6ghz memory was already reached and cpus go to like 7-8ghz on ln2
I love your videos so much .
Best tech channel for me ♡
i have a i9-9900k and ddr-3200 mhz but the i9 supports only 2660 in stock. how i can activate the full power of my ram?
Maenjuu_DBL find XMP profile and make sure the ram voltage is set correctly to what the box or ram was designed for.
Fastest ram always give a performance improvement, the point is not this.
Testing rams without considering latencies is basically saying NOTHING AT ALL. Today a dual channel 3200 cas14 usually perform better or equal to a 3600 cas16.
Now even if we get a 4000mhz ram with a good cas we probably wasted 200$ for a 2% performance improvement.
Great video. As far as I know, you're among one of the few tech press that actually do great tests on RAM usage, and frame rates. I always thought the upper RAM speeds were neglected in in-depth reviews. Thank-you.
I kinda think that between you and Gamers Nexus, both of the channels put out the overall better content.
Informative and allot of work goes into it. Nice to find both channels, I have learned a bit by watching, thanks guys.
As for most of the rest I have ran across? Meah.
So many bullshit babbling memes, brought to UA-cam by Linus and crowd.
Whoops, did I say that on the internets?
Sorry..................
Fast [insert any computer part here] will never be a “waste”. Not to mention, there are already comparisons on games like fallout 4 where memory speeds can create up to like 15fps changed.
Memory speed is related to Bandwidth/instructions a cpu can read at once when performing cycles. That in many occasions is very helpful. Battle royale games and fps games with intense/diverse animations occurring at the same time can bottleneck your cpu and increase input lag even at higher frame rates, depending on how the engine prioritize its resources. higher bandwidth minimizes its effects and can give you a better experience.
What happen previous video, you know... "2018 Worst game title" ?
twitter.com/HardwareUnboxed/status/1079699865028837376
This "bug" happened also to 2klicksphilip, he were release premiere 4K video "Man VS Skyrim - Legendary Difficulty" 31.12.2018
It was 360p long time until next day. Maybe traffic in UA-cam at New Year event ? @@Hardwareunboxed
But what about the rgb... ?
Will this RAM work with 9700k? (With turbo mode and XMP profile on).
Although I understand you where trying to demonstrate the difference in memory speed and thus used CL17 with everything. But I'd like to see what this looks like with different CL ratings. When buying DDR4 CL 17 isn't my only choice and often fast memory comes with tighter timings.
I currently have some 3200 CL 14 I bought before prices went crazy. I recently picked up some 3733 CL17 memory as it was priced nicely and about $30 less than the 3200 CL14 kit I was going to buy. Probably won't be much of a difference, and for the price I'm content. Still would be nice to see there they actually compare. Update turns out is dosen't compare as It won't run at it rated XPM settings, 3400 was the best I would do at stock voltages.
Hi. Great content as always. I have 16 Gb kit running at 3200Mhz CL14. I can run it at 3600Mhz CL16. Would that give me better frame times in FPS games? BTW I have 8600k running 5.2 Ghz @ 1.35V
Should be the 3600, but you can try it out in a few game benchmarks.
flashkillpro I can but the games I’m interested in are BF V and COD ww2. I have no idea how to benchmark those games consistently to establish comparative results.
@@HB-ws6ge 3600C16 is better out of those two. Almost identical latency but 3600 has higher bandwidth. I bet you could also run 3900C16 @ 1.5V DDR, 1.3V VCCSA and 1.2 VCCIO
Vcssa shouldnt be over 1.25 theres no point usually, and it just hurts the imc. Either way though. Go for as much frequency as you can. Unless its between say 3200 c16 and 3333 c19... thats a bad swap 😂
@@AntzuPC I may just try with my kit, I run my 3600 kit at 3733 and 16-17-17-34, 1.4v. gotta try VCCSA and VCCIO
First time on this channel I would say - useless test. It must be different timings (the same number of timings mean linear decrease of delay and absolutely linear increase of memory bandwidth). Result is absolutly predictable and does not describe the real situation.
Also it must be wery intresting, if you will test 9900k with 4000 cl17 memory with stock and overcloced ring (will it have difference or not, and how big it will be - short but very interesting test).
@GaHHuKoB
Clocking the cores/NB will surely help. I think 55GB read is a bit low, partially because of the stock NB/Cores and the default timings. With tight sub timings and overclocked CPU 60GB read should be more than doable and even more with tighter main timings or 4133/4200, but i assume that will be a challenge for the Gigabyte board.
@GaHHuKoB and yet the take away using fixed timings is memory speed doesn't really matter. I fail to see how the test is useless. Don't fixate on frequency if that's the issue for you, focus on bandwidth, call DDR4-4000, 55 GB/s instead, DDR4-3800 53 GB/s and so on.
If you have CL14 DDR4-3200 memory hitting 50 GB/s~ then that will be comparable to DDR4-3600 CL17 @ 50~51 GB/s as I said in the video!
@@Hardwareunboxed the point is that the takeaway was obvious and expected. No one would argue that faster memory at the cost of latency is worth it. What we do often argue is that the real sweet spot is where real latency is lowest. Which is obtained roughly by taking the primary timing and dividing by frequency, result in nanoseconds. Obviously if subtimings suck that complicates the result, so we also tend to recommend tighter subtimings to the primary timing. Going for something like 3466 at the right timings is absolutely worth trying. Even 3600 can be worth it for two particular g.skill kits (one single rank and one double rank) I am aware of. I've done the math, that's the absolute sweet spot on price to performance.
@@Hardwareunboxed how are you calculating bandwidth there? DDR4 was designed for high cas latencies and heavily pipelines to avoid bandwidth reducing stalls.
Holy shit, finally someone showing actually legit numbers. Like in hitman where the jump from 2400 to 3200 is 20 FPS, depending on your setup, that could be HUGE. And yet everyone acts like men speed doesn’t matter.
I would like to watch a benchmark between 2x8 GB @ 3200 MHz comparing different latencies CL 16, CL 15 and CL 14 to see if it makes any difference (specially in games). I think it's another variable we struggle when purchasing a new computer.
16GB DDR4 3000MHz is the minimum for high-end gaming. End of.
true
No it's 2666mhz
With 16 gb ddr4 3000 being around 110 dollars it is definitely the best value
@@samleuning Is this US pricing again? Lucky bastards XD
@@HandleIsNewAndBad Yes
cl 15 3000 is almost as good as cl 17 4000 when you overclock the 3000 to 3200 and way cheaper. change my mind Steve.
Did you watch the video? I think we are on a similar page 😂
@@Hardwareunboxed yeah i baited you with my shit troll lol have a great new year man love your content.
i did this exactly 3000 cl15 oc to 3200 with same timings corsair vengeance rbg
Happy New Year Hardware Unboxed. God bless and thank you for the amazing content. Cheers from Bogota -Colombia
I noticed the WD-40 on the shelf. Is that to deal with every gamers' bain - squeaky gaming chair syndrome? lol
Happy new year from Argentina!
Steve, great content as usual but I still am missing a more in depth video not about frequency and timings, we have enough of those... but about actual impact on running single channel vs dual channel on mainstream, please check it out!
Didn't get all the testing specs. Why not compare with quad-channel and diferent brands of memory?
On average, the biggest difference was actually in Hitman and Tomb Raider. Battlefield V actually had the smallest impact from clock speed so it would have been nice to see the most CPU demanding title which is Hitman be tested at 1440p instead of Battlefield but I think it pretty much gets the point across.
Guys, rememeber the most important thing - your hardware must be balanced(for gaming).
If you have a powerful GPU with slow CPU - then it's not. Fast CPU with slow GPU is cool, but too.
For example you dont really need anything faster than r5 2600x for up to 1070ti, after that 9600k for up to 2080ti.
As for memory, you dont need anything faster 3200 mhz for up to GTX 1080 for example, memory is not a bottleneck there.
But if you have a top tier GPU and CPU - then yes, faster memory the better, up to 3600, as tested by much respected Hardware Unboxed.
I want to buy gskills tridentz rgb.. but i hear a lot of people about gskill incompatibility with ryzen 1000.. so they released the gtzrx for amd.. in my country i can buy gtzr cl 16 or gtzrx cl 16.. the price is a lot more for the x version for amd.. Did ryzen 2000 fixed the issue with gskill rams? i have Ryzen 5 2400g and MSI B450M Mortar, my question is, would i be able to run gtzr cl16 at 3200mhz stable? I know most of 16gb kits (2x8gb) are single side rank.. But not all of them are Samsung b-die..
The rams are: F4-3200C16D-16GTZRX PC4-25600 for amd version and F4-3000C16D-16GTZR PC4-24000 for intel version
Happy New Year, Steve and Tim ... love you guys; Happiness and health for 2019
Here's to 500k subs.
Would the Ryzen APUs benefit from such speeds and latencies?
Hardware Unboxed , Steve and Tim , Happy New year!🙋📅
Great video , can you do this with next generation of AMD ( Ryzen CPU)?
I really wish someone had the budget to test all the high end ram, and to see what the best one on the market is. Like the best mhz, and cl combination that they can find on the internet, etc.
It's funny, the only reason I went with a 3600 kit in the end was a crazy sale during Newegg's holiday sales.
32GB DDR4 3600. The timings were decent, too.
I'm tempted to spend time tightening those though, whew.
While single PC streaming, would a faster ram give me more headroom for the game's minimum fps?
I've seen a few boards now support in-between steps but routers and ISPs jump from 1 to 10 gigabit nothing in between.
recently upgraded to a 9900K but just remembered that my ram is 3200 cl16, for a second there i thought my RAM was no longer on the sweet spot. but seems like im still on the green moving up to 1440p :D
thanks steve! + thanks HUB!
Upgraded my memory from 3200c16 to 4000c16 bdie and I gained about 30 fps. Dellided my 9900k and run it at 5ghz gained about 10 fps more. It runs really good with a 3080
Anyone else like how UA-cam subtitles always say harbor unbox? Dat Aussie accent 😂😂
My old Phenom II 965BE @4GHz with 8GB DDR3 1600MHz CL9 and a R9 290 all on a Foxconn mobo runs battlefeild V aorund 35FPS on average @2560x1440p.
On some smaller maps i get up to a consistent 50FPS with a moderate to high amount going on in the game. On big maps it does suffer some stuttering but it's defiantly playable.
I did have memory issues when i first tried running the game but after loosening a few timings it ran fine, every other game i play runs good with tighter timings.
I have an 8700k at 5GHz and an RTX 2080, and I was getting huge lag spikes running BFV at 1440p Ultra with DXR at medium. Otherwise, frame rates were good at 65-80fps, but these huge freeze spikes were ruining my game. After some troubleshooting and nothing working I settled on upgrading my 2400MHz RAM to Trident Z 3200MHz. Problem fixed. So it's not just average frame rates you stand to lose when going for slow RAM - it was really fucking up BFV on my rig.
I got 4000mhz ram recently and it wont even boot unless I set it to 3600mhz anything above even 3700 will cause the PC to not boot windows no matter how much I've messed with voltages
Heavily depends on the Game code, test QC which is very sensitive to the benchmarking, that's how the game code should be written.
I think this just sows, even more than a year later, that a "moderate" clockspeed of ~3200 to ~3400 combined with a low latency is the best choice. So, if you have the choice between higher latency DDR4000 and lower latency DDR3400 or 3200, go with the lower latency.
Please do a memory comparison video testing performance differences between single-rank and dual-rank memory. A clear explanation of the difference would be nice as well. This one one of the murkier topics I've found when it comes to memory. Would be nice to see results on both Intel and AMD platforms. Keep up the excellent work! You and GN Steve are my favorite tech UA-camrs.
I have been watching your videos for about 1-2 months. Interesting benchmark. Keep up the good work. Subscribed.
Got much better results with the 9600k and 3600MHz 16-18-18-36 RAM 40+ FPS in Far Cry 5 & GTA V. The average increase was in the 20-30 FPS range. This is at 1080p where the CPU is most critical. My 2070 is now 10-20 FPS than a 2080 in the same system using 3000MHz Corsair Vengeance LPX RAM.
I'd still be interested in a short video comparing timings and a given speed, just to have the numbers. Maybe just test 3200 CL14-18. Or perhaps a tech video on the relationship of timing and clocks would make more sense.
Keep in mind that with less cores to feed, less bandwidth should be needed (if the application is well threaded). I'd be curious to see the scaling of an 8700k in BF V for example, in comparison to the 9900k and would expect that less bandwidth is needed before diminishing returns settle in.
Might be a stupid question, but if you compare clockspeed against extra ram, say 16GB 3400Mhz(or higher) vs 32 GB 2666Mhz. What would be recommended there?
@Hardware Unboxed
I have a couple of questions. Did you guys carry the ''gaming performance'' tests with in-game benchmarks or with real-time gameplay, i.e. the realistic gaming performance ? I assume those little gameplay scenes are used for that ? I'd be happy if you share your thoughts on the in-game ''benchamrs vs gameplay'', not only for CPU/subsystem performance, but also GPU performance. Testing games these days is ''trivial'', but IMO only a few channels/review outlets actually do proper and usefull testing.
AIDA64 results - Isnt it 55GB a bit too low, or its fine since the cores/NB is untouched, and also the timings ? Can you share the results from the test, its not really visible from the little off screen on the left.
The feedback will be greatly appeciated !
Ekind
Shadow of the Tomb Raider and Battlefield V were tested in-game (BFV obviously was). 55GB is what you will get with this kit out of the box with a stock 9900K.
@@Hardwareunboxed Thanks for the feedback.
lol at bird dying effexts
You guys should see how the scaling works with iGPUs. Dual channel 4000 memory would probably scale way better than 3200 for performance with an igpu being used as a place holder for those saving up for a dedicated graphics card. How far could over clocking a 2400g w/ dual channel 4000 mhz ddr4 get you in gaming?
It won't run it, so black screen? 100000 fps black screen, or 0 fps black screen, just depends on your perspective.
Sooo, how should I set my Dominator 2x8 2400 CL10? Right now I've set it at CAS13 and 3200Mhz, 1.41V (Cooling is no problem, -10°C outside atm where I live 🇸🇪🇸🇪🇸🇪❄️❄️☃️) Anyone knows how far the Dominators can be pushed?
6:50 wouldn’t CL14 3200 match a 3600 high CL timing ram?
He said exactely that in the video.
So that means cl 14 3200 is better because of timings?
The hard work you put into anything you do for the channel really amazes me. God bless you and Tim! Happy New Year from the PHL!
Happy New year Steve and Tim. Keep up the good work
With the increasing number of cores and likelihood of running a separate monitor for your 'Discord', email and You tube apps would faster Ram in a system have a greater impact on game performance? That said you showed the CL14 approximation of bandwidth compared to CL17 Ram in your video. What about 'T RAS' latency?- I've seen 'hints' from other - less thorough testers than yourself - running 'T RAS' down to 28 from 35 and having quite an impact on frame performance.
Is it possible to run a 3200MHz 'experiment' and test the reduction in timings instead of just speed? AMD Ryzen as previously mentioned is affected by speed and timings much more than Intel CPUs, perhaps with the new perported up and coming releases we could see if they've 'fixed' this latency?
You do great work!.
@Giovanni P.
Intel is also greatly affected by sub-timing optimization, just like Ryzen. By what amount depends on the memory kit, motherboard, etc. I dont know why this myth about Intel systems not benefiting from ram is so spread though. Doesnt make much sense for me.
Guess I should be very happy with the 32gb 3200 CL14 DDR4 I got 2.5 years ago for $185.
Just wish I would of gotten 64gGB back then, as it much more expensive now (end even more so just a few month back)
I have 4266 trident RAM and cannot get it stable with a i9-9900k lol
You know... I have 1 stick 8 GB because I planned to upgrade it to 2x8GB.
My questions:
1. Does using different memory brand affect the compability and perfromance? I am prepared to get the lowest of any sticks, but will it drop below the lowest stick?
2. Does Single Rank and Dual Rank will affect the compability?
3. I want to ask about 4+8GB, but you answered it in your previous video, nice one :D
It's always recommended to by a kit of ram, like 2x8, than buying a single stick and later adding one because it can cause compatibility problems even if it's the same brand and product number. I definitely recommend going 2x8 as it can yield a lot of performance ua-cam.com/video/9w1U-kForAU/v-deo.html
Umm don't you have to have a specific motherboard for this? would be nice to know what gear your working with
Is there a big performance-difference between DDR4-3600 CL16 16-16-36 and CL16 19-19-39?🤔 Or is there even any difference at all regarding gaming in 1440p
The problem with fast ram is you're always better off spending the extra money on a better gpu or cpu, so unless you already have the best of those, it's a waste of money.