when DDR4 was relatively new, you could get great gains even going from 2133 to 2666 MHz in certain games. But the small L3 cache of CPUs at the time was probably one factor
The cache is a MAJOR factor. This can be seen in the 5800X-3D CPU, where it simply doesn't care what quality DDR4 its paired with. The data it requires is almost always in internal large 3D cache. This will be replicated in the future 7000X-3D CPUs as well. Memory Speed and CAS latency are definitely going to become less important as CPU cache get larger and larger.
Brought my 7600k back to life for a while by upgrading from 16gb of 2400 to 32gb of 3200. But in the end it was just part of the upgrade path and not an upgrade I can recommend.
They appear to have made a real effort to bring memory guys on board, EXPO might be better than XMP which was limited timing adjustment. In late July/early August DDR4/5 prices were closer, it surprised me. I wouldn't want to buy my DDR4 again as it's risen considerably. OEMs pay bulk contract prices so I expect pre-builds with 7x00x will be more competitive than early adopter DIY, with the new release price premium.
@@Neggy-Z Micron more like Poopron when it comes to DDR5 And according to Buildzoid, pretty much anything at 5600MT/s and above is pretty much guaranteed to have Samsung dies (reliably OC to 6000+) or SK Hynix (very easy 6000+ at good timings). And looking at some 5600 kits being pretty inexpensive already, it may not be the worst of ideas to pay a bit more for much better stuff
I think it would be interesting to revisit this when the new Ryzen CPUs come out not so much as a comparison against DDR4 but as a comparison between Intel and AMD with the advantage that you already have the Intel data as a baseline so if you use the same memory modules and test suite it would save you a lot of time.
I really like your idea, only problem is that if the modules that he has don't have AMD's new EXPO, then it might not work. More than likely he would have to buy some new DIMMs that have both Intel XMP and AMDs EXPO capability onboard before he tests.
This will be a lot more interesting with the upcoming 40series GPU esp if it turns out to be more than 50% faster. The same thing has happened in the past where in the earlier days of DDR4 speed didn't make that much of a difference, then a couple gpu generations later jdec tier ram became a bottleneck
@@NESW2000 plenty have been holding back from upgrades since around 2019 and the pricings of tech since then ddr5 is gunna be worth it for plenty of people for now
@@NESW2000 most of the leaked data is suggesting that the high end is going to be double the performance of the current gen. It's a node jump for both companies, both making architectural improvements, both increasing the snot out of the power draw.
@@brendago4505 'leaked' data is infamously unreliable. I'll believe it when I see it but if it is no one will be able to get one again as they will all sell out
@@tonypeperoni5818 i was holding back from upgrades since 2012......, If Really Nvidia and Intel consume a lot of Power (ill see reviews first) then ill go AMD and full 4K (im not only play, but do Photo/video editing and digital paint), 2k or 1080p with high Hz are so fuking trash even in 2022...
Makes sense, so if I were to build a new system right now I'd definitely go DDR5. But here's my dilemma, I'm currently on an i7-12700K with 32GB (16GB x 2) DDR4 3600Mhz CL16 and a 4080, is it worth it to swap out my motherboard and RAM just to upgrade to DDR5? I game @ 1440P. Side note, I would've loved to see the Spider-Man remastered results @ 1440P as well.
15:25 I actually upgraded to Alder Lake earlier this year and went DDR4 because I already had a perfectly good 64GB of 3600mhz 18CL RAM, and didn't make sense to pay the premium on new RAM for marginal gains when I already have a perfectly good kit of RAM.
@@provisionalhypothesis I need a system that can render my videos at a reasonable speed so I can get to work on the next one. It's not "obsessing", this is how I make my income.
I've recently watched BZ's RAM timing series where he explains some of them, and which timings are relevant to which operations. Would you be interested in testing which timing effect which games in your testing suite? RAM is usually presented in reviews with only with transfer rate and tCL or primary timings but we never see things like tRRDS/L, tFAW, tRDRD_sg/dd and command rate, which are far more relevant than the primaries for data that is scattered between different banks/bank groups. Considering the advantages of dual-rank memory over single rank in gaming I reckon those timings could prove more important than the primaries. This is something talked about by XOCers, tuners, and random people in forums, but I couldn't find any properly standardized testing as you or GN do.
@@NVMDSTEvil There's a few out there but they they are in Russian and Chinese. If you are familiar with overclocking memory for tests like Y-Cruncher, PYPrime, Geekbench, Super Pi and Time Spy CPU they are worth a watch but of course there's still points that I have no idea what point they are trying to get across
Exactly. 4 x 8gb D4 at 3200c14 is a poor way to test against D5 imo. I can easily run daily stable DR 2 x 16gb Samsung B-Die at 4133c14 (12600K) with 2nds and terts maxxed ( with good high bin G.Skills) but testing at say 4133c15 1.5v should works fine even for a 3200c14 bin kit. 3200 is just too slow to be put up against D5 especially since 12xxx has a much better IMC(s) than the previous gen Rocket Lake that fully maxxed at 4000 for the best and 3866 for most cpu's.
Gamers Nexus did a couple videos comparing XMP vs tuned memory performance around the time that the i5-10600 and Ryzen 5 3600 launched. The problem is that it takes a lot of time to do (as anyone who does mem OCing would know, with all the tweaking, stability testing, rebooting, etc. involved), and on top of that it is a niche that isn't going to get a lot of clicks/views/engagement, except with the hyper-nerds. So for the reviewers, it's a big time and effort investment for low returns. Hopefully they do another one for DDR5 after Zen4/Raptor Lake launch, but I wouldn't hold my breath.
It's tempting. But, I think I'll give some extra life to my current rig. New CPU and maybe a GPU. Thing is far from dead. No point ditching it for a new platform.
I agree. If your a heavy gamer or your earnings are derived from gaming....YES upgrade. Otherwise stay with DDR4 and Zen 2 or 3 or currently intel. Myself, I'm not a heavy gamer, I have a ryzen 3700x and GTX 1660 super and I don't plan to upgrade. Let prices stabilize, DDR4 market diminishes and then build a new computer. I'm looking at 2-3 years.
With the upcoming price of AM5 motherboards looking to be double to triple the price of equivalent B550 motherboards, I can't see any reason to upgrade platform unless you hate money or are fine with no cpu upgrades after 13th gen on the board. Still feels like it's more cost effective now to just buy the fastest gpu you can, increase resolution and keep your cpu as irrelevant to performance as possible.
@@mick7727 Yeah I have a AX370 motherboard. I just updated the BIOS and bought a 5800x3d for $225, ran a -28 all core CO offset and it runs cool no problems with a single tower air cooler.
Thanks for the timely video, Steve. That's something that's good to understand when leading up to the launch of Ryzen 7000. I already saw 2x16GB 5600CL36 available for around $150, and although price is now back up a little I suppose it will be drop back. The price also doesn't seem too bad considering that's about what I paid for 2x4GB DDR4 in 2018. Hopefully when Ryzen 7000 launches you'll do a video comparing some memory speeds. I rather expect you to. You're good like that. :)
I think we got two scenarios for ddr5. Either the new gpus will enable higher performance, or new cpus carrying way more cache will reduce how effective ddr5 is
I'm looking forward to seeing the next generation of "high end" APUs from AMD combined with DDR5. It's obviously not going to compete with discrete cards (unless they're extremely bad), but the ones in the 5000 series seem to be both memory and compute limited.
@@MrMartinSchou Yeah, unfortunately I don't think we are gonna see anything different in the apu space until at least Zen 5 or Zen 6. Just the same 8cu garbage or whatever they do
It would have been nice, if you’ve added one or two benchmarks for non-gaming scenario compiling, 7zip, cinebench for example. In another video I’ve seen, that compression benchmark in 7zip had a 30% boost with ddr5.
This is really awesome content considering the buyer's advice really was to stay away from DDR5 because of the price across the board. Now when I'm considering buying a new computer from scratch without having any DDR4 laying around, going for DDR5 is actually something to seriously consider because it'll still be viable a couple of generations down the road.
i think amd doing the opposite thing of the am4 here, seems like they released their most powerful cpus first then release the efficient ones later, just a guess
Well with how fast GPU tech is advancing you'll pretty much be replacing your motherboard every 3-4 years anyway, if you don't want to be severely bottlenecked.
@@rokaspleckaitis566 They're a half step above the Linus bots. You can get quality b-die for the same price as this garbage tier DDR5, and every K sku AlderLake chip can easily do 3866-4200MHz DDR4 with good timings. Even if you don't want to tune timings, 3200MHz is straight clowning in 2022.
With the speculated 4x nvidia benchs, upcoming cpus and ddr5, we might witness one of the biggest performance leaps we've seen over the years hopefully
This is exactly the kind of testing I was looking for. Well done Steve. Can we have a test at normal use cases? Eg. High quality settings on a mid level GPU
Its starting to make sense to look at DDR5 for new builds. I have a ton of fast DDR4 on hand and carried it to a D4 12700k build when ADL released. No regrets. Anyone who is looking to build from scratch and use all new parts is better holding off and seeing how 13th gen and AM5 shake out in the next 6 months IMO.
Glad to see that even entry level DDR5 is pretty good since I'm planning on upgrading to a 7000 series CPU next year. I imagine with the release getting fairly close, the ramp up in DDR5 production will improve both performance and price towards the end of this year.
Nice day for fishing :) . Really liking them figurines. As for GPU limited scenarios: dropping the quality settings alone is only going to get rid of the shaders related bottlenecks. You may still end up limited by the ROPs of the GPU. To get rid of that you need to drop the resolution as well - 1080p is still too high.
DDR4 platform with Samsung B-Die, dual rank and tuned (i.e. 4000mhz cl15 + adjusted timmings) is still faster for gaming than any DDR5 platform at the same price. Period.
I understand the point of using such a high end test bench is to eliminate any bottlenecks other than the specific hardware being tested. That being said, I would still love to see a test like this run with more entry level hardware, so I could see what improvements I could expect with my system.
Thanks for a great comparison! Just as a suggestion. It would be really cool to see performance difference on the titles which are old, but are highly CPU and RAM dependent, such as ARMA3 and DCS: World...
Get back to Ryzen 7000 testing Steve since you aren't blasting noobs on Fortnite. 😛 Honestly can't wait to see the results of it and testing RAM like this is gonna be essential for building your new AM5 PC, so thanks for this. 😎👍
Yeah the wait is dragging, Steve should be Masterizing noobs on Fortnite WITH a 7600x! 😁😆 There's tease videos out that just repeat launch uncertainties, those sampled will have a pretty good idea already on those RAM speed questions.
@@chitorunya Yeah, but sometimes there's clues, I saw a video with a Ryzen 7000 in mobo, discussing memory and coolers. My issue is the guy is repeating speculation when he must be able to know the answers .. it's a click baity tactic. He used to work for a channel that earned a reputation for unreliable FUD about AMD. He read out Robert Hallock's discord statement but then ignored it, talking about settings for Zen2/Zen3 memory overclocking although the info was to use AUTO:1:1. He has found a way to make video and hide behind the embargo, as he is simply repeating public questions asked at the launch and the days after.
"blasting noobs on Fortnite". Fortnite is a lame game for lame wannabe-gamers, wannabe-people, mental noobs who can't even spell their own name without searching it on Google with their smartphones.
it seems to me that ddr5 can give great benefits in performance in games when playing in 1080p, because you are not gpu limited. But when you play at 1440p or higher, you become more gpu limited, so does ddr4 or ddr5 speeds matter in those cases?
Pretty clearly it doesn't. It's the same as spending a bunch of cash on anything higher than a 5600 CPU at 1440p or higher...it's a total waste for gaming.
I think it is unbelievably stupid for people with 1.5k+ systems playing on 1080p in the first place to be honest, your system at that price point and above is so much more capable and you are spending thousands for numbers that really are unnecessary lol I’m sure 100fps at 1440p is still more then smooth to enjoy a game, rather than double that at 1080p, with a system that costs thousands being wasted on a low resolution
@@connorosullivan3500 It's primarily the esports players that play at 1080P and lowest visual settings to eke out every small advantage they can get from lower latency.
I'm still holding off, CPU / RAM isn't what's holding me back currently, next up will be a midrange next gen GPU. 2024 is when I think i'll do a platform jump, would like to go from 64GB to 128GB when I do it. For my day to day, memory capacity and thread count are where i'm hitting limits.
@@spacechannelfiver smart move. Way to early to jump. Price needs to stabilize and supple needs to increase. If you just built a computer within the last two years, certainly not a time to upgrade. You typically want to hold on a rig for 4 years until you upgrade all components.
@@silas232003 this is the thing people often miss about PC, you get them originally at a certain upfront cost; but can just keep patching them for years and years. I originally built my desktop in 2008 and it's nominally the same computer now, although not sure any original components remain at this point. maybe a hard disk or fan or something. Edit: if GPU or Platform price spikes then you can put your annual budget into a new monitor, or PSU, or improve peripherals.
This, definitely. I thought I would se a 4000Mhz CL16 kit at least in there, of course with hand tuned timings, XMP is worthless. It would probably kick ass.
@@rokaspleckaitis566 No doubt but how many people have the knowledge or interest in tuning memory, most people just want to install and enable xmp and start playing.
A few months ago in the no sleep zone I said I was dissapointed that your first DDR5 test was with 4800 instead of 6400. I see now that it doesn't matter that much so I stand corrected.
Worth noting that if you get 2 sticks of DDR5, do not expect to easily upgrade to 4 sticks later at some point. You'll have to drop bus speed by a fair amount.
I would like to see the differences when the APU's come out, I am not interested in anything but APU's going forward so I hope DD5 ram makes a difference.
@@JohnDoe-nh7vx bad info, on apu almost no difference between 16 and 32gb. On normal high end CPU + high end GPU it can be 5-6% difference in gaming AT MOST. Sounds like money well spent yea? +5% performance for 200% cost?
Misleading title. 'Cheap' DDR5 is more expensive than quality DDR4. And the performance gains in most games are next to zero. Most consumers already own DDR4 kits, so it isnt $50 DDR4 vs $90 DDR5, its $0 vs $90, for like 1% more performance on average. Absolutely not worth it.
Cheers mate for the vid. This has helped me make a decision- update to the very latest specs or save cash in the short term. I guess I am better off spending the pounds in the interest of longevity.
The ddr5 mobo's are another part of the catch 22 and not just when it comes to the pricing, there has been a bunch of horror stories on the topics of compatibility and stability in overclocking across all boared from all the big name manufacturers, I probably went through probably 20 pages on a single Reddit thread trying to do research before I chose my new upgrade platform after my 1st gen Tydens system called it quits, and it was filled with complaints and frustrations of people who were asking for help troubleshooting and giving up stating that they'll RMA they're ram modules or boards, and this was a 3 month old thread over multiple bios updates
Any new tech have horror stories.. like when RTX 2000 released & people was spreading stories about there was a bad batch ..etc But for now, DDR5 been like 1 year out, so it should be stable overall (Intel already was the beta taster Lol) Plus, people levels are not the same, God who knows what they were doing wrong , wither it's hardware error or software compatibility/programs error.. so don't judge from a reddit pages, judge on the trusted UA-camrs results (like this channel)
Hey steve great testing as always, but would have been nice to throw in some cpu testing in there as well. Plus maybe some "realistic" tests, maybe with a 3070/80 in 1440p
ACC is quite CPU bound when playing with the ai. The ai are subject to the exact same physics as the user, so you can get huge frame rate gains just reducing a couple ai cars. I went from a 1080ti to a 3080 on a 8700k system and saw almost no frame rate improvement on max visual settings. Took 4 cars out the pack and had a boost from 70fps to 90fps. It's a bit of a mad one tbh
ACC is also peculiar in that it absolutely loves L3 cache (likely due to running all those physics calculations for all the cars) at present the AMD 5800X3D is massively better for it than any other CPU.
I'd love to see more of these tests done with 1440p and 4k resolutions. Sure it makes the graph differences smaller in regards to avg. or max fps, but you can compare minimum fps performance and frametimes' stability.
Since you are already GPU bound in 1080P on some games, what point makes 1440p or 4k? There will be no difference between the worst DDR5 and the best/most expensive one unless you will have a 4090ti o.O
@@tjintell Validation. To demonstrate that what you think is true is in fact true. That's why the Nvidia driver overhead benchmarks were so shocking - pretty much nobody tests that way and the scenarios where the CPU was bottlenecking the GPU were with combination you would normally assume are fine.
I was planning to ask it in Q&A but might as well try it here ;) Is there a way to determine if RAM is single or dual rank before buying it? Second question, are You plannig to make a video about DDR5 "sweetspot" frequency/timing for Zen4?
@@HazzyDevil I know how to check it after instaling in the system ;) My question was about checking it before buying it ;) But thanks for the answer anyway and I guess online I can always google it to check.
@DaKrawnik420 I was going to reply with this earlier, went around looking at the 4 kits of ddr4 i own and oddly enough not a single one of them mentions this, remember seeing the rank number all the time on ddr3.
Same here. Probably gonna be better to grab a 5950X and DDR4 on fire sale from a value-for-money perspective, especially if you've already bought DDR4 in anticipation, like I have.
A 3200CL14 kit is slower than a cheaper 3600CL16 kit. I understand that 3200mhz is more convenient for you since you test older CPUs that have trash IMCs like 1st and 2nd ryzen, but most new builders today are buying 3600MHz kits.
I upvoted because I certainly agree that B-die at 3200 MT/s is extravagant, but it's worth noting that locked Alder Lake CPUs may not be capable of going past 3200 MT/s, at least not without a manual (non-XMP) overclock, and at least not in Gear 1 mode. 3600 appears questionable even with a manual overclock. (The system agent voltage on non-K Alder Lake is locked at ~0.9 V.) FWIW, Buildzoid says that 3466 MT/s is a "safe" assumption on locked Alder Lake--and I managed to achieve that myself on my i7-12700 (non-K). But I couldn't get 3600 to work. Memory overclocking is not for the faint of heart, so it's probably best to stick with 3200 MT/s on these CPUs, if we're assuming the average user.
@@RedundancyDept I'm aware of that VCCSA lock. I suggest you try Comand Rate 2T if you haven't tried it. That might allow you to run 3600MHz in gear 1.
Would have been nice to see that 3200Cl14 kit overclocked to 3800 Cl14 or 15 because basically no one on the planet will buy samsung B-die to run them at stock speeds especially at 3200mhz since both Intel's and AMD's sweetspot is around 3600-4000Mhz.
DDR5 is still double the price where I live, so, no - it's *definitely* not time to leave DDR4 for everyone. Also, new AM5 motherboards START at about $150 USD (as quoted by AMD), so the price to get a decent AM5 setup is still quite expensive (including Alder- or Raptor Lake for - at least - the memory cost, too). Nice information otherwise.
Indeed all the tech channels never take into account that prices vary A LOT around the world. Still the graphs are enough for people to make an educated decision.
Protip: AVOID ALL 8gb DDR5 sticks!! They are x16 memory chips and are far worse than an equivalent 16gb kit (You can see this in the video!). The sweet spot is Corsair DDR5 5600 CL 36. You can get a 32gb kit for $170. I own this specific kit and it easily overclocked to 32-34-34-38 at 6000 MT/s (On Alderlake). It's a really great value because I get double the bandwidth (~96gb/s in AIDA) of DDR4 setups with a "similar" latency of around ~60ns. This is similar pricing to a good B-DIE bin of 3600 cl14, and is way better.
My main purpose is to play games and I game at 1080P. I don't really need more than 16 GB RAM. I recently got a Ryzen 5 7600X as a gift and I am making a build around it. What would you propose for me? It really bothers me to get more than 16 GB RAM knowing full well I'd never need it.
This makes my decision to wait for Zen 4 3DV look really good. It'll be even cheaper by then, and the 3D V-cache will make up for a lot of the latency issues.
If you are building a new DDR5 system you really should not be putting only 16GB of ram it it. Especially since DDR5 as a platform really struggles to run 4 sticks due to the high frequencies, so adding 2 more sticks later is not a good option. You should start with 2X16GB or you will regret it later.
Maybe I missed it but I'm curious about why you used 2 stick DDR5 kits vs 4 stick DDR4 kits. Also, when I went from 3200 CL18 to 3600 CL16 (and OC'd), the difference in games was minimal. But the difference in Davinci Resolve and After Effects was massive. Hoping for more of the same when I pick up a 7950x to replace my 3950x.
Most ACC sim-racing setups needs a lot of graphical horsepower. Firstly, because most sim-racers race on triple screen setups, VR or a large 4K TV, meaning there is a lot of pixels to "drive". Secondly, most enthusiasts use higher Hz monitors as they are not okay with just playable levels of FPS (e.g. 60 fps) since more FPS will mean you have an easier time hitting your braking point and will get timelier information of the cars trajectory in corners, which is really important because sim-racers can't rely on the "pants in the seat"-feeling of racing on track and therefore rely even more on visual ques than drivers in racecars. And VR is often 90 Hz displays so for VR sim-racers it is a must to hit that frame rate. A good amount of enthusiast youtube-sim-racers setups are three 144 Hz 1440p monitors and really high-end PC's (RTX 3080/3090 and 12900K/5950X) to get >100 fps on monitors with a combined 11 million pixels. Where 4K is about 8 million pixels for context. My own setup is a more modest triple 144 Hz 1080p setup, where I target >90 fps in-game because my old PC is having trouble keeping up with games like ACC at that resolution. Triple screen setups are normal for sim-racers, who are not running VR. The added screen real-estate means you can see more of your car and can spot other racers in the sideview or mirrors, at a realistic field of view from the cockpit-view, when racing side by side. Higher Hz displays are because when racing GT3 cars in ACC you are often traveling at average speed of about 180 kph (110 mph), and often doing close to 280 kph (170 mph) at the end of straights. Meaning you travel 75 m/s coming into the braking zone. At 60 Hz you are traveling 1,25 meters per frame. And at 144 Hz you are traveling 0,5 meters per frame. Therefore, higher fps can make it easier to hit you braking point, which sets up the corner, and better show the cars trajectory traveling towards the curbs. Hope this gives you a bit of context as you mentioned, Steve. :)
Good explanation, pro sim racers go further though and basically treat the game as a CS:GO player would but with triple screens (I use triple 240hz 1080p monitors)
Something else to consider would be the 7800X3D that will (presumably) function similar to its zen 3 counterpart. Maybe high quality memory won't be that important
As much as I want to upgrade to AM5, I think I am going to wait until next year when the 3D cache CPUs come out. I recently got a Samsung Neo G8 4K 240Hz HDR monitor, and I think my money would be better spent on a RTX 4090 than a new CPU platform.
I probably will buy both next year. Currently sitting here with the 5800x3D and a good 6800XT. This test for me was interesting because I have games where the CPU and the GPU are fully loaded + MP title &4k So average DDR 5 RAM will boost it even further
@@Vanadium And then Black Friday rolls around and I am able to get a Ryzen 7950x for $180, a AROCK RIPTIDE AM5 Motherboard for $149, and some Team T-Force Vulcan 32GB (2 x 16GB) 288-Pin PC RAM DDR5 5600 for $129, so it looks like I upgraded sooner than I thought I would.
It would not be quad channel, it's still only dual channel. But using 4 sticks means it's dual-rank per channel. Which in terms of number of banks to swap to, it's similar to the 2X16GB ddr5 kits. So yeah you are getting a performance boost there certainly.
ddr4 prices at least in the us are pretty darn cheap with some 2x8 4600mhz kits being like 116 usd. But ddr5 is defintely getting better and intresting to see how much some games seem to get huge boosts from it
@@aos32 Why is there always someone saying that? Wasn't long ago they'd say more than 4gb is useless. Things change, and not everyone with a pc uses it just to game. 64gb (or more) is totally useful for many people.
Not sure 4xDDR5 modules would make upgrade sense, 2x configs only are rated for high speed. Thanks for testing the cheap memory. I wondered about a cheap DDR5 memory strategy, planning to replace it later as game performance demands rise. If my X570 mobo or CPU dies a death out of warranty, I can at least consider moving to AM5.
@@kingalex2nd991 AMD released a slide suggesting a large reduction of bandwidth in quad dimm configs. Intel had similar DDR5 performance drop. Therefore until the embargo is lifted or a new breakthrough alters the situation, most will avoid the quad configuration
@@RobBCactive DDR5 has more bank groups them DDR4. A 2 DIMM DDR5 setup (That is actually proper DDR5, 16gb per DIMM) will have the same amount of bank groups as a quad-dimm DDR4 setup. So you get almost no benefit going from 2 DIMM to 4 DIMM (You actually get slower because it's more taxing on the memory controller and you get electrical interference!). Stick with 2 DIMMS and get the fattest sticks you can get (They make 32gb DIMMS at 6000 now!)
People, this video is just here showing you when you are building a new pc, that DDR-5 can be a solid choice to go with. This video is not saying change now. If you are making a new pc, the DDR-5 choice is something you can keep in mind. Also, there is no rush, you literally can wait when DDR-5 get more refined like literally every invention in history.
I drew that conclusion too, the DDR5 price fluctuates alot in DIY market though. OEMs should be ramping demand now, they won't be paying the high premiums on bulk contracts.
My take is there's no downside to DD5 now prices are coming down, it's the motherboard price that makes the difference. When the next 3D vcache CPUs and Raptor Lake arrive, hopefully boards are cheaper then as well. No idea how long AMD will support CPUs in the new boards, but I still believe it is longer than Intel.
In regards to ACC. I have seen improvements in performance by upgrading memory in the past. I only drive in VR. For flat screen gaming, as long as you have a half decent rig, you should have a good time, but for VR, you need all the performance gains you can find.
I'm very curious to see how this kit performs relative to faster speeds/timings with new Ryzen CPUs, since Ryzen has always been quite sensitive to MT/s
@@silasmayes7954 Why and to what extent? The population weirdness is something modern platforms are equipped to deal with. You populate one dual-channel pair with the same sticks, and another dual-channel pair with another two sticks, then you have each channel housing 12gb in a somewhat weird address and rank interleave, but if it works, it works, and the two channels are equal to each other, so it's not even hybrid dual channel, it's true dual channel. The overall timings are that of the worst of the 4 modules, plus some penalty for all 4 slots being populated, if the particular board thinks it needs that.
In driving simulators, people often run multiple monitors in order to be able to see cars that are passing. But at the same time , higher frame rates are generally considered to improve the players ability to react at high speeds. Competitive players will likely need higher framerates across triple monitors, but more casual players are more likely to care about visual effects.
Salty Intel fanboy detected.... Too bad your _team_ launechd Alder Lake when the memory was more expensive than the CPU. For me it is simple, if I was building today, I would not put Alder Lake on DDR4, because then you are locked to that unless you change out your motherboard. Sure DDR5 is not as cheap as we want, but it is so much less than it was when Alder Lake released that it makes sense now to put Alder/Raptor on DDR5 now. At least to me anyway.
DDR5 will most definetly be an improvement from DDR4 with CPUs that have lots of L3 cache and a good size of L2 cache. Once the prices drop and the AM and LGA platforms have dedicated spaces without any uncertainties in the way, DDR5 will be solid for everyone, no questions asked.
Honestly performs way better then expected especially due to having x16 chips This 2x8gb kit is availible in germany for under 60€ Budget ddr5 builds incoming with A620
Omg thanks for this, I been debunking the narrative that ddr4 is always the way to go cause ddr5 cost too much. If you what 32gb the cheap ddr5 sticks is the way to go.
I hope you can get your hands on the new Hynix A-die. And see how it performs. Stock and overclocked. And to compare it with multiple memory kits, even more than this video with more games in 1080 1440 and 4k. despite it being a memory test i feel like every test you do GPU/CPU/RAM/STORAGE.. any test should be done at 1080, 1440, 4k. Why test a CPU/Ram at 4k where its going to be GPU bound? Because you never know. No one knows. Rather than assuming what would happen. Showing data will remove any guessing. Also you will never know what you'll find out. The more tests you do and the more data you provide can show you some stuff you didnt even know existed. Hiw many times when you tested somethuling then you noticed something, then you did a whole video and test for that one thing. Quite a lot isnt it. Thats why the more data the better. You will never know whats going to happen with electronics and how they react. Also thanks for your efforts. ❤
"Why test a CPU/Ram at 4k where its going to be GPU bound?" Apart from the point you make, in terms of shopping guidance it's still very useful seeing a demonstration of where and how does the limit shift from CPU to GPU. That way you can get a more realistic idea about the scenarios you're going to be getting into. So for instance while testing low end CPUs with high end GPUs does reveal their limits, there should also be testing with hardware that's appropriate for the price range to see how or when you're going to be limited in real world scenarios.
You've stirred up a hornet's nest with your DDR5 videos. The youtube algorithm is suggesting reaction videos from other youtubers to this and the previous DDR5 video. Just want to say thanks for all the work you put into these videos.
Yeah it's a shame YT promotes lazy and inaccurate content built off the back of our hard work, but realistically those videos are getting very few views anyway. Plus people with a brain can watch both and draw their own conclusion ;)
I think a DDR4 B-die kit of 4000 / 4133 with low timings for low latency would of been a great test here. Going from 3200 to 4000 with crap timings on DDR4 would of boosted frames for some of these games to be closer if not match the 6000 ddr5 kits. I think this might be to overkill for most users since tuning is required and why hardware unboxed didn’t go down this path but for some re-using ddr4 kits in a new intel machine build, could be a deciding factor.
don't want to edit upper comment, so will add in self-reply. 3200/14 is only worth it if you're going to manually overclock it further. It's mostly good bins of b-die that don't cost a fortune as even better bins of b-die. If someone is going to use just XMP (so majority of people) - 3600/16 is cheaper and will perform better.
I know what would focus on cpu and ram scenarios - fake/placebo compilation tests like simulating the compilation of a system kernel. This would even benefit game developers concerning compilation times. DDR5 systems are extremely new thus don't really see anything really pushing the performance of the newer architectures just yet, but with growing memory allocation with graphics cards and a serious move of game assets and code to GPU memory systems are becoming so much more modular that GPU benching will almost exclusively apply to the GPU. CPU benching tools will rise to prominence once again for CPU and RAM. Good vid btw.
Nice job comparing ddr5 with some overpriced ddr4 sticks. Compare with affordable cl16 sticks, which can be had for almost half the price of these cheap ddr5 sticks and the value isn't even close.
Performance difference between their dual rank 3200 CL14 kit and a cheap CL16 kit is pretty significant. Larger than even the difference between their kit and the ddr5 5200 kit usually.
DDR5 is just too expensive and almost no speed improvement (if any at all). What is better is data transfer speed although this is normally not needed.
my brother still uses ddr3 and an old AMD fx 4350 chip and gets 70+fps. all he plays is GTA5. if it's not broke don't fix it is my advice. just because a video is showing you all this flashy new stuff doesn't mean you need it.
@@laszlozsurka8991 well i have to disagree, me myself i play all types of games from old school to new and i noticed a decent difference in changing my ram speed from motherboard default "2333" to custom 3000mhz, but that is probably because i have the CPU and GPU to compliment the RAM speeds.
I am weirded out by the low amount of memory chips on the stick. I forgot if it's called the density of the DIMM or what but I'm almost certain it could impact performance by having less chips.
when DDR4 was relatively new, you could get great gains even going from 2133 to 2666 MHz in certain games. But the small L3 cache of CPUs at the time was probably one factor
The cache is a MAJOR factor. This can be seen in the 5800X-3D CPU, where it simply doesn't care what quality DDR4 its paired with. The data it requires is almost always in internal large 3D cache.
This will be replicated in the future 7000X-3D CPUs as well. Memory Speed and CAS latency are definitely going to become less important as CPU cache get larger and larger.
@@fredsas12 you can still see small gains like around 5% or so if you go from 3200MHz to 3800Mhz on the 5800X3D, but yes not very big
I want to see frametime/time charts and fps/time charts for all of these benches.
Brought my 7600k back to life for a while by upgrading from 16gb of 2400 to 32gb of 3200. But in the end it was just part of the upgrade path and not an upgrade I can recommend.
@@ndgoliberty interfering, what difference did you see? I didn’t think Intel 7 could take advantage of faster memory
In the UK DDR5-5200 8 GB dimms are like £2 more expensive (per dimmi), so definitely worth getting it over this 4800 kit
all 4800-5200 kits should quite easily hit 5400, so save those 2£ for a burger or something lol
Makes basically 0 difference in performance.
@@Dyils reference burgers that's exactly what I thought lmao
@@Dyils You can easily get a burger for £2, it just won't have the meat of any animal you've ever heard of
@@THuk44444 you'll get just the bag
I seems that AMD have timed their DDR5-only platform very well
cheapest =! average
They appear to have made a real effort to bring memory guys on board, EXPO might be better than XMP which was limited timing adjustment.
In late July/early August DDR4/5 prices were closer, it surprised me. I wouldn't want to buy my DDR4 again as it's risen considerably.
OEMs pay bulk contract prices so I expect pre-builds with 7x00x will be more competitive than early adopter DIY, with the new release price premium.
It's a bit of a self-fulfilling prophecy. But it's cool that AMD is big enough to trigger that self-fulfillment.
These guys are shills.
@@jayr8282 too... whom?
Buildzoid pointed that if your DDR5 module is SK Hynix it can be easily overclocked to 6000MT/S even in the cheapest modules.
Keep in mind that the memory tested in this vid is from crucial, therefore the memory chips used will always be from micron.
@@Neggy-Z Micron more like Poopron when it comes to DDR5
And according to Buildzoid, pretty much anything at 5600MT/s and above is pretty much guaranteed to have Samsung dies (reliably OC to 6000+) or SK Hynix (very easy 6000+ at good timings).
And looking at some 5600 kits being pretty inexpensive already, it may not be the worst of ideas to pay a bit more for much better stuff
@@Neggy-Z Just something to keep in mind for testing and budget builds
I got mine to 6600 cl32
This was my question. We know they are micron but can they get to 5600MT/s as a stable OC?
I think it would be interesting to revisit this when the new Ryzen CPUs come out not so much as a comparison against DDR4 but as a comparison between Intel and AMD with the advantage that you already have the Intel data as a baseline so if you use the same memory modules and test suite it would save you a lot of time.
yes
I really like your idea, only problem is that if the modules that he has don't have AMD's new EXPO, then it might not work. More than likely he would have to buy some new DIMMs that have both Intel XMP and AMDs EXPO capability onboard before he tests.
Oh hey, the VLDL figures were such a nice addition to the set!
Haha I'm like VLDL surely that's some RAM designation and not Viva La Dirt League. Happy to be surprised. 😆👌
Agreed!
Mornin'! Nice day for benchmarks, ain't it? :D
@@JayMaverick I immediately thought "Very Low Density Lipoproteins". I'm a nerd.
Hello Adventurer!!
This will be a lot more interesting with the upcoming 40series GPU esp if it turns out to be more than 50% faster. The same thing has happened in the past where in the earlier days of DDR4 speed didn't make that much of a difference, then a couple gpu generations later jdec tier ram became a bottleneck
50% faster seems incredibly unlikely. Maybe 10-20% and that's of you can even find one
@@NESW2000 plenty have been holding back from upgrades since around 2019 and the pricings of tech since then ddr5 is gunna be worth it for plenty of people for now
@@NESW2000 most of the leaked data is suggesting that the high end is going to be double the performance of the current gen. It's a node jump for both companies, both making architectural improvements, both increasing the snot out of the power draw.
@@brendago4505 'leaked' data is infamously unreliable. I'll believe it when I see it but if it is no one will be able to get one again as they will all sell out
@@tonypeperoni5818 i was holding back from upgrades since 2012......, If Really Nvidia and Intel consume a lot of Power (ill see reviews first) then ill go AMD and full 4K (im not only play, but do Photo/video editing and digital paint), 2k or 1080p with high Hz are so fuking trash even in 2022...
Makes sense, so if I were to build a new system right now I'd definitely go DDR5. But here's my dilemma, I'm currently on an i7-12700K with 32GB (16GB x 2) DDR4 3600Mhz CL16 and a 4080, is it worth it to swap out my motherboard and RAM just to upgrade to DDR5? I game @ 1440P.
Side note, I would've loved to see the Spider-Man remastered results @ 1440P as well.
Holy smokes the B rolls are awesome. So much effort for a single video.
15:25 I actually upgraded to Alder Lake earlier this year and went DDR4 because I already had a perfectly good 64GB of 3600mhz 18CL RAM, and didn't make sense to pay the premium on new RAM for marginal gains when I already have a perfectly good kit of RAM.
@@provisionalhypothesis I need a system that can render my videos at a reasonable speed so I can get to work on the next one. It's not "obsessing", this is how I make my income.
I've recently watched BZ's RAM timing series where he explains some of them, and which timings are relevant to which operations. Would you be interested in testing which timing effect which games in your testing suite? RAM is usually presented in reviews with only with transfer rate and tCL or primary timings but we never see things like tRRDS/L, tFAW, tRDRD_sg/dd and command rate, which are far more relevant than the primaries for data that is scattered between different banks/bank groups. Considering the advantages of dual-rank memory over single rank in gaming I reckon those timings could prove more important than the primaries.
This is something talked about by XOCers, tuners, and random people in forums, but I couldn't find any properly standardized testing as you or GN do.
To date i've not seen a single memory test video done on well tuned vs XMP secondary/tertiary timings. :(
This. tRRDS and tRRDL with tFAW tuned properly alone can do wonders. Especially when the XMP profile comes with awful values for these.
@@NVMDSTEvil There's a few out there but they they are in Russian and Chinese. If you are familiar with overclocking memory for tests like Y-Cruncher, PYPrime, Geekbench, Super Pi and Time Spy CPU they are worth a watch but of course there's still points that I have no idea what point they are trying to get across
Exactly. 4 x 8gb D4 at 3200c14 is a poor way to test against D5 imo. I can easily run daily stable DR 2 x 16gb Samsung B-Die at 4133c14 (12600K) with 2nds and terts maxxed ( with good high bin G.Skills) but testing at say 4133c15 1.5v should works fine even for a 3200c14 bin kit. 3200 is just too slow to be put up against D5 especially since 12xxx has a much better IMC(s) than the previous gen Rocket Lake that fully maxxed at 4000 for the best and 3866 for most cpu's.
Gamers Nexus did a couple videos comparing XMP vs tuned memory performance around the time that the i5-10600 and Ryzen 5 3600 launched.
The problem is that it takes a lot of time to do (as anyone who does mem OCing would know, with all the tweaking, stability testing, rebooting, etc. involved),
and on top of that it is a niche that isn't going to get a lot of clicks/views/engagement, except with the hyper-nerds.
So for the reviewers, it's a big time and effort investment for low returns.
Hopefully they do another one for DDR5 after Zen4/Raptor Lake launch, but I wouldn't hold my breath.
It's tempting. But, I think I'll give some extra life to my current rig. New CPU and maybe a GPU. Thing is far from dead. No point ditching it for a new platform.
I agree. If your a heavy gamer or your earnings are derived from gaming....YES upgrade. Otherwise stay with DDR4 and Zen 2 or 3 or currently intel. Myself, I'm not a heavy gamer, I have a ryzen 3700x and GTX 1660 super and I don't plan to upgrade. Let prices stabilize, DDR4 market diminishes and then build a new computer. I'm looking at 2-3 years.
If they happen to come out with a 5600x3D I'd swoop that but otherwise perfectly content with the 5600x with 3600cl14 ram
Yea specially with 5800x3D or 13th gen with DDR4 if you have b660 or z690
With the upcoming price of AM5 motherboards looking to be double to triple the price of equivalent B550 motherboards, I can't see any reason to upgrade platform unless you hate money or are fine with no cpu upgrades after 13th gen on the board. Still feels like it's more cost effective now to just buy the fastest gpu you can, increase resolution and keep your cpu as irrelevant to performance as possible.
Living on the trailing edge is more economical, and good enough is good enough.
This for me will come down to 5800x 3D vs 7600x / Board /Memory, seems the 5800x can keep up on DDR 4 with a more stable platform.
Yeah but it's a weird min max. Fastest AM4 cpu vs not fastest AM5. I'd get a 5800x3d if it dropped to $300 after the launch
ooooor maybe wait for 7800x3d
@@Chrontard sooooo what if they already own an AM4 mobo? That's even more money on top if they need the 7thousand series cpu.
@@mick7727 Yeah I have a AX370 motherboard. I just updated the BIOS and bought a 5800x3d for $225, ran a -28 all core CO offset and it runs cool no problems with a single tower air cooler.
@@weasle2904 5800x3d for 225 ??? Maybe 5800x. 3d is at almost twice as expensive
Thanks for the timely video, Steve. That's something that's good to understand when leading up to the launch of Ryzen 7000. I already saw 2x16GB 5600CL36 available for around $150, and although price is now back up a little I suppose it will be drop back. The price also doesn't seem too bad considering that's about what I paid for 2x4GB DDR4 in 2018. Hopefully when Ryzen 7000 launches you'll do a video comparing some memory speeds. I rather expect you to. You're good like that. :)
I think we got two scenarios for ddr5. Either the new gpus will enable higher performance, or new cpus carrying way more cache will reduce how effective ddr5 is
I'm looking forward to seeing the next generation of "high end" APUs from AMD combined with DDR5.
It's obviously not going to compete with discrete cards (unless they're extremely bad), but the ones in the 5000 series seem to be both memory and compute limited.
@@MrMartinSchou Yeah, unfortunately I don't think we are gonna see anything different in the apu space until at least Zen 5 or Zen 6. Just the same 8cu garbage or whatever they do
I got a 32GB set of DDR5-5600 CL32 for $170 and I've seen slower kits under $150 regularly. It's absolutely time.
What set do you have if you mind sharing? Are there good alternatives?
It would have been nice, if you’ve added one or two benchmarks for non-gaming scenario compiling, 7zip, cinebench for example.
In another video I’ve seen, that compression benchmark in 7zip had a 30% boost with ddr5.
Thank you for the effort it takes you to bring us this high quality, detailed videos! Much appreciated!
This is really awesome content considering the buyer's advice really was to stay away from DDR5 because of the price across the board. Now when I'm considering buying a new computer from scratch without having any DDR4 laying around, going for DDR5 is actually something to seriously consider because it'll still be viable a couple of generations down the road.
i think amd doing the opposite thing of the am4 here, seems like they released their most powerful cpus first then release the efficient ones later, just a guess
Well with how fast GPU tech is advancing you'll pretty much be replacing your motherboard every 3-4 years anyway, if you don't want to be severely bottlenecked.
@@bobbob9821 @Bob Bob Good point. In 3-4 years DDR5 will be way ahead of DDR4, and the processors will be of course much faster too.
"Welcome back to hardware unboxed, today we're yet again going to test DDR4 at dog shit slow 3200MHz with absolutely no sub timing tuning"
And his audience is completely ignoring that fact
@@rokaspleckaitis566 They're a half step above the Linus bots. You can get quality b-die for the same price as this garbage tier DDR5, and every K sku AlderLake chip can easily do 3866-4200MHz DDR4 with good timings. Even if you don't want to tune timings, 3200MHz is straight clowning in 2022.
@@BrianCroweAcolyte 3200 is fine lol. One step under the optimal speed, so painful oof.
@@joeykeilholz925 laptop user spotted 😆
if you relied on these videos for information you wouldn't even know what sub timings are
With the speculated 4x nvidia benchs, upcoming cpus and ddr5, we might witness one of the biggest performance leaps we've seen over the years hopefully
This is exactly the kind of testing I was looking for. Well done Steve.
Can we have a test at normal use cases? Eg. High quality settings on a mid level GPU
Yes, and at a realistic resolution of 1440p.
Its starting to make sense to look at DDR5 for new builds. I have a ton of fast DDR4 on hand and carried it to a D4 12700k build when ADL released. No regrets. Anyone who is looking to build from scratch and use all new parts is better holding off and seeing how 13th gen and AM5 shake out in the next 6 months IMO.
What i am doing exactly with my I7 4790k/ RTX 3060/ DDR3 1866mhz 2x8gb built...
Glad to see that even entry level DDR5 is pretty good since I'm planning on upgrading to a 7000 series CPU next year. I imagine with the release getting fairly close, the ramp up in DDR5 production will improve both performance and price towards the end of this year.
Nice day for fishing :) . Really liking them figurines.
As for GPU limited scenarios: dropping the quality settings alone is only going to get rid of the shaders related bottlenecks. You may still end up limited by the ROPs of the GPU. To get rid of that you need to drop the resolution as well - 1080p is still too high.
Thank you, this is definitely alleviating fears of buying new gen cpus coming up and having to buy the most expensive ddr5!
DDR4 platform with Samsung B-Die, dual rank and tuned (i.e. 4000mhz cl15 + adjusted timmings) is still faster for gaming than any DDR5 platform at the same price. Period.
I understand the point of using such a high end test bench is to eliminate any bottlenecks other than the specific hardware being tested. That being said, I would still love to see a test like this run with more entry level hardware, so I could see what improvements I could expect with my system.
Thanks for a great comparison! Just as a suggestion. It would be really cool to see performance difference on the titles which are old, but are highly CPU and RAM dependent, such as ARMA3 and DCS: World...
Get back to Ryzen 7000 testing Steve since you aren't blasting noobs on Fortnite. 😛
Honestly can't wait to see the results of it and testing RAM like this is gonna be essential for building your new AM5 PC, so thanks for this. 😎👍
Yeah the wait is dragging, Steve should be Masterizing noobs on Fortnite WITH a 7600x! 😁😆
There's tease videos out that just repeat launch uncertainties, those sampled will have a pretty good idea already on those RAM speed questions.
@@RobBCactive one anagram: NDA
@@chitorunya Yeah, but sometimes there's clues, I saw a video with a Ryzen 7000 in mobo, discussing memory and coolers.
My issue is the guy is repeating speculation when he must be able to know the answers .. it's a click baity tactic.
He used to work for a channel that earned a reputation for unreliable FUD about AMD.
He read out Robert Hallock's discord statement but then ignored it, talking about settings for Zen2/Zen3 memory overclocking although the info was to use AUTO:1:1.
He has found a way to make video and hide behind the embargo, as he is simply repeating public questions asked at the launch and the days after.
"blasting noobs on Fortnite". Fortnite is a lame game for lame wannabe-gamers, wannabe-people, mental noobs who can't even spell their own name without searching it on Google with their smartphones.
@@chitorunya *anagram
it seems to me that ddr5 can give great benefits in performance in games when playing in 1080p, because you are not gpu limited. But when you play at 1440p or higher, you become more gpu limited, so does ddr4 or ddr5 speeds matter in those cases?
Pretty clearly it doesn't. It's the same as spending a bunch of cash on anything higher than a 5600 CPU at 1440p or higher...it's a total waste for gaming.
I think it is unbelievably stupid for people with 1.5k+ systems playing on 1080p in the first place to be honest, your system at that price point and above is so much more capable and you are spending thousands for numbers that really are unnecessary lol
I’m sure 100fps at 1440p is still more then smooth to enjoy a game, rather than double that at 1080p, with a system that costs thousands being wasted on a low resolution
@@connorosullivan3500 It's primarily the esports players that play at 1080P and lowest visual settings to eke out every small advantage they can get from lower latency.
@@connorosullivan3500 going down to 100fps is noticeable
@@tired9494 not so much if it is consistent lol
I think many people spend quite unnecessarily but not my money so I don’t really care that much haha
Well shit you convinced me, I was teetering on if I wanted to hold out on the platform jump or not, but that minimum fps gain is tempting indeed!
I'd advise you to wait for a massive DDR5 motherboard price drop.
@@Behdad47 yeah, I'm not in a huge rush to upgrade, I'm sure the ddr5 kits themselves will also get a bit cheaper maybe even faster (latency wise)
I'm still holding off, CPU / RAM isn't what's holding me back currently, next up will be a midrange next gen GPU. 2024 is when I think i'll do a platform jump, would like to go from 64GB to 128GB when I do it. For my day to day, memory capacity and thread count are where i'm hitting limits.
@@spacechannelfiver smart move. Way to early to jump. Price needs to stabilize and supple needs to increase. If you just built a computer within the last two years, certainly not a time to upgrade. You typically want to hold on a rig for 4 years until you upgrade all components.
@@silas232003 this is the thing people often miss about PC, you get them originally at a certain upfront cost; but can just keep patching them for years and years. I originally built my desktop in 2008 and it's nominally the same computer now, although not sure any original components remain at this point. maybe a hard disk or fan or something.
Edit: if GPU or Platform price spikes then you can put your annual budget into a new monitor, or PSU, or improve peripherals.
Would've liked to see 16GB of faster DDR4 memory thrown in there for comparison
oh yes
without that really can't decide to change boards
This, definitely. I thought I would se a 4000Mhz CL16 kit at least in there, of course with hand tuned timings, XMP is worthless.
It would probably kick ass.
Not probably, definitely. Makes you think, hmmmmm?
@@rokaspleckaitis566 No doubt but how many people have the knowledge or interest in tuning memory, most people just want to install and enable xmp and start playing.
A few months ago in the no sleep zone I said I was dissapointed that your first DDR5 test was with 4800 instead of 6400. I see now that it doesn't matter that much so I stand corrected.
You forgot that he compared it to 3200 instead of 4000 ddr4. Makes you think
Worth noting that if you get 2 sticks of DDR5, do not expect to easily upgrade to 4 sticks later at some point. You'll have to drop bus speed by a fair amount.
How so? If they are the same speed, shouldn't there not be a problem?
You won't with these crappy JEDEC sticks. That's one upside of slow RAM: at least it's stable.
@@nathangamble125 I'll keep that in mind.
Thanks.
@@nathangamble125 Happens with certified/QVL XMP as well.
@@ShadowMKII The memory controller is the limitation, not the memory itself
I love that we are finally seeing some linear boosts from memory kits after a decade of far less differences. thanks very much for this detail!
I would like to see the differences when the APU's come out, I am not interested in anything but APU's going forward so I hope DD5 ram makes a difference.
Then you want 2x16GB due to the higher total bandwidth.
@@Psi-Storm Thanks for the info, that was the plan anyway but didn't know it would be an extra benefit other than it just being more ram.
@@JohnDoe-nh7vx bad info, on apu almost no difference between 16 and 32gb. On normal high end CPU + high end GPU it can be 5-6% difference in gaming AT MOST. Sounds like money well spent yea? +5% performance for 200% cost?
@@BeHappyTo We are talking APU's though, not graphics cards. When DDR4 or DDR5 is used for Video ram does it make a difference.
Misleading title. 'Cheap' DDR5 is more expensive than quality DDR4. And the performance gains in most games are next to zero. Most consumers already own DDR4 kits, so it isnt $50 DDR4 vs $90 DDR5, its $0 vs $90, for like 1% more performance on average. Absolutely not worth it.
Cheers mate for the vid. This has helped me make a decision- update to the very latest specs or save cash in the short term. I guess I am better off spending the pounds in the interest of longevity.
The ddr5 mobo's are another part of the catch 22 and not just when it comes to the pricing, there has been a bunch of horror stories on the topics of compatibility and stability in overclocking across all boared from all the big name manufacturers, I probably went through probably 20 pages on a single Reddit thread trying to do research before I chose my new upgrade platform after my 1st gen Tydens system called it quits, and it was filled with complaints and frustrations of people who were asking for help troubleshooting and giving up stating that they'll RMA they're ram modules or boards, and this was a 3 month old thread over multiple bios updates
Any new tech have horror stories.. like when RTX 2000 released & people was spreading stories about there was a bad batch ..etc
But for now, DDR5 been like 1 year out, so it should be stable overall (Intel already was the beta taster Lol)
Plus, people levels are not the same, God who knows what they were doing wrong , wither it's hardware error or software compatibility/programs error.. so don't judge from a reddit pages, judge on the trusted UA-camrs results (like this channel)
Early adopters tax.
God bless em', they do all the beta testing for the rest of the market
@@popcorny007 amen
Don't see how it's different than ddr3, ddr4
@@joeykeilholz925 yeah, also PS3 , PS4 & PS5 are the same.. also the planet is also flat.. does that suits U well? xD
Hey steve great testing as always, but would have been nice to throw in some cpu testing in there as well. Plus maybe some "realistic" tests, maybe with a 3070/80 in 1440p
ACC is quite CPU bound when playing with the ai. The ai are subject to the exact same physics as the user, so you can get huge frame rate gains just reducing a couple ai cars. I went from a 1080ti to a 3080 on a 8700k system and saw almost no frame rate improvement on max visual settings. Took 4 cars out the pack and had a boost from 70fps to 90fps. It's a bit of a mad one tbh
ACC is also peculiar in that it absolutely loves L3 cache (likely due to running all those physics calculations for all the cars) at present the AMD 5800X3D is massively better for it than any other CPU.
I'd love to see more of these tests done with 1440p and 4k resolutions.
Sure it makes the graph differences smaller in regards to avg. or max fps, but you can compare minimum fps performance and frametimes' stability.
Since you are already GPU bound in 1080P on some games, what point makes 1440p or 4k? There will be no difference between the worst DDR5 and the best/most expensive one unless you will have a 4090ti o.O
@@tjintell Validation. To demonstrate that what you think is true is in fact true. That's why the Nvidia driver overhead benchmarks were so shocking - pretty much nobody tests that way and the scenarios where the CPU was bottlenecking the GPU were with combination you would normally assume are fine.
It would be really nice to have this comparison with Ryzen 7000 too, due to the storical more sensibility to RAM speed and latency from AMD's CPUs.
Both Intel and AMD are memory sensitive.
Love the VLDL figurines, best part of the review by far :P
I was planning to ask it in Q&A but might as well try it here ;) Is there a way to determine if RAM is single or dual rank before buying it?
Second question, are You plannig to make a video about DDR5 "sweetspot" frequency/timing for Zen4?
It’ll either say so on the dimm or the packaging, or you could run a program like cpu-z which will tell you as well.
@@HazzyDevil I know how to check it after instaling in the system ;) My question was about checking it before buying it ;) But thanks for the answer anyway and I guess online I can always google it to check.
@@Sharleee I’m derp. In which case yeah online. It’ll say in the description, or in the product name/ID.
@DaKrawnik420 I was going to reply with this earlier, went around looking at the 4 kits of ddr4 i own and oddly enough not a single one of them mentions this, remember seeing the rank number all the time on ddr3.
@@Neggy-Z exacly I was looking at many different online stores and none of them mention it in specification/description of the product.
Some 1440p or 4k benchmarks would have been really cool. Please add that in the near future.
All I got out of this video is that upgrading to a whole new platform to use DD5 is a waste of money. Thank you for doing the testing.
Same here. Probably gonna be better to grab a 5950X and DDR4 on fire sale from a value-for-money perspective, especially if you've already bought DDR4 in anticipation, like I have.
@@dimples282 5800x3d > 5950x for gaming.
A 3200CL14 kit is slower than a cheaper 3600CL16 kit. I understand that 3200mhz is more convenient for you since you test older CPUs that have trash IMCs like 1st and 2nd ryzen, but most new builders today are buying 3600MHz kits.
I upvoted because I certainly agree that B-die at 3200 MT/s is extravagant, but it's worth noting that locked Alder Lake CPUs may not be capable of going past 3200 MT/s, at least not without a manual (non-XMP) overclock, and at least not in Gear 1 mode. 3600 appears questionable even with a manual overclock. (The system agent voltage on non-K Alder Lake is locked at ~0.9 V.)
FWIW, Buildzoid says that 3466 MT/s is a "safe" assumption on locked Alder Lake--and I managed to achieve that myself on my i7-12700 (non-K). But I couldn't get 3600 to work. Memory overclocking is not for the faint of heart, so it's probably best to stick with 3200 MT/s on these CPUs, if we're assuming the average user.
@@RedundancyDept I'm aware of that VCCSA lock. I suggest you try Comand Rate 2T if you haven't tried it. That might allow you to run 3600MHz in gear 1.
@@WrexBF Already at 2T, sadly. Appreciate the thought, though.
3200CL14 Bdie kit when tuned with sub timings is not slower than a cheap 3600CL16 Kit.
12:50 looks like the main difference in CS:Go is the capacity. Going to 32GB seemed to help boost the 1% lows a little bit.
Would have been nice to see that 3200Cl14 kit overclocked to 3800 Cl14 or 15 because basically no one on the planet will buy samsung B-die to run them at stock speeds especially at 3200mhz since both Intel's and AMD's sweetspot is around 3600-4000Mhz.
Yet I have mines downclocked to 3400 since my R5 2600 blue screens after that.
Still a great boost since 2933 is technically the max of the CPU.
@@PatalJunior sure but the 2600 is outdated by now we have to look at current gen
@@PatalJunior yeah but the thing is, the infinity fabric speed is tied to the ram speed, so it could have something to do with that
Sure but that would mean that ddr 5 could also be overclocked
@@ifrit35 yeah but it's not as a necessary of an upgrade
Nice work Stevo, always making useful and interesting videos when it's actually relevant. Looking forward to the ryzen 7k reviews.....
DDR5 is still double the price where I live, so, no - it's *definitely* not time to leave DDR4 for everyone.
Also, new AM5 motherboards START at about $150 USD (as quoted by AMD), so the price to get a decent AM5 setup is still quite expensive (including Alder- or Raptor Lake for - at least - the memory cost, too).
Nice information otherwise.
Indeed all the tech channels never take into account that prices vary A LOT around the world. Still the graphs are enough for people to make an educated decision.
@@xingbairong Yep. Strange - especially for a channel based in Australia (which they have accounted for in other videos).
Protip: AVOID ALL 8gb DDR5 sticks!! They are x16 memory chips and are far worse than an equivalent 16gb kit (You can see this in the video!). The sweet spot is Corsair DDR5 5600 CL 36. You can get a 32gb kit for $170. I own this specific kit and it easily overclocked to 32-34-34-38 at 6000 MT/s (On Alderlake). It's a really great value because I get double the bandwidth (~96gb/s in AIDA) of DDR4 setups with a "similar" latency of around ~60ns. This is similar pricing to a good B-DIE bin of 3600 cl14, and is way better.
My main purpose is to play games and I game at 1080P. I don't really need more than 16 GB RAM. I recently got a Ryzen 5 7600X as a gift and I am making a build around it. What would you propose for me? It really bothers me to get more than 16 GB RAM knowing full well I'd never need it.
This makes my decision to wait for Zen 4 3DV look really good. It'll be even cheaper by then, and the 3D V-cache will make up for a lot of the latency issues.
They will be very expensive CPUs.
@@chovekb Thank you, Captain Obvious.
The fact that majority of good DDR5 kits are 2x16GB hinders quick mass adoption.
If you are building a new DDR5 system you really should not be putting only 16GB of ram it it.
Especially since DDR5 as a platform really struggles to run 4 sticks due to the high frequencies, so adding 2 more sticks later is not a good option. You should start with 2X16GB or you will regret it later.
Maybe I missed it but I'm curious about why you used 2 stick DDR5 kits vs 4 stick DDR4 kits.
Also, when I went from 3200 CL18 to 3600 CL16 (and OC'd), the difference in games was minimal. But the difference in Davinci Resolve and After Effects was massive. Hoping for more of the same when I pick up a 7950x to replace my 3950x.
Most ACC sim-racing setups needs a lot of graphical horsepower. Firstly, because most sim-racers race on triple screen setups, VR or a large 4K TV, meaning there is a lot of pixels to "drive". Secondly, most enthusiasts use higher Hz monitors as they are not okay with just playable levels of FPS (e.g. 60 fps) since more FPS will mean you have an easier time hitting your braking point and will get timelier information of the cars trajectory in corners, which is really important because sim-racers can't rely on the "pants in the seat"-feeling of racing on track and therefore rely even more on visual ques than drivers in racecars. And VR is often 90 Hz displays so for VR sim-racers it is a must to hit that frame rate.
A good amount of enthusiast youtube-sim-racers setups are three 144 Hz 1440p monitors and really high-end PC's (RTX 3080/3090 and 12900K/5950X) to get >100 fps on monitors with a combined 11 million pixels. Where 4K is about 8 million pixels for context.
My own setup is a more modest triple 144 Hz 1080p setup, where I target >90 fps in-game because my old PC is having trouble keeping up with games like ACC at that resolution.
Triple screen setups are normal for sim-racers, who are not running VR. The added screen real-estate means you can see more of your car and can spot other racers in the sideview or mirrors, at a realistic field of view from the cockpit-view, when racing side by side.
Higher Hz displays are because when racing GT3 cars in ACC you are often traveling at average speed of about 180 kph (110 mph), and often doing close to 280 kph (170 mph) at the end of straights. Meaning you travel 75 m/s coming into the braking zone.
At 60 Hz you are traveling 1,25 meters per frame.
And at 144 Hz you are traveling 0,5 meters per frame.
Therefore, higher fps can make it easier to hit you braking point, which sets up the corner, and better show the cars trajectory traveling towards the curbs.
Hope this gives you a bit of context as you mentioned, Steve. :)
Good explanation, pro sim racers go further though and basically treat the game as a CS:GO player would but with triple screens (I use triple 240hz 1080p monitors)
So DDR5 is actually faster and cheaper than DDR4 if you chose non expensive DDR5 version!
Not bad at all!
Buy cheapest DDR5 now, and get very good kit for cheap in a year or two
Yes, but then I also need to replace my motherboard.
@@peeonthe3rdrail414 i mean you also replacing new cpu...
@@peeonthe3rdrail414 i think the point is that theres no downsids to ddr5 now so its one less price gate to a new platform
Only if you have a RTX 3090 Ti
Something else to consider would be the 7800X3D that will (presumably) function similar to its zen 3 counterpart. Maybe high quality memory won't be that important
As much as I want to upgrade to AM5, I think I am going to wait until next year when the 3D cache CPUs come out. I recently got a Samsung Neo G8 4K 240Hz HDR monitor, and I think my money would be better spent on a RTX 4090 than a new CPU platform.
I probably will buy both next year. Currently sitting here with the 5800x3D and a good 6800XT.
This test for me was interesting because I have games where the CPU and the GPU are fully loaded + MP title &4k
So average DDR 5 RAM will boost it even further
@@Vanadium And then Black Friday rolls around and I am able to get a Ryzen 7950x for $180, a AROCK RIPTIDE AM5 Motherboard for $149, and some Team T-Force Vulcan 32GB (2 x 16GB) 288-Pin PC RAM DDR5 5600 for $129, so it looks like I upgraded sooner than I thought I would.
@@DizConnected that is an insane price
Not an apples to apples comparison - should have used DDR5 2x8GB across the board.
Should the DDR4 tests have been performed with 2x8GB, similar to he ddr5 4800 instead of 4x8? There's a performance bump in doing quad channel, right?
It would not be quad channel, it's still only dual channel. But using 4 sticks means it's dual-rank per channel. Which in terms of number of banks to swap to, it's similar to the 2X16GB ddr5 kits. So yeah you are getting a performance boost there certainly.
Thank you for using ACC to inspection. As a ACC player, data is very beneficial!
Impressive how NEWEGG is going full throttle with advertising with well known reviewers. Curious how this will effect Amazon 1-2 years from now.
ddr4 prices at least in the us are pretty darn cheap with some 2x8 4600mhz kits being like 116 usd. But ddr5 is defintely getting better and intresting to see how much some games seem to get huge boosts from it
I recently got myself 64GB's of DDR4 for less $ than 32GB's of DDR5. I'll check back on DDR5 a few years from now.
Lol
when socket AM6 comes around I wonder if we'll be on DDR6 yet
64Gb of RAM is usually totally useless for most people.
@@aos32 Why is there always someone saying that? Wasn't long ago they'd say more than 4gb is useless. Things change, and not everyone with a pc uses it just to game. 64gb (or more) is totally useful for many people.
@@truedoh2831 it's literally useless unless you actually need that RAM. I have 32gb and have never had to go to swap.
Very interesting and informative. Thank you!
Not sure 4xDDR5 modules would make upgrade sense, 2x configs only are rated for high speed.
Thanks for testing the cheap memory.
I wondered about a cheap DDR5 memory strategy, planning to replace it later as game performance demands rise.
If my X570 mobo or CPU dies a death out of warranty, I can at least consider moving to AM5.
Thank you for your answer but the mobo comes with 4 slots I would have like to see what would have happened with 32GB
@@kingalex2nd991 AMD released a slide suggesting a large reduction of bandwidth in quad dimm configs.
Intel had similar DDR5 performance drop.
Therefore until the embargo is lifted or a new breakthrough alters the situation, most will avoid the quad configuration
@@RobBCactive DDR5 has more bank groups them DDR4. A 2 DIMM DDR5 setup (That is actually proper DDR5, 16gb per DIMM) will have the same amount of bank groups as a quad-dimm DDR4 setup. So you get almost no benefit going from 2 DIMM to 4 DIMM (You actually get slower because it's more taxing on the memory controller and you get electrical interference!). Stick with 2 DIMMS and get the fattest sticks you can get (They make 32gb DIMMS at 6000 now!)
I'm planning on getting a whole new system in the next month and this helps a lot, thanks.
People, this video is just here showing you when you are building a new pc, that DDR-5 can be a solid choice to go with.
This video is not saying change now.
If you are making a new pc, the DDR-5 choice is something you can keep in mind.
Also, there is no rush, you literally can wait when DDR-5 get more refined like literally every invention in history.
I drew that conclusion too, the DDR5 price fluctuates alot in DIY market though. OEMs should be ramping demand now, they won't be paying the high premiums on bulk contracts.
Why not get old gen cpu mobo ddr4 and spend your savings for better gpu?
But the title clearly says 'It's Time To Leave DDR4'
Yeah commenters aren't very bright here missing the entire point in different threads
My take is there's no downside to DD5 now prices are coming down, it's the motherboard price that makes the difference. When the next 3D vcache CPUs and Raptor Lake arrive, hopefully boards are cheaper then as well. No idea how long AMD will support CPUs in the new boards, but I still believe it is longer than Intel.
In regards to ACC. I have seen improvements in performance by upgrading memory in the past. I only drive in VR. For flat screen gaming, as long as you have a half decent rig, you should have a good time, but for VR, you need all the performance gains you can find.
I'm very curious to see how this kit performs relative to faster speeds/timings with new Ryzen CPUs, since Ryzen has always been quite sensitive to MT/s
Good stuff Steve! Very enlightening.
Given I've kept my current desktop for 6 years I'd probably go for 32 GB of ram. We are just starting to get to the point of 32 gb being needed.
yeah 32gb is the new norm
I was thinking of building 24GB. 2x8+2x4. But due to a small accident ended up building 4x8. Only 16GB was no longer becoming a good option.
@@SianaGearz 24 gb is generally ill advised. It can work for a some time before causing issues but eventually will.
@@silasmayes7954 Why and to what extent? The population weirdness is something modern platforms are equipped to deal with. You populate one dual-channel pair with the same sticks, and another dual-channel pair with another two sticks, then you have each channel housing 12gb in a somewhat weird address and rank interleave, but if it works, it works, and the two channels are equal to each other, so it's not even hybrid dual channel, it's true dual channel. The overall timings are that of the worst of the 4 modules, plus some penalty for all 4 slots being populated, if the particular board thinks it needs that.
In driving simulators, people often run multiple monitors in order to be able to see cars that are passing. But at the same time , higher frame rates are generally considered to improve the players ability to react at high speeds. Competitive players will likely need higher framerates across triple monitors, but more casual players are more likely to care about visual effects.
AMD shills HUB strike again. DDR5 suddenly great as soon as AM5 about to release. This guy most likely owns AMD stocks.
Salty Intel fanboy detected.... Too bad your _team_ launechd Alder Lake when the memory was more expensive than the CPU. For me it is simple, if I was building today, I would not put Alder Lake on DDR4, because then you are locked to that unless you change out your motherboard. Sure DDR5 is not as cheap as we want, but it is so much less than it was when Alder Lake released that it makes sense now to put Alder/Raptor on DDR5 now. At least to me anyway.
Why no synthetic benchmarks? I'd like to know how much the speed differences impact productivity/rendering.
That is perhaps a different video, segmented by audience.
Can you guys simulate an AMD 5600X3D, Quad and Dual Core With SMT VCache ? To see how the benefits scale across the architecture core count
As V-cache did well in lightly threaded games they'll perform well.
What is the test intended to prove, ONLY 5800x3D is available on AM4?
Thanks for including Total War games in your testing, I can never find testing using Total War and its nice since its one of the few games I play.
Curious how big the margins are with the latest flagship gpus at 4k giving they're said to be monster gpus
Steve: welcome to hardware unboxed
UA-cam captions: welcome to hard where on a box
DDR5 will most definetly be an improvement from DDR4 with CPUs that have lots of L3 cache and a good size of L2 cache. Once the prices drop and the AM and LGA platforms have dedicated spaces without any uncertainties in the way, DDR5 will be solid for everyone, no questions asked.
Wrong.. the more cache the cpu has the less important ram is... its seen with the 5800x3d
Wrong, why is it then a 5800X3D with DDR4 can beat a 12900K with DDR5? The higher the CPU cache is the less important RAM speed becomes.
Honestly performs way better then expected especially due to having x16 chips
This 2x8gb kit is availible in germany for under 60€
Budget ddr5 builds incoming with A620
Omg thanks for this, I been debunking the narrative that ddr4 is always the way to go cause ddr5 cost too much. If you what 32gb the cheap ddr5 sticks is the way to go.
This winter is looking better and better for a build. If I can I might even wait till spring.
I hope you can get your hands on the new Hynix A-die. And see how it performs. Stock and overclocked. And to compare it with multiple memory kits, even more than this video with more games in 1080 1440 and 4k. despite it being a memory test i feel like every test you do GPU/CPU/RAM/STORAGE.. any test should be done at 1080, 1440, 4k.
Why test a CPU/Ram at 4k where its going to be GPU bound?
Because you never know. No one knows. Rather than assuming what would happen. Showing data will remove any guessing. Also you will never know what you'll find out. The more tests you do and the more data you provide can show you some stuff you didnt even know existed.
Hiw many times when you tested somethuling then you noticed something, then you did a whole video and test for that one thing. Quite a lot isnt it. Thats why the more data the better. You will never know whats going to happen with electronics and how they react.
Also thanks for your efforts. ❤
"Why test a CPU/Ram at 4k where its going to be GPU bound?"
Apart from the point you make, in terms of shopping guidance it's still very useful seeing a demonstration of where and how does the limit shift from CPU to GPU. That way you can get a more realistic idea about the scenarios you're going to be getting into.
So for instance while testing low end CPUs with high end GPUs does reveal their limits, there should also be testing with hardware that's appropriate for the price range to see how or when you're going to be limited in real world scenarios.
Awesome amount of testing again! 👌🏻
You've stirred up a hornet's nest with your DDR5 videos. The youtube algorithm is suggesting reaction videos from other youtubers to this and the previous DDR5 video.
Just want to say thanks for all the work you put into these videos.
Yeah it's a shame YT promotes lazy and inaccurate content built off the back of our hard work, but realistically those videos are getting very few views anyway. Plus people with a brain can watch both and draw their own conclusion ;)
I think a DDR4 B-die kit of 4000 / 4133 with low timings for low latency would of been a great test here.
Going from 3200 to 4000 with crap timings on DDR4 would of boosted frames for some of these games to be closer if not match the 6000 ddr5 kits.
I think this might be to overkill for most users since tuning is required and why hardware unboxed didn’t go down this path but for some re-using ddr4 kits in a new intel machine build, could be a deciding factor.
3200 CL14 is generally worse than 3600 CL16 - which is also cheaper on average - so terrible comparison to begin with.
don't want to edit upper comment, so will add in self-reply. 3200/14 is only worth it if you're going to manually overclock it further. It's mostly good bins of b-die that don't cost a fortune as even better bins of b-die. If someone is going to use just XMP (so majority of people) - 3600/16 is cheaper and will perform better.
I know what would focus on cpu and ram scenarios - fake/placebo compilation tests like simulating the compilation of a system kernel. This would even benefit game developers concerning compilation times. DDR5 systems are extremely new thus don't really see anything really pushing the performance of the newer architectures just yet, but with growing memory allocation with graphics cards and a serious move of game assets and code to GPU memory systems are becoming so much more modular that GPU benching will almost exclusively apply to the GPU. CPU benching tools will rise to prominence once again for CPU and RAM. Good vid btw.
Nice job comparing ddr5 with some overpriced ddr4 sticks. Compare with affordable cl16 sticks, which can be had for almost half the price of these cheap ddr5 sticks and the value isn't even close.
Performance difference between their dual rank 3200 CL14 kit and a cheap CL16 kit is pretty significant. Larger than even the difference between their kit and the ddr5 5200 kit usually.
DDR5 is just too expensive and almost no speed improvement (if any at all). What is better is data transfer speed although this is normally not needed.
I'm still on ddr3, I think its finally time to start considering a complete upgrade :)
my brother still uses ddr3 and an old AMD fx 4350 chip and gets 70+fps. all he plays is GTA5.
if it's not broke don't fix it is my advice. just because a video is showing you all this flashy new stuff doesn't mean you need it.
@@mentalasylumescapee6389 RAM speed doesn't even matter much at games. It probably only gives you like a 1 or 2% boost.
@@laszlozsurka8991 well i have to disagree, me myself i play all types of games from old school to new and i noticed a decent difference in changing my ram speed from motherboard default "2333" to custom 3000mhz, but that is probably because i have the CPU and GPU to compliment the RAM speeds.
Any plan on doing a short follow up using a ryzen system with 5600x? There could be quite the difference there.
I am weirded out by the low amount of memory chips on the stick. I forgot if it's called the density of the DIMM or what but I'm almost certain it could impact performance by having less chips.
I'm almost certain you're wrong.
Great video thanks, makes waiting to upgrade my Intel 3rd gen 2013 ddr3 pc alot less painfull.
ddr4 3200 cl14 is over priced and has not been the sweet spot for several years.
Premium ram, for the most part, has always been a troll