Bandwidth based games maybe.. latency bound games will suffer a bit.. SOTTR loves bandwidth, but Say Warzone, Loves both latency and bandwidth, scales with both...
@The Minister of Memes I'm not bias but if you want some information from the AMD side through user experience. I've worked with many clients. With AMD and INTEL... So far I've noticed MORE instability with AMD processors. Ryzen 7 2700x - Blue screening issues. Ryzen 3600 - Crashes while running moderate tasks. Ryzen 5600g - Green screening issues while idle. But I've had no issues at all. I've worked with systems with 8400,8700,9600k,9900k,10850k,10400 I've only had issues with the 9600k with a moderate OC.
@The Minister of Memes AMD is changing platforms next generation so everyone is starting from scratch. It'll be interesting especially if there's going to be a fight for the bottom half of the market in both GPU and CPU this round.
@@justdashie1132 Well I've had the exact opposite problem - all my recent Intel builds had stability issues, but my two Ryzen rigs have zero issues at all. The only issue I did have with AMD is because Asus cant figure out how to unfuck their USB2.0 header issue.
I'm sticking to AM4 for the next 3 - 4 years at least. I get to avoid being a beta tester for both AMD and Intel, and by then prices for RAM, motherboards, etc should become more reasonable. I am currently on a Zen+ CPU. Going from that to a 5800x a couple years from now will be a solid upgrade. I can then hold on to that CPU for at least 3 years, then move on to next-gen platforms.
This is the smart thing to do. And I'm saying this as someone that will be getting Alder Lake because of 240Hz and BF 2042. DDR5 will be expensive asf and trash at the start. Not worth it unless you're an idiot like me, chasing frames.
Very smart. I've gotten to the point where my equipment hadn't been performing very well for most things, so I upgraded during these past launches. I won't be upgrading for another 5 years or so unless I never have to worry about money ever again.
I usually wait two years later. It will be very expensive plus time to work out the bugs. I've built a system when ddr4 first came out. It had soo many issues til It got updated. Sometimes patience is worth it.
And just like DD3 to DD4 transition, its likely that first gen may be 'faster' but the latency will make it an even trade off. Unless you have a small Epeen... wait a year or 2 like Super Hiro said.
all equals to the sum you are prepared to hand out for tech you can get 16 gb of ddr5 now but in 2 years you spend even more for the good stuf or you stay with an bined/overclocked ddr4 kit and in 2,5 years get what you actually wanted from the hype
In regard to the cooler discussion, we have seen Noctua and MSI announce they have free upgrade kits for their current cooler customers to use with LGA 1700. You are absolutely right with your anticipation!
I'm also curious about "idle" state. Your PC is just sitting there with a screen saver. Or just watching one video. Easy tasks. How low of a power draw can we manage with this? Paired with ATX12VO?
Intel raised ATX efficiency requirements for 10W loads last year so probably sub 20W or sub 15W with ATX12VO? I'm also curious about UEFI - there could be 6 or more settings just for cpu voltage and dozens of settings for power management...
I wonder when people are going to realize these "leaks" are done by the company to get people to keep on their platform. I have known this for many years yet people still seem to think this is someone with great insider knowledge not the company dribbling out this information.
Seriously, I've never been this dis-interested in tech before. You could tell me the 4080 RTX is 40% more FPS than the 3090, at using only 25% of the power and I'd be like "Yeah, but for $2,500 MSRP with scalpers wanting $4k." Seriously, fuck scalpers, and fuck AMD/NVIDIA for taking advantage of their customers by literally raising prices 2x once they realized scalpers/crypto miners would pay the higher price if gamers wouldn't.
@@stormchasingk9 this. I was looking at building a system with a 3050 or a hypothetical 6500xt but now i've straight up given up. this 3500u is gonna have to last for a few more years lol
@@stormchasingk9 Problem is, as long as it's not mask or tp or hand sanitizers, police and government turns blind eye to other type scalpers because non essential shit *disincentivizes* the enforcement. Marketplace platform also don't give a damn about non essential stuff either, unlike masks. I do consider GPU semi essential especially for 3d modeler, CAD users (but they mainly use quadro or radeon pro), and artist that deals with CG
@@stormchasingk9 yes, cause scalpers are the reason you can't get a GPU. It couldn't be the chip shortage or anything. Which is showing signs that it is NOT going to get better in the next couple years as originally thought and is in fact going to get worse.
10:37: NOT EVERYBODY WILL NEED A NEW COOLER. Noctua has said they will send you an upgrade kit *FOR FREE* provided a receipt. If you can't provide one the kit is like $7 It may be a larger die but higher end coolers should be appropriately sized.
Noctua have done this for a long time. I had to replace the motherboard in my Media PC a few years back and got a replacement mounting bracket. Good PR and customer service.
@@barkerit I feel like spending the extra coin on a cooler that won't go to waste because the company has a great track record of supporting their products when other manufactures would rather you buy a new cooler is a selling point.
Nice! This means ddr4 and zen 3 prices will drop soon. I'll upgrade my 3600 and b450 to a 5800 and x570 for dirt cheap, and grab another 16gb of ddr4 while I'm at it. Loving the competition!
I'm happy that Intel is finally stepping up their game. AMD was a wake up call for them to get serious again. With that said, I'm still happy using my 10900K so I won't be upgrading for 2-3 years. But it's nice to see Intel finally start pushing the boundaries with performance. Real competition is back
@@brando3342 it all comes down to your use case for that CPU. But I'm a heavy user and need more cores & eventually the 10900K will get outdated. They're be better CPUs out in that time from both competitors.
I will definitely be keeping my eye on Alder Lake and see what AMD responds with. I was originally going to build a whole new computer and upgrade to a Ryzen CPU, but of course the GPU economy has prevented me from doing that. My i5-3570k paired with a GTX680 needs to be retired already, but both the computer and myself can hold out a while longer.
AMD will allegedly release Zen3D (or Vcache) either at the end of this year (fingers crossed) or Q1 of 2022. Price for Zen3 might drop a little (if it will not be dropped for Alder Lake) and Zen3D has approx. 15-20% of IPC increase over Zen3 (due to larger cache - something that is throttling Intel's CPU's and will be a problem even for Alder Lake). If I were you I would either wait for Zen3D to be released - and then choose AMD, or if you really NEED Intel - wait for Raptor Lake (so 13th Gen). It's supposed to arrive at Q4 2022 and by that time prices of DDR5 will get lower and all the teething problems with scheduler should at least be partially resolved. Though by then Zen4 will be very close by as well ;)
GPU prices should be a lot better 6 months from now, because Intel is entering the market in Q1, they'll be pricing GPUs cheap to try and get as much market share as possible, and AMD and NVidia may be forced to compete.
@@greggmacdonald9644 except for the fact that the major chip suppliers are warning of shortages not clearing up till the end of 2022/beginning of 23. Doesn't matter how low they price GPU's if there are not parts to make them. Scalpers will still buy up the very minimal stock, then resale for 2 to 3 times the retail.
@@CrowsRevenge1 Yeah, it's hard to say how all that will play out over the next little while, for a bunch of interconnected reasons. Still, I hear that Intel's main delay here isn't making the cards themselves, it's getting the drivers ready, so I would not at all be surprised to see Intel swamping the GPU market in a way that AMD and NVidia won't easily be able to respond to. It might very well be that those two will keep on pricing as usual while unproven Intel comes out with a 3070 equivalent at $300 or so. If that's the case, at least gamers will have that choice, and an Intel 3070-ish is a lot more appealing than a NVidia GTX 1650, I bet.
MOOOOREEEE info on DDR5 please. I know for sure it's going to be very expensive for early adopters and frequency at first won't be as high as it potentially could be
Somehow I’m thinking Alder Lake could hit a snag because of that. If it can’t hit high memory controller clocks and have to lean on some of these higher gear modes, then it could really suffer.
@@SKHYJINX Well there is always more to know, it has ecc, uses less power, is faster and more but it interests me because it's been promised to have more features and performance uplift than any other ram gen if I'm not mistaken
But even if it is double the price of DDR4, that doesn't necessarily mean much when DDR4 is so cheap. I mean, I'd happily pay an extra $100 or whatever for 32 GB if it gets me an extra 10% and I could move it to a future system.
@@icaruswindrune7116any issues? My original 3900x died and microcenter warrantied it for me....sold it to a friend and now I got it back in my backup system, been working just fine Weird coincidence?? Lol
I find it really hard to justify a cpu upgrade since it also almost always requires a mobo and ram upgrade along with it. But we have a really cool alignment of ddr5 ram, pcie gen 5 and huge cpu jumps.
Don't forget the "we finally overcame the shortages.. or did we?" Video he'll make in 2028, explaining GPUs, PSUs, CPUs and even SSDs are all available (even to normal prices) but no cases or coolers. God, this is just ridiculous 😆
@@josephgray3035 Dark hero is working great, love that it has a passively cooled chipset heatsink, instead of the small fan many of the other x570s have.
Obviously everyone is hyped about the "12900K", and rightly so, but I'll be more interested in a 12400 through 12600, and if/how Intel handles the performance/value equation. 4/6 performance cores and 4/6 efficient cores maybe? For $300 or less? That'd be a compelling midrange platform for your every day user.
12900k will cost a grand? Absolutely fuck all about that. I won't pay more than ~400 for a high-end CPU. Fuck AMD for increasing their prices on 5000 series.
How is this any better to just offering more cores for less money on desktop? If I'm able to get 2-4 more cores on AMD for the same price then how is this not a joke? This has no appreciable application outside of notebooks. It's not like I wouldn't appreciate going back to Intel for the extra features they usually have but there is no way I'm paying the same or more for less actual performance cores. I'm running a 3950X with a 3090 and I'm going for fidelity and resolution over framerate so there is no good reason to upgrade yet. Paid MSRP for all of my stuff and I'm not willing to pay scalpers so this PC is going to stay with me for a long time even though I usually buy a lot of premium stuff when it becomes available.
So you're ok with having something like windows updates start robbing your cpu of performance while your playing a game? This hybrid design could be one of the biggest innovations we have seen in a long time when it comes to doing literally anything on your computer. To possibly be able to go and fine tune your windows install to have certain task only ever be ran on the E cores or P cores is a mind blowing level of tuning. Not to mention not having your cores go balls to the walls whenever any random task happens is always a plus for your electric bill lol
@@thisismelsemail1217 That's no argument, wouldn't you rather have 4 extra P cores instead of E cores? I never have issues running updates in the background my 3950X neither does my co-op buddy with a 5900X. To me, Intel trying to cheap out and force a primary meant for power saving mobile solution on desktop users instead of doing the descend thing: Offer more cores in their lineup across the board, from i3 to i9. It isn't even innovative, smartphones have been doing it for over a decade now and there is nothing that would make an E core better at background task than having another P core in it's place. If intel is really entering the chiplet world with this design one might even argue they are reusing crappy chiplets they have lying around instead instead of filling that die space with proper cores. Wouldn't be surprised when it turns out those E cores are 14nm++++++++++++++++ leftovers grafted on there.
@@whydoihavetodothisannoying not everything is about cores dude. Those games you're playing, they use one core. Maybe two at best. There's a reason the 8700k plays games just as well as your 3950X. And at half the price to boot. Especially if you play games for fidelity over fps. Before arguing with someone on a subject you may want to at least read up on said subject just a bit. Oh and btw, I am running a 9700K paired with a 1080ti and have yet to come across a game I can't run at 4K 50-60 fps
@@thisismelsemail1217 Yes not everything is about cores but there isn't a reason why low power E cores should be in any way better than just regular ones to do whatever is running in the background. That's why I mentioned my processor, when your goal is to play or work without being disturbed by background tasks it doesn't matter if you have P or E cores doing that background work. Having 2 extra regular cores over say 4 E cores might also be the better option since those small cores will also take a lot more time to finish a potentially long process like updating windows. And believe me I'm well informed, I work in software development and I use my computer not only for playing games. Sure those 16 core CPU's might not have as strong single core performance but when playing on a 5120x2160 screen my aim is to get 100+ FPS and I'm still usually limited by the 3090 in modern games. And it's still good enough to run my Index at 144 Hz. This is here I reiterate the core of my statement. Why would you want low power cores INSTEAD of additional regular cores on a desktop where you aren't limited by battery life ?
@@thisismelsemail1217 That "innovation" exists for some time and it didn't come from Intel. It is very hars to respect someone when all he shows is infinite repetition of Intel Apple-wanna-be marketing.
I really don't care about a short, jargon loaded video, over hyping a new technology. I much prefer a more simply worded, longer video that explains things in a way that is easy to understand but doesn't treat you like a 5 year old, and for that I am glad Jay exists in this space. As far as community and channel goes, this may not be the biggest nor the most technical but gosh darn it if it isn't the most level headed one and that is why I love it
The whole hybrid design is great for mobile devices and laptops where battery life is an issue, but to me it just seems like a waste of time and resources on a desktop chip. I'd rather trade those 8 "E cores" for 2 more "P cores".
E cores aren't arm architecture. And why limit the OS to the E cores and applications to P cores? Seems a way to introduce inefficencies in a sistem designed to overcome them.
Yes. I don't want either one of them to get too far ahead. The second any one gets too far ahead the risk for higher prices and product stagnation just increase. I WANT it to be hard to choose what components I want to buy for my next PC because there are a wealth of excellent choices.
As much as I enjoy AMD taking Intel to task all across the board, competition is good for the consumer (and really everyone). When you stop trying you wind up building 14mn overs for a decade. When the 11th gen launched it started pushing AMD prices down too.
Then you need to wait until end 2022 for an amd counter. 3d cache is overhyped and makes no sense. During the presentation they just picked cherry games so they can say 15% uplift in ipc but thats not true! Only Zen 4 will be good at the end of 2022. But at that time intel launches Raptor lake. So that will be a neck on neck race from now on.
I'll wait and watch for you and the other YTer's out there, get your hands on it and run some tests. Plus I'm waiting on AMD to launch their next gen CPU's. More info on DDR5 would be great!!!
Is there any direct Linux development happening with the scheduling feature? Really not looking forward to windows 11...10 is driving me crazy already.
I believe the Linux kernel was updated last year to handle hybrid CPU's and improve performance. Also the kernel 5.5 update includes these and newer updates to their scheduler. I know with my systems, run multitasking much more efficiently on linux than windows, but Linux is a true multiuser / multitasking system where windows is not, so its a bit of a chalk and cheese situation in that regard. By the way I'm in the midst of switching out 12 Windows 10 machines in an office to Fedora 34 thanks to Windows 11 tests.
I'm mostly curious how LINUX will play into Intel's CPU plans. Will Linux do as good as Win11 at juggling the cores? (In theory it should do better, but if Microsoft keeps secrets...) Will Win11 even do a good job? Be funny if Win10 without the logic to handle properly actually does better. Etc. AND ... VMs. How well can your VM hypervisor play with splitting up cores to VMs? The proof will be in the pudding and I trust Microsoft not at all. ROFL But it will be very important to know which OS is best to pair with Intel's new toy.
@@slightlyopinionated8107 I'd like to think so too. A year late, maybe. But e v e n t u a l l y. The Linux kernel is far better designed as far as task assignment, so it should not only be possible, but be better. And yet... The problem with Linux is that as FOSS, if no one feels like working on it, it never gets done. You would think Intel would have a kernel patch already in hand for the community. That they'd find it FUN to develop. But I haven't heard Linux mentioned AT ALL. Just Windows. Windows with a severe AMD oversight bad enough to spark conspiracy theorists. It's ... curious ... how this is going.
@@justsomeperson5110 to that I agree. Windows 11 is catered so much towards Intel that it is suspicious. I’m not sure if intel’s older gens took the same performance hit but if not there is definitely something funny
Exciting to see a new angle from Intel, curious if this was also designed with cpu encoded live streaming in mind. Hopefully they'll be competitive on price for once.
my bet is that alder lake improves power efficiency substantially, and that intel's actual bragging will be 19% improvement performance per watt. outside the productivity sphere intel doesn't care about selling chips. then AMD will come with zen5 and DDR5 will be feeding navi iGPUs, obviating the need for a low or even mid-tier discrete gpu, during the chip shortage AMD would make a killing.
@@emmata98 Well Zen4 is another story, he said Zen3. Also I think that intel's big cores should have at least a 25% IPC improvement considering they added 50% more ALUs, 50% more AGUs, much more bandwidth and registers and cache. And if the clock speeds remain the same, that'd be quite a nice improvement.
@@rattlehead999 Jay said only an 18 % improvement and I thing that some of the bandwidth is needed for the better cores and then for the additional e-cores, since they also use L3 Cache and the same memory.
@@emmata98 Maybe it lacks bandwidth for the first generation of this architecture, so that they can milk us for longer, like they've been doing along with AMD and Nvidia for the past decade. But at the same time the 12900k was shown beating the W2990 and R9 5950X in Cinebench and those small cores seem like they have the performance of 1x big core for every 4x small cores, while also no scaling linearly. IDK the specs on the small cores.
@ 6:10 - Yeah, my PC does that all the time when I am playing a game at an intense moment (i.e. final four in Fortnite, or valheim boss) and it is always windows checking for updates. ALWAYS
Hey, I know you asked if anyone was interested in a video about DDR5. If you do this video, could you potentially talk about latency and the potential advantages (or disadvantages) DDR5 would have for gaming?
I'm here on an 8700. I'm going up to 9900 , this new tech can wait. Not necessarily because prices will probably be insane, but mostly because it's new stuff bound to have new bugs.
Eeeyup. I'm leaving Alder Lake to rich kids who have to have new Intel toy every time Intel farts. That being said if Alder Lake succeeds, Raptor Lake will be superb. It will be much more mature and the prices (of DDR5 for example) should be more manageable. As for prices - this will depend on how long the drought will hold.
But what counts as maturing? I mean, even if Windows 10 gets confused by the efficiency cores, so you disable them for now, a 12900K is still superior to a 11900K, so maybe it doesn't matter for gaming or anything that doesn't need more than 8 cores? idk, it's going to be very interesting to see what testing reveals.
@@greggmacdonald9644 For me, maturing in an OS is best exemplified by Win95 B vs Win95 Vanilla. As for the hardware, things like Ver 3.1 like I had with the Sandy Bridge chipsets, for example.
Let me tell you how it is going to play out: •Intel will release Adler lake •Intel will try to capture market •AMD comes up with ddr5 compatibility and cheaper •AMD will try to capture market •Healthy competition will occur(hopefully)
I also wonder if Intel's "Ultra Mobile" version is going to FINALLY revolutionize phones. Atom phones ... didn't catch on. (I was never sure why, to be honest.) But for a VERY long time I have been waiting for a phone that can actually run a full-blown x86 Windows OS. (Or Linux. I'd take Linux.) I wonder if this is the CPU that will finally make that leap, if I could FINALLY just carry an office in my pocket, making travel infinitely better. I don't expect it to be fast. Heck, I used to use a Viliv S5. I know how slow these things can be. LOL But it'd be great if I could finally live the dream that phones promised decades ago when they killed PDAs ... but then NEVER delivered.
For me personally fps is king. I need to see how alder lake will stack up against zen 3/zen 4 and what the 1% lows will be. It took amd 4 years to make zen run really well in games, im willing to bet intel will also have growing pains
I'll wait for their 2nd implantation of it, it'll give me a chance to see how good it is & what AMD will dish out. My i7 9700k did me good for a couple years and now with a 5900x I'm glad I upgraded even though I really didn't need to. So it'll be nice to see what AMD will try to counter with and how better these things get. Hopefully by the time I feel the urge to upgrade the chip shortage won't be as bad and the prices have fluctuated back a down a bit.
Anyone who upgrades their home non-business setup more often than once every 2 years is a sucker. Most upgrades are either incremental or take years to work the bugs out of. People don't seem to realize that if they can wait for the bugs to be worked out, they can wait for their existing hardware to actually be old before replacing it. I bet many of the people who think they really need that 240th frame per second don't even have a 240Hz monitor that can display it.
Sounds more like to me they want is on paper to be a 16 core processor. However you're not getting 16 real cores. You're getting 8 very cut down cores. Normal users aren't going to know this. they are just going to see the "16 core" Marketing and fall for it.
I just built an AMD system with ddr 4 and pcie gen 4.... I'm good for now. All that new tech is going to put both you and your wallet on the bleeding edge. 12VO motherboards, ddr5 ram, new socket, new intel chips/ chipset, etc, whew the cost is going to be eye watering.
motherboard and PSU manufactures are really pushing back on the 12VO standard that intel is putting out, highly doubt that they would force this if they don't get the support of the mobo OEMs
@@mcp866 I'm assuming this is coming from a good place, so here is the definition of the word pun. Pun: a joke exploiting the different possible meanings of a word or the fact that there are words which sound alike but have different meanings. "the pigs were a squeal (if you'll forgive the pun)"
YAY AMD Intel going at it once again! This is gonna be great, I just hope both just keep making great products. I feel like its been forever since I had to really research a platform to build the best I can with the pennies I have lol
@@Spectru91 Thats usually the right formula lol Seeing these tech giants battle it out for decades and Intel usual holding it down has been fun, and to think that both companies came from the same background and branching off to create great products.
Lurker here... just drinking my morning coffee, looking at the background. Ahhh... I get it... Tires = CPU Threads (treads) Ladder = CPU clock steps Or maybe I just need to drink more coffee.
> Just like the way Windows 10 was required to take advantage of ... 9900K features That's the first I heard about that... Could anyone tell me something more specific so I can find the details?
I told my buddies to wait for this release before upgrading their rig. They’re not in a rush and if nothing else the old stuff will drop in price. DDR5 breakdown por favor!
Recently I had to buy a new CPU due to my 6700K dying, I went with a 3700X and it has been amazing! Before Ryzen came out I thought AMD would never be a brand that I would consider.
Intel launching DDR5 support on a mainstream platform is quite the change from how it usually does things. Normally there would be a HEDT release with the newest memory standard (like X58 > DDR3, X99 > DDR4).
This p core and e core scheduling sounds exactly like what we wanted dual cpu systems to be back in the day. 1 cpu to handle games and 1 to handle all the background stuff. I also wonder what their Xeons will be like.
Can't wait for Alder Lake 12th gen. Leaked benchmarks look way better than Zen 3, even the 5950x. And prices are rumored to be cheaper. Intel is now our savior???
We need that ddr5 breakdown vid ASAP
Won't make much of a difference right away most likely
Facts
Bandwidth based games maybe.. latency bound games will suffer a bit.. SOTTR loves bandwidth, but Say Warzone, Loves both latency and bandwidth, scales with both...
Its been done by other channels
I got DDR5 and they are AMAZING I have Corsair Dominator Platinum RGB 32GB P
This is why I like Jay. Gives you the basic low down in an easy to understand way without over hyping the crap out of it.
Agreed
So true..most other YT are just fall of shit. So tired of these loosers!
Jay is the best 👍🏻
I don’t prefer AMD or Intel, I just want good competition and a wide selection, so this is good
@The Minister of Memes amd literally has those exact same issues and I have a 3900x
@The Minister of Memes
I'm not bias but if you want some information from the AMD side through user experience.
I've worked with many clients. With AMD and INTEL...
So far I've noticed MORE instability with AMD processors.
Ryzen 7 2700x - Blue screening issues.
Ryzen 3600 - Crashes while running moderate tasks.
Ryzen 5600g - Green screening issues while idle.
But I've had no issues at all.
I've worked with systems with 8400,8700,9600k,9900k,10850k,10400
I've only had issues with the 9600k with a moderate OC.
@The Minister of Memes AMD is changing platforms next generation so everyone is starting from scratch. It'll be interesting especially if there's going to be a fight for the bottom half of the market in both GPU and CPU this round.
Good competition = better prices
@@justdashie1132 Well I've had the exact opposite problem - all my recent Intel builds had stability issues, but my two Ryzen rigs have zero issues at all.
The only issue I did have with AMD is because Asus cant figure out how to unfuck their USB2.0 header issue.
I'm sticking to AM4 for the next 3 - 4 years at least. I get to avoid being a beta tester for both AMD and Intel, and by then prices for RAM, motherboards, etc should become more reasonable. I am currently on a Zen+ CPU. Going from that to a 5800x a couple years from now will be a solid upgrade. I can then hold on to that CPU for at least 3 years, then move on to next-gen platforms.
This is the smart thing to do. And I'm saying this as someone that will be getting Alder Lake because of 240Hz and BF 2042. DDR5 will be expensive asf and trash at the start. Not worth it unless you're an idiot like me, chasing frames.
@@mcgmgc 240 Hz is literally a waste of time. Stop buying into the stupid e-sports bullshit marketing.
Very smart. I've gotten to the point where my equipment hadn't been performing very well for most things, so I upgraded during these past launches. I won't be upgrading for another 5 years or so unless I never have to worry about money ever again.
Buy pc for what you need it for. Don't overspend.
same here. no point till; we get good gpus
Frankly, I’m just glad to see real competition.
Amen to that.
not just competition but ACTUAL leaps and bounds ahead of what we were getting for years... finally... we get evolutions of upgrades...
Yeah, same.
It appears that AMD's quality has made Intel step up. I love to see it
The consumer is always the winner when companies compete!
Yes, do a DDR5, pci-e gen5 round-up. I’m planning on doing a new build as soon as it’s available.
Wait a year into DDR5. You’ll save lots of money and the compatibility will be more mainstream.
@@I_SuperHiro_I yea, I’m waiting till the supply shortage has calmed down. Should be plenty of time for things to mature
I usually wait two years later. It will be very expensive plus time to work out the bugs. I've built a system when ddr4 first came out. It had soo many issues til It got updated. Sometimes patience is worth it.
And just like DD3 to DD4 transition, its likely that first gen may be 'faster' but the latency will make it an even trade off. Unless you have a small Epeen... wait a year or 2 like Super Hiro said.
all equals to the sum you are prepared to hand out for tech
you can get 16 gb of ddr5 now but in 2 years you spend even more for the good stuf
or you stay with an bined/overclocked ddr4 kit and in 2,5 years get what you actually wanted from the hype
I would love to watch a DDR5 video from you, Sir Jay. Have a good day!
In regard to the cooler discussion, we have seen Noctua and MSI announce they have free upgrade kits for their current cooler customers to use with LGA 1700. You are absolutely right with your anticipation!
Awesome
Intel finally does something new. After so many years. I wonder what is AMD up to, I don't think they will give up the market lead that easy.
You don't need to wonder. They have announced Zen 3D at Computex.
@@nipa5961 I know, but we do not know much about it. That is why I wonder if they can surprise everyone one more time.
@@zagloba1987 Thats strange, I think we know much less about Alder Lake than about Zen 3D yet. It's simply Zen 3 with more cache stacked on top of it.
@@nipa5961 And more programs with be optimized for them.
@@omegaman7377 ?
I'm also curious about "idle" state. Your PC is just sitting there with a screen saver. Or just watching one video. Easy tasks. How low of a power draw can we manage with this? Paired with ATX12VO?
Intel raised ATX efficiency requirements for 10W loads last year so probably sub 20W or sub 15W with ATX12VO? I'm also curious about UEFI - there could be 6 or more settings just for cpu voltage and dozens of settings for power management...
With a 9900K@ 5GHz all-core, i can wait a few more generations. I don't need to (and can't afford to) have the very best-of-the-best.
I wonder when people are going to realize these "leaks" are done by the company to get people to keep on their platform. I have known this for many years yet people still seem to think this is someone with great insider knowledge not the company dribbling out this information.
Well that would be someone with great insider knowledge wouldn't it?
Everyone knows this lol
We should all know this by now.
I’m sitting out till 2023 or 2024. This chip shortage/inflated prices discouraged the heck out of me!!!
Seriously, I've never been this dis-interested in tech before. You could tell me the 4080 RTX is 40% more FPS than the 3090, at using only 25% of the power and I'd be like "Yeah, but for $2,500 MSRP with scalpers wanting $4k."
Seriously, fuck scalpers, and fuck AMD/NVIDIA for taking advantage of their customers by literally raising prices 2x once they realized scalpers/crypto miners would pay the higher price if gamers wouldn't.
@@stormchasingk9 this. I was looking at building a system with a 3050 or a hypothetical 6500xt but now i've straight up given up. this 3500u is gonna have to last for a few more years lol
@@stormchasingk9 Problem is, as long as it's not mask or tp or hand sanitizers, police and government turns blind eye to other type scalpers because non essential shit *disincentivizes* the enforcement.
Marketplace platform also don't give a damn about non essential stuff either, unlike masks.
I do consider GPU semi essential especially for 3d modeler, CAD users (but they mainly use quadro or radeon pro), and artist that deals with CG
@@stormchasingk9 yes, cause scalpers are the reason you can't get a GPU. It couldn't be the chip shortage or anything. Which is showing signs that it is NOT going to get better in the next couple years as originally thought and is in fact going to get worse.
Spoiler alert: it's never gonna end. Thanks to Chinavirus.
That Intel comeback happend faster than expected
10:37: NOT EVERYBODY WILL NEED A NEW COOLER. Noctua has said they will send you an upgrade kit *FOR FREE* provided a receipt. If you can't provide one the kit is like $7
It may be a larger die but higher end coolers should be appropriately sized.
Yeah. Any coolers that support 2066 or TR4 should work just requiring a mounting adapter
Noctua have done this for a long time. I had to replace the motherboard in my Media PC a few years back and got a replacement mounting bracket. Good PR and customer service.
@@barkerit I feel like spending the extra coin on a cooler that won't go to waste because the company has a great track record of supporting their products when other manufactures would rather you buy a new cooler is a selling point.
Where In the hell you get 999 for a 12900k ?? Literally everyone is reporting 605
Nice! This means ddr4 and zen 3 prices will drop soon. I'll upgrade my 3600 and b450 to a 5800 and x570 for dirt cheap, and grab another 16gb of ddr4 while I'm at it. Loving the competition!
But don't wait too much, when a new generation of RAM has been established, older ones only go up
@@facestabber I wish they were like that here. 16GB modules are aroun 100 USD in my country
@@facestabber I just spent $80 on 16GB CL16 3600. The 3200 was around $70-75 depending on brand.
Budget gamers rise!
I'm on my 11th gen i9 and my Ryzen 5950X gaming and working happily.
I'm happy that Intel is finally stepping up their game. AMD was a wake up call for them to get serious again. With that said, I'm still happy using my 10900K so I won't be upgrading for 2-3 years. But it's nice to see Intel finally start pushing the boundaries with performance. Real competition is back
I just built a new machine with a 10600k, for the money it was the best option by far. I won’t be upgrading for at least 5 years.
I'm going to so laugh my ass off if they fall flat on there face as all intel = Is more over priced crap.
@@brando3342 same
You,ll get 5 yrs out of that chip.
@@brando3342 it all comes down to your use case for that CPU. But I'm a heavy user and need more cores & eventually the 10900K will get outdated. They're be better CPUs out in that time from both competitors.
You’re my fav Tec YT channel. Why? Because you don’t have an ego problem. You’re grounded, you can laugh at yourself. Hats off to you Jay.
I will definitely be keeping my eye on Alder Lake and see what AMD responds with. I was originally going to build a whole new computer and upgrade to a Ryzen CPU, but of course the GPU economy has prevented me from doing that. My i5-3570k paired with a GTX680 needs to be retired already, but both the computer and myself can hold out a while longer.
AMD will allegedly release Zen3D (or Vcache) either at the end of this year (fingers crossed) or Q1 of 2022. Price for Zen3 might drop a little (if it will not be dropped for Alder Lake) and Zen3D has approx. 15-20% of IPC increase over Zen3 (due to larger cache - something that is throttling Intel's CPU's and will be a problem even for Alder Lake).
If I were you I would either wait for Zen3D to be released - and then choose AMD, or if you really NEED Intel - wait for Raptor Lake (so 13th Gen). It's supposed to arrive at Q4 2022 and by that time prices of DDR5 will get lower and all the teething problems with scheduler should at least be partially resolved. Though by then Zen4 will be very close by as well ;)
GPU prices should be a lot better 6 months from now, because Intel is entering the market in Q1, they'll be pricing GPUs cheap to try and get as much market share as possible, and AMD and NVidia may be forced to compete.
@@greggmacdonald9644 except for the fact that the major chip suppliers are warning of shortages not clearing up till the end of 2022/beginning of 23. Doesn't matter how low they price GPU's if there are not parts to make them. Scalpers will still buy up the very minimal stock, then resale for 2 to 3 times the retail.
@@CrowsRevenge1 Yeah, it's hard to say how all that will play out over the next little while, for a bunch of interconnected reasons. Still, I hear that Intel's main delay here isn't making the cards themselves, it's getting the drivers ready, so I would not at all be surprised to see Intel swamping the GPU market in a way that AMD and NVidia won't easily be able to respond to. It might very well be that those two will keep on pricing as usual while unproven Intel comes out with a 3070 equivalent at $300 or so. If that's the case, at least gamers will have that choice, and an Intel 3070-ish is a lot more appealing than a NVidia GTX 1650, I bet.
@@CrowsRevenge1 Intel make their own chips - so there's that.
I'd still like to see a third Cpu manufacturer, that would mean real competition
MOOOOREEEE info on DDR5 please. I know for sure it's going to be very expensive for early adopters and frequency at first won't be as high as it potentially could be
What more do you need to know?
Its fast but delayed latencies..
Somehow I’m thinking Alder Lake could hit a snag because of that. If it can’t hit high memory controller clocks and have to lean on some of these higher gear modes, then it could really suffer.
@@reinhardtwilhelm5415 thinking the same... 11th gen and 12th gen on gear 2... and fact that normie users don't tweak this stuff too much
@@SKHYJINX Well there is always more to know, it has ecc, uses less power, is faster and more but it interests me because it's been promised to have more features and performance uplift than any other ram gen if I'm not mistaken
But even if it is double the price of DDR4, that doesn't necessarily mean much when DDR4 is so cheap. I mean, I'd happily pay an extra $100 or whatever for 32 GB if it gets me an extra 10% and I could move it to a future system.
10900k you will not be replaced! I love you.
I love my 10850k so much. Will need to be something substantial to justify the upgrade.
Jay will give you something substantial. His two cents.
Same for me, albeit with my 3900 x. All I know is that I will be waiting until zen 4 comes out before I upgrade my CPU.
@@icaruswindrune7116any issues? My original 3900x died and microcenter warrantied it for me....sold it to a friend and now I got it back in my backup system, been working just fine
Weird coincidence?? Lol
@@Skippernomnomnom hey now. Don’t get my hopes up. I want the whole 9 cents.
@@KevinLikesRTS 😭
I find it really hard to justify a cpu upgrade since it also almost always requires a mobo and ram upgrade along with it.
But we have a really cool alignment of ddr5 ram, pcie gen 5 and huge cpu jumps.
i upgraded from a 4th i5 to a 7th gen i7 so yea i needed a new mobo, cpu and ram but it was definitely worth it.
You have to factor in the cost of Windows 11 too if they don't have the free upgrade route. Retail Windows 10 is what $150-200 bucks?
@@Craig_N or grey market keys for 10 bucks and hope it will work fine
@@aberinox I'm going with the assumption that early adaptors will have to buy a full on retail box just for the key.
Going to wait and see how they actually perform before I upgrade.
Way to sound like my ex-wife....
@@gridsquare I lold
like everyone should do
I was lucky enough to recently upgrade to a shiny, new system so I'll be excited to watch Jay's DDR6 news and reviews in 2029.
Don't forget the "we finally overcame the shortages.. or did we?" Video he'll make in 2028, explaining GPUs, PSUs, CPUs and even SSDs are all available (even to normal prices) but no cases or coolers.
God, this is just ridiculous 😆
There will also be a plutonium shortage for 1.21 gigawatt power supplies.
Might finally upgrade my 6700k. Been doing great but it's about that time and this, if it stays along this line, is definitely interesting.
I'm in the same boat. 6700k gang
Im super happy with my 5950X, but sounds like its gonna be a good competition. GL to all players.
Would love more follow ups & that DDR5 video lol.
What mobo did you get? I just got the cpu trying to find a good one.. I was looking at the dark hero
@@josephgray3035 I got the ASUS ROG Strix X570-F gaming.
No complaints so far
@@josephgray3035 Dark hero is working great, love that it has a passively cooled chipset heatsink, instead of the small fan many of the other x570s have.
Obviously everyone is hyped about the "12900K", and rightly so, but I'll be more interested in a 12400 through 12600, and if/how Intel handles the performance/value equation. 4/6 performance cores and 4/6 efficient cores maybe? For $300 or less? That'd be a compelling midrange platform for your every day user.
It's great to see Intel get their act together. More competition for the consumer!
Intel never lost their crown they just waited for AMD to get close in gaming CPUs...they could crush AMD if they went all out
@@moby1kanob How old are you? I assume you are not a kid as much as you appear to be in that comment.
Will be interesting to see how AMD tries to one up them if these are any good. I can see the price being a sore spot but competition is always good
12900k will cost a grand? Absolutely fuck all about that. I won't pay more than ~400 for a high-end CPU. Fuck AMD for increasing their prices on 5000 series.
How is this any better to just offering more cores for less money on desktop? If I'm able to get 2-4 more cores on AMD for the same price then how is this not a joke? This has no appreciable application outside of notebooks. It's not like I wouldn't appreciate going back to Intel for the extra features they usually have but there is no way I'm paying the same or more for less actual performance cores.
I'm running a 3950X with a 3090 and I'm going for fidelity and resolution over framerate so there is no good reason to upgrade yet. Paid MSRP for all of my stuff and I'm not willing to pay scalpers so this PC is going to stay with me for a long time even though I usually buy a lot of premium stuff when it becomes available.
So you're ok with having something like windows updates start robbing your cpu of performance while your playing a game? This hybrid design could be one of the biggest innovations we have seen in a long time when it comes to doing literally anything on your computer. To possibly be able to go and fine tune your windows install to have certain task only ever be ran on the E cores or P cores is a mind blowing level of tuning. Not to mention not having your cores go balls to the walls whenever any random task happens is always a plus for your electric bill lol
@@thisismelsemail1217 That's no argument, wouldn't you rather have 4 extra P cores instead of E cores? I never have issues running updates in the background my 3950X neither does my co-op buddy with a 5900X. To me, Intel trying to cheap out and force a primary meant for power saving mobile solution on desktop users instead of doing the descend thing: Offer more cores in their lineup across the board, from i3 to i9.
It isn't even innovative, smartphones have been doing it for over a decade now and there is nothing that would make an E core better at background task than having another P core in it's place.
If intel is really entering the chiplet world with this design one might even argue they are reusing crappy chiplets they have lying around instead instead of filling that die space with proper cores.
Wouldn't be surprised when it turns out those E cores are 14nm++++++++++++++++ leftovers grafted on there.
@@whydoihavetodothisannoying not everything is about cores dude. Those games you're playing, they use one core. Maybe two at best. There's a reason the 8700k plays games just as well as your 3950X. And at half the price to boot. Especially if you play games for fidelity over fps. Before arguing with someone on a subject you may want to at least read up on said subject just a bit. Oh and btw, I am running a 9700K paired with a 1080ti and have yet to come across a game I can't run at 4K 50-60 fps
@@thisismelsemail1217 Yes not everything is about cores but there isn't a reason why low power E cores should be in any way better than just regular ones to do whatever is running in the background. That's why I mentioned my processor, when your goal is to play or work without being disturbed by background tasks it doesn't matter if you have P or E cores doing that background work. Having 2 extra regular cores over say 4 E cores might also be the better option since those small cores will also take a lot more time to finish a potentially long process like updating windows.
And believe me I'm well informed, I work in software development and I use my computer not only for playing games.
Sure those 16 core CPU's might not have as strong single core performance but when playing on a 5120x2160 screen my aim is to get 100+ FPS and I'm still usually limited by the 3090 in modern games. And it's still good enough to run my Index at 144 Hz.
This is here I reiterate the core of my statement.
Why would you want low power cores INSTEAD of additional regular cores on a desktop where you aren't limited by battery life ?
@@thisismelsemail1217 That "innovation" exists for some time and it didn't come from Intel. It is very hars to respect someone when all he shows is infinite repetition of Intel Apple-wanna-be marketing.
I really don't care about a short, jargon loaded video, over hyping a new technology.
I much prefer a more simply worded, longer video that explains things in a way that is easy to understand but doesn't treat you like a 5 year old, and for that I am glad Jay exists in this space.
As far as community and channel goes, this may not be the biggest nor the most technical but gosh darn it if it isn't the most level headed one and that is why I love it
The whole hybrid design is great for mobile devices and laptops where battery life is an issue, but to me it just seems like a waste of time and resources on a desktop chip. I'd rather trade those 8 "E cores" for 2 more "P cores".
Steve is good when you want to geek out and get into the weeds. Jay is good when you want to get the picture. I appreciate you man
Imagine a lightweight ARM OS that runs entirely on the E cores, leaving the P cores to handle x86 applications
E cores aren't arm architecture. And why limit the OS to the E cores and applications to P cores?
Seems a way to introduce inefficencies in a sistem designed to overcome them.
@@robertodesimone2823 I think if you put a really lightweight libido distro on arm e cores the battery will last for months lol
Yes. I don't want either one of them to get too far ahead. The second any one gets too far ahead the risk for higher prices and product stagnation just increase. I WANT it to be hard to choose what components I want to buy for my next PC because there are a wealth of excellent choices.
Love this channel
Yeah boii
the new scheduler in windows for these new hybrid chips is the real reason why we have W11.
As much as I enjoy AMD taking Intel to task all across the board, competition is good for the consumer (and really everyone). When you stop trying you wind up building 14mn overs for a decade. When the 11th gen launched it started pushing AMD prices down too.
I really hope this P/E design performs well - I wanna see these two aggressively leapfrog each other.
I more concerned by the software part. I hope Microsoft won't go the easy way by putting a Windows 11 only label on that.
Can't wait to see the benchmarks and what AMD will counter with... Interesting times
Smart money says Amd will keep Windows 10 support
indeed.
The benchmarks are already out. Have been for days. Intel is topping the charts, even over Threadrippers.
@@williameldridge9382 You mean over 3 years old Threadripper ? God intel marketing BS really hited you hard
Then you need to wait until end 2022 for an amd counter. 3d cache is overhyped and makes no sense. During the presentation they just picked cherry games so they can say 15% uplift in ipc but thats not true! Only Zen 4 will be good at the end of 2022. But at that time intel launches Raptor lake. So that will be a neck on neck race from now on.
I like when they put little windows in the processer
I'll wait and watch for you and the other YTer's out there, get your hands on it and run some tests. Plus I'm waiting on AMD to launch their next gen CPU's. More info on DDR5 would be great!!!
I liked the nod to King of the Hill in the sign off Jay
Awesome! *Engine block PC case update when?*
DANCE DANCE REVOLUTION 5 IS COMING?!?!?!?!?!?
Is there any direct Linux development happening with the scheduling feature? Really not looking forward to windows 11...10 is driving me crazy already.
I believe the Linux kernel was updated last year to handle hybrid CPU's and improve performance. Also the kernel 5.5 update includes these and newer updates to their scheduler.
I know with my systems, run multitasking much more efficiently on linux than windows, but Linux is a true multiuser / multitasking system where windows is not, so its a bit of a chalk and cheese situation in that regard. By the way I'm in the midst of switching out 12 Windows 10 machines in an office to Fedora 34 thanks to Windows 11 tests.
Linux (in a form of Android) has been running on ARM chips with big.LITTLE architecture for years by now. What Intel is doing is not different.
1:02 Look at all those power supplies!
I'm mostly curious how LINUX will play into Intel's CPU plans. Will Linux do as good as Win11 at juggling the cores? (In theory it should do better, but if Microsoft keeps secrets...) Will Win11 even do a good job? Be funny if Win10 without the logic to handle properly actually does better. Etc. AND ... VMs. How well can your VM hypervisor play with splitting up cores to VMs? The proof will be in the pudding and I trust Microsoft not at all. ROFL But it will be very important to know which OS is best to pair with Intel's new toy.
I’m pretty sure Red Hat and VMware will do something about it
@@slightlyopinionated8107 I'd like to think so too. A year late, maybe. But e v e n t u a l l y. The Linux kernel is far better designed as far as task assignment, so it should not only be possible, but be better. And yet... The problem with Linux is that as FOSS, if no one feels like working on it, it never gets done. You would think Intel would have a kernel patch already in hand for the community. That they'd find it FUN to develop. But I haven't heard Linux mentioned AT ALL. Just Windows. Windows with a severe AMD oversight bad enough to spark conspiracy theorists. It's ... curious ... how this is going.
@@justsomeperson5110 to that I agree. Windows 11 is catered so much towards Intel that it is suspicious. I’m not sure if intel’s older gens took the same performance hit but if not there is definitely something funny
Exciting to see a new angle from Intel, curious if this was also designed with cpu encoded live streaming in mind. Hopefully they'll be competitive on price for once.
my bet is that alder lake improves power efficiency substantially, and that intel's actual bragging will be 19% improvement performance per watt. outside the productivity sphere intel doesn't care about selling chips. then AMD will come with zen5 and DDR5 will be feeding navi iGPUs, obviating the need for a low or even mid-tier discrete gpu, during the chip shortage AMD would make a killing.
I'm waiting to see where things go with a new HEDT platform.
I just want to see some real benchmarks and see where it falls against Zen 3. :D
It's faster, because the big cores alone have quite higher IPC than the current generation of Intel CPUs.
@@rattlehead999 but it has to counter 16 big cores and with improvements to Zen 4, I don't think they will get the lead back
@@emmata98 Well Zen4 is another story, he said Zen3.
Also I think that intel's big cores should have at least a 25% IPC improvement considering they added 50% more ALUs, 50% more AGUs, much more bandwidth and registers and cache.
And if the clock speeds remain the same, that'd be quite a nice improvement.
@@rattlehead999 Jay said only an 18 % improvement and I thing that some of the bandwidth is needed for the better cores and then for the additional e-cores, since they also use L3 Cache and the same memory.
@@emmata98 Maybe it lacks bandwidth for the first generation of this architecture, so that they can milk us for longer, like they've been doing along with AMD and Nvidia for the past decade.
But at the same time the 12900k was shown beating the W2990 and R9 5950X in Cinebench and those small cores seem like they have the performance of 1x big core for every 4x small cores, while also no scaling linearly. IDK the specs on the small cores.
Jay really looks old and young at the same time.
16:10 "sometimes I feel like these are controled leaks by the brands themselves" Oh really Jay ??? You think so ??? nO WaY !!!!
16:10 "sometimes I feel like these are controled leaks by the brands themselves" Oh really Jay ??? You think so ??? nO WaY !!!!
The weeks leading up to every Apple event 🤣. So many leaks and “what to expect” articles and videos. Every. Single. Time.
@ 6:10 - Yeah, my PC does that all the time when I am playing a game at an intense moment (i.e. final four in Fortnite, or valheim boss) and it is always windows checking for updates. ALWAYS
Hey, I know you asked if anyone was interested in a video about DDR5. If you do this video, could you potentially talk about latency and the potential advantages (or disadvantages) DDR5 would have for gaming?
I'm here on an 8700. I'm going up to 9900 , this new tech can wait. Not necessarily because prices will probably be insane, but mostly because it's new stuff bound to have new bugs.
If Win 11 is so embedded in the technology offered by Intel, should we then wait for Raptor Lake so that both the software and hardware matures?
As long as you have the patience
I'm thinking the same thing. Plus, maybe by the time Raptor Lake comes out, the chip shortage will be over
Eeeyup. I'm leaving Alder Lake to rich kids who have to have new Intel toy every time Intel farts. That being said if Alder Lake succeeds, Raptor Lake will be superb. It will be much more mature and the prices (of DDR5 for example) should be more manageable.
As for prices - this will depend on how long the drought will hold.
But what counts as maturing? I mean, even if Windows 10 gets confused by the efficiency cores, so you disable them for now, a 12900K is still superior to a 11900K, so maybe it doesn't matter for gaming or anything that doesn't need more than 8 cores? idk, it's going to be very interesting to see what testing reveals.
@@greggmacdonald9644 For me, maturing in an OS is best exemplified by Win95 B vs Win95 Vanilla. As for the hardware, things like Ver 3.1 like I had with the Sandy Bridge chipsets, for example.
Let me tell you how it is going to play out:
•Intel will release Adler lake
•Intel will try to capture market
•AMD comes up with ddr5 compatibility and cheaper
•AMD will try to capture market
•Healthy competition will occur(hopefully)
I also wonder if Intel's "Ultra Mobile" version is going to FINALLY revolutionize phones. Atom phones ... didn't catch on. (I was never sure why, to be honest.) But for a VERY long time I have been waiting for a phone that can actually run a full-blown x86 Windows OS. (Or Linux. I'd take Linux.) I wonder if this is the CPU that will finally make that leap, if I could FINALLY just carry an office in my pocket, making travel infinitely better. I don't expect it to be fast. Heck, I used to use a Viliv S5. I know how slow these things can be. LOL But it'd be great if I could finally live the dream that phones promised decades ago when they killed PDAs ... but then NEVER delivered.
i missed the 2020 upgrade train and still using i5 3570k setup and surprisingly I have no issues at all and yea I no longer do heavy gaming anymore
For me personally fps is king. I need to see how alder lake will stack up against zen 3/zen 4 and what the 1% lows will be. It took amd 4 years to make zen run really well in games, im willing to bet intel will also have growing pains
Agreed! I don't care how fast it can run through Cinebench, I only care for the gaming performance, which makes FPS king
@@Darkhalo314 100% this. Who exactly are cinebench scores for anyway? Competitive overclockers maybe but thats about it.
DDR5 breakdown would be great, big fan of the Jayz talking head vids like these
I'll wait for their 2nd implantation of it, it'll give me a chance to see how good it is & what AMD will dish out. My i7 9700k did me good for a couple years and now with a 5900x I'm glad I upgraded even though I really didn't need to. So it'll be nice to see what AMD will try to counter with and how better these things get. Hopefully by the time I feel the urge to upgrade the chip shortage won't be as bad and the prices have fluctuated back a down a bit.
Anyone who upgrades their home non-business setup more often than once every 2 years is a sucker. Most upgrades are either incremental or take years to work the bugs out of. People don't seem to realize that if they can wait for the bugs to be worked out, they can wait for their existing hardware to actually be old before replacing it. I bet many of the people who think they really need that 240th frame per second don't even have a 240Hz monitor that can display it.
Sounds more like to me they want is on paper to be a 16 core processor. However you're not getting 16 real cores. You're getting 8 very cut down cores. Normal users aren't going to know this. they are just going to see the "16 core" Marketing and fall for it.
I just built an AMD system with ddr 4 and pcie gen 4.... I'm good for now. All that new tech is going to put both you and your wallet on the bleeding edge. 12VO motherboards, ddr5 ram, new socket, new intel chips/ chipset, etc, whew the cost is going to be eye watering.
motherboard and PSU manufactures are really pushing back on the 12VO standard that intel is putting out, highly doubt that they would force this if they don't get the support of the mobo OEMs
Jay: The PCIe 5.0 isn't a move that matters for games.
Also Jay: It's for massive raids.
Me: *WoW puns intensify*
I'm hoping you realize he was referring to RAID configurations for drives...
@@mcp866
I'm assuming this is coming from a good place, so here is the definition of the word pun.
Pun:
a joke exploiting the different possible meanings of a word or the fact that there are words which sound alike but have different meanings.
"the pigs were a squeal (if you'll forgive the pun)"
interesting to see how its going to stack up against AMD's 5nm next year
Well it won’t be competing with 12th gen it’ll be competing with 13th gen since it drops next year too
This is a great, solid explanation of the features of Aler Lake!
YAY AMD Intel going at it once again! This is gonna be great, I just hope both just keep making great products. I feel like its been forever since I had to really research a platform to build the best I can with the pennies I have lol
intel for gaming
amd for productivity
simple as that
@@Spectru91 but I have a 5900x. so why not both?
@@Spectru91 I would take intel for productivity. Much more reliable and compatibale.
@@Xirpzy more compatible? Intel? 😂😂😂
@@Spectru91 Thats usually the right formula lol Seeing these tech giants battle it out for decades and Intel usual holding it down has been fun, and to think that both companies came from the same background and branching off to create great products.
I'm confused, Jay said "triperthreading" for 16 cores and 24 threads at 1:49 . Wouldn't that be 48 threads instead?
1.5erthreading
Can't wait to see the Jayztwocents VS Gamers Nexus overlooking series!
Lurker here... just drinking my morning coffee, looking at the background.
Ahhh... I get it...
Tires = CPU Threads (treads)
Ladder = CPU clock steps
Or maybe I just need to drink more coffee.
I'm very interested in ddr5, and curious to see what it will do for integrated graphics.
Im interested in the igpu also.
> Just like the way Windows 10 was required to take advantage of ... 9900K features
That's the first I heard about that... Could anyone tell me something more specific so I can find the details?
First, read about cool projects such as Flirt Invest and then tell tales about growth
Competition is King. We've all seen how things stagnate when there is none.
Thanks for the vid.
LGA1700 mounts are already available on Amazon for certain Noctua coolers. (NM-i17xx-MP83 Mounting Kit)
I told my buddies to wait for this release before upgrading their rig. They’re not in a rush and if nothing else the old stuff will drop in price. DDR5 breakdown por favor!
ARTIC already stated that they will ship free brackets for customers who provide purchase info so that's a bonus not needing a new cooler
Recently I had to buy a new CPU due to my 6700K dying, I went with a 3700X and it has been amazing! Before Ryzen came out I thought AMD would never be a brand that I would consider.
You wasted your money and joined the AMD hype train. An 11400f build would of saved you money and given you better performance. But it was your choice
requires windows 11, that lost me right there. keeping my AMD
Please! Yes! DDR5 roundup! That would be great! Thanks!
Intel using a hybrid design to try and improve CPU performance! Very interesting strategy! Can't wait to see how it differs in testing!
Very interesting strategy! Can't wait to see how it differs in testing!
Yes, we do need the ddr5 review. Thanks
Excited for alder lake. That’s my next build!
Yes please to that ddr5 roundup. Thank you.
Can't wait to see the benchmarks!!
Competition always benefits the customers.
Finally I see people asking about ddr5. Tests are IRRELEVANT until all are on equal ram.
Intel launching DDR5 support on a mainstream platform is quite the change from how it usually does things. Normally there would be a HEDT release with the newest memory standard (like X58 > DDR3, X99 > DDR4).
This p core and e core scheduling sounds exactly like what we wanted dual cpu systems to be back in the day. 1 cpu to handle games and 1 to handle all the background stuff.
I also wonder what their Xeons will be like.
Can't wait for Alder Lake 12th gen. Leaked benchmarks look way better than Zen 3, even the 5950x. And prices are rumored to be cheaper. Intel is now our savior???
"There's something not right about that boy." Gotcha Hank. You look more like Boomhauer.