We need more interviews with Mark. He's a little unsettled to start, but then his passion kicks in and it's all gravy. Nerdy, informative, and entertaining.
Idk about unsettled, as much as just very careful to not say too much, and is calculating where, and how, to steer the conversation, given the limits that he has or (intel) has imposed upon him.
Not only that it proves he loves his work, it is a statement only a person with super high intelligence would have uttered about solving a CPU operations level problem.
Bravo to Intel for allowing this type of interview and in fact all of the back scenes they've shared. It personally makes me appreciate them that much more...Thank you.
i guess this interview makes up for the years of anti-consumer practices? or, you said that much more?? well glad they've been looking out for at least you personally all this time. was an interesting interview tho
@@XX-121 .. I think you're looking through your polarized glasses and not looking at the reality. Every company especially multi-billion dollar companies, has a mandatory agenda of profitability if they're a publicly traded stock. Listen, I'm the first to be thrilled that AMD has finally come out of the closet booming with a competitive CPU that is pushed the market to benefit the consumers. So is it Intel's fault or lack of competition that you're upset with? _Just food for thought. A little off topic. This is what happens when you have mega mergers and or very little competition. Keep that in mind._
As a somewhat of a corporate gun for hire nerd myself a multi billion dollar company does not let a geek out into the wild if they won't say the right things. Not that they don't say the facts or have cool info but he is being paid to be there to talk tech. He is a great presenter
This is so awesome, the community has been STARVING for media like this for so long, this is a great trend, but I can't believe it took this long. Thanks so much to Mark for doing this, we appreciate the time and your enthusiasm, Intel needs to share more of you with the world.
I think so many people wat to have interview like this. But big company wont allow it. The reason this happened might be caused by CHIP act by government
Thank you Mark & der8auer! A *very* interesting and informative interview. I was completely engrossed by the discussion. I would love to see more of Mark with these types of behind the scenes technical discussions.
This is the kind of transparency that consumers need and some have been asking for, for a long time! Please keep up the amazing and informative videos!
I work at Intel and have been following this channel since the beginning. I wish Intel would link or post your videos on our employees website. It's very uplifting information during a tough time.
Intel management really chapped my ass over the Quad-core-for-a-decade thing enough to force me out of the PC hobby. Knowing that Mark and people like him are there, seeing his passion and deft expertise has me strongly reconsidering my Intel-gets-as-little-of-my-money-as-possible policy. It's great stuff for both groups and I hope to see more like it.
@@bringbackdislikebutton6452 I'm in the same situation but for me it's AMD-will-never-get-any-of-my-money-again policy. I even had to ditch my Nvidia-don't-screw-me-with-your-prices-bias somewhat, only a little mind.
This was an awesome interview. As an engineer that used to do life testing on LCC parts/mounting/systems (back in the day) I can relate to a lot of what was talked about. Thanks!
This is the type of marketing these companies should fully focus on. It's amazing hearing from a guy who actually knows what he's talking about! Really valuable content, thanks Roman (and Intel).
Really enjoy stuff like this. Seeing how much thought goes into the design really humanizes the various tech company's products. It's easy to get carried away with misconceptions of a final product without a detailed explanation like this. Pretty wild how much they've been able to push the silicon. Look forward to Intel's future products. Wonder if any future advancements will allow any return to lowering voltages while keeping the same performance. I've always been big on perf/watt and loved when we used to be able to socket mobile cpus into systems.
I was gifted as 12900k from my buddy at Intel. People thought I was insane when I shoved it in a 3L itx case with a 40mm cooler. You'll destroy the chip, you're nuking its performance bla bla bla. Intel laptops have been running at 90+C for nearly a decade now, temperature is simply hardly correlated with chip failure, which is already astonishingly rare outside of user error (people used to often think the cpu was defective when in reality it's the motherboard nearly every time) Anyway my 12900k has been running at 100 celcius nearly 24/7 on at least one core. Single thread I lose about ~10%, fully threaded I score around 80% of an identical chip that can maintain PL2, no big deal. Wouldn't make sense for me to buy, but in a lot of cases you're better off getting a more expensive chip and a cheap cooler and letting it rip than overspend to cool a weaker chip with lower average clocks and max turbo because you psychologically feel like you've unlocked all of the weaker chips performance
One of the best videos I have seen on a tech topic. Very refreshing not seeing any graphic slides or animations, just a whiteboard and very passionate persons.
My main concern with running "at the limit" is twofold. 1 - I like to have long gaming sessions, I can't imagine being at the thermal limit for 3-4 hours at a time is good and 2 - the CPU is packaged very close to other electronics in the system that may not be able to handle the same temps as the CPU. I have a 13980HX in my Strix G18 and recently had to have the motherboard replaced, thank goodness it was under warranty because I hate to think what it would cost for a board with a i9 and a 4080 soldered onto it. But I am almost certain that either the CPU crapped out, or some other chip on the board near the CPU, and it was because I did what he does and basically didn't pay attention to temps. I now have my multipliers set much lower (x42 as opposed to x52) because ~4 ghz is plenty for gaming. Temps rarely get into the 80s now and just in general the system is much better to us as it's not as hot. Any way - thanks for the interview, it was really cool to hear from an Intel Engineer on what they think about CPU temps. .
They keep raising the power limits and making the turbo more aggressive just to be able to claim they've got better performance compared to Ryzen. I am 100% convinced that's all they really care about. I'm not buying all of this "Oh, well, we're just trying to extract every bit of performance for our users..." talk. It's all just nonsense to me. It's still all marketing bs at the end of the day as far as I'm concerned. Now let's talk actual hardware shall we? Intel CPU coolers are still pretty shit, just like they've been for the past what like 15 years now? Can we just agreed about that? How long is the fan going to last with their new stupid turbo that's going to constantly rev the fan up and down really fast? Why do you think all those shitty little fans on video cards break so damn fast? Can you even put a small case fan on an Intel cooler radiator and how? How about the CPU socket power delivery system on the motherboard. Are those little beer keg looking things next to the socket sensitive to temperature? Yeah, I'm obviously talking about the capacitors. Very sensitive?!? OMG!!! And no, it doesn't matter if they're polymeric (or solid state) they still age rapidly at high temperature just like your phone's battery. Like he doesn't know that. Like he doesn't understand that their new shitty CPUs are going to cut into the lifespan of your mobo. I don't even want to start talking about your PSU and transient current problems caused by these news CPUs. He knows, they all know, they're just don't fucking care! They need to claim they're better than AMD. Just like Nvidia has to claim they have the fastest video card on the market. Doesn't matter if it uses 450W/h, electricity's basically free son. Bunch of bastards, all of them... And no, I'm not saying AMD's much better. They did the whole power thing too in the FX days. They do almost all of the same crap Intel does too but at least they're not charging you extra for SMT or for an unlocked multiplier or for unlocking some more features and performance on your server CPU. Intel is literally trying to push microcode microtransaction upgrades for CPUs right now. I know besides the point but still just think about that...
There's a lot of marketing bs here. Of course you want your temps as low as possible and stable, cause it affect your fan curves and longevity of components. The fact it's within the spec and they designed it like that to survive 2-3 years, doesn't change facts. It's great it can suck everything from the chip and interesting from the engineering perspective, but it's not good for consumers to stress components too much.
@@davidcobra1735 Singlecore performance does not scale with higher Powerlimits. Your comment makes no sense. It is there to increase multicore performance, as on same Power consumption a 7950x will always beat a 13900K in multicore, but regardless of the power, the 13900K will always beat the 7950x in singlecore
@@otozinclus3593 That's what I meant. Like you can't tell that. And you know very well they mostly stress the fact they have a single core performance lead again in their marketing and pretty much nothing else. Do you just want to be argumentative or feel like a wise guy or something? Just spare me. Edited my previous comment to better phrase things.
@@davidcobra1735 You said in your comment "They increase power consumption to claim higher single core performance " And I corrected that saying, that the power consumption has nothing to do with single core performance
This is the side of the equation that we often miss and need in order to establish a rounded understanding and opinion of products. At least for people like me that don’t/can’t commit the time to dive into the technical material available online. While you are one of the most capable and honest reviewers out there, hearing from the creators’ perspective as well as yours as an enthusiast end user adds immense value. Great work to the both of you and thank you for your time and insight.
It's great as an engineer. Everyone has to deal with management wanting a number to be better. "Sure I can put the sensor somewhere else, but you really don't want to do that."
This is certainly an interesting insight. While I'm still not a fan of running cpus close to their temperature limits, I guess we can make peace with it so long as they're truly designed for it, and there's no negative impact on their lifespan.
I would love to see a followup interview with an Intel engineer explaining the current situation with their high-end CPUs degrading and whether they still approve of CPUs running at Tj max for prolonged period of time. People that buy or intend to buy their products in the future deserve to know this.
First I want to clear that I love Intel. But this way to put things are very optimistic. The problem isn't that the CPUs are running at 100 degrees C, the problem is that while in the past you could reach the max turbo speeds with a $50 air cooler, now you have to spend $100-200 on a AIO and it is still not running at those speeds at all time. I moved from a 9700K to a 13900KS, I cooled and even overclocked my 9700k with a be quiet Dark Rock Pro 4. Now I can't reach the advertised speeds with the same cooler, even a 13600k will make any air cooler struggle. People aren't mad that their CPUs are running at 100 degrees, they are mad because they have to spend $200-300 extra on a good AIO or over $1000 on a custom loop, just to get to those advertised speeds. I paid for a 13900ks, but if I don't have a very good cooler, it will be slower than a 12900k. For example, I have a friend who bough a 13600k, he insisted that we just get the best air cooler (yeah, D15) and hope it is going to be enough, he is using TG Extreme for TIM. CPU still throttles. In normal circumstances, throttling is something that should not happen at all and if the CPU is rated to turbo to 5.2GHz or 5.5GHz it should be able to reach that, instead of getting a 24000 pts on Cinebench R23, like the people with custom loops and good AIOs, he is getting 21-22000, which is around what a i5-13500 is getting. Running at 100 C is fine... if the CPU runs at 200-250W, not at 320-400W. Saying "it is designed to run at 100 C, because you will get the best performance" isn't a reason not to focus on power efficiency.
There's also the fact that this gen of CPU's hasn't been out for long enough to test the company's claims. They SAY that these high temperatures are """"fine"""", but ARE they? We don't know. Taking the word of someone selling a product that their product is perfect is always a losing gamble.
I'm in similar situation as your friend. I bought a 14700, and CM Hyper 620S. Realised CPU kept throttling at 100 C. I just set PL1 lower at 167W now, CPU package temperature around 79 C (ambient usually 30 C). Should have just saved my money and purchased a mid-end processor. Maybe I will purchase an AIO cooler 5 years down the road, and consider that as my "upgrade".
No. 95c should never be crap I am wasting performance and I need to over clock until i hit 100c! I don’t want degraded CPUs and and 50 db jet engine fans to keep it cool! I went and and it never hits above 70c to keep it quiet
Awesome video, you can tell there's a lot more behind what's being discussed here. My main concern with temperatures isn't how hot my CPUs get, it's how reliable they'll be at those temperatures. For example, If I want a laptop to survive 10 years, but it runs at 103C on a hot day, I want to know what's happening to the projected lifespan of my components.
Great interview. Decades ago I worked in the semiconductor side of the business; while he probably can’t get into too much detail, I’d love to know more about how they’ve addressed a lot of the material science issues we had trying to mitigate Black’s Equation, especially at these power densities. It absolutely blows my mind that any of this stuff works, when we were amazed (at the time) that we could hit 300Mhz on a 0.50 μm process.
This is fascinating. I got a new Intel processor and it was running at 95 under intense gaming and I was like OMG it’s going to blow up but actually this gives me confidence that it’s all ok.
Higher operating frequency means higher temperature. It's the higher operating temperature which reduces the life of the semiconductor device . You can go look it up online.
@@Akkbar21 I correcting the error made by the original poster. I am not commenting on how long people keep their electronic devices. That's a separate matter. And I have personally seen equipment containing semiconductor devices last for 25 years and still be in use. So the idea the life time is not important is incorrect.
@@deang5622 I've got my CPU overvolted at 1.45v and its been running now for 5 years. You snowflakes need to grow some balls. But I also delided, applied liquid metal to the die itself touching the IHS. And also running Noctua NHd15. As long as temperatures don't cause thermal throttle you are going to be fine.
Thermal management engineering can be REALLY, REALLY hard. Measuring stuff where the MEASUREMENT ITSELF, affects or alters your results -- that's also really hard, but also really fun as well. When I had to do that for my undergrad research (measuring the contact angle of a sessile droplet), there wasn't really a way of doing it such that you would make physical contact with the droplet. So, given the refraction index of the liquid in question, we were able to use lasers and a little bit of trig to measure that. I *wished* that the general population understood more about the über technical details that goes into engineering something.
I don't see why the general population would need to know these details if they are not engineering these components themselves. That is why we have division of labour and expertise, so that people not directly involved in something don't need to waste time studying it. We also let doctors learn about medical issues instead of studying 10 years ourselves to understand the details of our anatomy and how some disease might develop.
@@cyberfunk3793 You'd be AMAZED at how many people pretend to know things, when they've only read very few things about it and then lack the fundamental, background, first principles knowledge that went in, behind said "thing". Three things: 1) There are principles of engineering that a LOT of people don't understand, due in no small part, to the fact that scientific literacy in Canada and/or the US, is generally, very, very poor. 2) Then you add that to the fact that people like to *think* that they know things, whilst simultaneously, failing to realise how LITTLE they know, based on the LITTLE amount of reading that they've done, and the homework that they HAVEN'T done, with respect to these topics. (I used to rail on the reviewers on the former computer website PimpRig, which then re-branded itself as PCApex, for their reviews of CPU heatsinks and fans, because they wouldn't state what their ambient temperature is, and they would ONLY report on the temperature that they saw, in the software monitoring tool that they were using at the time.) And I used to excoriate them for it, so much so that one of the reviewers was the owner's wife, and the poor job of doing said CPU HSF review that he would try to come to her defence (which was understandable), but then both of the FAILED to grasp the TECHNICAL aspect of heat and mass transfer, which is one of the courses that you need take, as a mechanical engineer, in order for you to be able to graduate as said mechanical engineer. There are a LOT of people who like to THINK they know things (vs. people who ACTUALLY know things). And it is only when they are challenged by people the latter, sometimes with REALLY, REALLY basic questions about the very thing that they're talking about, which is how you are able to ascertain which side of the Dunning-Kruger effect plot they're on (the right side (high degree of knowledge, high confidence) vs. the left-side of the plot (low degree of knowledge, high confidence). (I've challenged people who believe in chemtrails to be able to provide the pressure and/or the mass flow rate of the supposed "sprayer", along with a request for the material properties of the spray itself (so that I can simulate the dispersion pattern using computational fluid dynamics (CFD) and those people ALWAYS aren't able to answer those very simple, basic questions. And the same is true with 9/11 truthers, whereby I've literally interacted with the "Architects and Engineers for 9/11 (truthers)" and I'm like "you know you can simulate that using an explicit dynamics simulations program like LS-DYNA?" And I've even asked them for their LS-DYNA models, so that I can review it, and of course, they come back with some kind of bullshit answer/excuse like it takes too much time, etc. (Meanwhile, HOW much time are they spending on trying to find out said "9/11 truth"???) So, this is needed because there are people who THINK they know things, and coming from one of the guys who was actually responsible, in part, for the development and engineering of the very thing that people are talking about, people being people, they can STILL try to argue with them. But the ones that learn from this are the ones who won't bother trying to challenge one of the engineers on the product deveopment team, and those that do, fail to realise that they're fighting a really steep, uphill battle, because there are simple questions that he would be able to ask them, that the latter group, just simply won't be able to answer, because they failed to realise that they had a low degree of knowledge to begin with. 3) I work in engineering for battery electric vehicles. You'd be AMAZED at how many things people pretend to know vs. what people actually know. Yes, ideally, we should be able to trust the division of labour and expertise. But then you get idiots like Wendy Wright (and her "debate" against Richard Dawkins, who has a Ph.D. in zoology), whereby she, at one point, basically says "why should I trust you?" To which, my response to people who DON'T trust the experts, is for them to run the labs themselves, just like their high school chem and bio classes that in all probability, they never took, because they were "too cool for school". This is why videos like these are important. It isn't going to stop people who have a low degree of knowledge, high confidence, from being who/what they are, but it WOULD make the entire discussion go a lot shorter/faster if you can just send them the link to this video, and then it's upto them whether they want to watch it or not. My favourite quote that I made up is "if I don't say/write it, it's my fault. If I write/say it and they don't read it, then it's theirs." The same applies to this video. You can't make people watch the video, but you can sure as heck, point out and highlight the critical and relevant parts of the discussion you're having with that person, with this video.
@@ewenchan1239 I agree that basic knowledge like good fundamentels in reading, writing, math and physics is something that the education system should be able to give to as many people as possible. I also think though that it's common human nature to pretend to know things and with specialized knowledge like in this video it's not even desirable that 99.9% of the population actually educate themselves on the topic but rather spend their time on something more direcly aplicable to their own lives unless they actually enjoy the topic and reading about it is something that interestest them. Another thing is that pretty much in any other field than perhaps mathematics and some really settled issues in physics, there are contrary opinions on many questions eveng among experts. Take medicine and/or nutrition for example: you can get polar opposite opinions from experts that have studied the topic for years so it's easy to see why an amateur might think they are qualified to challendge other more educated people on the topic. If the field internally doesn't have a solid evidence based concensus, it's unreasonable to except it from the laymen confused about the difference in opinions of the experts.
@@cyberfunk3793 Three things: 1) Engineering (in this case) is not the same as medicine and/or nutrition. (i.e. unless you're debating the axioms of mathematics and/or quantum physics, the math that is used in engineering, as well as, predominantly, classical mechanics - there's relatively little debate about that* (with some exceptions like turbulence, but that isn't what's discussed here).) 2) re: human interest I would have to guess one of two things: either a) people like to argue about stuff because they like to be right (regardless of level of competency in the field) and/or b) people like to think and/or pretend that their level of competency in any given field/subject matter is higher than what their actual level of competency is, in said actual field/subject matter, because the latter provides them with that sense of confidence whilst also requiring the person to put in as little work as possible to achieve that level of confidence. And so long as they don't come across people whose actual competency exceeds theirs, the chances that they are going to get called out on their low(er) level of competency is likely to go unchallenged, if not indefinitely, then for a VERY, VERY long time. That is, until they come across somebody who IS willing to call them out on their lack of competency, and who also doesn't abide by the common, social etiquette of NOT calling people out. (It has been my experience that more people tend to focus on the social etiquette element of it rather than why didn't the person put more work into increasing their level of competency in said any given field and/or subject matter.) 3) re: medicine/nutrition a) People's most common exposure to medicine and/or nutrition is when they have the common cold and/or the "flu" (which may or may not ACTUALLY be the influenza virus). People also usually like to take complicated, multi-variate subjects, and distill that into something "simple" that they can understand, which CAN work, sometimes, but also people tend to neglect that with the simplification, comes ALL of the gross assumptions that are an integral part of said simplification, which people conveniently, and often, neglect to remember. I remember the master's level international business course that I took - out of a class of 26, 2 were mechanical engineers, 2 were industrial engineers, and the rest were business majors. One of the girls, during our final presentation for an international business game that we were playing (for the class), lamented about how she wasn't trained to think like an engineer, as a part of her business degree. And I remember telling the class, during our presentation "forget the math. The point of the game was that everything affects everything else." And that's when the professor stopped me in the middle of my presentation, and was like "EXACTLY! That was the whole point of the game/exercise." (Multivariate optimisation is something that I'm used to, due in part to my training.) Thus, to the point about medicine (e.g. common cold and/or "the flu"), the real answer is that it is multivariate (hence ddx), but also, for common cold/flu, we COULD run the labs, but it is generally NOT cost effective to do so as the labs may only confirm what the doctor is likely going to tell you what to do anyways, and yields little, additional insight. And just like how people generally suck at systems analyses, medicine is no different. For any condition that is more complicated than that, unless the condition has been studied, there are some medical conditions that we might never know what the real answer is or might be, so the best that we can do is to try and see if we can get close (enough) to what the real answer is. Again, people like to be able to point to ONE thing that's a cause, rather than a whole SLEW of things, all of which contribute to the cause. And as SARS-CoV-2 has also shown, that even when experts DO study it, that doesn't necessarily mean that the average lay person is going to necessarily abide by the results of those studies neither. re: nutrition The answer to this hasn't changed for hundred (since modern medicine) or even thousands of years: everything in balance/moderation. It's not rocket science. Of course, that doesn't include the deliberate and corrupt lies that were promulgated at the hands of capitalism (e.g. sugar blamed obesity on the consumption of fats rather than on the consumption of sugar). (But capitalism lying to us, repeatedly, is a subject for another discussion, another time.) If you read the paper on why salt is bad for you, what they actually did in that experiment was force-fed mice/rats about 15,000 TIMES the recommend dosage that rats/mice are supposed to have (in mg/kg) and then found that, surprise surprise, the mice/rats started having health issues. Well...yeah. No shit. If we consumed 15,000 TIMES more of something than we were really supposed to, and we ended up getting adverse health effects as a result of that, it's no wonder why there'd be adverse health effects. And that's how they found that out. Similarly, there is such a thing as water toxicity (i.e. drinking too much water can lead to death), just as there's also oxygen toxicity (which is a part of the reason why Earth's atmosphere/ambient air is only 21% oxygen by volume). Too much oxygen, and you can die from that as well. I always tell my wife that whenever there is some "fad" diet, I always tell her to ask the question "what are they trying to sell (to you)?" (Because if the person who is trying to do the selling REALLY cared about you and your well being, they'd offer their stuff up for free, if they REALLY cared. But the truth of the matter is that they really couldn't give a shit about you, which is why they are selling it to you because their motivation is to make money, and whether you actually get better or not is COMPLETELY irrelevant and immaterial to their enterprise.) Getting back on topic though, presumably if the people who are watching this video - I would have to guestimate that they're doing so because they ARE interested in it. Similarly, the reason why Roman asked the questions in the manner that he did is SPECIFICALLY designed to dispel and debunk some of the myths that could be relatively, easily debunked with a little bit of basic engineering/heat and mass transfer knowledge, that again, most people are too lazy to want to get. (They can watch the lectures on MIT Open Courseware, as an example. But most won't. Cuz people are generally lazy. (High degree of confidence, low competency, corresponding to low amounts of effort.)
@@ewenchan1239 “ (i.e. unless you're debating the axioms of mathematics and/or quantum physics, the math that is used in engineering, as well as, predominantly, classical mechanics - there's relatively little debate about that* Question like is it a good idea to run a CPU as hot as possible to gain maximum performance isn't that close to those physics fundamentals. How the chip might degrade over time due to heat can probably be modelled with math, but to confirm that real world samples of that chip probably have go through that timespan. Also I'm guessing a hot CPU might affect other components in side the case if cooling is not efficient enough and most definitely a hot running CPU affects fan noise. If the guys talked about the effects on noise, I must have skipped that part because I don't remember seeing it. So an engineer might be correct about what heat does to the chip itself in a short time span, but not know for certain the long term effects on the chip before having years of experience on said chip and the short term effects on things like noise that people typically tend to avoid. So there is still a lot of debating possible even in this topic and it's not just easily settled based on math. "People's most common exposure to medicine and/or nutrition" Medicine was used just as an example of a field that requires a lot of expertise but yet many issues are debated even among those experts so a layman flipping the coin can have almost as good chance of being correct on some specific questions as someone with a degree in the field. I don't think it's the flu that is how regular people are most exposed to these non settled issues, but things like what they self eat and if it's healthy or not. One year for example research finds eggs are not healthy, then another one finds the opposite until it gets reverse again at some point. 'And I remember telling the class, during our presentation "forget the math. The point of the game was that everything affects everything else."' I remember in my first level introductory course to logistics (might have been something else also, can’t remember because it’s more than 20 years ago now) in business school we were already doing optimisation and running solvers in excel and on some other software suite which name I also can’t even remember anymore while in my electrical engineering studies the topic was not even touched in the first years. I’m guessing what people are going to study in business school depends much on the major/minor and what optional courses a person might take. Certainly there is a lot of math courses for those interested in them, for people majoring in for example finance while others for example studying communication or marketing might not be exposed that much to math at all. I also have experience in multivariate optimisation because it has been my income source since about 2010, running simulation/optimisation software that I wrote for the financial markets. “And just like how people generally suck at systems analyses, medicine is no different." I think the biggest reason why there is confusion in the field of medicine/nutrition is that the test subject is us. Thus it’s not easy and ethical to run controlled trials that run +20 years where people are only feed 1 thing. Diseases take long to develop and experimenting on humans is difficult for many reasons so epidemiology is typically used instead of randomised clinical trials to figure out the health effects of a certain diet or food items. “Of course, that doesn't include the deliberate and corrupt lies that were promulgated at the hands of capitalism (e.g. sugar blamed obesity on the consumption of fats rather than on the consumption of sugar).” Typically it has been socialism where the largest lies have been spread by government institutions, but sure a for profit company might have incentive to fund and show results that are more favorable to their business. It seems to me it doesn’t really matter if its fats or sugar, it’s the total calories that have grown since the 70’s that is the cause of the obesity. And in fact sugar doesn’t appear to be an independent risk factor even for things like diabetes, while saturated fats actually are so even this issue isn’t as clear as many people think. "(But capitalism lying to us, repeatedly, is a subject for another discussion, another time.)" It's not the economic system that is lying to us, but for sure for profit companies might think they have an incentive to cheat. Typically in the end the lies will come out though and hurt the dishonest business when people are able to vote with their money, unlike in socialism where there is no such market mechanism how businesses would get punished. “If you read the paper on why salt is bad for you” Well this thing isn’t based on a single study but a lot of epidemiology where the effects of sodium on blood pressure have been observed. For many people with hypertension, it seems lowering their salt intake helps but it also seems too low of salt intake can be an issue also for other reasons. Again a hard thing to determine perfectly as long enough controlled randmomized experiments on humans are hard to conduct. 'I always tell my wife that whenever there is some "fad" diet, I always tell her to ask the question "what are they trying to sell (to you)?"' Obviously companies selling something have an incentive to fund research that is favorable to them, that is why when reading studies in for example nutrition, it’s important to understand the full conflicts of interest the authors might have. It doesn’t mean that everyone doing research in the field is corrupt though and a lot of research has been done in universities for example that isn't so affected by the private sector.
This is a fantastic interview with great depth to make his points. Thank you for making this content. I know this guy is telling the truth, for years i've spent doing distributed computing projects on BOINC and have had various types of processors from P3 to Alder Lake running at load 24/7. All the processors still work, no problems.
As I was a Haswell laptop owner, I had plenty of times where the CPU+GPU was saturated and running at 100C to the point it was burning my hands with the exhaust feeling like a hair dryer 😁😁 It was quite funny to see the fuss about AMD's 95C power limit which they hit aggressively to push frequency under load.
Having chip’s running this high isn’t a bad thing… it’s amazing these tiny machines work under such incredible heat cycles!! Amazing work that’s for sure!!
If it's truly not a bad thing, why did they thermal throttle at 80C for so many years?? Doesn't make much sense to leave all that performance on the table when they would have been just fine running above 80C...
@@bobbygetsbanned6049Because they didn't have the systems in place to very accurately 'guess' the actual hotspot temperature (which could be a lot higher and actually dangerous).
These interviews with engineers etc is a great thing for us enthusiasts etc as much clarification is given and removes uninformed commentary as well providing information which helps consumers understand why architectures run the temps they do etc which can be piece of mind for some. Great series Der8auer and keep’m coming 💪🤯🥰🤩👍
The power density is something interesting to me. That is one of the biggest things I've noticed with the last few generations of CPUs. Just how fast they vary in temp. I remember back in the day with a good custom loop you would start a benchmark and temps would slowly start to rise until they leveled off. Then after the load ceased temps would drop back down at a good pace but not instantly. With the recent generations you can be at a cool idle temp and start a benchmark and almost instantly the chip is over double it's idle temp and pushing up on its thermal limit. Then as soon as the load ends it almost instantly drops double digits. My observations are with well designed EK loops. Perhaps it's more accurate and faster sensors in combination with the increased power and power density. This behavior is what makes me think the future improvements will literally have to be from a performance per watt perspective as it seems the power density is already flirting with the limits of what our TIM and other thermal materials are capable of. Also, I trust that they believe the chips can be ran up to their limit with no issues, but I still personally would love to be able to tweak for efficiency and build a proper loop to be able to get at least quoted performance or slightly better while also cooling well enough to be a decent bit below that thermal limit. Maybe that's wanting my cake and to eat it too in today's world, but I'll keep trying to make everything I own better than it was when I bought it to the limits of my abilities. It's just the enthusiast nature.
Yeah, nowadays CPUs are just designed to hit thermal limits before powerlimits. This is basically what OCing was about. There is very little headroom for OC in modern CPUs with common thermal solutions. Going to need to buy that compressor cooler :D
I can't really read comments much. I have been a machinist for a good about of time and do very high tolerance work and most commenter just remind me of customers telling me what they need then telling me how to do the work when they have no idea what they are talking about. (hurts my head) I very much appreciate this video. This man has spent much of his professional career doing this and it shows that he know exactly what he is talking about in the first 2 minutes. Thank you
The thing is, as a home user, the thing I care more about is perf per watt, not absolute max peak temp. This almost entirely about my _home_ cooling, not my PC's safe temps. If I can get 95% of the performance (and currently in some loads over 100%) by choosing a specific platform ... I'm picking the one that doesn't hotbox my room. Same with GPU and sacrificing a couple % max fps to lower temps (and noise) dramatically. Tldr; I haven't moved away from Intel for feeling it is running chips too hot for safety. I've done it for personal comfort.
Undervolting is the new overclocking, but yeah I have to agree. It's not pleasant if ambient temps are already at 21C or higher and your case is dumping heat into the room like a radiator 🥵 Noise is also way way higher in those situations. At 19-20C ambient my undervolted rig has 98-99% the perf of stock (with boost) while CPU & GPU don't push past 45C. That's at 50-60% fan speed tops. Cool, quiet, basically the same performance. It's a win/win for the end user. I understand they wouldn't want to artificially limit their components and for the sake of extreme overclocking and enthusiast space they should push for the most they can on any given SKU so I guess it's a balancing act. Thankfully we have undervolting, custom fan curves, and a boatload of thermal solutions to mitigate these rising temps.
This is a great video, information like this is invaluable to us power users. Definitely would welcome more videos with engineers, especially this dude!
Its not the fault of cpu or gpu makers its the fault of the case makers.. No case manufacturer has created the Window Air Conditioner / Computer Case combo.. Or the Mini Fridge / Case Combo
As a diehard AMD user I've got to say that interview was fantastic, very interesting, a lot of passion for his work and clear no BS answers. Large respect.
@@Kholaslittlespot1 not always. My 13600kf died, mobo was giving it 1.5V on default settings. I had it undervolted and after 3-4 months some cores died
Fast forward a year and boom, intel released a statement in April 2024 having manufacturers throttle the voltage. Recommended voltage max is now 400 watts vs Asus setting of 511. And I am seeing crashes when hitting 100c. Mind you this is watercooled and has been from day 1
I KNOW the CPU "works" at those temps. I just don't think any of us have good reason YET to trust a statement from Intel or AMD that the CPU is going to last until it is fully obsolete. And especially if you actually use that CPU heavily loaded every day at those temps. If the CPU is pegged at it's loaded throttle point (95C, 100C, whatever it is) and you just leave it there all day every day... how long until MTBF? Frankly, that needs to be fully calculated and disclosed at these insane temperatures. It's impossible to believe that degradation isn't substantial at 100C.
Good info. I understand his point about scavenging as much potential as possible. I still feel a bit apprehensive about running my CPU at rated temps. I could be wrong, yet I believe "high" temps (north of 80c) must adversely affect chip life.
It can be understandable. But also you may want to take in account history of how this belief appeared. In older time sensors weren't as integrated in die, as they are now. So they reported lower temperature. Like VRM temp. You see one value, but actual temp can be 15-20 degrees higher under package. Also at older times amount of temp sensors per die/core were lower, and hotspots weren't as openly readable. Modern reported temp is much closer to reality than what was before. But i still can see why seeing temps above 90 degrees can be unsettling to observe. Issues at those temps may not even relate to die longevity by itself, but it can be even more in area of interaction with thermal paste, like pump-out and component separation.
@@DimkaTsv Sensor count. Yes, that makes sense. My old CPU (Sandy Bridge 2600K) appears to run cooler, even with a mild overclock. It does have a much lower sensor count. I under-volted my new CPU (Ryzen 5800X) because of higher apparent temps. I may have been a bit hasty. Cheers.
The problem is the heat being dumped into the room! That's another reason why you hear they're 'too hot'. Having said that I love the technical stuff being provided here
But isn't that about the actual power consumption? Ie. a CPU running at 95C but pulling 100W dumps less heat than a CPU running at 85C but pulling 150W.
Die temperature and thermal density have nothing in common to dumped heat. It is as if we compare bowl of soup to heated floor. Or to room heater Floor is colder, but dumps MUCH more heat in room compared to bowl of soup at 90 degrees. Heaters can be as big as that exact bowl of soup, but they will be able to push more heat as well. CPU and GPU have 100% efficiency. Because all power goes into heat. Performance is just desired sideeffect. And it means that how much heat actually is dumped in room depends exclusive on how much power die consumes. High temperature of core/die is just sideeffect of high thermal density or slow cooler heat dissipation. At this point even if you use absolutely insane water cooler with controlled to 0.1 degree coolant at 5 degrees (like coolers used to chill lasers), CPU can still throttle. Like 13900k had 63 degrees in CB in stock under these conditions. And it throttled with some OC, despite such cooling solution. At this point issue lies in physics of heat transfer.
@@DimkaTsv @2intheampm512 But if you have 2 systems with all variables the same (CPU watts, cooler etc), and the only difference is the CPU max temp limit......
@@fumo1000 Doesn't matter. If CPU wattage is same - overall heat output will be same. CPU temps with condition of same exact power consumption are only dependant on efficiency of heat transfer and heat dissipation of cooler. It doesn't change amount of heat. Just speed of it's removal from CPU. It is as if you try to chill bowl of soup by placing it in cold water, or by leaving it on air. Thermal capacity of bowl of soup is same. But it will cool down faster and to lower temps in cold water compared to air.
Very well explained. I remember when Ryzen 3000 came out and people were going nuts over the voltages because they were spiking when being observed as the sensors and stuff were being probed. I also remember back in the day during college years dripping water onto a cpu without a cooler, they used to get so hot.
@@The_Man_In_Red yeah i still did not see the point in it for learning if they wanted to prove how hot a cpu can get without a cooler i feel just saying so is enough lol. But i guess it did something because i remember the class very well. I doubt they would be allowed to do it these days for health and safety reasons. That teacher was so cool. We ever took the top off a HDD to watch it spin. For networking the teacher had us bring our PC's into school and hook them all up for a LAN party! Worked very well as a teaching aid that everyone that day learnt to trouble shoot and network.
Clearly intel is at the end of its rope and pushing their chips to the limit; it has to choose between its own survival or that of their cpu. Then along comes the laptop manufacturer who wants to stay under 10 mm thick and wants to save as much copper as possible.... the system reaches 100C after two seconds of activity? it's okay, the cpu takes care of it with thermal throttling, you paid $1000 for a cpu but it works like a $200 one? IT'S ALL OK! Your two-year-old laptop suddenly dies? what's the matter, the new version is out! Don't worry, it's all normal, keep playing as the titanic sinks
If you're referring to the recent Intel chip failures, those were caused by incorrect voltage limits set by buggy CPU microcode. They weren't temperature related, which is what this video is discussing. You can still run your chips up to the thermal limit allowed by the cutoff circuit, as explained in the video.
This is so incredible and informative. I totally understand the newfound anxiety of 100-115c temp limits but with the new sensors and testing it kind of just makes sense to gauge more effectively. Of course lower temps are generally going to be better for fan RPM / longevity but it's a complex topic and for high performance this just totally makes sense. The only real downside I have really noticed myself is it makes controlling fan RPM more difficult because in my experience i could lower my RPM greatly and still get similar temps on a lot of chips. On my old 3600 the thing was silent and low temps. I guess this is just the price of having high performance parts though that I am new to experiencing.
The problem is not the CPU temperature in itself but that it controls our fan speed dynamically meaning higher temperature = higher fan speed = more noise. Performance is desirable, noise isn't 🤷♂
I don't really agree with 'if the CPU is not at its limit, you are missing out on performance'. It all depends on how much performance it is and what are the downsides. If I have to double the power draw for just 10% of performance, then that is clearly not worth it for me. I prefer saving energy and 'losing' a little bit of performance. If it was ONLY temperature and not power draw (which of course is not how physics work) then for sure, I wouldn't care about temperature either
You have a contradicting argument tho. If your work load doesn’t require the potential of whatever cpu you have, you could have saved money and bought a lower tier cpu. This discussion relates more to people buying top tier CPU’s, without even using 10/20/30/40/50% of its potential whilst handicapping the cpu to get lower temps.
@@prisonmike9823 My example was halving the power draw for 90% the performance (which is what Ryzen 7000 does). I very much doubt there is a lower tier that still has 90% of the performance of the higher tier. Also as I said, I don't do it for the temperatures, but for sustainability and energy usage reasons. If we use double the world's energy for 10% more performance in our tech, it would be a quick way to never solve the global warming problem
it would have been nice to confirm if the MTBF will increase with lower temps... silicon have like every product a lifespan. if you are changing your CPU every 4 years there is no problem running "hot", but if you want it to last more like 8 to 10 years the lower temp the better.
You have to take into the account not only the CPU but the surrounding components on the motherboard as well. Output filtering capacitors sit very close to the CPU socket especially on ITX boards. General rule of thumb is every 10 Celsius increase in operating temperature of the capacitor will halve the lifespan on it.
Easy solution: CPU manufacturers providing longterm e.g. 5 year warranties when running the cpu under Temp(recommend) - guidance and the nice feeling of safety for the user.
You can’t get that unless you give Intel/AMD access to your CPU’s activity and behavior, and letting them be able to validate that you never went above temperature. I don’t think it’s worth opening that can of worms. People just need to get educated on the fact that there’s no difference in wear between a cpu running at 90 degrees and 50 degrees if their behaviors are the same.
@@joemarais7683 You can't definitively state that. Temperature is absolutely a factor in silicon degradation and we don't have any historical reference of how these temperatures are going to impact the longevity of these CPUs over time.
@@TheGameBench has anyone actually had a CPU fail without doing something inadvisable to it? its the most reliable part of a computer, thats why older motherboards hold their value, they die far before the CPU
What about longevity of the part? Will running more often at the limit cause it to fail earlier? I definitely don't upgrade often, and I'd like my cpu to last a decade at least. 15 years would be nice.
I was wondering the same thing. I'm no expert but if I had to guess more heat would degrade the cpu faster but probably not enough to make much of a difference long term. I guess we'll find out 10 years from now.
Exactly this. Pushing things to their limit is stupid beyond words. Everyone drooling over this interview are completely ignoring the long term and acting like a bunch of sheep who have never looked into things parroting each other. We all knew all what was in the video before the interview, there was nothing new being presented here. Take good care of things, undervolt/-clock and they'll last a lifetime, do what Intel and AMD are doing and you'll be contributing your PC/laptop to e-waste after the warranty or "lifetime" ends. Use Asus to speedrun that process.
I can’t speak for the others but I’ve personally never had issues with the CPU running at 100 degrees. I’ve been a laptop guy since forever and the older laptops used to sit at 100 degrees all day everyday while gaming and not one died or got damaged or had issues. The silicon will outlast your device even at those temps. Sure, undervolting is good. But I started undervolting mostly to reduce fan noise than to cut temps. To this day, my Helios 300 runs at minimum fan speed while gaming and CPU stays at more than 95 degrees. I can increase fan speed to cut the temps but it doesn’t matter really. Hope my answer helps.
It's fine to just say all of this but look at all the reports of these chips crashing in unreal engine games with stock settings... fixes are undervolt it, lower clock speeds, set power limits etc... my fix is asking for a refund tbh
i'm sorry, this is just not ok NOTHING works good at 80 degrees ! why dont you just tell the truth? you haven't worked on your cpus to fix the problem and maybe you can't ?! but it's more important to make more money because 450$ for an overheating cpu is just not enough.. “Behind every great invention, there’s an engineer who thought, ‘I can make this better! ( they never do)
Read the data sheet for the device! Every manufacturer of integrated circuits produces one. The data sheet contains an electrical specification section. I know, I used to write them.
Just a random August 12th, 2024 comment here. I would have loved to speak to Mark now after this fiasco/scandal over breaking CPUs I would be way more direct and simply state that in my experience, riding temps even in high 80s can and did degrade older Intel CPUs I am on my 2nd 13900k, and even with 1st degraded chip I was undervolted from Day 1 How many people are fixing cores like me, and undervolt like me? Probably about 3% of people I have respect for Intel and AMD engineers, but this beating around the bush talks are not reflective of any chip, current or previous. We are balls to the wall with these "stock overclocks", without even touching the chip We are now required to undervolt with AMD and Intel Both companies are destroying their own chips with 95C idea on AMD, and overheating chip on Intel 13900k and 14900k Hopefully, this Intel fiasco will make them think about power efficiency first, and increase IPC without burning the silicon off.
My concern is with thermal expansion. What will reaching 95C do long term for wear and tear, and also the rapid temp swings in chips. Rapid changes are also harder on materials. Will these two factors manifest in a shorter CPU lifespan if you use it's full potential?
The older generation builders are maybe the cause of these "off spec" temperatures mania. We learned to keep CPU's under 80°C and lower. Because we knew how it looks like on an older system, where we put the computer on, without a cooler. The die just shrinks, discolors and quit. So now they see these "unusual" temperatures. And panic. We all have to get used to the higher temp specs. Great video. Thanks.
So are these temperatures safe? Let's say I run Handbrake all the time converting into HEVC the CPU runs really hot, is it safe for the chip long-term or does it lower it's lifespan because for example my 2500K is running perfectly fine for 12 years however it runs cool like far away from 70-100*C.
i would guess that the "untold truth" in this new way cpus tend to work is that they will burn way faster than older CPUs did... so for example, if an intel 1st or 2nd gen would had been designed this way, then they might be burned instead of running right now, 12/13 years later. Fun thing is that an overall shorter CPU lifespam would rise demand for newer CPUs, since used market wouldn't be as great as it is today, and this is exactly the type of things that improves corporate profit in regard of users economy and natural resources
They're not "designing them to run at these temperatures", they're doing damage control/mitigation/workaround because their current process is horribly power inefficient. Yes AMD also has high temp with current designs but if you delid them both, the AMD is still in the 70's at max load but they're drawing nearly half the power. Also, "wasting potential if you're not running at the limit"? Maybe for a .1% of the user base elite Overclocker, but for anyone running workloads or an every day user 200mhz less will yield barely anything with a massive power difference. To me this feels like guerilla marketing for an inefficient node process. Putting the sensor further from the hot spot is great design /s... that just lends the perspective of "we don't actually know what the outcome will be long term" and if we report lower temperatures we'll sell more chips. We don't actually care what's going on in the core temperature wise and neither should you. As user monkeys... we don't care either. We just have expectations because of all the previous lies over the last 3 decades that we've been conditioned to believe previously by these companies on operating temperatures and electron migration. First it was "DON'T DO THIS FOR TOO LONG", now its, "IT'S FINE, IT's DESIGNED, IT'S ENGINEERED TO". No matter what these companies try to pedal, we still know the cooler a system runs, the longer it lasts. Temperature accelerates entropy.
My issue with "modern" cpus is less that they run hot, but more that crutches are built in. Ex: new ihs on ryzen acts as a blanket on the chip preventing you from removing the heat and improving potential Ex 2: sandy bridges to ivy bridges, intel changed from solder to a crap paste that acted as a blanket resulting in the same case as ryzen. Both examples are bad engineering. The only way to fix these creates high potential risk on the consumers side. You remove the ihs and direct mount. But you risk breaking chips just to achieve what should have been correct from the factory. If they want it to run hot to utilize greater potential, cool I'll buy a better cooling solution. But if they put a blanket on the chip, there is no cooling solution that will resolve the issue unless I risk damaging the piece and being out money and time.
Would love to see an interview about overclocking and their opinions about what is the ideal max voltage for overclocking depending on which type of cooling used
Intel needs to just have the CPUs pre delided as an offer…. Maybe the KS CPUS? Some of us just want to use Liquid Metal default. The PS5 already has Liquid Metal? Asus does it to their laptops. Like Liquid Metal is the next thermal paste with these higher temperatures…
Why would you delid KS series ? If you think about it, they are binned cpus which means they require less voltages for most likely same frequencies. Would rather delid normal K series instead. None of my K series scored above 90 points in Asus bios, while KS scored above 110 points. Might not be appealing for everyone, but am running 13900KS undervolted, with low load-line calibration and on best case scenario.. Managed to pass R23 multicore without crashing
@@Need4FPS because binned CPUS like the KS can overclock a bit better and pull more power, not every K owner is going to want to mess with Liquid Metal, but over clocking community like the KS series since the base line is higher clocked. Also since it’s a limited production, intel would benefit from that. They can’t sell all CPUs delided. It would not make sense to them
@@Multimeter1 The overclocking community is so small that it would be just additional cost for Intel to set up a manufacturing and packaging line just for CPUs without heatsink. Remember that they need to test the CPUs before packaging, and in this case they need to do it all without heatsink. So that is: alteration of the manufacturing line to accommodate the fork for CPUs without heatsink (i.e. covering smds etc), separate QA procedures, separate packaging. A lot of resources for the few bucks more they would make from such cpus. Completely not worth it.
This is nice and all, but in the real world, you don't notice the performance difference between 5.8 Ghz and 6 Ghz but that power and temperature difference is very significant. Running 5.8 at ~70 degrees is definitely better than 6 at 100.
We need more interviews with Mark. He's a little unsettled to start, but then his passion kicks in and it's all gravy. Nerdy, informative, and entertaining.
Could not agree more
100%. Passionate and informative. This guy is great
I didn’t observe him to be unsettled
Engineers should rule the world.
Idk about unsettled, as much as just very careful to not say too much, and is calculating where, and how, to steer the conversation, given the limits that he has or (intel) has imposed upon him.
The moment Mark talks about solving a problem as an "interesting game" it truly shows that he enjoys his work, respect.
Not only that it proves he loves his work, it is a statement only a person with super high intelligence would have uttered about solving a CPU operations level problem.
Forgot to ask you
Bravo to Intel for allowing this type of interview and in fact all of the back scenes they've shared. It personally makes me appreciate them that much more...Thank you.
i guess this interview makes up for the years of anti-consumer practices? or, you said that much more?? well glad they've been looking out for at least you personally all this time.
was an interesting interview tho
@@XX-121 .. I think you're looking through your polarized glasses and not looking at the reality. Every company especially multi-billion dollar companies, has a mandatory agenda of profitability if they're a publicly traded stock. Listen, I'm the first to be thrilled that AMD has finally come out of the closet booming with a competitive CPU that is pushed the market to benefit the consumers. So is it Intel's fault or lack of competition that you're upset with?
_Just food for thought. A little off topic. This is what happens when you have mega mergers and or very little competition. Keep that in mind._
Its cool, but it wont make me buy their products
@@DJaquithFL Yeah, companies exist to make a profit. And AMD competition (which is on and off through the years) certainly helps us consumers.
As a somewhat of a corporate gun for hire nerd myself a multi billion dollar company does not let a geek out into the wild if they won't say the right things. Not that they don't say the facts or have cool info but he is being paid to be there to talk tech. He is a great presenter
This is so awesome, the community has been STARVING for media like this for so long, this is a great trend, but I can't believe it took this long. Thanks so much to Mark for doing this, we appreciate the time and your enthusiasm, Intel needs to share more of you with the world.
I think so many people wat to have interview like this.
But big company wont allow it.
The reason this happened might be caused by CHIP act by government
Thanks derbauer and intel for enlightening us . We need more such interviews
@@smkslpsd Are you lost?
@@smkslpsd tf
Thank you Mark & der8auer! A *very* interesting and informative interview. I was completely engrossed by the discussion. I would love to see more of Mark with these types of behind the scenes technical discussions.
Do you know Marks full name by any chance?
I would honestly love to listen to him about voltages and longevity expectations. Sounds like a cool topic.
if you listen to your professors, you will be him.
Yeah, i want him to answer why my default 13600k takes 1,43v !
@@Golden2Talon because the E cores are really inefficient in terms of power usage while the P cores need as much power as the 12400.
@@YourFriends223 riiiiight 😂 you think everyone can become an engineer , think sgain
@@Golden2TalonYou shouldn't be running at Default anyway. The chips come at overvolted from the factory. Just Undervolt and you'll be fine
This Intel guy does great interviews. Hope to see more from him!
This is the kind of transparency that consumers need and some have been asking for, for a long time! Please keep up the amazing and informative videos!
I work at Intel and have been following this channel since the beginning. I wish Intel would link or post your videos on our employees website. It's very uplifting information during a tough time.
Intel management really chapped my ass over the Quad-core-for-a-decade thing enough to force me out of the PC hobby. Knowing that Mark and people like him are there, seeing his passion and deft expertise has me strongly reconsidering my Intel-gets-as-little-of-my-money-as-possible policy. It's great stuff for both groups and I hope to see more like it.
@jessie james, cool! Intel Inside! :)
I really hope you do work at intel. Are you in an engineering role?
Great idea, let me make it happen!
@@bringbackdislikebutton6452 I'm in the same situation but for me it's AMD-will-never-get-any-of-my-money-again policy. I even had to ditch my Nvidia-don't-screw-me-with-your-prices-bias somewhat, only a little mind.
This was an awesome interview. As an engineer that used to do life testing on LCC parts/mounting/systems (back in the day) I can relate to a lot of what was talked about. Thanks!
This is the type of marketing these companies should fully focus on. It's amazing hearing from a guy who actually knows what he's talking about! Really valuable content, thanks Roman (and Intel).
Really enjoy stuff like this. Seeing how much thought goes into the design really humanizes the various tech company's products. It's easy to get carried away with misconceptions of a final product without a detailed explanation like this. Pretty wild how much they've been able to push the silicon. Look forward to Intel's future products.
Wonder if any future advancements will allow any return to lowering voltages while keeping the same performance. I've always been big on perf/watt and loved when we used to be able to socket mobile cpus into systems.
Having actual chats with engineers is just gold. Hope to see more of this in the future!
I was gifted as 12900k from my buddy at Intel. People thought I was insane when I shoved it in a 3L itx case with a 40mm cooler. You'll destroy the chip, you're nuking its performance bla bla bla. Intel laptops have been running at 90+C for nearly a decade now, temperature is simply hardly correlated with chip failure, which is already astonishingly rare outside of user error (people used to often think the cpu was defective when in reality it's the motherboard nearly every time)
Anyway my 12900k has been running at 100 celcius nearly 24/7 on at least one core. Single thread I lose about ~10%, fully threaded I score around 80% of an identical chip that can maintain PL2, no big deal. Wouldn't make sense for me to buy, but in a lot of cases you're better off getting a more expensive chip and a cheap cooler and letting it rip than overspend to cool a weaker chip with lower average clocks and max turbo because you psychologically feel like you've unlocked all of the weaker chips performance
One of the best videos I have seen on a tech topic. Very refreshing not seeing any graphic slides or animations, just a whiteboard and very passionate persons.
2:06 intel probably wishes they hadn't said that
My main concern with running "at the limit" is twofold. 1 - I like to have long gaming sessions, I can't imagine being at the thermal limit for 3-4 hours at a time is good and 2 - the CPU is packaged very close to other electronics in the system that may not be able to handle the same temps as the CPU.
I have a 13980HX in my Strix G18 and recently had to have the motherboard replaced, thank goodness it was under warranty because I hate to think what it would cost for a board with a i9 and a 4080 soldered onto it.
But I am almost certain that either the CPU crapped out, or some other chip on the board near the CPU, and it was because I did what he does and basically didn't pay attention to temps.
I now have my multipliers set much lower (x42 as opposed to x52) because ~4 ghz is plenty for gaming. Temps rarely get into the 80s now and just in general the system is much better to us as it's not as hot.
Any way - thanks for the interview, it was really cool to hear from an Intel Engineer on what they think about CPU temps. .
This is great technical information without the marketing bs. Please give us more of this type of content, thank you.
They keep raising the power limits and making the turbo more aggressive just to be able to claim they've got better performance compared to Ryzen. I am 100% convinced that's all they really care about. I'm not buying all of this "Oh, well, we're just trying to extract every bit of performance for our users..." talk. It's all just nonsense to me. It's still all marketing bs at the end of the day as far as I'm concerned.
Now let's talk actual hardware shall we?
Intel CPU coolers are still pretty shit, just like they've been for the past what like 15 years now? Can we just agreed about that? How long is the fan going to last with their new stupid turbo that's going to constantly rev the fan up and down really fast? Why do you think all those shitty little fans on video cards break so damn fast? Can you even put a small case fan on an Intel cooler radiator and how?
How about the CPU socket power delivery system on the motherboard. Are those little beer keg looking things next to the socket sensitive to temperature? Yeah, I'm obviously talking about the capacitors. Very sensitive?!? OMG!!! And no, it doesn't matter if they're polymeric (or solid state) they still age rapidly at high temperature just like your phone's battery. Like he doesn't know that. Like he doesn't understand that their new shitty CPUs are going to cut into the lifespan of your mobo.
I don't even want to start talking about your PSU and transient current problems caused by these news CPUs.
He knows, they all know, they're just don't fucking care! They need to claim they're better than AMD. Just like Nvidia has to claim they have the fastest video card on the market. Doesn't matter if it uses 450W/h, electricity's basically free son. Bunch of bastards, all of them...
And no, I'm not saying AMD's much better. They did the whole power thing too in the FX days. They do almost all of the same crap Intel does too but at least they're not charging you extra for SMT or for an unlocked multiplier or for unlocking some more features and performance on your server CPU. Intel is literally trying to push microcode microtransaction upgrades for CPUs right now. I know besides the point but still just think about that...
There's a lot of marketing bs here. Of course you want your temps as low as possible and stable, cause it affect your fan curves and longevity of components. The fact it's within the spec and they designed it like that to survive 2-3 years, doesn't change facts. It's great it can suck everything from the chip and interesting from the engineering perspective, but it's not good for consumers to stress components too much.
@@davidcobra1735 Singlecore performance does not scale with higher Powerlimits. Your comment makes no sense. It is there to increase multicore performance, as on same Power consumption a 7950x will always beat a 13900K in multicore, but regardless of the power, the 13900K will always beat the 7950x in singlecore
@@otozinclus3593 That's what I meant. Like you can't tell that. And you know very well they mostly stress the fact they have a single core performance lead again in their marketing and pretty much nothing else. Do you just want to be argumentative or feel like a wise guy or something? Just spare me.
Edited my previous comment to better phrase things.
@@davidcobra1735 You said in your comment "They increase power consumption to claim higher single core performance "
And I corrected that saying, that the power consumption has nothing to do with single core performance
This is the side of the equation that we often miss and need in order to establish a rounded understanding and opinion of products. At least for people like me that don’t/can’t commit the time to dive into the technical material available online. While you are one of the most capable and honest reviewers out there, hearing from the creators’ perspective as well as yours as an enthusiast end user adds immense value. Great work to the both of you and thank you for your time and insight.
It's great as an engineer. Everyone has to deal with management wanting a number to be better. "Sure I can put the sensor somewhere else, but you really don't want to do that."
This is certainly an interesting insight. While I'm still not a fan of running cpus close to their temperature limits, I guess we can make peace with it so long as they're truly designed for it, and there's no negative impact on their lifespan.
I would love to see a followup interview with an Intel engineer explaining the current situation with their high-end CPUs degrading and whether they still approve of CPUs running at Tj max for prolonged period of time. People that buy or intend to buy their products in the future deserve to know this.
:D :D :D :D
This interview aged like milk.
Cool bit. Would be interesting to also know about the process of thermal throttling & points of physical failure on killed CPUs
If he could do some live explanations and demos. With some sacrificial CPUs. That would be cool :)
This is how theyre designed.. etc.
First I want to clear that I love Intel. But this way to put things are very optimistic. The problem isn't that the CPUs are running at 100 degrees C, the problem is that while in the past you could reach the max turbo speeds with a $50 air cooler, now you have to spend $100-200 on a AIO and it is still not running at those speeds at all time. I moved from a 9700K to a 13900KS, I cooled and even overclocked my 9700k with a be quiet Dark Rock Pro 4. Now I can't reach the advertised speeds with the same cooler, even a 13600k will make any air cooler struggle. People aren't mad that their CPUs are running at 100 degrees, they are mad because they have to spend $200-300 extra on a good AIO or over $1000 on a custom loop, just to get to those advertised speeds. I paid for a 13900ks, but if I don't have a very good cooler, it will be slower than a 12900k.
For example, I have a friend who bough a 13600k, he insisted that we just get the best air cooler (yeah, D15) and hope it is going to be enough, he is using TG Extreme for TIM. CPU still throttles. In normal circumstances, throttling is something that should not happen at all and if the CPU is rated to turbo to 5.2GHz or 5.5GHz it should be able to reach that, instead of getting a 24000 pts on Cinebench R23, like the people with custom loops and good AIOs, he is getting 21-22000, which is around what a i5-13500 is getting.
Running at 100 C is fine... if the CPU runs at 200-250W, not at 320-400W. Saying "it is designed to run at 100 C, because you will get the best performance" isn't a reason not to focus on power efficiency.
There's also the fact that this gen of CPU's hasn't been out for long enough to test the company's claims. They SAY that these high temperatures are """"fine"""", but ARE they? We don't know. Taking the word of someone selling a product that their product is perfect is always a losing gamble.
I'm in similar situation as your friend. I bought a 14700, and CM Hyper 620S. Realised CPU kept throttling at 100 C. I just set PL1 lower at 167W now, CPU package temperature around 79 C (ambient usually 30 C).
Should have just saved my money and purchased a mid-end processor. Maybe I will purchase an AIO cooler 5 years down the road, and consider that as my "upgrade".
No. 95c should never be crap I am wasting performance and I need to over clock until i hit 100c! I don’t want degraded CPUs and and 50 db jet engine fans to keep it cool! I went and and it never hits above 70c to keep it quiet
This was a great watch, would love to hear more about longevity in correlation with temps
Big shout out to Intel for doing these reviews. As a consumer I really appreciate these insights.
Awesome video, you can tell there's a lot more behind what's being discussed here. My main concern with temperatures isn't how hot my CPUs get, it's how reliable they'll be at those temperatures. For example, If I want a laptop to survive 10 years, but it runs at 103C on a hot day, I want to know what's happening to the projected lifespan of my components.
It will still last more than 10 years unless you have not used a TIM or mounted a cooler at all.
Great interview. Decades ago I worked in the semiconductor side of the business; while he probably can’t get into too much detail, I’d love to know more about how they’ve addressed a lot of the material science issues we had trying to mitigate Black’s Equation, especially at these power densities. It absolutely blows my mind that any of this stuff works, when we were amazed (at the time) that we could hit 300Mhz on a 0.50 μm process.
This is fascinating. I got a new Intel processor and it was running at 95 under intense gaming and I was like OMG it’s going to blow up but actually this gives me confidence that it’s all ok.
Cpus start to melt above 250 celsius degrees meanwhile people are so scared running their cpus to 70 degrees lmao
This did not age too well. Maybe a little follow up is needed. "Go ahead up until PL2 if your system can handle it" they said...
Intel Engineers 🤡
Really interesting. For sure a conversation on oc voltages vs life expectancy would be a good video.
Higher operating frequency means higher temperature.
It's the higher operating temperature which reduces the life of the semiconductor device .
You can go look it up online.
@@deang5622I mean ya. 15 years instead of 20 maybe? Who’s gonna use an alder lake cpu for that long?
@@Akkbar21 I correcting the error made by the original poster. I am not commenting on how long people keep their electronic devices. That's a separate matter.
And I have personally seen equipment containing semiconductor devices last for 25 years and still be in use. So the idea the life time is not important is incorrect.
@@deang5622 I've got my CPU overvolted at 1.45v and its been running now for 5 years. You snowflakes need to grow some balls.
But I also delided, applied liquid metal to the die itself touching the IHS. And also running Noctua NHd15.
As long as temperatures don't cause thermal throttle you are going to be fine.
These types of video interviews are so interesting, i would love seeing them turn into a regular series.
Thermal management engineering can be REALLY, REALLY hard.
Measuring stuff where the MEASUREMENT ITSELF, affects or alters your results -- that's also really hard, but also really fun as well.
When I had to do that for my undergrad research (measuring the contact angle of a sessile droplet), there wasn't really a way of doing it such that you would make physical contact with the droplet.
So, given the refraction index of the liquid in question, we were able to use lasers and a little bit of trig to measure that.
I *wished* that the general population understood more about the über technical details that goes into engineering something.
I don't see why the general population would need to know these details if they are not engineering these components themselves. That is why we have division of labour and expertise, so that people not directly involved in something don't need to waste time studying it. We also let doctors learn about medical issues instead of studying 10 years ourselves to understand the details of our anatomy and how some disease might develop.
@@cyberfunk3793
You'd be AMAZED at how many people pretend to know things, when they've only read very few things about it and then lack the fundamental, background, first principles knowledge that went in, behind said "thing".
Three things:
1) There are principles of engineering that a LOT of people don't understand, due in no small part, to the fact that scientific literacy in Canada and/or the US, is generally, very, very poor.
2) Then you add that to the fact that people like to *think* that they know things, whilst simultaneously, failing to realise how LITTLE they know, based on the LITTLE amount of reading that they've done, and the homework that they HAVEN'T done, with respect to these topics.
(I used to rail on the reviewers on the former computer website PimpRig, which then re-branded itself as PCApex, for their reviews of CPU heatsinks and fans, because they wouldn't state what their ambient temperature is, and they would ONLY report on the temperature that they saw, in the software monitoring tool that they were using at the time.)
And I used to excoriate them for it, so much so that one of the reviewers was the owner's wife, and the poor job of doing said CPU HSF review that he would try to come to her defence (which was understandable), but then both of the FAILED to grasp the TECHNICAL aspect of heat and mass transfer, which is one of the courses that you need take, as a mechanical engineer, in order for you to be able to graduate as said mechanical engineer.
There are a LOT of people who like to THINK they know things (vs. people who ACTUALLY know things).
And it is only when they are challenged by people the latter, sometimes with REALLY, REALLY basic questions about the very thing that they're talking about, which is how you are able to ascertain which side of the Dunning-Kruger effect plot they're on (the right side (high degree of knowledge, high confidence) vs. the left-side of the plot (low degree of knowledge, high confidence).
(I've challenged people who believe in chemtrails to be able to provide the pressure and/or the mass flow rate of the supposed "sprayer", along with a request for the material properties of the spray itself (so that I can simulate the dispersion pattern using computational fluid dynamics (CFD) and those people ALWAYS aren't able to answer those very simple, basic questions. And the same is true with 9/11 truthers, whereby I've literally interacted with the "Architects and Engineers for 9/11 (truthers)" and I'm like "you know you can simulate that using an explicit dynamics simulations program like LS-DYNA?" And I've even asked them for their LS-DYNA models, so that I can review it, and of course, they come back with some kind of bullshit answer/excuse like it takes too much time, etc.
(Meanwhile, HOW much time are they spending on trying to find out said "9/11 truth"???)
So, this is needed because there are people who THINK they know things, and coming from one of the guys who was actually responsible, in part, for the development and engineering of the very thing that people are talking about, people being people, they can STILL try to argue with them.
But the ones that learn from this are the ones who won't bother trying to challenge one of the engineers on the product deveopment team, and those that do, fail to realise that they're fighting a really steep, uphill battle, because there are simple questions that he would be able to ask them, that the latter group, just simply won't be able to answer, because they failed to realise that they had a low degree of knowledge to begin with.
3) I work in engineering for battery electric vehicles.
You'd be AMAZED at how many things people pretend to know vs. what people actually know.
Yes, ideally, we should be able to trust the division of labour and expertise.
But then you get idiots like Wendy Wright (and her "debate" against Richard Dawkins, who has a Ph.D. in zoology), whereby she, at one point, basically says "why should I trust you?"
To which, my response to people who DON'T trust the experts, is for them to run the labs themselves, just like their high school chem and bio classes that in all probability, they never took, because they were "too cool for school".
This is why videos like these are important.
It isn't going to stop people who have a low degree of knowledge, high confidence, from being who/what they are, but it WOULD make the entire discussion go a lot shorter/faster if you can just send them the link to this video, and then it's upto them whether they want to watch it or not.
My favourite quote that I made up is "if I don't say/write it, it's my fault. If I write/say it and they don't read it, then it's theirs."
The same applies to this video.
You can't make people watch the video, but you can sure as heck, point out and highlight the critical and relevant parts of the discussion you're having with that person, with this video.
@@ewenchan1239 I agree that basic knowledge like good fundamentels in reading, writing, math and physics is something that the education system should be able to give to as many people as possible. I also think though that it's common human nature to pretend to know things and with specialized knowledge like in this video it's not even desirable that 99.9% of the population actually educate themselves on the topic but rather spend their time on something more direcly aplicable to their own lives unless they actually enjoy the topic and reading about it is something that interestest them.
Another thing is that pretty much in any other field than perhaps mathematics and some really settled issues in physics, there are contrary opinions on many questions eveng among experts. Take medicine and/or nutrition for example: you can get polar opposite opinions from experts that have studied the topic for years so it's easy to see why an amateur might think they are qualified to challendge other more educated people on the topic. If the field internally doesn't have a solid evidence based concensus, it's unreasonable to except it from the laymen confused about the difference in opinions of the experts.
@@cyberfunk3793
Three things:
1) Engineering (in this case) is not the same as medicine and/or nutrition. (i.e. unless you're debating the axioms of mathematics and/or quantum physics, the math that is used in engineering, as well as, predominantly, classical mechanics - there's relatively little debate about that* (with some exceptions like turbulence, but that isn't what's discussed here).)
2) re: human interest
I would have to guess one of two things:
either a) people like to argue about stuff because they like to be right (regardless of level of competency in the field) and/or b) people like to think and/or pretend that their level of competency in any given field/subject matter is higher than what their actual level of competency is, in said actual field/subject matter, because the latter provides them with that sense of confidence whilst also requiring the person to put in as little work as possible to achieve that level of confidence.
And so long as they don't come across people whose actual competency exceeds theirs, the chances that they are going to get called out on their low(er) level of competency is likely to go unchallenged, if not indefinitely, then for a VERY, VERY long time. That is, until they come across somebody who IS willing to call them out on their lack of competency, and who also doesn't abide by the common, social etiquette of NOT calling people out.
(It has been my experience that more people tend to focus on the social etiquette element of it rather than why didn't the person put more work into increasing their level of competency in said any given field and/or subject matter.)
3) re: medicine/nutrition
a) People's most common exposure to medicine and/or nutrition is when they have the common cold and/or the "flu" (which may or may not ACTUALLY be the influenza virus).
People also usually like to take complicated, multi-variate subjects, and distill that into something "simple" that they can understand, which CAN work, sometimes, but also people tend to neglect that with the simplification, comes ALL of the gross assumptions that are an integral part of said simplification, which people conveniently, and often, neglect to remember.
I remember the master's level international business course that I took - out of a class of 26, 2 were mechanical engineers, 2 were industrial engineers, and the rest were business majors. One of the girls, during our final presentation for an international business game that we were playing (for the class), lamented about how she wasn't trained to think like an engineer, as a part of her business degree. And I remember telling the class, during our presentation "forget the math. The point of the game was that everything affects everything else." And that's when the professor stopped me in the middle of my presentation, and was like "EXACTLY! That was the whole point of the game/exercise."
(Multivariate optimisation is something that I'm used to, due in part to my training.)
Thus, to the point about medicine (e.g. common cold and/or "the flu"), the real answer is that it is multivariate (hence ddx), but also, for common cold/flu, we COULD run the labs, but it is generally NOT cost effective to do so as the labs may only confirm what the doctor is likely going to tell you what to do anyways, and yields little, additional insight.
And just like how people generally suck at systems analyses, medicine is no different. For any condition that is more complicated than that, unless the condition has been studied, there are some medical conditions that we might never know what the real answer is or might be, so the best that we can do is to try and see if we can get close (enough) to what the real answer is.
Again, people like to be able to point to ONE thing that's a cause, rather than a whole SLEW of things, all of which contribute to the cause.
And as SARS-CoV-2 has also shown, that even when experts DO study it, that doesn't necessarily mean that the average lay person is going to necessarily abide by the results of those studies neither.
re: nutrition
The answer to this hasn't changed for hundred (since modern medicine) or even thousands of years: everything in balance/moderation.
It's not rocket science.
Of course, that doesn't include the deliberate and corrupt lies that were promulgated at the hands of capitalism (e.g. sugar blamed obesity on the consumption of fats rather than on the consumption of sugar).
(But capitalism lying to us, repeatedly, is a subject for another discussion, another time.)
If you read the paper on why salt is bad for you, what they actually did in that experiment was force-fed mice/rats about 15,000 TIMES the recommend dosage that rats/mice are supposed to have (in mg/kg) and then found that, surprise surprise, the mice/rats started having health issues. Well...yeah. No shit. If we consumed 15,000 TIMES more of something than we were really supposed to, and we ended up getting adverse health effects as a result of that, it's no wonder why there'd be adverse health effects. And that's how they found that out.
Similarly, there is such a thing as water toxicity (i.e. drinking too much water can lead to death), just as there's also oxygen toxicity (which is a part of the reason why Earth's atmosphere/ambient air is only 21% oxygen by volume).
Too much oxygen, and you can die from that as well.
I always tell my wife that whenever there is some "fad" diet, I always tell her to ask the question "what are they trying to sell (to you)?"
(Because if the person who is trying to do the selling REALLY cared about you and your well being, they'd offer their stuff up for free, if they REALLY cared. But the truth of the matter is that they really couldn't give a shit about you, which is why they are selling it to you because their motivation is to make money, and whether you actually get better or not is COMPLETELY irrelevant and immaterial to their enterprise.)
Getting back on topic though, presumably if the people who are watching this video - I would have to guestimate that they're doing so because they ARE interested in it.
Similarly, the reason why Roman asked the questions in the manner that he did is SPECIFICALLY designed to dispel and debunk some of the myths that could be relatively, easily debunked with a little bit of basic engineering/heat and mass transfer knowledge, that again, most people are too lazy to want to get.
(They can watch the lectures on MIT Open Courseware, as an example. But most won't. Cuz people are generally lazy. (High degree of confidence, low competency, corresponding to low amounts of effort.)
@@ewenchan1239 “ (i.e. unless you're debating the axioms of mathematics and/or quantum physics, the math that is used in engineering, as well as, predominantly, classical mechanics - there's relatively little debate about that*
Question like is it a good idea to run a CPU as hot as possible to gain maximum performance isn't that close to those physics fundamentals. How the chip might degrade over time due to heat can probably be modelled with math, but to confirm that real world samples of that chip probably have go through that timespan. Also I'm guessing a hot CPU might affect other components in side the case if cooling is not efficient enough and most definitely a hot running CPU affects fan noise. If the guys talked about the effects on noise, I must have skipped that part because I don't remember seeing it. So an engineer might be correct about what heat does to the chip itself in a short time span, but not know for certain the long term effects on the chip before having years of experience on said chip and the short term effects on things like noise that people typically tend to avoid. So there is still a lot of debating possible even in this topic and it's not just easily settled based on math.
"People's most common exposure to medicine and/or nutrition"
Medicine was used just as an example of a field that requires a lot of expertise but yet many issues are debated even among those experts so a layman flipping the coin can have almost as good chance of being correct on some specific questions as someone with a degree in the field. I don't think it's the flu that is how regular people are most exposed to these non settled issues, but things like what they self eat and if it's healthy or not. One year for example research finds eggs are not healthy, then another one finds the opposite until it gets reverse again at some point.
'And I remember telling the class, during our presentation "forget the math. The point of the game was that everything affects everything else."'
I remember in my first level introductory course to logistics (might have been something else also, can’t remember because it’s more than 20 years ago now) in business school we were already doing optimisation and running solvers in excel and on some other software suite which name I also can’t even remember anymore while in my electrical engineering studies the topic was not even touched in the first years. I’m guessing what people are going to study in business school depends much on the major/minor and what optional courses a person might take. Certainly there is a lot of math courses for those interested in them, for people majoring in for example finance while others for example studying communication or marketing might not be exposed that much to math at all. I also have experience in multivariate optimisation because it has been my income source since about 2010, running simulation/optimisation software that I wrote for the financial markets.
“And just like how people generally suck at systems analyses, medicine is no different."
I think the biggest reason why there is confusion in the field of medicine/nutrition is that the test subject is us. Thus it’s not easy and ethical to run controlled trials that run +20 years where people are only feed 1 thing. Diseases take long to develop and experimenting on humans is difficult for many reasons so epidemiology is typically used instead of randomised clinical trials to figure out the health effects of a certain diet or food items.
“Of course, that doesn't include the deliberate and corrupt lies that were promulgated at the hands of capitalism (e.g. sugar blamed obesity on the consumption of fats rather than on the consumption of sugar).”
Typically it has been socialism where the largest lies have been spread by government institutions, but sure a for profit company might have incentive to fund and show results that are more favorable to their business. It seems to me it doesn’t really matter if its fats or sugar, it’s the total calories that have grown since the 70’s that is the cause of the obesity. And in fact sugar doesn’t appear to be an independent risk factor even for things like diabetes, while saturated fats actually are so even this issue isn’t as clear as many people think.
"(But capitalism lying to us, repeatedly, is a subject for another discussion, another time.)"
It's not the economic system that is lying to us, but for sure for profit companies might think they have an incentive to cheat. Typically in the end the lies will come out though and hurt the dishonest business when people are able to vote with their money, unlike in socialism where there is no such market mechanism how businesses would get punished.
“If you read the paper on why salt is bad for you”
Well this thing isn’t based on a single study but a lot of epidemiology where the effects of sodium on blood pressure have been observed. For many people with hypertension, it seems lowering their salt intake helps but it also seems too low of salt intake can be an issue also for other reasons. Again a hard thing to determine perfectly as long enough controlled randmomized experiments on humans are hard to conduct.
'I always tell my wife that whenever there is some "fad" diet, I always tell her to ask the question "what are they trying to sell (to you)?"'
Obviously companies selling something have an incentive to fund research that is favorable to them, that is why when reading studies in for example nutrition, it’s important to understand the full conflicts of interest the authors might have. It doesn’t mean that everyone doing research in the field is corrupt though and a lot of research has been done in universities for example that isn't so affected by the private sector.
This is great content, like the last one. I could listen to you two having an in depth tech conversation for hours
This was great, it means so much more hearing "this is fine, we design it like this on purpose and here's why" from an actual engineer
This is a fantastic interview with great depth to make his points. Thank you for making this content. I know this guy is telling the truth, for years i've spent doing distributed computing projects on BOINC and have had various types of processors from P3 to Alder Lake running at load 24/7. All the processors still work, no problems.
Intel is so cool to take the time talking about this.
Thank you der8auer and Intel
They have to because their CPUs are not possible to cool.
As I was a Haswell laptop owner, I had plenty of times where the CPU+GPU was saturated and running at 100C to the point it was burning my hands with the exhaust feeling like a hair dryer 😁😁
It was quite funny to see the fuss about AMD's 95C power limit which they hit aggressively to push frequency under load.
Having chip’s running this high isn’t a bad thing… it’s amazing these tiny machines work under such incredible heat cycles!!
Amazing work that’s for sure!!
If it's truly not a bad thing, why did they thermal throttle at 80C for so many years?? Doesn't make much sense to leave all that performance on the table when they would have been just fine running above 80C...
@@bobbygetsbanned6049Because they didn't have the systems in place to very accurately 'guess' the actual hotspot temperature (which could be a lot higher and actually dangerous).
I want this engineer to explain the instability I am experiencing right now...
I really like this kind of videos. Very informative and makes you feel like an insider.
So as was first suspected this video/message was pure gaslighting of the consumer
These interviews with engineers etc is a great thing for us enthusiasts etc as much clarification is given and removes uninformed commentary as well providing information which helps consumers understand why architectures run the temps they do etc which can be piece of mind for some. Great series Der8auer and keep’m coming 💪🤯🥰🤩👍
The power density is something interesting to me. That is one of the biggest things I've noticed with the last few generations of CPUs. Just how fast they vary in temp. I remember back in the day with a good custom loop you would start a benchmark and temps would slowly start to rise until they leveled off. Then after the load ceased temps would drop back down at a good pace but not instantly. With the recent generations you can be at a cool idle temp and start a benchmark and almost instantly the chip is over double it's idle temp and pushing up on its thermal limit. Then as soon as the load ends it almost instantly drops double digits. My observations are with well designed EK loops. Perhaps it's more accurate and faster sensors in combination with the increased power and power density. This behavior is what makes me think the future improvements will literally have to be from a performance per watt perspective as it seems the power density is already flirting with the limits of what our TIM and other thermal materials are capable of. Also, I trust that they believe the chips can be ran up to their limit with no issues, but I still personally would love to be able to tweak for efficiency and build a proper loop to be able to get at least quoted performance or slightly better while also cooling well enough to be a decent bit below that thermal limit. Maybe that's wanting my cake and to eat it too in today's world, but I'll keep trying to make everything I own better than it was when I bought it to the limits of my abilities. It's just the enthusiast nature.
Good old days :)
you can tweak TCC under load till you get quoted performance.
you can also cap power.
@@satibelyes capping power has worked great for me haven’t noticed any significant performance loss neither
Yeah, nowadays CPUs are just designed to hit thermal limits before powerlimits. This is basically what OCing was about. There is very little headroom for OC in modern CPUs with common thermal solutions.
Going to need to buy that compressor cooler :D
@@rkan2 linus managed to hit thermal limits on a 5kw chiller, so it seems at that power density you can't.
I can't really read comments much. I have been a machinist for a good about of time and do very high tolerance work and most commenter just remind me of customers telling me what they need then telling me how to do the work when they have no idea what they are talking about. (hurts my head) I very much appreciate this video. This man has spent much of his professional career doing this and it shows that he know exactly what he is talking about in the first 2 minutes. Thank you
The thing is, as a home user, the thing I care more about is perf per watt, not absolute max peak temp. This almost entirely about my _home_ cooling, not my PC's safe temps.
If I can get 95% of the performance (and currently in some loads over 100%) by choosing a specific platform ... I'm picking the one that doesn't hotbox my room. Same with GPU and sacrificing a couple % max fps to lower temps (and noise) dramatically.
Tldr; I haven't moved away from Intel for feeling it is running chips too hot for safety. I've done it for personal comfort.
Undervolting is the new overclocking, but yeah I have to agree. It's not pleasant if ambient temps are already at 21C or higher and your case is dumping heat into the room like a radiator 🥵
Noise is also way way higher in those situations. At 19-20C ambient my undervolted rig has 98-99% the perf of stock (with boost) while CPU & GPU don't push past 45C. That's at 50-60% fan speed tops.
Cool, quiet, basically the same performance. It's a win/win for the end user. I understand they wouldn't want to artificially limit their components and for the sake of extreme overclocking and enthusiast space they should push for the most they can on any given SKU so I guess it's a balancing act. Thankfully we have undervolting, custom fan curves, and a boatload of thermal solutions to mitigate these rising temps.
This feels like a professor casually giving overview on a subject... Enough to peak curiosity but not enough to bore you. Just awesome
This is a great video, information like this is invaluable to us power users. Definitely would welcome more videos with engineers, especially this dude!
This all makes a lot of sense, just wished we'd have gone a bit more into automatic throttling of the CPU as well as noise when it comes to temps.
Its not the fault of cpu or gpu makers its the fault of the case makers.. No case manufacturer has created the Window Air Conditioner / Computer Case combo.. Or the Mini Fridge / Case Combo
As a diehard AMD user I've got to say that interview was fantastic, very interesting, a lot of passion for his work and clear no BS answers. Large respect.
Yeah, please talk to Mark about CPU voltages, OC and life span of the CPU. That would be super interesting!
Oc = less life
CPUs are just so much tougher than people give them credit for though. Unless it's XOC type stuff.
@@Kholaslittlespot1 not always. My 13600kf died, mobo was giving it 1.5V on default settings. I had it undervolted and after 3-4 months some cores died
@@_sneer_ ouch. Some of these board manufacturers need to be held accountable for the ridiculous default settings.
@@Kholaslittlespot1 yeah, MSI is garbage. Only a little better than Asrock
Fast forward a year and boom, intel released a statement in April 2024 having manufacturers throttle the voltage. Recommended voltage max is now 400 watts vs Asus setting of 511. And I am seeing crashes when hitting 100c. Mind you this is watercooled and has been from day 1
Watts is not "voltage", it's power.
I KNOW the CPU "works" at those temps. I just don't think any of us have good reason YET to trust a statement from Intel or AMD that the CPU is going to last until it is fully obsolete. And especially if you actually use that CPU heavily loaded every day at those temps. If the CPU is pegged at it's loaded throttle point (95C, 100C, whatever it is) and you just leave it there all day every day... how long until MTBF? Frankly, that needs to be fully calculated and disclosed at these insane temperatures. It's impossible to believe that degradation isn't substantial at 100C.
Great video. Nice to have a actual engineer who knows what's really going on inside an Intel CPU and is willing to share a slice of his expertise.
Good info. I understand his point about scavenging as much potential as possible. I still feel a bit apprehensive about running my CPU at rated temps. I could be wrong, yet I believe "high" temps (north of 80c) must adversely affect chip life.
It can be understandable.
But also you may want to take in account history of how this belief appeared.
In older time sensors weren't as integrated in die, as they are now. So they reported lower temperature. Like VRM temp. You see one value, but actual temp can be 15-20 degrees higher under package.
Also at older times amount of temp sensors per die/core were lower, and hotspots weren't as openly readable.
Modern reported temp is much closer to reality than what was before.
But i still can see why seeing temps above 90 degrees can be unsettling to observe. Issues at those temps may not even relate to die longevity by itself, but it can be even more in area of interaction with thermal paste, like pump-out and component separation.
@@DimkaTsv Sensor count. Yes, that makes sense. My old CPU (Sandy Bridge 2600K) appears to run cooler, even with a mild overclock. It does have a much lower sensor count. I under-volted my new CPU (Ryzen 5800X) because of higher apparent temps. I may have been a bit hasty. Cheers.
This was good, really good. Thanks der8auer for bringing such amazing content for us. We need one such interview with amd people as well, if possible.
The problem is the heat being dumped into the room! That's another reason why you hear they're 'too hot'.
Having said that I love the technical stuff being provided here
But isn't that about the actual power consumption? Ie. a CPU running at 95C but pulling 100W dumps less heat than a CPU running at 85C but pulling 150W.
Die temperature and thermal density have nothing in common to dumped heat.
It is as if we compare bowl of soup to heated floor. Or to room heater
Floor is colder, but dumps MUCH more heat in room compared to bowl of soup at 90 degrees. Heaters can be as big as that exact bowl of soup, but they will be able to push more heat as well.
CPU and GPU have 100% efficiency. Because all power goes into heat. Performance is just desired sideeffect. And it means that how much heat actually is dumped in room depends exclusive on how much power die consumes.
High temperature of core/die is just sideeffect of high thermal density or slow cooler heat dissipation. At this point even if you use absolutely insane water cooler with controlled to 0.1 degree coolant at 5 degrees (like coolers used to chill lasers), CPU can still throttle. Like 13900k had 63 degrees in CB in stock under these conditions. And it throttled with some OC, despite such cooling solution. At this point issue lies in physics of heat transfer.
@@DimkaTsv @2intheampm512 But if you have 2 systems with all variables the same (CPU watts, cooler etc), and the only difference is the CPU max temp limit......
I actually suck at understanding heat, thermal dynamics or whatever
@@fumo1000 Doesn't matter. If CPU wattage is same - overall heat output will be same.
CPU temps with condition of same exact power consumption are only dependant on efficiency of heat transfer and heat dissipation of cooler. It doesn't change amount of heat. Just speed of it's removal from CPU.
It is as if you try to chill bowl of soup by placing it in cold water, or by leaving it on air. Thermal capacity of bowl of soup is same. But it will cool down faster and to lower temps in cold water compared to air.
great video, I enjoyed Marks comments and enthusiasm.
Very well explained. I remember when Ryzen 3000 came out and people were going nuts over the voltages because they were spiking when being observed as the sensors and stuff were being probed. I also remember back in the day during college years dripping water onto a cpu without a cooler, they used to get so hot.
Why did you run a CPU without a cooler haha
@@The_Man_In_Red college experiment in class learning it was with old hardware and controlled with drips. Cool visual demo.
@@AdamsWorlds Ahhhh, neat
@@The_Man_In_Red yeah i still did not see the point in it for learning if they wanted to prove how hot a cpu can get without a cooler i feel just saying so is enough lol. But i guess it did something because i remember the class very well. I doubt they would be allowed to do it these days for health and safety reasons. That teacher was so cool. We ever took the top off a HDD to watch it spin. For networking the teacher had us bring our PC's into school and hook them all up for a LAN party! Worked very well as a teaching aid that everyone that day learnt to trouble shoot and network.
You guys are the heroes we need. I could watch this as a podcast forever.
Clearly intel is at the end of its rope and pushing their chips to the limit; it has to choose between its own survival or that of their cpu.
Then along comes the laptop manufacturer who wants to stay under 10 mm thick and wants to save as much copper as possible....
the system reaches 100C after two seconds of activity? it's okay, the cpu takes care of it with thermal throttling, you paid $1000 for a cpu but it works like a $200 one? IT'S ALL OK! Your two-year-old laptop suddenly dies? what's the matter, the new version is out!
Don't worry, it's all normal, keep playing as the titanic sinks
Amazing work really! I love the direction der8auer's videos have been going lately. Keep it up!
2024... well this crap aged like milk......
If you're referring to the recent Intel chip failures, those were caused by incorrect voltage limits set by buggy CPU microcode. They weren't temperature related, which is what this video is discussing. You can still run your chips up to the thermal limit allowed by the cutoff circuit, as explained in the video.
This is so incredible and informative. I totally understand the newfound anxiety of 100-115c temp limits but with the new sensors and testing it kind of just makes sense to gauge more effectively. Of course lower temps are generally going to be better for fan RPM / longevity but it's a complex topic and for high performance this just totally makes sense.
The only real downside I have really noticed myself is it makes controlling fan RPM more difficult because in my experience i could lower my RPM greatly and still get similar temps on a lot of chips. On my old 3600 the thing was silent and low temps. I guess this is just the price of having high performance parts though that I am new to experiencing.
By the time overheating causes degradation, your pc will be technologically obsolete.
It's great to see Mark again, fantastic interview
The problem is not the CPU temperature in itself but that it controls our fan speed dynamically meaning higher temperature = higher fan speed = more noise.
Performance is desirable, noise isn't 🤷♂
I don't really agree with 'if the CPU is not at its limit, you are missing out on performance'. It all depends on how much performance it is and what are the downsides. If I have to double the power draw for just 10% of performance, then that is clearly not worth it for me. I prefer saving energy and 'losing' a little bit of performance. If it was ONLY temperature and not power draw (which of course is not how physics work) then for sure, I wouldn't care about temperature either
You have a contradicting argument tho. If your work load doesn’t require the potential of whatever cpu you have, you could have saved money and bought a lower tier cpu.
This discussion relates more to people buying top tier CPU’s, without even using 10/20/30/40/50% of its potential whilst handicapping the cpu to get lower temps.
@@prisonmike9823 My example was halving the power draw for 90% the performance (which is what Ryzen 7000 does). I very much doubt there is a lower tier that still has 90% of the performance of the higher tier.
Also as I said, I don't do it for the temperatures, but for sustainability and energy usage reasons. If we use double the world's energy for 10% more performance in our tech, it would be a quick way to never solve the global warming problem
I wish I'd seen this closer to publication, I've been wondering this FOR YEARS, and it's nice to have the issue put to rest in my mind.
it would have been nice to confirm if the MTBF will increase with lower temps... silicon have like every product a lifespan. if you are changing your CPU every 4 years there is no problem running "hot", but if you want it to last more like 8 to 10 years the lower temp the better.
You have to take into the account not only the CPU but the surrounding components on the motherboard as well. Output filtering capacitors sit very close to the CPU socket especially on ITX boards. General rule of thumb is every 10 Celsius increase in operating temperature of the capacitor will halve the lifespan on it.
WHY wasn't this recommended to me 2 months ago when it dropped!?
This is an amazing interview!
Easy solution: CPU manufacturers providing longterm e.g. 5 year warranties when running the cpu under Temp(recommend) - guidance and the nice feeling of safety for the user.
You can’t get that unless you give Intel/AMD access to your CPU’s activity and behavior, and letting them be able to validate that you never went above temperature. I don’t think it’s worth opening that can of worms. People just need to get educated on the fact that there’s no difference in wear between a cpu running at 90 degrees and 50 degrees if their behaviors are the same.
There's no problem for a solution. The engineer explained why this is normal.
Changing the warranty from 3 years to 5 years make that big a difference to you?
@@joemarais7683 You can't definitively state that. Temperature is absolutely a factor in silicon degradation and we don't have any historical reference of how these temperatures are going to impact the longevity of these CPUs over time.
@@TheGameBench has anyone actually had a CPU fail without doing something inadvisable to it? its the most reliable part of a computer, thats why older motherboards hold their value, they die far before the CPU
This is so awesome to listen too.
Having a (two actually) specialist talk about the genre (PC tech) i love.
What about longevity of the part? Will running more often at the limit cause it to fail earlier? I definitely don't upgrade often, and I'd like my cpu to last a decade at least. 15 years would be nice.
I was wondering the same thing. I'm no expert but if I had to guess more heat would degrade the cpu faster but probably not enough to make much of a difference long term. I guess we'll find out 10 years from now.
@@rileyxbell heat can kill anything
Exactly this. Pushing things to their limit is stupid beyond words. Everyone drooling over this interview are completely ignoring the long term and acting like a bunch of sheep who have never looked into things parroting each other. We all knew all what was in the video before the interview, there was nothing new being presented here.
Take good care of things, undervolt/-clock and they'll last a lifetime, do what Intel and AMD are doing and you'll be contributing your PC/laptop to e-waste after the warranty or "lifetime" ends. Use Asus to speedrun that process.
I can’t speak for the others but I’ve personally never had issues with the CPU running at 100 degrees. I’ve been a laptop guy since forever and the older laptops used to sit at 100 degrees all day everyday while gaming and not one died or got damaged or had issues. The silicon will outlast your device even at those temps. Sure, undervolting is good. But I started undervolting mostly to reduce fan noise than to cut temps. To this day, my Helios 300 runs at minimum fan speed while gaming and CPU stays at more than 95 degrees. I can increase fan speed to cut the temps but it doesn’t matter really. Hope my answer helps.
I guess this video aged extremely well /s
It's fine to just say all of this but look at all the reports of these chips crashing in unreal engine games with stock settings... fixes are undervolt it, lower clock speeds, set power limits etc... my fix is asking for a refund tbh
Indeed, this video didn't age well.
Thank you Mark & der8auer! Please don't hesitate to get more technical. We can take it. :)
Great discussion to dispell myths, you both cover these issues very well and casually.
The honest answer about the hot spots and where they could place sensors was eye opening. Great interview!
i'm sorry, this is just not ok NOTHING works good at 80 degrees !
why dont you just tell the truth? you haven't worked on your cpus to fix the problem and maybe you can't ?!
but it's more important to make more money because 450$ for an overheating cpu is just not enough..
“Behind every great invention, there’s an engineer who thought, ‘I can make this better! ( they never do)
Oh damn i guess u know better than intel engineers
@@blargface1561 lol /s
Brilliant. Thank you. These guys, along with the NASA engineers are like unsung rock stars to me.
Would love an interview with an intel engineer on safe voltages.
Read the data sheet for the device!
Every manufacturer of integrated circuits produces one.
The data sheet contains an electrical specification section. I know, I used to write them.
Just a random August 12th, 2024 comment here. I would have loved to speak to Mark now after this fiasco/scandal over breaking CPUs
I would be way more direct and simply state that in my experience, riding temps even in high 80s can and did degrade older Intel CPUs
I am on my 2nd 13900k, and even with 1st degraded chip I was undervolted from Day 1
How many people are fixing cores like me, and undervolt like me?
Probably about 3% of people
I have respect for Intel and AMD engineers, but this beating around the bush talks are not reflective of any chip, current or previous.
We are balls to the wall with these "stock overclocks", without even touching the chip
We are now required to undervolt with AMD and Intel
Both companies are destroying their own chips with 95C idea on AMD, and overheating chip on Intel 13900k and 14900k
Hopefully, this Intel fiasco will make them think about power efficiency first, and increase IPC without burning the silicon off.
My concern is with thermal expansion. What will reaching 95C do long term for wear and tear, and also the rapid temp swings in chips. Rapid changes are also harder on materials. Will these two factors manifest in a shorter CPU lifespan if you use it's full potential?
I’ve heard it’s a difference of a cpu lasting 15 years instead of like 20. So it’s not really relevant tbh
Another very interesting and enjoyable tech talk with the guy who make the parts! Nice job!!!
I didn't know Weird Al Yankovic was an engineer at Intel.
His mind is too powerful to be restrained to music alone!
Was just about to post something similar.
We shouldn't be surprised though, after all he made 'all about the pentiums' back in the day.
The older generation builders are maybe the cause of these "off spec" temperatures mania. We learned to keep CPU's under 80°C and lower. Because we knew how it looks like on an older system, where we put the computer on, without a cooler. The die just shrinks, discolors and quit. So now they see these "unusual" temperatures. And panic. We all have to get used to the higher temp specs. Great video. Thanks.
So are these temperatures safe? Let's say I run Handbrake all the time converting into HEVC the CPU runs really hot, is it safe for the chip long-term or does it lower it's lifespan because for example my 2500K is running perfectly fine for 12 years however it runs cool like far away from 70-100*C.
i would guess that the "untold truth" in this new way cpus tend to work is that they will burn way faster than older CPUs did...
so for example, if an intel 1st or 2nd gen would had been designed this way, then they might be burned instead of running right now, 12/13 years later.
Fun thing is that an overall shorter CPU lifespam would rise demand for newer CPUs, since used market wouldn't be as great as it is today, and this is exactly the type of things that improves corporate profit in regard of users economy and natural resources
der8auer is so happy to talk with someone who speaks his language :D
Will there be a pt 3? It was a great interview!
They're not "designing them to run at these temperatures", they're doing damage control/mitigation/workaround because their current process is horribly power inefficient. Yes AMD also has high temp with current designs but if you delid them both, the AMD is still in the 70's at max load but they're drawing nearly half the power.
Also, "wasting potential if you're not running at the limit"? Maybe for a .1% of the user base elite Overclocker, but for anyone running workloads or an every day user 200mhz less will yield barely anything with a massive power difference.
To me this feels like guerilla marketing for an inefficient node process. Putting the sensor further from the hot spot is great design /s... that just lends the perspective of "we don't actually know what the outcome will be long term" and if we report lower temperatures we'll sell more chips. We don't actually care what's going on in the core temperature wise and neither should you.
As user monkeys... we don't care either. We just have expectations because of all the previous lies over the last 3 decades that we've been conditioned to believe previously by these companies on operating temperatures and electron migration. First it was "DON'T DO THIS FOR TOO LONG", now its, "IT'S FINE, IT's DESIGNED, IT'S ENGINEERED TO".
No matter what these companies try to pedal, we still know the cooler a system runs, the longer it lasts. Temperature accelerates entropy.
this is so much fun to watch! 20min video that went real quick for me
My issue with "modern" cpus is less that they run hot, but more that crutches are built in.
Ex: new ihs on ryzen acts as a blanket on the chip preventing you from removing the heat and improving potential
Ex 2: sandy bridges to ivy bridges, intel changed from solder to a crap paste that acted as a blanket resulting in the same case as ryzen.
Both examples are bad engineering. The only way to fix these creates high potential risk on the consumers side. You remove the ihs and direct mount. But you risk breaking chips just to achieve what should have been correct from the factory.
If they want it to run hot to utilize greater potential, cool I'll buy a better cooling solution. But if they put a blanket on the chip, there is no cooling solution that will resolve the issue unless I risk damaging the piece and being out money and time.
What a lovely person! a pleasure to hear somebody speak about their subject :)
Would love to see an interview about overclocking and their opinions about what is the ideal max voltage for overclocking depending on which type of cooling used
The max voltage will be the one just below the one that destroyer's your cpu. 👍
The Mark Interview Saga has been aboslutely fantastic. Please more interviews like this.
also thank you for the wonderful content lately Roman.
Intel needs to just have the CPUs pre delided as an offer…. Maybe the KS CPUS? Some of us just want to use Liquid Metal default. The PS5 already has Liquid Metal? Asus does it to their laptops. Like Liquid Metal is the next thermal paste with these higher temperatures…
Production wise the cost would be insane!
It would actually save them a bit of money. They don’t have to solder the KS processors
Why would you delid KS series ?
If you think about it, they are binned cpus which means they require less voltages for most likely same frequencies.
Would rather delid normal K series instead.
None of my K series scored above 90 points in Asus bios, while KS scored above 110 points.
Might not be appealing for everyone, but am running 13900KS undervolted, with low load-line calibration and on best case scenario..
Managed to pass R23 multicore without crashing
@@Need4FPS because binned CPUS like the KS can overclock a bit better and pull more power, not every K owner is going to want to mess with Liquid Metal, but over clocking community like the KS series since the base line is higher clocked. Also since it’s a limited production, intel would benefit from that. They can’t sell all CPUs delided. It would not make sense to them
@@Multimeter1 The overclocking community is so small that it would be just additional cost for Intel to set up a manufacturing and packaging line just for CPUs without heatsink. Remember that they need to test the CPUs before packaging, and in this case they need to do it all without heatsink. So that is: alteration of the manufacturing line to accommodate the fork for CPUs without heatsink (i.e. covering smds etc), separate QA procedures, separate packaging. A lot of resources for the few bucks more they would make from such cpus. Completely not worth it.
This is nice and all, but in the real world, you don't notice the performance difference between 5.8 Ghz and 6 Ghz but that power and temperature difference is very significant. Running 5.8 at ~70 degrees is definitely better than 6 at 100.
Would love to keep seeing interviews with Mark. Really enjoying these.