@@InterstellarLord Intel is big enough to be supporting multiple ISAs, look at their R&D compared to any ARM vendor, AMD, and fabrication plant combined. Point being, I hope they see that they shouldn't tie themselves to the x86 ball forever.
@@anamikadas2888 Half the MP performance of a Snapdragon 8xc. That's far from the most impressive ARM effort either, let's see Apple's in a few months. It's a start, but I don't see why a company the size of Intel can't start to weigh their future and become ISA agnostic. www.tomshardware.com/news/intel-core-i5-l15g7-lakefield-cpu-rivals-qualcomm-snapdragon-835#:~:text=With%20a%20single%2Dcore%20score,workloads%20are%20another%20story%20though.&text=For%20reference%2C%20the%20Snapdragon%208cx,the%20remaining%20at%201.8%20GHz.
@@doctorpanigrahi9975 lol no, very small margins for chip partners, so no one cares, just like nvidia didn't make ps4 and xbox one chips but still has 4 times more market share than amd, think about that..
I think Zukerbergs Learnt Why Napolion Said, "I'd rather have a Lion leading Donkeys than a Donkey leading Lions." Here's you're Lesson Raja Kodri - The One who Assesses the Assesi, must be More Qualified, Otherwise the Assessor judges the Assesi to be a fool. Who's the Fool? The Assessor that didn't Know That!
I wouldn't have known about Frances Allen without watching this, thank you. Adding Compute Disruptions to the Performance Democratization slide was helpful 21:42 and acknowledging Developers role as well as ARM and Android leadership in Mobile/Cloud era demonstrates how everyone in the industry benefits and contributes to this amazing growth.
Quantum computing only speeds up exponentially certain tasks and operations which are not that significant for normal everyday user and there is big if it will be ever commercialized.
@@vlada881 Yeah, for the time being certain task. Later on, it might be useful for certain engg applications that we do everyday. it may not be that imp for content creators and daily uses like browsing n stuffing. Well we never know what will happen next... in future.
I predict most of our computing will be done in the cloud (server farm's) which are allot of quantum computers working together to be a massive computer for the world to use to work and share things together. Our personal device's are just a gateway to connect to such massive computing device's in the Cloud. If we evolve the performance of computing for one massive computer both hardware and software we will achieve a massive scale of performance. Just some thoughts in my head at least. 🤷🏼♂️ It's very important if we all word together to achieve greatness. 🤓🌎🌍🌏
Are we on the wrong path in memory? Long ago, more DRAM was desperately needed and cost optimization was first priority. In the SDRAM/DDR era, priority has been bandwidth for throughput first, cost second. For the last several years, systems have had far more memory than really needed and DRAM cost is essentially irrelevant (for professional use). Today, much of the capability of modern cores are unused no-op cycles waiting for memory access. There should be an option for low latency DRAM in servers with ECC memory. Currently we only have the standard 14-14.5nns CAS,RCD,RP latency. Gamers can latency down to 8.5ns. Lets cut the L3 miss penalty down from 18+ns in the high core count die. Also, what code use SIMD-AVX but cannot run on the GPU? if very little, the GPU is the right place for SIMD, and lets return the core to general purpose
I would like a more generalized ISA that is not garbage which works on CPU and also GPU.. The problem with supporting ninjas is just that i call them “js people” which make toons of useless abstractions.
CPU performance increase in the last 10 years was barely 2 to 3x? These days 2x in transistor density leads to 5-10% more speed so 50x is not that much especially if that goal will be achieved for more than a decade from now. You can only achieve additional exponential performance increase in certain tasks and operations. By it`s definition "Moore`s law" is already dead. ua-cam.com/video/Vu3YWNbXkKk/v-deo.html Now it`s all about special purpose computing, AI, ML and DL which for normal users is not so significant. Personal computing (desktops and laptops) has less and less importance for tech companies and they are diversifying their business model.
Only if you look at Moore's law as an increase in speed however, there's gains in how much data you can process for a specific period of time. More cores/threads to crunch more data is where Moore's law is still very much alive.
"I predict most of our computing will be done in the cloud (server farm's) which are allot of quantum computers working together to be a massive computer for the world to use to work and share things together. Our personal device's are just a gateway to connect to such massive computing device's in the Cloud. If we evolve the performance of computing for one massive computer both hardware and software we will achieve a massive scale of performance. Just some thoughts in my head at least. 🤷🏼♂️ It's very important if we all word together to achieve greatness. 🤓🌎🌍🌏"
All I hear is complanes? Lythe guy did guid. For abstraction design start doping Ur ninjas ..and try to loosen the constraints as generaticaly possible during builds.
Move to 5/3 no now, 0nm is killing Intel,IPC should be improved, what the hell are engineers doing.? Or stop selling processors. The only leverage that Intel has is stability.
@@xsuploader all these years they used inferior intel chips, now they have their own weapon in hand. So just wait and watch! They are gonna take over laptops/notebooks market but definitely not desktop market, unless they come up with amazing gpus
I learned 10x more from this than school
Go to a better school.
12:42
"There is still plenty of room at the bottom"
──Jim Keller
This man is a legend...
Saved AMD Radeon(?), now saving Intel. He is the chosen one.
@@OninDynamics he did not save amd. Remember the rx series and how hot they ran and how buggy the drivers were?
24:19 I wonder if Intel is thinking about making some ARM cores. They should be.
Intel has made up a new line of processor which will challenge and beat arm it's Lakefield line up
@@InterstellarLord Intel is big enough to be supporting multiple ISAs, look at their R&D compared to any ARM vendor, AMD, and fabrication plant combined. Point being, I hope they see that they shouldn't tie themselves to the x86 ball forever.
@@anamikadas2888 Half the MP performance of a Snapdragon 8xc. That's far from the most impressive ARM effort either, let's see Apple's in a few months. It's a start, but I don't see why a company the size of Intel can't start to weigh their future and become ISA agnostic.
www.tomshardware.com/news/intel-core-i5-l15g7-lakefield-cpu-rivals-qualcomm-snapdragon-835#:~:text=With%20a%20single%2Dcore%20score,workloads%20are%20another%20story%20though.&text=For%20reference%2C%20the%20Snapdragon%208cx,the%20remaining%20at%201.8%20GHz.
Intel fkd up when they denied to make chips for the iphone. Oppertunity lost.
@@doctorpanigrahi9975 lol no, very small margins for chip partners, so no one cares, just like nvidia didn't make ps4 and xbox one chips but still has 4 times more market share than amd, think about that..
2025 is going to be nuts.
I think Zukerbergs Learnt Why Napolion Said, "I'd rather have a Lion leading Donkeys than a Donkey leading Lions."
Here's you're Lesson Raja Kodri - The One who Assesses the Assesi, must be More Qualified, Otherwise the Assessor judges the Assesi to be a fool.
Who's the Fool? The Assessor that didn't Know That!
thank you so much for the valuable information and proud to be part of Intel
Enjoy sinking to the bottom of the pile with this crap.
Whatever?
@@phoenixzappa7366 why so salty!!?
TOUJOUR PLUS PUISSANCE
Amazing.
never expected i can watch this on youtube! thanks for sharing
Great keynote
In the end, he didn't say the most important thing, performance per watt or performance per cost/usage cost.
I wouldn't have known about Frances Allen without watching this, thank you. Adding Compute Disruptions to the Performance Democratization slide was helpful 21:42 and acknowledging Developers role as well as ARM and Android leadership in Mobile/Cloud era demonstrates how everyone in the industry benefits and contributes to this amazing growth.
Make sure no core left behind too!!
This dude thinks he knows More about Processes than me.
Informative, when we expect intel to develop quantum computing processor for commercial lunch? In next 10 years?
Quantum computing only speeds up exponentially certain tasks and operations which are not that significant for normal everyday user and there is big if it will be ever commercialized.
@@vlada881 Yeah, for the time being certain task. Later on, it might be useful for certain engg applications that we do everyday. it may not be that imp for content creators and daily uses like browsing n stuffing. Well we never know what will happen next... in future.
I predict most of our computing will be done in the cloud (server farm's) which are allot of quantum computers working together to be a massive computer for the world to use to work and share things together.
Our personal device's are just a gateway to connect to such massive computing device's in the Cloud.
If we evolve the performance of computing for one massive computer both hardware and software we will achieve a massive scale of performance.
Just some thoughts in my head at least. 🤷🏼♂️
It's very important if we all word together to achieve greatness. 🤓🌎🌍🌏
Mind = blown, I am very nerdy about this field. And i want to know every details about new tech that are cooking.
If you cant beat Rtx 3090 your fired!! lol
Haha is raja the head GPU man for Intel?
Is this intro for a dope health insurance?
Are we on the wrong path in memory? Long ago, more DRAM was desperately needed and cost optimization was first priority. In the SDRAM/DDR era, priority has been bandwidth for throughput first, cost second. For the last several years, systems have had far more memory than really needed and DRAM cost is essentially irrelevant (for professional use). Today, much of the capability of modern cores are unused no-op cycles waiting for memory access. There should be an option for low latency DRAM in servers with ECC memory. Currently we only have the standard 14-14.5nns CAS,RCD,RP latency. Gamers can latency down to 8.5ns. Lets cut the L3 miss penalty down from 18+ns in the high core count die.
Also, what code use SIMD-AVX but cannot run on the GPU? if very little, the GPU is the right place for SIMD, and lets return the core to general purpose
Bruh better get amd
I would like a more generalized ISA that is not garbage which works on CPU and also GPU..
The problem with supporting ninjas is just that i call them “js people” which make toons of useless abstractions.
She kinda reminds me of Guerrilla Games fictional character, Dr Elisabeth Sobeck.
CPU performance increase in the last 10 years was barely 2 to 3x? These days 2x in transistor density leads to 5-10% more speed so 50x is not that much especially if that goal will be achieved for more than a decade from now. You can only achieve additional exponential performance increase in certain tasks and operations. By it`s definition "Moore`s law" is already dead. ua-cam.com/video/Vu3YWNbXkKk/v-deo.html Now it`s all about special purpose computing, AI, ML and DL which for normal users is not so significant. Personal computing (desktops and laptops) has less and less importance for tech companies and they are diversifying their business model.
Only if you look at Moore's law as an increase in speed however, there's gains in how much data you can process for a specific period of time. More cores/threads to crunch more data is where Moore's law is still very much alive.
"I predict most of our computing will be done in the cloud (server farm's) which are allot of quantum computers working together to be a massive computer for the world to use to work and share things together.
Our personal device's are just a gateway to connect to such massive computing device's in the Cloud.
If we evolve the performance of computing for one massive computer both hardware and software we will achieve a massive scale of performance.
Just some thoughts in my head at least. 🤷🏼♂️
It's very important if we all word together to achieve greatness. 🤓🌎🌍🌏"
Love it! Buy INTC!!
All I hear is complanes? Lythe guy did guid. For abstraction design start doping Ur ninjas ..and try to loosen the constraints as generaticaly possible during builds.
Let's pray no one dyes in the wayfer
Imagine hiring this guy after his failed performance at AMD.
intel xe graphic is the biggest failure....
Lol.
Just look at what happened to AMD after he left...Good luck intel!
What do you mean by that?
Mos of the AMD successful architecture is from him including Navi and Vega.
Move to 5/3 no now, 0nm is killing Intel,IPC should be improved, what the hell are engineers doing.? Or stop selling processors. The only leverage that Intel has is stability.
Apple silicon is gonna eat intel alive 🤭, with their incredible performance per watt !😂
Apple only sells to Apple.
@@Anenome5 but kills others
@@prithiviraj3348 so why do macs only have 10% market share?
Macs arent going to dominate market share ever.
@@xsuploader all these years they used inferior intel chips, now they have their own weapon in hand. So just wait and watch! They are gonna take over laptops/notebooks market but definitely not desktop market, unless they come up with amazing gpus
@@prithiviraj3348 I doubt apple will ever have a majority share in either the desktop or laptop market.