I had my first ever internship at Intel, a lot of brilliant and hard-working people there. Personally, I would say the foundry had been the single point of failure that had been letting the whole development process down. I was at the company when there was a lot of introspection going on. It is good that Intel is opening up and the foundry is innovating and setting ambitious targets. Hopefully, they can achieve them with the new investment and plants.
As Intel got competition from AMD, is it not weird how fast they were able to catch up? Is it not obvious that Intel was holding back and drip feeding us innovation? If this is obvious how can a company like that maintain support from consumer? They drew first blood.
@@levingthedream You're right that it's no coincidence AMD shook up the hornet's nest. There were major leadership shakeups that returned many of the engineers from Intel's juggernaut days, like Drs. Gelsinger and Kelleher, back to the board after they realized Ice Lake was going to flop. I'm surprised Sunil Shenoy's name wasn't mentioned here. They recently brought him back after IFS invested in SiFive, and that was the clue that Intel would be joining RISC-V development.
@@levingthedream They weren't. As the OP said, the foundry side of their business basically held everything else up. And forced them to release stuff like Rocket Lake which they backported to the 14nm process node because they still weren't able to get sufficient volume production and yields using their 10nm super fin or whatever they've rebranded that to now. It's also worth mentioning that while the Zen series of architectures have been great. AMDs return to competitiveness was also heavily influenced by the fact that they were using TSMC's 7nm node which gave them a big boost in performance per watt. While Intel was stuck on 14nm and had to basically push that process node to the absolute limit. It's honestly quite astonishing how much they were able to get out of 14nm. Basically, while yes Intel definitely did get fat, lazy and complacent to some degree it was really a perfect storm of Intel screwing up on the Foundry side while AMD managed to create one of their greatest architectures ever and TSMCs 7nm process ended up being even better than expected as well. Intel was able to recover relatively quickly because it was only a single aspect of their business that was really holding everything else up. So much of the performance improvements we see in these chips still comes off the back of the process node improvements, while yes there are certainly architectural improvements and optimizations on the chip design level, it's really the efficiency improvements and transistor density increases that account for a significant amount of the performance gains. It's not like Intel was really making poorly designed chips as much as they were making well designed chips on an older process that in some cases the chips weren't even designed for in the first place. With the nature of Intel's issues I think the time it took them to recover is about what was expected. I'm just glad AMD was able to give them the kick in the pants they needed to start having some actual ambition again. Pat G has set some very aggressive targets for the future, whether they can execute on them is another question but if they can even get 80% of the stuff he's talked about done by 2027 it should be quite something.
@@levingthedream Competition is always good for everybody. Most people view Intel's lack of innovation their fault, and to a good extent it is. However, AMD is the one to pioneer 64bit architecture (as seen in the suffix amd64 even on intel system). Why is so few pointing the finger at AMD for failing to innovate or so long as to let Intel dominate the market? "Drew first blood" is a rather incorrect oversimplification of the actual history. The utter defeat in the hands of AMD caused a lot of internal management changes. Bob Swan was a businessman, and doesn't know much about technology. There was a lot of sugarcoating of projections in the middle management, leading to the eventual 14nm fiasco.
I doubt we'll see RISCV as the main CPU, rather it's all those embedded uses you don't see where it will become prevalent. Things like i/o controllers and suchlike.
@@Chalisque , Long-term the x86 architecture is dead. That's just a fact, and the success of the Apple M1 design shows it's very possible. We can't just jump to ARM/RISC-V (RISC) on Windows devices. So we absolutely will get some sort of HYBRID approach. Keep in mind this isn't going to happen quickly. INTEL is already coming out with x86 CPU's that have different cores (normal + efficiency cores) so Windows is being coded to be SMART about recognizing different cores. So one option is to have x86 + ARM cores. Or we can have x86/ARM merged. I also expect MACHINE LEARNING to figure out very efficient ways to optimize software translation layers to work with dedicated hardware translation. Machine learning can also recode software. Theoretically even partially thread code written as a single thread (yep, it's been done)... so in THEORY we may get programs converted from x86 to ARM/RISC-V with some threading to help reduce single-core performance bottlenecks.
Since it's already possible to game on ARM processors without relying on emulation, I don't think that's true anymore. It's just a matter of building the platform and then seeding the market with some strategic investments in game ports.
I know nothing about the microprocessor industry, but did do contract manufacturing. Often people would bring me something that works great in a prototype, but can't be manufactured in the thousands and would have to be redesigned so it could be manufactured. Often the first design would be great, but the engineers that designed it did not take into consideration manufacturing their great thing. Keeping chip design and manufacturing together might address that problem.
I think you don't have knowledge about risc and cisc, lips and mips architecture. Please read carefully all microprocessor structure and isa structure.
That's gotta be the best intro in this channel! xD Sticking to the lighter side of things, I did see a youtube video from a channel which does critical analysis of movies discussing the classification of Die Hard. The youtuber was speaking for Die Hard to be considered a Christmas movie and did back up his argument by saying that the overarching themes in the movie do certainly match with those of a Christmas movie. A few of such themes that he pointed out were: family, love, reunions, mending things which were broken, etc.
What a confusing opening, mixing true things like pigs being able to fly (just put them on an aeroplane) and Die Hard being a Christmas movie with untrue things like Han not shooting first.
Companies like SiFive are competing directly with Arm in many markets. When SiFive releases a new CPU core it compares the performance and efficiency directly with Arm's processors. There are no smartphones use RISC-V yet.
According to RISC-V International's website, Alibaba finished porting Android to RISC-V. I don't think they'd do that unless they intended to do something with it. So we're going to see RISC-V in some kind of Android device, whether it be a wristwatch, Android TV, or something else.
That doesn't have much to do with it. You can attach a Radeon Graphics card to a SiFive board if you want. Those pieces of hardware attach to motherboards using different hardware standards, so the rest is up to software support.
I figured it was just a case of "the enemy of my enemy is my friend." By dumping money into the RISC-V movement, they potentially weaken one of their bigger design/IP competitors, ARM. They've already been known to include ARC cores in their own System on Package products and RISC-V is another viable option for them.
ARM got a big boost from Apple and Android. Intel has to do something about this, otherwise they'll end up like dinosaurs. And Risc-V is perfect for that. They can keep up with the times without giving up their main product.
@@GaryExplains Yes. I now see how my comment made that impression. I should've worded better. I was just saying x86 arch has been risc like since almost a decade. And besides, why would Intel just abandon their own x86 ISA for RISC-V ISA when it is the micro-architecture that counts.
I think it is the architecture that counts first, then then microarchitecture. Also, Intel's need to translate variable length instructions to fixed length RISC-like micro-ops is a handicap in terms of power and performance.
Arm and RISC-V are both RISC architectures, but that is where the similarities stop. Both Ford and BMW are car makers, but that is where the similarities stop. I suggest you watch my Arm vs RISC-V video.
I believe Intel will be very successful balancing it's manufacturing and designing business. Samsung deigns it's chips and manufactures OEM chips. So far this isn't any more of an issue compared to other chip designers and chip manufacturers. At this stage, I am surprised it took intel this long to come up with this idea. I have a Xiaomi phone, and it is basically a Samsung phone assembled by Xiaomi. From the screen, to the memory, to the Qualcomm chip manufactured on Samsung's fabrication process, Samsung wins even if you don't buy from them. Intel should have done this years ago. We need more competition in this space.
System on a package greatly increases wafer yield because smaller chips are individually less likely to contain a physical defect. If each blank wafer of silicon contains one defect and you make a giant IC that uses a whole wafer then you have a 100% loss rate, but if you can fit 4 IC per wafer then the one defect will only fall on 1 of the 4 chips. Even if the final package requires 4 chips in order to equal the monolithic design, the 4 per wafer method still only has 25% wastage. This is actually a big part of the cost savings created by shrinking transistors and thus IC size for a given design. Though there are limits off course, as some point the chips are so small that defect loss percentage is out weighed by the economies of scale and handling/packaging costs and efficiency losses due to inter-chip connections.(The 386 had L1 cache as a chiplet but the clock speed was low enough this didn't hurt, 486 cache was moved on die.) Reduced transistor size also makes the product sensitive to smaller defects, but this is not relevant to SoaP as the lithography doesn't change. (Or it could even use previous gen larger features while still keeping chiplet count per wafer very high.)
I do remember having a small discussion on future of RISC-V under your another video a couple of years ago My thought was that RISC-V will have a bright future but you denied and wrote that it will find very limited use cases including academics. Anyway another good, short video.
I think it will solidly be the third place ISA for everything. It'll crowd out anything that isn't x86 or ARM, then start creeping up into other places. As for consumer-facing devices where it's the main processor, I think it will take a while for that to happen, but if it does, it will be the embedded small devices market that does it first - and it will probably run Android. Alibaba recently ported Android to RISC-V, and Android was designed to be multi-platform. There are also geopolitical factors to consider. The USA keeps on messing with China's ability to import chip fabs or use American intellectual property, and foreign nations aren't happy about being dependent on American chip designers for their computing needs. RISC-V has officially been made the "National ISA" for India (whatever that entails), and China is already investing in it. Those last details are probably the main thing to make RISC-V big in the world if ever it does become big.
Embrace and extend? Maybe just a _few_ special operations, you know for performance. Everyone is welcome to adopt our backwards compatible breaking changes. ;)
It would be incredibly short sighted to ignore RISC-V. Also if Intel wanted to make a bunch of money, they should start making the chips that are causing the automotive shortages, smaller dies and probably bigger traces. The market is hot and if they could start making them right now while there is big demand, they should make out pretty good.
Good move for Intel. They can fab these on their old nodes instead of throwing them out. Take this market and grow it instead of letting TSMC and Samsung taking it. It has potential to become bigger than ARM or at least equal.
You are oversimplifying Intel. Since they bought Altera, they also have an FPGA business. FPGAs often run softcover. The Altera Nios softcore is a bit dated and Altera (now Intel) tools also support Risc V. Ii makes 100% sense to join.
I can see the value to Intel. It allows them to maintain the older process nodes as they invest in newer processes. As their own products shift gradually, the older fabs can be kept busy rather than dwindling as the legacy Intel parts reach end-of-life. It will also allow them to leverage their FPGA fabric technology to allow bespoke hybrid SoCs with hard logic cores (CPU, GPU, MCU, RAM, etc.) which may give them a leg up over AMD+Xilinx, who don't have their own chip fabs. The experience gained, and the range of applications, when working with partners/customers may reveal new market trends that might affect the product roadmaps so that they aren't caught flat-footed. I am not a fan of Intel architectures, but I do see this as a good move on their part.
Well, one thing you are missing is that Intel was going to invest in SiFive which is a Risc-v based microprocessor design company. Intel decided against it and now following risc-v more closely, I guess they are planning on going into risc-v based processors. Maybe they want to wait and see what's going to happen there first as far as I remember Qualcomm has already invested in Sifive, and not sure why Intel decided against it
this is the long term path forward for Intel as x86 is not suitable for all these new markets that need specialized chips. and if they're smart, they would be doing projects behind the scenes to develop RISC-V that could ultimately be in the same markets as x86
Keen on whether Intel Foundry has any noteworthy sales re: Risc-V chips, now 30 months on from this excellent feb 2022 video 👍 P.S. Han Solo did not shoot first . . . It was Greedo!
RISC-V is an architecture, not a design. I have a series of videos explaining RISC-V, and also two comparison videos: Arm vs RISC-V, and Intel vs RISC-V. You might find them useful.
@@GaryExplains Alright I guess what I meant to ask is, do you think the RISC-V architecture has hopes of being superior to the architectures of current leading ARM chips?
As I say, those videos I made really are your best starting point. I am not trying to avoid your question, but there is much more in the videos than I can write here. But as a very short answer, it isn't the architectures themselves that are good or bad or superior, but the microarchitectures (i.e. the design of any one chip). At the moment there are no RISC-V processors that compete with the leading Arm designs or with Apple's designs.
RISC-V's ISA is open source, that is all. The microarchitecture (i.e. the actual chip design) is not and even if it was there is no way to verify that what is in a chip is the same as the public release of the code. So, it is very possible.
For intel and the tech industry in general 1 billion isn't that much, but it does mean they're no longer on life support and can continue for at least 10 years without much worry..
Normal Intel CPU with a full risc-v CPU allowing developers to use there Intel CPU for there development machine and a risc-v CPU in that Intel CPU to test the Risc v softwar
7:05 I don't think comparing RISC-V to Itanium is quite the best comparison. For one, Itanium was invented by Intel, while RISC-V was not, and their design philosophies are almost complete opposites. One's RISC, and the other is VLIW. Intel tried their darndest to replace x86 in the 80's with a variety of microarchitectures so they could make embedded devices and micro-controllers, but it never panned out because their ideas were too complicated. Eventually they just said screw it, and tried to make Atom cores to fill that niche, but those never caught on much either. Itanium is their biggest ISA failure, but by no means their only ISA failure. RISC-V is unique in that it's initial success was rather accidental, and it's continued success is the result of many different companies adopting it. I see Intel's liking for RISC-V as their attempt at completing their small chip ambitions that they failed at repeatedly. Every company tries to expand its business, and they typically try to do so by expanding into related businesses first. It's a logical move. AMD still has a lot of market share it could overtake in its main competencies, and it's cozied up to ARM, so I don't think we'll see them be as aggressive in adopting RISC-V.
Itanium used a VLIW architecture, which was difficult to write compilers for. So the ISA did matter in that instance. VLIW only found commercial success in GPUs at the time due to their highly parallel design. 64-bit x86 did kill Itanium, but it's not the entire story. RISC-V offers things that Itanium didn't. It's more efficient, it's well-designed, easier to write compilers for by a long shot, and it's open source, so its success isn't tied to just one or two companies.
they're been putting plaster over top of a 16-bit 70s ISA that was hideous to begin with - that they've been unable to escape because of backwards compatibility - so long that it's now a bloated teetering 1500+ instruction monster. it should have died decades ago!
Just seems there's a big push for energy efficiency and perhaps resource efficiency in developing the digital distribution of systems. Good news imho irrespective of the parochial views in the comments. The x86 sounds like it's going to end up more legacy which connects with being wide compatibility. Efficiency is the way forwards. Sounds like the Han Solo Shot first quip predicted a lot of the comment reactions.
Itanium was a great CPU, much better than x86 It has security built right into the design The compilers were difficult to get right and Intel wanted too much for the CPU What really killed it though was AMD, they came up with a 64 bit extension set that meant business didn't need to retest everything all over again but could migrate slowly over time to 64 bit
Intel lives. I predicted this. Apple will have to keep intel on the roadmap. Apple could certainly adopt RISC-V directly and cut Intel out of the equation.
I don't see this as a surprise. Last I checked they're a chip company with an aim to MAKE MONEY. They probably don't care what's in em. If people started eating carrots, McDonalds would sell the McCarrot.
The biggest threat to Intel's dominance of the CPU space is (obviously) ARM - and the biggest threat to ARM's dominance of the non-Intel CPU space is RISC-V - therefore, Intel invests in RISC-V to undermine ARM - simples, innit? Plus, nVidia has just made their big play for ARM (and failed), so this RISC-V investment may have also been a response to the possible nVidia-ARM merger ...
I wonder when/if Intel (AMD or any other) is ever going to actually build an ARM CPU for windows so we can see if also windows machine can improve in terms of both performance and energy consumption... Even though, clearly, windows will never be a really stable system due to all sorts of hardware that has to accommodate
@@GaryExplains maybe Intel ME is going to be ported to RV and their new processors are going to have their low level 0 vulnerabilities on RISC-V. Like saying switching to a Volvo car will made you have car crashes in a Volvo. Sound logic.
Many of these changes are coming from the fact that AMD who moved fabless years ago has had a lot of success. Look at combined market cap of those two businesses. The other factor is that ARM a CPU design and IP company have had so much success. ARM is going to go public now that the NVDA merger has stopped. Intel will no doubt spin the foundry business off just as AMD did years ago and compete as a company that more closely resembles ARM. The combined shareholder value unlocked will be compelling. If they don’t do it on their own then an activist investor will come in to make it happen as $INTC stock has been stagnant for ages.
I expect you are close, but I don't expect to see Intel spinning off the fab division all at once, but instead eventually sell off the older fabs at some point as they become redundant.
Good for Intel, bad news for ARM!! Intel finally realized that they have to adopt risc-v, which is smart move, with risc-v, Intel can be inside everything, from microcontroller to supercomputer. For ARM, it is bad day, ARM's day are numbered, once risc-v smarthone available, ARM does not have too much to compete, it is just matter of time!!
Although it's good thing to happen , if risc v gets serious support atleast quarter of arm , all that arm needs to do is open source their isa and just licence their CPU cores
@@ksrikar6668 I don't think ARM can open source their isa, even with billions of product shipped with ARM, it still can not earn enough money, that is the reason softbank trying to sell it, Nvidia buy arm should be approved, ARM won't be competetive alone, risc-v pose huge threat to ARM.
@@alexpaul1665 but let's say arm did not opensource but they made it free to access. They can do what Qualcomm is doing making custom cores and sell. And letting others to use their isa however they want. Because rewriting software is such a big deal most would prefer to build custom arm cores
@@ksrikar6668 Almost 25 billion devices shipped with ARM, but ARM's revenue is 2 billion $, this shows that ARM's business model simply won't work. Pure IP providing model won't work, with risc-v it will be even hard to be pure IP provider, no one wants to hand over their fate to somebody else's hand. I believe ARM also desperate to sell itself, because risc-v tide is coming to ARM. regarding to software, mobile phone software is not that difficult. On pc we want have long term backward compatibility, in industry or science, everyone wants the new hardware have long term backward compatibility, because some software takes year to improve. On smartphone, that is not huge deal, no one is using their phone more than five years. Once there is competenct risc-v solution, android ecosystem will be ready in 1-2 years.
@@GaryExplains Let me explain... because these other companies they are much more transparent and are much more trustworthy than Intel. History has showed us how dark Intel is, and until very recently, are still exhibiting the same behaviors as before.
@@GaryExplains Because they will finally move away from a near 50 years old architechture. And if they make a serious move to RISC-V, they will surely be able to innovate beyond what they have planned for next 5 years. We could see a leap in tech by 10-15 years within the span of 5 to 6 years.
It may be really dangerous for RISC-V to accept Intel's contribution. History knows such Intel's activity which just killed the thing. In late 90s on servers market there were few big players, and Intel was probably none of them. Big one was HP with their HP-PA risc processors. Intel initiated venture with HP and other server vendors to create new ISA and new chips called Itanium which become the future for servers. Chips were promised to have different instruction sets, and different backward compatible emulation. HP closed HP-PA research and paid a lot of new hope for this, but Intel paid almost nothing for Itanium development, doing its own separate work on Xeons. HP fired Itanium servers which were real crap and Intel built its position on servers with x86 Xeons. It was a Intel's cheat in fact. Hope that RISC-V story will be different.
I had my first ever internship at Intel, a lot of brilliant and hard-working people there. Personally, I would say the foundry had been the single point of failure that had been letting the whole development process down. I was at the company when there was a lot of introspection going on. It is good that Intel is opening up and the foundry is innovating and setting ambitious targets. Hopefully, they can achieve them with the new investment and plants.
As Intel got competition from AMD, is it not weird how fast they were able to catch up? Is it not obvious that Intel was holding back and drip feeding us innovation? If this is obvious how can a company like that maintain support from consumer? They drew first blood.
Completely agree.
@@levingthedream You're right that it's no coincidence AMD shook up the hornet's nest. There were major leadership shakeups that returned many of the engineers from Intel's juggernaut days, like Drs. Gelsinger and Kelleher, back to the board after they realized Ice Lake was going to flop. I'm surprised Sunil Shenoy's name wasn't mentioned here. They recently brought him back after IFS invested in SiFive, and that was the clue that Intel would be joining RISC-V development.
@@levingthedream They weren't. As the OP said, the foundry side of their business basically held everything else up. And forced them to release stuff like Rocket Lake which they backported to the 14nm process node because they still weren't able to get sufficient volume production and yields using their 10nm super fin or whatever they've rebranded that to now. It's also worth mentioning that while the Zen series of architectures have been great. AMDs return to competitiveness was also heavily influenced by the fact that they were using TSMC's 7nm node which gave them a big boost in performance per watt. While Intel was stuck on 14nm and had to basically push that process node to the absolute limit. It's honestly quite astonishing how much they were able to get out of 14nm. Basically, while yes Intel definitely did get fat, lazy and complacent to some degree it was really a perfect storm of Intel screwing up on the Foundry side while AMD managed to create one of their greatest architectures ever and TSMCs 7nm process ended up being even better than expected as well. Intel was able to recover relatively quickly because it was only a single aspect of their business that was really holding everything else up. So much of the performance improvements we see in these chips still comes off the back of the process node improvements, while yes there are certainly architectural improvements and optimizations on the chip design level, it's really the efficiency improvements and transistor density increases that account for a significant amount of the performance gains. It's not like Intel was really making poorly designed chips as much as they were making well designed chips on an older process that in some cases the chips weren't even designed for in the first place. With the nature of Intel's issues I think the time it took them to recover is about what was expected. I'm just glad AMD was able to give them the kick in the pants they needed to start having some actual ambition again. Pat G has set some very aggressive targets for the future, whether they can execute on them is another question but if they can even get 80% of the stuff he's talked about done by 2027 it should be quite something.
@@levingthedream Competition is always good for everybody. Most people view Intel's lack of innovation their fault, and to a good extent it is. However, AMD is the one to pioneer 64bit architecture (as seen in the suffix amd64 even on intel system). Why is so few pointing the finger at AMD for failing to innovate or so long as to let Intel dominate the market? "Drew first blood" is a rather incorrect oversimplification of the actual history.
The utter defeat in the hands of AMD caused a lot of internal management changes. Bob Swan was a businessman, and doesn't know much about technology. There was a lot of sugarcoating of projections in the middle management, leading to the eventual 14nm fiasco.
Can’t wait to watch Die Hard on Christmas Eve on my Risc V SoP made by Intel in the future.
all while sitting on the back of the pig that can fly
You will on your iPhone/iPad 2025 or their AR glasses. Apple is already testing and planning to move to Risc-V.
RISCV should get a proper low loss X86 translation/emulation extension which is very critical for RISC-V success in app compatibility and gaming.
I doubt we'll see RISCV as the main CPU, rather it's all those embedded uses you don't see where it will become prevalent. Things like i/o controllers and suchlike.
@@Chalisque ,
Long-term the x86 architecture is dead. That's just a fact, and the success of the Apple M1 design shows it's very possible. We can't just jump to ARM/RISC-V (RISC) on Windows devices. So we absolutely will get some sort of HYBRID approach. Keep in mind this isn't going to happen quickly.
INTEL is already coming out with x86 CPU's that have different cores (normal + efficiency cores) so Windows is being coded to be SMART about recognizing different cores. So one option is to have x86 + ARM cores. Or we can have x86/ARM merged.
I also expect MACHINE LEARNING to figure out very efficient ways to optimize software translation layers to work with dedicated hardware translation. Machine learning can also recode software. Theoretically even partially thread code written as a single thread (yep, it's been done)... so in THEORY we may get programs converted from x86 to ARM/RISC-V with some threading to help reduce single-core performance bottlenecks.
@@Chalisque I think RISC-V wiill dominate everywhere as many big manufacturers joining risc v and development
Since it's already possible to game on ARM processors without relying on emulation, I don't think that's true anymore. It's just a matter of building the platform and then seeding the market with some strategic investments in game ports.
Question: why would intel invest in RISC-V when they own x86? Wouldn't the market going to RISC-V be bad news for Intel?
I like watching Die Hard at Christmas so hopefully I should like Intel joining RISC-V!
😂
Obviously it's no Elf!
@@GaryExplains Time Stamp??
Awesome channel Gary. I'm glad to have found you, pretty good content. Thanks!
I know nothing about the microprocessor industry, but did do contract manufacturing. Often people would bring me something that works great in a prototype, but can't be manufactured in the thousands and would have to be redesigned so it could be manufactured. Often the first design would be great, but the engineers that designed it did not take into consideration manufacturing their great thing. Keeping chip design and manufacturing together might address that problem.
I could see Intel using RISC-V for 'ultra efficient cores' if R5 + Emulation is still more power efficient than what x86 can do.
I think you don't have knowledge about risc and cisc, lips and mips architecture. Please read carefully all microprocessor structure and isa structure.
As far as recall reading in a book and comments on youtube x86 is already emulated.
@@alexmartian3972 emulation and physical design totally different. Not all that things possible from emulation, simulation and synthesis.
@@alexmartian3972 I don't believe it's overly efficient though. I think they only just did x64 too.
@@jatinder640 Not all...Sure not quantum but e.g. Turing machine simulates "ordinary" architecture and code.
That's gotta be the best intro in this channel! xD Sticking to the lighter side of things, I did see a youtube video from a channel which does critical analysis of movies discussing the classification of Die Hard. The youtuber was speaking for Die Hard to be considered a Christmas movie and did back up his argument by saying that the overarching themes in the movie do certainly match with those of a Christmas movie. A few of such themes that he pointed out were: family, love, reunions, mending things which were broken, etc.
Jim keller said that the best way to make a very powerfil cpu is going risc V
It will be interesting to see how up-to-date node tech Intel will allow possible competitors us in their FABs.
What a confusing opening, mixing true things like pigs being able to fly (just put them on an aeroplane) and Die Hard being a Christmas movie with untrue things like Han not shooting first.
*GARY!!!*
*GOOD MORNING PROFESSOR!*
*GOOD MORNING FELLOW CLASSMATES!*
Stay safe out there everyone!
MARK!!!
Die Hard is a Christmas film and you can't change my mind.
Eh, catch someone unaware and use a good bone saw. You can change anyone's mind that way.
Hi @Gary Explains , I've a question, Is RISC-V going to be competing against ARM architecture? Like in mobile segment etc.
Companies like SiFive are competing directly with Arm in many markets. When SiFive releases a new CPU core it compares the performance and efficiency directly with Arm's processors. There are no smartphones use RISC-V yet.
According to RISC-V International's website, Alibaba finished porting Android to RISC-V. I don't think they'd do that unless they intended to do something with it. So we're going to see RISC-V in some kind of Android device, whether it be a wristwatch, Android TV, or something else.
I would like to know more about these flying pigs that you mentioned. ...
Risc v won't make much money unless it becomes fast and enter mobile and PC markets. How often do you buy your tv or hifi system ?
I love this channel, thanks for sharing.
If this really happens will we able to still upgrade our devices like the CPU,GPU,RAM etc??
Or will it be just a SoC?
If what really happens?
@@GaryExplains if Intel moves to RISC-V from x86-64
It won't, not in the next 10 years.
That doesn't have much to do with it. You can attach a Radeon Graphics card to a SiFive board if you want. Those pieces of hardware attach to motherboards using different hardware standards, so the rest is up to software support.
I figured it was just a case of "the enemy of my enemy is my friend." By dumping money into the RISC-V movement, they potentially weaken one of their bigger design/IP competitors, ARM. They've already been known to include ARC cores in their own System on Package products and RISC-V is another viable option for them.
ARM got a big boost from Apple and Android. Intel has to do something about this, otherwise they'll end up like dinosaurs. And Risc-V is perfect for that. They can keep up with the times without giving up their main product.
thanks for the video!
You're welcome
Die Hard is a Christmas movie. x86 are already RISC-like in the backend.
RISC-like in the pipeline and RISC-V are too very different things.
@@GaryExplains Yes. I now see how my comment made that impression. I should've worded better.
I was just saying x86 arch has been risc like since almost a decade. And besides, why would Intel just abandon their own x86 ISA
for RISC-V ISA when it is the micro-architecture that counts.
I think it is the architecture that counts first, then then microarchitecture. Also, Intel's need to translate variable length instructions to fixed length RISC-like micro-ops is a handicap in terms of power and performance.
Die Hard is a damn fine Christmas Movie, so I'm all for open source RISC V.
So how long will it take to move to arm design completely.
Also will x86 die gradually.
I know not instantly but gradually.
Move to Arm design? Don't you mean move to a RISC-V design?
@@GaryExplainswell given you expertise is it ok to think x86 will not be as prominent 10 years down the line? Everything's moved over to arm or R5
@@adityay525125 aren't both similar
Bith are reduced instructions
Arm and RISC-V are both RISC architectures, but that is where the similarities stop. Both Ford and BMW are car makers, but that is where the similarities stop. I suggest you watch my Arm vs RISC-V video.
Its interesting when Intel x86 joins hands with the RISC architecture, new RISC age is here!
Die hard is obviously not a christmas movie. It's THE christmas movie!
😂
Han Solo shot first in the original. When Star Wars was rereleased years later one of the changes made had Greedo shooting first an missing.
Die Hard IS a Christmas movie.
You, Sir, have made a VERY powerless enemy. Huff!
😂
I believe Intel will be very successful balancing it's manufacturing and designing business. Samsung deigns it's chips and manufactures OEM chips. So far this isn't any more of an issue compared to other chip designers and chip manufacturers. At this stage, I am surprised it took intel this long to come up with this idea. I have a Xiaomi phone, and it is basically a Samsung phone assembled by Xiaomi. From the screen, to the memory, to the Qualcomm chip manufactured on Samsung's fabrication process, Samsung wins even if you don't buy from them. Intel should have done this years ago. We need more competition in this space.
I hope you are right!
Diehard is not A christmas movie. Dihard is THE christmas movie
Wonder if they will license the i960 on their fabs. That's a far more interesting value proposition.
System on a package greatly increases wafer yield because smaller chips are individually less likely to contain a physical defect. If each blank wafer of silicon contains one defect and you make a giant IC that uses a whole wafer then you have a 100% loss rate, but if you can fit 4 IC per wafer then the one defect will only fall on 1 of the 4 chips. Even if the final package requires 4 chips in order to equal the monolithic design, the 4 per wafer method still only has 25% wastage.
This is actually a big part of the cost savings created by shrinking transistors and thus IC size for a given design. Though there are limits off course, as some point the chips are so small that defect loss percentage is out weighed by the economies of scale and handling/packaging costs and efficiency losses due to inter-chip connections.(The 386 had L1 cache as a chiplet but the clock speed was low enough this didn't hurt, 486 cache was moved on die.)
Reduced transistor size also makes the product sensitive to smaller defects, but this is not relevant to SoaP as the lithography doesn't change. (Or it could even use previous gen larger features while still keeping chiplet count per wafer very high.)
I do remember having a small discussion on future of RISC-V under your another video a couple of years ago
My thought was that RISC-V will have a bright future but you denied and wrote that it will find very limited use cases including academics.
Anyway another good, short video.
My views on RISC-V haven't changed much. You still can't actually buy anything useful. Microcontrollers and academics are still the main use cases.
I think it will solidly be the third place ISA for everything. It'll crowd out anything that isn't x86 or ARM, then start creeping up into other places. As for consumer-facing devices where it's the main processor, I think it will take a while for that to happen, but if it does, it will be the embedded small devices market that does it first - and it will probably run Android. Alibaba recently ported Android to RISC-V, and Android was designed to be multi-platform. There are also geopolitical factors to consider. The USA keeps on messing with China's ability to import chip fabs or use American intellectual property, and foreign nations aren't happy about being dependent on American chip designers for their computing needs. RISC-V has officially been made the "National ISA" for India (whatever that entails), and China is already investing in it. Those last details are probably the main thing to make RISC-V big in the world if ever it does become big.
Embrace and extend? Maybe just a _few_ special operations, you know for performance. Everyone is welcome to adopt our backwards compatible breaking changes. ;)
Obviously die hard is a Christmas movie, what else would it be
The finest Christmas movie ever made! Yippee Ki Yay!
It is. Can't change my mind about this one.
Christmas party, the hohoho on the dead body, romance fulfilled etc.
Christmas movie.
It would be incredibly short sighted to ignore RISC-V. Also if Intel wanted to make a bunch of money, they should start making the chips that are causing the automotive shortages, smaller dies and probably bigger traces. The market is hot and if they could start making them right now while there is big demand, they should make out pretty good.
The only way to get ahead of ARM in low power design.
So interesting, I hope they can make it work. Kinda chips convergence
All that's old is new again. Chiplets... we called this hybrids technology in the 1970's
Umm Die hard is a christmas movie. It and Home Alone are played every year around christmas time here
Good move for Intel. They can fab these on their old nodes instead of throwing them out. Take this market and grow it instead of letting TSMC and Samsung taking it. It has potential to become bigger than ARM or at least equal.
smart move from intel's side!
I am sorry but Die Hard just is a Christmas movie, live with it! haha ;)
You are oversimplifying Intel. Since they bought Altera, they also have an FPGA business.
FPGAs often run softcover. The Altera Nios softcore is a bit dated and Altera (now Intel) tools also support Risc V.
Ii makes 100% sense to join.
I can see the value to Intel. It allows them to maintain the older process nodes as they invest in newer processes. As their own products shift gradually, the older fabs can be kept busy rather than dwindling as the legacy Intel parts reach end-of-life.
It will also allow them to leverage their FPGA fabric technology to allow bespoke hybrid SoCs with hard logic cores (CPU, GPU, MCU, RAM, etc.) which may give them a leg up over AMD+Xilinx, who don't have their own chip fabs. The experience gained, and the range of applications, when working with partners/customers may reveal new market trends that might affect the product roadmaps so that they aren't caught flat-footed.
I am not a fan of Intel architectures, but I do see this as a good move on their part.
Well, one thing you are missing is that Intel was going to invest in SiFive which is a Risc-v based microprocessor design company. Intel decided against it and now following risc-v more closely, I guess they are planning on going into risc-v based processors. Maybe they want to wait and see what's going to happen there first
as far as I remember Qualcomm has already invested in Sifive, and not sure why Intel decided against it
No, not missing, I did a whole video about the rumor that Intel was trying to buy SiFive.
this is the long term path forward for Intel as x86 is not suitable for all these new markets that need specialized chips. and if they're smart, they would be doing projects behind the scenes to develop RISC-V that could ultimately be in the same markets as x86
Keen on whether Intel Foundry has any noteworthy sales re: Risc-V chips, now 30 months on from this excellent feb 2022 video 👍
P.S. Han Solo did not shoot first . . . It was Greedo!
Is the RISC-V a significant design improvement over current ARM?
RISC-V is an architecture, not a design. I have a series of videos explaining RISC-V, and also two comparison videos: Arm vs RISC-V, and Intel vs RISC-V. You might find them useful.
@@GaryExplains Alright I guess what I meant to ask is, do you think the RISC-V architecture has hopes of being superior to the architectures of current leading ARM chips?
As I say, those videos I made really are your best starting point. I am not trying to avoid your question, but there is much more in the videos than I can write here. But as a very short answer, it isn't the architectures themselves that are good or bad or superior, but the microarchitectures (i.e. the design of any one chip). At the moment there are no RISC-V processors that compete with the leading Arm designs or with Apple's designs.
RISC-V is a nice , new, clean architecture designed by nice, new, clean people. Also you don't need to pay money to do your own microarchitecture.
So @T you planning on designing your own RISC-V microarchitecture any time soon?
Does *RISC-V* give end users the choice to allow backdoors and undocumented "features" known only to govts?
Give users the choice to allow backdoors? By "users" do you mean designers?
risc-v is open source so this should not be possible afaik.
RISC-V's ISA is open source, that is all. The microarchitecture (i.e. the actual chip design) is not and even if it was there is no way to verify that what is in a chip is the same as the public release of the code. So, it is very possible.
Why don't they make GPUs that are RISC V with 1gb cashe and have math and geometry accelerated processing?
Because CPUs and GPUs are very different, you don't need a CPU ISA inside of a GPU. What purpose would it serve?
Intel is sabotaging arm while looking virtuous...haha
Han Solo shot first. I hope to see the chip shortage come to an end soon.
For intel and the tech industry in general 1 billion isn't that much, but it does mean they're no longer on life support and can continue for at least 10 years without much worry..
Who was on life support?
Normal Intel CPU with a full risc-v CPU allowing developers to use there Intel CPU for there development machine and a risc-v CPU in that Intel CPU to test the Risc v softwar
Still 3 years behind. And it isn't in any actual products.
@@GaryExplains hmmm Risc v Kickstarters should be a thing
7:05 I don't think comparing RISC-V to Itanium is quite the best comparison. For one, Itanium was invented by Intel, while RISC-V was not, and their design philosophies are almost complete opposites. One's RISC, and the other is VLIW. Intel tried their darndest to replace x86 in the 80's with a variety of microarchitectures so they could make embedded devices and micro-controllers, but it never panned out because their ideas were too complicated. Eventually they just said screw it, and tried to make Atom cores to fill that niche, but those never caught on much either. Itanium is their biggest ISA failure, but by no means their only ISA failure. RISC-V is unique in that it's initial success was rather accidental, and it's continued success is the result of many different companies adopting it. I see Intel's liking for RISC-V as their attempt at completing their small chip ambitions that they failed at repeatedly. Every company tries to expand its business, and they typically try to do so by expanding into related businesses first. It's a logical move. AMD still has a lot of market share it could overtake in its main competencies, and it's cozied up to ARM, so I don't think we'll see them be as aggressive in adopting RISC-V.
Itanium failed because of x86. Any desktop or server CPU that Intel creates on RISC-V will fail for the same reason. It wasn't about the ISA.
Itanium used a VLIW architecture, which was difficult to write compilers for. So the ISA did matter in that instance. VLIW only found commercial success in GPUs at the time due to their highly parallel design. 64-bit x86 did kill Itanium, but it's not the entire story. RISC-V offers things that Itanium didn't. It's more efficient, it's well-designed, easier to write compilers for by a long shot, and it's open source, so its success isn't tied to just one or two companies.
1:53 Lovely squeeing noise :)))
So what I understood from this video, Gary wants S.O.A.P. (system on a package)
You mean SoP akin to SoC. SOAP actually stands for Simple Object Access Protocol. It is an XML thing.
@@GaryExplains It was just an attempt at some humor. Anyway love your vids, great information. Thank you for that Sir.
Gary, the jazz hands are getting out of control.
Lol what a day to be alive.
I think they know x86 can't stand for long
they're been putting plaster over top of a 16-bit 70s ISA that was hideous to begin with - that they've been unable to escape because of backwards compatibility - so long that it's now a bloated teetering 1500+ instruction monster. it should have died decades ago!
Just seems there's a big push for energy efficiency and perhaps resource efficiency in developing the digital distribution of systems. Good news imho irrespective of the parochial views in the comments. The x86 sounds like it's going to end up more legacy which connects with being wide compatibility. Efficiency is the way forwards. Sounds like the Han Solo Shot first quip predicted a lot of the comment reactions.
On man I want RISC-V to beat the Acorn nuts.
Why?
Itanium was a great CPU, much better than x86
It has security built right into the design
The compilers were difficult to get right and Intel wanted too much for the CPU
What really killed it though was AMD, they came up with a 64 bit extension set that meant business didn't need to retest everything all over again but could migrate slowly over time to 64 bit
Regarding the pigs flying... they still go cargo, all airlines will not alow in the passenger area a suport pig 😁
Intel lives. I predicted this. Apple will have to keep intel on the roadmap. Apple could certainly adopt RISC-V directly and cut Intel out of the equation.
I don't see this as a surprise. Last I checked they're a chip company with an aim to MAKE MONEY.
They probably don't care what's in em.
If people started eating carrots, McDonalds would sell the McCarrot.
The biggest threat to Intel's dominance of the CPU space is (obviously) ARM
- and the biggest threat to ARM's dominance of the non-Intel CPU space is RISC-V
- therefore, Intel invests in RISC-V to undermine ARM
- simples, innit?
Plus, nVidia has just made their big play for ARM (and failed), so this RISC-V investment may have also been a response to the possible nVidia-ARM merger ...
Will we be seeing fanless intel in the future?
I wonder when/if Intel (AMD or any other) is ever going to actually build an ARM CPU for windows so we can see if also windows machine can improve in terms of both performance and energy consumption... Even though, clearly, windows will never be a really stable system due to all sorts of hardware that has to accommodate
Qualcomm will be the first with its Nuvia based CPUs. I have several videos about it.
You don't build hardware FOR software, you build software for hardware. Window can and has run on an ARM CPU, it was compiled specifically for ARM.
Am I the only one wanting to know about the flying pigs? ;-)
Put them on a plane.
am I suppose to risc v
Yes, with enough propulsion pigs can fly.
Ah, yes, Esperanto AI :)
Die Hard was a love story, John loves to shoot everyone he can and it was Chewbaca who shot first but that was a different movie, Swedish porno lol
Goto backdoor them chips somehow
How does Intel joining RISC-V increase the possibilities of backdoors?
@@GaryExplains maybe Intel ME is going to be ported to RV and their new processors are going to have their low level 0 vulnerabilities on RISC-V.
Like saying switching to a Volvo car will made you have car crashes in a Volvo. Sound logic.
@@nextlifeonearth I’m assuming they will not put their ME in products they make for other people, that would just not make much sense
@CapnTates That scenario is a possibility for EVERY chip maker. Nothing special about Intel or RISC-V in that sense.
Many of these changes are coming from the fact that AMD who moved fabless years ago has had a lot of success. Look at combined market cap of those two businesses. The other factor is that ARM a CPU design and IP company have had so much success. ARM is going to go public now that the NVDA merger has stopped. Intel will no doubt spin the foundry business off just as AMD did years ago and compete as a company that more closely resembles ARM. The combined shareholder value unlocked will be compelling. If they don’t do it on their own then an activist investor will come in to make it happen as $INTC stock has been stagnant for ages.
I expect you are close, but I don't expect to see Intel spinning off the fab division all at once, but instead eventually sell off the older fabs at some point as they become redundant.
The dark forces.
Good for Intel, bad news for ARM!! Intel finally realized that they have to adopt risc-v, which is smart move, with risc-v, Intel can be inside everything, from microcontroller to supercomputer. For ARM, it is bad day, ARM's day are numbered, once risc-v smarthone available, ARM does not have too much to compete, it is just matter of time!!
Although it's good thing to happen , if risc v gets serious support atleast quarter of arm , all that arm needs to do is open source their isa and just licence their CPU cores
@@ksrikar6668 I don't think ARM can open source their isa, even with billions of product shipped with ARM, it still can not earn enough money, that is the reason softbank trying to sell it, Nvidia buy arm should be approved, ARM won't be competetive alone, risc-v pose huge threat to ARM.
@@alexpaul1665 but let's say arm did not opensource but they made it free to access. They can do what Qualcomm is doing making custom cores and sell. And letting others to use their isa however they want. Because rewriting software is such a big deal most would prefer to build custom arm cores
@@alexpaul1665 arm revenue is still pretty high despite being a standalone company
@@ksrikar6668 Almost 25 billion devices shipped with ARM, but ARM's revenue is 2 billion $, this shows that ARM's business model simply won't work. Pure IP providing model won't work, with risc-v it will be even hard to be pure IP provider, no one wants to hand over their fate to somebody else's hand. I believe ARM also desperate to sell itself, because risc-v tide is coming to ARM. regarding to software, mobile phone software is not that difficult. On pc we want have long term backward compatibility, in industry or science, everyone wants the new hardware have long term backward compatibility, because some software takes year to improve. On smartphone, that is not huge deal, no one is using their phone more than five years. Once there is competenct risc-v solution, android ecosystem will be ready in 1-2 years.
Intel joined so they can put some -made back-door into that "open source" cpu. I'd never buy a RISC-V made by Intel, and neither you shouldn't.
And how do you know that is not also the case for every CPU from AMD or SiFive or Samsung, etc?
@@GaryExplains Let me explain... because these other companies they are much more transparent and are much more trustworthy than Intel. History has showed us how dark Intel is, and until very recently, are still exhibiting the same behaviors as before.
LOL, if you believe that then you are naive. The other companies just haven't been caught yet.
@@Traumatree If AMD is intel they just do the same. AMD is forced to innovate or go bankruptcy.
If intel truly starts using RISC-V and abandons x86, I will worship them as the lord himself!
Why?
@@GaryExplains Because they will finally move away from a near 50 years old architechture. And if they make a serious move to RISC-V, they will surely be able to innovate beyond what they have planned for next 5 years. We could see a leap in tech by 10-15 years within the span of 5 to 6 years.
Intel buys Arm 😄
no we don't wanna see that
Nvida just lost out by the looks of it, personally I would prefer to see Arm stay independent
I see Gary just done a video arm/nvidia purchase.
ua-cam.com/video/wvEyT3ZbIhE/v-deo.html
@@PlanetCypher_ yes we would all love to see that
Buying the competitor haha 🤣
Talk about risky business.
Gary failed to explain how pigs could fly. Not a single word. Disappointed.
😂
The flying pig got a gig at a Pink Floyd concert. Don't shine Gary on, you crazy diamond!
Han Solo totally shot first.
no one cares for die hard movies anymore. Ask your kid if he watched it.
AMD and Intel trading blows.
It may be really dangerous for RISC-V to accept Intel's contribution. History knows such Intel's activity which just killed the thing. In late 90s on servers market there were few big players, and Intel was probably none of them. Big one was HP with their HP-PA risc processors. Intel initiated venture with HP and other server vendors to create new ISA and new chips called Itanium which become the future for servers. Chips were promised to have different instruction sets, and different backward compatible emulation. HP closed HP-PA research and paid a lot of new hope for this, but Intel paid almost nothing for Itanium development, doing its own separate work on Xeons. HP fired Itanium servers which were real crap and Intel built its position on servers with x86 Xeons. It was a Intel's cheat in fact. Hope that RISC-V story will be different.
Listen, stop starting Sh1t Gaz. Han shot first, end of
🔫
Intel All ready Try to do Not X86 and Failed so it next Failed attempt - stop pay money for sell Stupid BigSmall PUCon PC is Stupid