Impressive storytelling. It started with Andras and ended it with him while telling the whole story of Intel. I really love the quality of your content.
00:00:00 - Bill Shockley invents the transistor and founds Shockley Semiconductor 00:05:00 - Andros Grove and others found Fairchild Semiconductor and develop the first silicon semiconductor chip 00:10:00 - Bob Noyce, Gordon Moore and team succeed in reducing the instability of transistors and price their products low to accelerate demand and gain market share 00:15:00 - Intel develops the 3101 SRAM chip, 1103 DRAM chip, 8080 microprocessor, 4004 microprocessor for personal computer, Altair 8800 and other computing products 00:20:00 - Intel founded in 1978 by Moore and Bob Noyce, growth slows in the early 1980s due to competition and financial difficulties 00:25:00 - Intel faces challenge to stay afloat in the memory chip market, switches to microprocessor production and becomes the world's largest microprocessor manufacturer 00:30:00 - Intel's 386 processor is a major success but IBM challenges its dominance, leading to significant challenges for Intel's CEO, Andy Grove, to keep the company on top.
I have worked as an Integrated Circuit Layout Designer since 1983. All those chips designed between the first Fairchild chip in the 1950's and the early 1980's was drafted on paper or mylar on a drafting table using pencils. Computers purpose built to assist layout design were only first introduced in the 1980's.
Just designed an entire 16-bit RISC CPU in Verilog as a personal toy project. Took about 4 weeks to get it running, on an FPGA, but the time included writing an assembler. I cannot even begin to imagine what it would be to tape that out on paper.
Another exciting event happened following the release of the Pentium: the floating-point divide disaster. It caused Intel in 1994 to have to buy back $475 million processors produced with the flaw. I went to work at Intel in 1995, and all the employees were given a Pentium processor encased in plastic and turned into jewelry (tie tack, lapel pin, cuff-links, etc), and they sold a fair amount of the defective devices as curiosity items. They were extraordinary to look at, with a gold and rainbow sheen.
I think that was one incident that helped cause Intel's loss of Intel's dominance. The memes (although they weren't called that) were devastating and the slogan "Intel inside" became a warning.
Andy and Robert fired then rehired themselves not to showcase some formality, or as we call it PR these days. They did that to sacrifice a lot of their own seniority benefits and adhere to the set of rules of the newly hires. As such, they putt themselves in such position to go along and to inspire the rest of the company to vigorously do the same, be active and agile like a startup. It could also be that they meant anyone can be fired if not serve the interest of the company. It could be the precedence as why they bring back the current CEO amongst other reasons.
This is a wonderful recounting of a major part the world’s recent technological history. I was a young engineer at a company called Melpar, a small defense contractor in Northern Virginia when the Intel 4004 came out. My boss asked me if I thought I could replace a big box we made for the government with the newfangled microprocessor. Not knowing whether or I could or not, of course I said yes I can do that. And I did. I designed the hardware and software in nine months. It was an exciting time to be in technology.
Interesting story. My family’s first PC was a 486, and my parents ended up choosing an AST over a Compaq. This was 1990 or 1991 in ZhongGuanCun, Beijing China, and that computer cost about 2500 USD. The company Lenovo was a spin-off from the Institute of Computing, where both of my parents worked. Lenovo was originally called LianXiang. Their main early product was a Chinese input method, which allows you to type one Chinese character and be prompted with words and phrases that start with that character, thus the name LianXiang, which roughly means neural association or connection, an indexing and search process that occurs in your brain.
Wow haven't heard someone mention a 486 on here ever! I started on a commodore 64, then moved up to an amiga 600 (couldn't afford the 1200) then I finally got a 386 with 1 meg of ram! By then I was flying progressing to the 486 with (I think) 2 Megs of ram.. I remember having a mario game that was over 4 floppy disks. All running on DOS and windows 3.0/3/1. I remember when windows 95 came out and you got the free Wheezer music video. I used to sit there defragging the hard drive. Simple times! The good ol'virgin days!
My parents' first computer was a Packard Bell with a 486. I don't remember a lot of the specs, but it had Windows 3.1 and I think the hard drive was 20 MB. Kind of crazy to think that a lot of processors have even larger amounts of cache today.
I wish more younger people found this stuff as interesting as I do, I would kill for a job within any of these lines or work, even if it’s something very minuscule, it would feel awesome working for/towards something that is literally revolutionary 😭🔥
Instead of killing someone go study computer science and/or electrical engineering. AMD, INTEL, and NVIDIA need all the talent they can get as they need to consistently come out with better and better products just to stay afloat.
Gordon Moore attended a lecture given by C Harry Knowles (Bell Labs) where the fundamental concepts that became "Moore's Law" were presented. Somewhere in history Knowles roots in Moore's Law have been forgotten.
I soldered my first computer together in 1977. It was a 8008! I had 1K of ram. Couple of years later I had a NorthStar running a Z80A with 16K and TWO (count them 2) 360k double-sided drives!
John, I just came onboard to your channel. Huge fan of the thoughtful and thurough illumination you lend to what is already highly intriguing subject matter. Thanks so much!
These are the highest TV- standards series led by a charismatic enterprenour we never knew we wanted, but we all needed. John! Make an episode about your purpose!
My first job at 16 in 1986 was at Marconi in the UK. I went on a wire bonding machine course and met a man that worked in a factory in Wales. Sure it was Intel. He said they was making parallel processors but the scrap rate was up in the 90%. The place I worked at made Silicone on Sapphire space chips.
Had a deskpro 386 when I was 10, I was the only kid in class with a PC; can’t believe it was 10 years old at the time! (1994). Glad my Dad saw the future.
I worked for Intel in the 1990's as a Senior engineer. The one decisive aspect of working there was the fact that they relied on the workers to make the decisions and define the technology in the next generation of product.
Slight correction. They were working on the 80386, not the 30386. How did the 80286 (and 80186) factor into this story? It's also worth discussing the difference between SX and DX, or DX2, and that the 80486 integrated the Floating Point coprocessor as well as introducing microcode to the design. By the P6 this was effectively giving a CISC instruction set on what is fundamentally a RISC processor with pipelines and out-of-order execution.
Not sure that microcode bit is 100% correct. Even an 8086 has some control run by non-upgradable microcode within the CPU. From a CPU designer's perspective, it is still microcode even if it isn't re-writable. A better classification for what was marketed as "Intel Microcode" would be to call it firmware IMO.
@@richardbaird1452 does the 8086 have a microcode? I don't think that's right. Microcode wasn't used until the Pentium Pro as I understand it. CISC CPUs before that used a conventional state machine to select the load execute cycle. Instructions were mapped out on the die in regions specific for the instruction being used. Microcode which was updatable came later. Without on die cache, it wasn't really a thing because that is what allowed reordering and pipelines.
@@R.B., 8086 absolutely does have microcode as do all of it's follow-ons (incl 8088) as well as the 68000 and it's decedents. It may have been the first microcoded microprocessor, though I'm not positive about that. You can actually see it on a micrograph of the die (it is usually labeled "Control Store" and just looks like a huge table). It takes up about 1/5th of the die. People have even attempted de-compiling of it based on the images. At that point it was used as a way to allow complex things (like DIV, pre-fetch, etc...) with limited die space, rather than performance enhancement via controlling caches, multiple execution unit synchronization, etc... Have a look at the previous microprocessors (6502, 8080, 8085, z80), none of them have those relatively complex features, primarily because they essentially use a PLA for instruction decode instead of a microcode engine and control store. For the 8086, they simply couldn't do all the things they wanted to do directly in circuitry because there wasn't enough room on the die. Different justification, same solution.
@@richardbaird1452, thanks for that additional perspective. I haven't decapped an 8086 myself, and my understanding mostly stems from my education a couple of decades ago when I created a 4-bit microcontroller as one of my labs. This notion that adding microcode to the x86 design was reenforced with other things I had read over the years. It makes sense that instructions like MUL and DIV might be implemented with microcode but I wasn't aware that Intel did that as early as the 8086.
Nice. What drives a lot of these entrepreneurs to start their own company is the observation by William Nordhaus, Yale economist, that a pioneer inventor only captures less than 5% of the value of their invention. The rest goes to society, second movers who come up with improvements, patent lawyers, and the M&A crowd. For example (one of many) Nobel Prize winner Kary Mullis's PCR DNA replication invention, the basis for much of today's biotechnology, made Mullis' something like $10k from his company (Cetus). Mullis made more from the Nobel Prize (which few inventors will win) and from being an expert witness in patent litigation. One reason it pays to be a gatekeeper (doctor, lawyer, businessperson) rather than an innovator. Speaking as somebody who spent years working in Silicon Valley (as a gatekeeper, allowed me to retire in my 40s).
The story is good, but the visuals are weird. The chip on the finger was a BellMac and Bell labs fab, not Fairchild. Also, not sure why the old footage of Brattain circa 52 was included with the Fairchild discussion. The footage was quite random. Noyce wasn't sole inventor of the IC, but he did take Hoerni's planar process to come up with the truly fully planar IC, slightly behind Kilby's hybrid IC. That Cray MCM footage was bizarre at that point in the timeliness- wrong decade. Also the apple Mac in the 1982 Intel discussion wa weird, as they come out in 1984 but used Motorola chips.
I hate to say it but your wrong about a whole bunch of the history. The first intel 16bit processor was the 8086 but they also made the 8088 which was designed to be compatible with the Z80 support chips. The Z80 was based on the 8080 but was enhanced in a variety of ways more & wider registers so it could execute some 16bit op codes. & thus became the heart of most business computers which used the Control Program/Monitor operating system. Meanwhile Apple were building its computers on the Commodore 6502 processor that was based on adapted motorola 6800 series processor, which is why the first Lisa's & Mac's were based on 68000 series processors So first IBM PC were actually 8bit as they used the 8088 processor, it was only with the PC/XC & PC/AT did they switch to 16 bit. The next chip intel brought out was the 80186 that integrated the 8086 & some support chips. But the big jump was the 80286 their first 32bit chip. The problem is that it had two modes 1 real mode & 2 protected mode but due to a design flaw it couldn't switch between them but IBM still used the chip which ran OS/2. OS/2 was the first Mac style GUI based interface but it was written with Microsoft. Except Microsoft called its version Microsoft NT & it was based on the Digital VMS operating system. Microsoft also created a OS that ran onto of MS-DOS which was also a graphical user interface also known as Windows. Eventually OS2 was combined Windows New Technology. Meanwhile the PS/2 introduced a bunch of new technology like the PCI bus & USB What made the 386 good was that it not only fixed the flaw it also allowed multiple rings of security locking out virus getting control of the computer The history of IBM pc and the mistakes IBM made is the history of the growth of Microsoft where it sold both PC-Dos for IBM but also MS-Dos to all the other people making IBM compatible PC's BTW Compaq made its origonal money by bringing out the first transportable PC similar to the Osborne 2
Was 80286 actually 32 bit? I see e.g. EAX first in 80386. I had 24 bit memory bus and virtual memory, but not 32 bit ALU. And AFAIK 80386 not even the first 32 bit microprocessor, some HP, Motorola or even AT&T one was.
@@mikhailryzhov9419 I had a 286 pc compatible running OS2. Yes it was 32bit. But it wasn't the first 32bit in common usage. Motorola 68000 series, Zilog Z8000, things from Nat Semi. The advantage for the first IBM PC of using the 8088 was that it could use the commonly available support chips of the Z80 (which was a clone plus enhancements of the intel 8080)
@@annakissed3226 Can you elaborate? The basic arithmetic registers are still 16 bits. This is the quote from the Programming Reference Manual “This manual describes the 80286, the most powerful 16-bit microprocessor in the 8086 family, and the 80287 Numeric Processor Extension (NPX). “ 286 had protected mode, but it was still 16bit.
@@mikhailryzhov9419 if you say so, when it was sold to me it was sold as being 32bit. But it was a long time ago & I could well be wrong and you right!
Thank you for the great story well told. But I would have also told about Intel practice to keep the better developed AMD of the counter in the shops, by paying a bribe fee. Not that I know but I learned this from another UA-cam video. But please, can you tell about the Intel plans till 2030, I am all curious.
Awesome storytelling. Thanks for putting all the pieces of Silicon Valley’s history together, John. When are you writing the one book to bind them all? Can’t see anyone better positioned to write it. I’d rather have a street-credible entrepreneur write it than a journalist type like Ashlee Vance.
Is soo interesting that the situation where Intel found itself years ago is repeating itself nowdays, sucess lead to complacency, complacency lead to failiure
What I learn in electronic course, 20 years ago, is that intel continued to grow in designing the microprocessor thanks to USA army. The deal: USA army gave fund to intel to develop better microprocessor, intel provided USA army with to of the notch product with exclusivity until the next gen is developed, then the next gen is given to the army and the n-1 product enter the market. I don't know if it's true but that may explain how intel managed to create the 386 then 486, pentium at this rate.
This is one of the most fascinating stories I have heard. Even though I knew every bit of this story and more, I couldn't stop myself from watching this video. Intel is arguably the most important company in world history. And Andy Grove the best manager till date.
completely excellent documentary. thank you - on my way to Penang. This really helps understand background to semi-conductor industry and the stresses involved
Really fine presentation. I was wondering if you plan to tell a similar story about Texas Instruments or should I say Geophysical Survey Inc. Some of the success of Compaq came from engineers that worked at TI that shifted into Compaq. TI and Intel fabrication efforts parallel each other along with microprocessor designs.
Intel Core 2 duo E8500 were selling very well because many people tell people intel very durable and until today i am now still using it with a new Afox LGA 775 motherboard that has only one year warranty with a standby side kick Intel Pentium E6600. For playing computer games i have a AMD 4600G put near my left leg. Intel customer have intel core2duo E8500 computer that is very durable and still do not spoilt even if they overclock the CPU & fed it with the cheapest SGD 10 thermal paste with no expiry date and end up They go and buy AMD 4600G. When Intel Pentium 4 -2.4b ghz have no rambus motherboard for repair, I check temperature of Cpu is 20 degrees temperature more than normal temperature because i overclock it from 2.4 to 2.53ghz and it behave non-responsive and also with strange artifact on the monitor then it must be spoil.
At 6:06, you note that Berkeley, CA and Mountain View, CA are 500 miles from each other "down the road." Actually, it's more like 50 miles (across San Francisco Bay from each other, with a short road trip in between).
sadly, my '386 was faster than my '486; the only desktops I know used now are by gamers and businesses; it's pretty cool to have lived through the history of the PC but sad to see the newer laptops are practically disposable w/ hard drives soldered onto their motherboards
That's what I always understood to make Intel the smarter choice... They had a patent on the instruction set that made it faster per clock cycle than AMD. This slower speed of AMD was because they had to use an instruction set backend workaround.
While I agree Andy Grove was most productive leader in history of Intel, there are too many errors and omission in this video for me to recommend. In all candor I was a member of the opposition (Zilog) for a few years. Yet he did give me the best business complement I ever received for recommendations given for consulting to Intel.
I have been thinking about photonics. They are used to make logic gates with light/laser instead of silicon transistors. They might help with hyperscalers and AI. Researching best companies. Curious to hear input from someone who can read.
great video, great editing style, great everything but please use a de-esser, watching this with headphones almost made me deaf, I had to get an eq extension and drown out the highs just to watch
I never thought I would use any computer without an Intel processor in 20 years. After Intel refused to recall my 14900K or refund anything whatsoever I switched to AMD Ryzen, the 7800X3D, and I won't look back ever until Intel proves competitive again against AMD and NVIDIA. I was such an Intel shill I had experimented with AMD before with a 7950X (no 3D-Vache) and sold it just to get Intel Raptor Lake processors. That was a huge mistake, to put it bluntly. Intel's refusal to recall the Raptor Lake and Raptor Lake Refresh processors with degradation, disgusting overvolted settings designed to degrade the processor straight out of the box that Intel knowingly deceived customers with, customers who lost stable business because of Intel's lack of foresight, that clock speed alone doesn't determine how fast your CPU is. AMD understands this effortlessly. AMD Ryzen will forever be my primary PC, Intel lost that right after two decades of increasing power inefficiency in spite of excellent performance, AMD Ryzen delivers on all fronts, performance and power efficiency.
Impressive quality in this story telling! I really enjoyed it - well done John. Can you please do the story of AMD, how they remained in the shadow of Intel for so many decades and how a taiwanese-born female CEO did what András István Gróf did - turn the ship and surpass Intel?
"Second sourcing," not "secondary." The essence of the idea is that there be two secure possibilities, not that there is a primary and a secondary, a truth which one very much wishes to keep quiet.
And Wintel is about to die, because Microsoft wants to make their own systems just like Apple does. And what does Microsoft want to use for their processor? An ARM RISC processor. However today's RISC is nothing like the 80s - 90s RISC. CISC still have big advantages even though the instruction logic uses a lot more transistors..
Berkeley and Mountain View are not 500 mi apart. More like 50. Also, Gordon Moore did not coin the term "Moore's Law." Although the concept was his, he was actually somewhat embarrassed that it came to bear his name.
"2:10 he was generally regarded as the sole inventor of the transistor" False. The three made important contributions, and mostly Bardeen, a 2 time physics Nobel prize winner (the only person to ever achieve it)
A good video overall, but I can't stop noticing the rampant copyright infringements going on. about 2/3 of the video is copyrighted footage that is not fair use.
The 1103 was the first big cash cow for intel, but competitors could copy it at lower cost. In ten years, it faded. Intel's next cash cow was a type of non-volatile memory, EPROM, needed for the BIOS code of PCs, and many other applications. For about 5 years, Intel was very successful with EPROMs and this kept Intel alive until the 286 took off. This is a forgotten product now, being embedded in many follow-on products, but it was lifeblood for Intel at a critical period.
This is how one company responsible for changing the world. One company leads to another innovative company and they all revolutionize the world we live in today. Same happened with PayPal mafia. And same is happening with internet companies and Ai companies today.
I enjoyed the programme immensely but I disagree with your view that William Shockley was considered the single inventor of the transistor.That is not the consensus view.Many would suggest that the premier thinker was John Bardeen and the practical laboratory worker was Walter Brattain.The equipment that features in the famous transistor picture belonged to Brattain.Shockley made a lot of enemies but Bardeen made a lot of friends on his was to a second Nobel prize. Also, the world’s first semi-programmable digital computer was the British Colossus (1943).The ENIAC featured several years later. Also,
Could you provide reference of book, articles where you got the history? I am really interested in ready that specially the business side of Fairchild.
So what Ram are they thinking of changing to its one of the old ones I believe but I read it somewhere just wondering if anyone else heard about this and which one it is and why . I mean until a optical computer company that can build and sell a true and only way to move forward they are already hitting their limit at 5nm and 3nm So a personal optical computer is the only way not smaller and smaller and hotter and hotter
Impressive storytelling. It started with Andras and ended it with him while telling the whole story of Intel. I really love the quality of your content.
00:00:00 - Bill Shockley invents the transistor and founds Shockley Semiconductor
00:05:00 - Andros Grove and others found Fairchild Semiconductor and develop the first silicon semiconductor chip
00:10:00 - Bob Noyce, Gordon Moore and team succeed in reducing the instability of transistors and price their products low to accelerate demand and gain market share
00:15:00 - Intel develops the 3101 SRAM chip, 1103 DRAM chip, 8080 microprocessor, 4004 microprocessor for personal computer, Altair 8800 and other computing products
00:20:00 - Intel founded in 1978 by Moore and Bob Noyce, growth slows in the early 1980s due to competition and financial difficulties
00:25:00 - Intel faces challenge to stay afloat in the memory chip market, switches to microprocessor production and becomes the world's largest microprocessor manufacturer
00:30:00 - Intel's 386 processor is a major success but IBM challenges its dominance, leading to significant challenges for Intel's CEO, Andy Grove, to keep the company on top.
This part is wrong: "Intel founded in 1978 by Steve Jobs and Bob Noyce, "
Yeah, well spotted it was founded by Moore and Noyce.
DO YOU KNOW who Steve jobs was?
Edit your comment about your note on 20:00 - Intel NOT founded by Steve Jobs.
Intel was founded in 1968, not 1978
"Profiles by John Coogan".... very well researched, the wait was worth it
What secret about intel he is telling i actually couldn't get it!!
@@diptyprakashswain1121 13:45 ...it's a biography of Andy grove. He's the secret
I have worked as an Integrated Circuit Layout Designer since 1983. All those chips designed between the first Fairchild chip in the 1950's and the early 1980's was drafted on paper or mylar on a drafting table using pencils. Computers purpose built to assist layout design were only first introduced in the 1980's.
Just designed an entire 16-bit RISC CPU in Verilog as a personal toy project. Took about 4 weeks to get it running, on an FPGA, but the time included writing an assembler. I cannot even begin to imagine what it would be to tape that out on paper.
Another exciting event happened following the release of the Pentium: the floating-point divide disaster. It caused Intel in 1994 to have to buy back $475 million processors produced with the flaw. I went to work at Intel in 1995, and all the employees were given a Pentium processor encased in plastic and turned into jewelry (tie tack, lapel pin, cuff-links, etc), and they sold a fair amount of the defective devices as curiosity items. They were extraordinary to look at, with a gold and rainbow sheen.
I think that was one incident that helped cause Intel's loss of Intel's dominance. The memes (although they weren't called that) were devastating and the slogan "Intel inside" became a warning.
Man this was a thrilling history covering so many milestones.
Your effort and presentation made it a gem. It will remain a gem.
Bunch of mistakes though.
@@reptilespantoso What were some of the biggest ones, and when did they occur?
Andy and Robert fired then rehired themselves not to showcase some formality, or as we call it PR these days. They did that to sacrifice a lot of their own seniority benefits and adhere to the set of rules of the newly hires. As such, they putt themselves in such position to go along and to inspire the rest of the company to vigorously do the same, be active and agile like a startup. It could also be that they meant anyone can be fired if not serve the interest of the company. It could be the precedence as why they bring back the current CEO amongst other reasons.
This is a wonderful recounting of a major part the world’s recent technological history. I was a young engineer at a company called Melpar, a small defense contractor in Northern Virginia when the Intel 4004 came out. My boss asked me if I thought I could replace a big box we made for the government with the newfangled microprocessor. Not knowing whether or I could or not, of course I said yes I can do that. And I did. I designed the hardware and software in nine months. It was an exciting time to be in technology.
Love love love your content. As a huge fan of “Computer Chronicles” Gary Kildall episodes this was VERY well researched. Love it!!!
Interesting story. My family’s first PC was a 486, and my parents ended up choosing an AST over a Compaq. This was 1990 or 1991 in ZhongGuanCun, Beijing China, and that computer cost about 2500 USD. The company Lenovo was a spin-off from the Institute of Computing, where both of my parents worked. Lenovo was originally called LianXiang. Their main early product was a Chinese input method, which allows you to type one Chinese character and be prompted with words and phrases that start with that character, thus the name LianXiang, which roughly means neural association or connection, an indexing and search process that occurs in your brain.
Wow haven't heard someone mention a 486 on here ever! I started on a commodore 64, then moved up to an amiga 600 (couldn't afford the 1200) then I finally got a 386 with 1 meg of ram! By then I was flying progressing to the 486 with (I think) 2 Megs of ram.. I remember having a mario game that was over 4 floppy disks. All running on DOS and windows 3.0/3/1. I remember when windows 95 came out and you got the free Wheezer music video. I used to sit there defragging the hard drive. Simple times! The good ol'virgin days!
@@curtis24-7 i made a pit stop with amiga 500. ANd only for Monkey ilsand 2 i bought a sec. disk drive. Fun times. Kids of the 90s great time.
My parents' first computer was a Packard Bell with a 486. I don't remember a lot of the specs, but it had Windows 3.1 and I think the hard drive was 20 MB. Kind of crazy to think that a lot of processors have even larger amounts of cache today.
Oh my God, thank you so much John for this incredible documentary, I had to watch it twice without skipping any ads👌
I wish more younger people found this stuff as interesting as I do, I would kill for a job within any of these lines or work, even if it’s something very minuscule, it would feel awesome working for/towards something that is literally revolutionary 😭🔥
Someone has got to do them sort of jobs, it might as well be you, go for it I say!
Instead of killing someone go study computer science and/or electrical engineering. AMD, INTEL, and NVIDIA need all the talent they can get as they need to consistently come out with better and better products just to stay afloat.
Your mindset is "literally revolutionary", abouts. I encourage you to pursue your goal, relentlessly. Don't get distracted...🇺🇸 😎👍☕
Its like the production value increases with each video. Kudos my dude!
Gordon Moore attended a lecture given by C Harry Knowles (Bell Labs) where the fundamental concepts that became "Moore's Law" were presented. Somewhere in history Knowles roots in Moore's Law have been forgotten.
I soldered my first computer together in 1977. It was a 8008! I had 1K of ram. Couple of years later I had a NorthStar running a Z80A with 16K and TWO (count them 2) 360k double-sided drives!
You are one of the Best Story Teller on UA-cam. Thank you so much 😊
This video has nothing to do with the title. 😐😐😐
John, I just came onboard to your channel. Huge fan of the thoughtful and thurough illumination you lend to what is already highly intriguing subject matter. Thanks so much!
These are the highest TV- standards series led by a charismatic enterprenour we never knew we wanted, but we all needed. John! Make an episode about your purpose!
My first job at 16 in 1986 was at Marconi in the UK. I went on a wire bonding machine course and met a man that worked in a factory in Wales. Sure it was Intel. He said they was making parallel processors but the scrap rate was up in the 90%. The place I worked at made Silicone on Sapphire space chips.
Pushing the algorithm ❤️
Need 7 words or more to affect the algorithm
Always insane quality!
29:48 Not a 30386. It was the 80386. I thought I was hearing things! 😁
I was paying attention back then.... In fact, I bought Andy Groves book. "Only The Paranoid Survive." It's a fantastic read if you buy it.
Had a deskpro 386 when I was 10, I was the only kid in class with a PC; can’t believe it was 10 years old at the time! (1994). Glad my Dad saw the future.
I worked for Intel in the 1990's as a Senior engineer. The one decisive aspect of working there was the fact that they relied on the workers to make the decisions and define the technology in the next generation of product.
An amazing video as usual 🔥🔥🥳 btw could you make a video on how chatbot intergrated search engines would change internet based advertising ✌️
Slight correction. They were working on the 80386, not the 30386. How did the 80286 (and 80186) factor into this story? It's also worth discussing the difference between SX and DX, or DX2, and that the 80486 integrated the Floating Point coprocessor as well as introducing microcode to the design. By the P6 this was effectively giving a CISC instruction set on what is fundamentally a RISC processor with pipelines and out-of-order execution.
Not sure that microcode bit is 100% correct. Even an 8086 has some control run by non-upgradable microcode within the CPU. From a CPU designer's perspective, it is still microcode even if it isn't re-writable. A better classification for what was marketed as "Intel Microcode" would be to call it firmware IMO.
@@richardbaird1452 does the 8086 have a microcode? I don't think that's right. Microcode wasn't used until the Pentium Pro as I understand it. CISC CPUs before that used a conventional state machine to select the load execute cycle. Instructions were mapped out on the die in regions specific for the instruction being used. Microcode which was updatable came later. Without on die cache, it wasn't really a thing because that is what allowed reordering and pipelines.
@@R.B., 8086 absolutely does have microcode as do all of it's follow-ons (incl 8088) as well as the 68000 and it's decedents. It may have been the first microcoded microprocessor, though I'm not positive about that. You can actually see it on a micrograph of the die (it is usually labeled "Control Store" and just looks like a huge table). It takes up about 1/5th of the die. People have even attempted de-compiling of it based on the images.
At that point it was used as a way to allow complex things (like DIV, pre-fetch, etc...) with limited die space, rather than performance enhancement via controlling caches, multiple execution unit synchronization, etc... Have a look at the previous microprocessors (6502, 8080, 8085, z80), none of them have those relatively complex features, primarily because they essentially use a PLA for instruction decode instead of a microcode engine and control store.
For the 8086, they simply couldn't do all the things they wanted to do directly in circuitry because there wasn't enough room on the die. Different justification, same solution.
@@richardbaird1452, thanks for that additional perspective. I haven't decapped an 8086 myself, and my understanding mostly stems from my education a couple of decades ago when I created a 4-bit microcontroller as one of my labs. This notion that adding microcode to the x86 design was reenforced with other things I had read over the years. It makes sense that instructions like MUL and DIV might be implemented with microcode but I wasn't aware that Intel did that as early as the 8086.
Nice. What drives a lot of these entrepreneurs to start their own company is the observation by William Nordhaus, Yale economist, that a pioneer inventor only captures less than 5% of the value of their invention. The rest goes to society, second movers who come up with improvements, patent lawyers, and the M&A crowd. For example (one of many) Nobel Prize winner Kary Mullis's PCR DNA replication invention, the basis for much of today's biotechnology, made Mullis' something like $10k from his company (Cetus). Mullis made more from the Nobel Prize (which few inventors will win) and from being an expert witness in patent litigation. One reason it pays to be a gatekeeper (doctor, lawyer, businessperson) rather than an innovator. Speaking as somebody who spent years working in Silicon Valley (as a gatekeeper, allowed me to retire in my 40s).
I was a programmer back in the nineties. on Xenix 286, Sco & interactive Unix, later Linux.I lived the history. Great story !
Man I learn so much from these. You're the best John!!
John, your work is simply brilliant. As always, thank you very much for this - yet another Coogan gem. Bravo! 👏👏🔥🔥
Well done. I lived this story during my 30 yrs in SV.
The story is good, but the visuals are weird. The chip on the finger was a BellMac and Bell labs fab, not Fairchild. Also, not sure why the old footage of Brattain circa 52 was included with the Fairchild discussion. The footage was quite random. Noyce wasn't sole inventor of the IC, but he did take Hoerni's planar process to come up with the truly fully planar IC, slightly behind Kilby's hybrid IC. That Cray MCM footage was bizarre at that point in the timeliness- wrong decade. Also the apple Mac in the 1982 Intel discussion wa weird, as they come out in 1984 but used Motorola chips.
Excellent stuff. With each date, I track back in time and think “what computer was I using?”.
You’re dropping multiple videos lately, all amazing
You did an amazing job captivating what most would consider a boring topic.
The pace of this guys storytelling is perfect!
Hands down best storyteller on youtube. Just can’t get enough. So engaging.
I hate to say it but your wrong about a whole bunch of the history. The first intel 16bit processor was the 8086 but they also made the 8088 which was designed to be compatible with the Z80 support chips. The Z80 was based on the 8080 but was enhanced in a variety of ways more & wider registers so it could execute some 16bit op codes. & thus became the heart of most business computers which used the Control Program/Monitor operating system. Meanwhile Apple were building its computers on the Commodore 6502 processor that was based on adapted motorola 6800 series processor, which is why the first Lisa's & Mac's were based on 68000 series processors
So first IBM PC were actually 8bit as they used the 8088 processor, it was only with the PC/XC & PC/AT did they switch to 16 bit. The next chip intel brought out was the 80186 that integrated the 8086 & some support chips. But the big jump was the 80286 their first 32bit chip. The problem is that it had two modes 1 real mode & 2 protected mode but due to a design flaw it couldn't switch between them but IBM still used the chip which ran OS/2. OS/2 was the first Mac style GUI based interface but it was written with Microsoft. Except Microsoft called its version Microsoft NT & it was based on the Digital VMS operating system. Microsoft also created a OS that ran onto of MS-DOS which was also a graphical user interface also known as Windows. Eventually OS2 was combined Windows New Technology. Meanwhile the PS/2 introduced a bunch of new technology like the PCI bus & USB
What made the 386 good was that it not only fixed the flaw it also allowed multiple rings of security locking out virus getting control of the computer
The history of IBM pc and the mistakes IBM made is the history of the growth of Microsoft where it sold both PC-Dos for IBM but also MS-Dos to all the other people making IBM compatible PC's
BTW Compaq made its origonal money by bringing out the first transportable PC similar to the Osborne 2
Was 80286 actually 32 bit? I see e.g. EAX first in 80386. I had 24 bit memory bus and virtual memory, but not 32 bit ALU.
And AFAIK 80386 not even the first 32 bit microprocessor, some HP, Motorola or even AT&T one was.
@@mikhailryzhov9419 I had a 286 pc compatible running OS2. Yes it was 32bit. But it wasn't the first 32bit in common usage. Motorola 68000 series, Zilog Z8000, things from Nat Semi. The advantage for the first IBM PC of using the 8088 was that it could use the commonly available support chips of the Z80 (which was a clone plus enhancements of the intel 8080)
@@annakissed3226 Can you elaborate? The basic arithmetic registers are still 16 bits. This is the quote from the Programming Reference Manual “This manual describes the 80286, the most powerful 16-bit microprocessor in the 8086 family, and the
80287 Numeric Processor Extension (NPX). “ 286 had protected mode, but it was still 16bit.
@@mikhailryzhov9419 if you say so, when it was sold to me it was sold as being 32bit. But it was a long time ago & I could well be wrong and you right!
Thank you for the great story well told. But I would have also told about Intel practice to keep the better developed AMD of the counter in the shops, by paying a bribe fee. Not that I know but I learned this from another UA-cam video. But please, can you tell about the Intel plans till 2030, I am all curious.
Awesome storytelling. Thanks for putting all the pieces of Silicon Valley’s history together, John.
When are you writing the one book to bind them all?
Can’t see anyone better positioned to write it. I’d rather have a street-credible entrepreneur write it than a journalist type like Ashlee Vance.
Is soo interesting that the situation where Intel found itself years ago is repeating itself nowdays, sucess lead to complacency, complacency lead to failiure
What I learn in electronic course, 20 years ago, is that intel continued to grow in designing the microprocessor thanks to USA army. The deal: USA army gave fund to intel to develop better microprocessor, intel provided USA army with to of the notch product with exclusivity until the next gen is developed, then the next gen is given to the army and the n-1 product enter the market. I don't know if it's true but that may explain how intel managed to create the 386 then 486, pentium at this rate.
All tech is from the military, it's a Lizard outfit 🙊
I love your storytelling, John. So compelling.
This is one of the most fascinating stories I have heard. Even though I knew every bit of this story and more, I couldn't stop myself from watching this video.
Intel is arguably the most important company in world history. And Andy Grove the best manager till date.
Your content doesn't miss I swear
completely excellent documentary. thank you - on my way to Penang. This really helps understand background to semi-conductor industry and the stresses involved
Great video. A better title would have been: “the history of intel”
Really fine presentation. I was wondering if you plan to tell a similar story about Texas Instruments or should I say Geophysical Survey Inc. Some of the success of Compaq came from engineers that worked at TI that shifted into Compaq. TI and Intel fabrication efforts parallel each other along with microprocessor designs.
Excellent storytelling!!! Great work!!
Wow! What a great storyteller. You should narrate every tech documentary.
Intel Core 2 duo E8500 were selling very well because many people tell people intel very durable and until today i am now still using it with a new Afox LGA 775 motherboard that has only one year warranty with a standby side kick Intel Pentium E6600. For playing computer games i have a AMD 4600G put near my left leg. Intel customer have intel core2duo E8500 computer that is very durable and still do not spoilt even if they overclock the CPU & fed it with the cheapest SGD 10 thermal paste with no expiry date and end up They go and buy AMD 4600G. When Intel Pentium 4 -2.4b ghz have no rambus motherboard for repair, I check temperature of Cpu is 20 degrees temperature more than normal temperature because i overclock it from 2.4 to 2.53ghz and it behave non-responsive and also with strange artifact on the monitor then it must be spoil.
Finally......the wait is over
At 6:06, you note that Berkeley, CA and Mountain View, CA are 500 miles from each other "down the road." Actually, it's more like 50 miles (across San Francisco Bay from each other, with a short road trip in between).
sadly, my '386 was faster than my '486; the only desktops I know used now are by gamers and businesses; it's pretty cool to have lived through the history of the PC but sad to see the newer laptops are practically disposable w/ hard drives soldered onto their motherboards
That's what I always understood to make Intel the smarter choice... They had a patent on the instruction set that made it faster per clock cycle than AMD. This slower speed of AMD was because they had to use an instruction set backend workaround.
While I agree Andy Grove was most productive leader in history of Intel, there are too many errors and omission in this video for me to recommend. In all candor I was a member of the opposition (Zilog) for a few years. Yet he did give me the best business complement I ever received for recommendations given for consulting to Intel.
A fine gathering,the wonderful convention’s a good time was had by all 🙃
Am reading "Chip War" right now and this dovetails PERFECTLY with it!! Perfect content with perfect timing! Thank you!
Good read
I have been thinking about photonics. They are used to make logic gates with light/laser instead of silicon transistors. They might help with hyperscalers and AI. Researching best companies. Curious to hear input from someone who can read.
Not so perfect, since he left out the Clipper/Intergraph/Intel parts of the story.
Man that was an amazing story and you are an amazing narrator
great video, great editing style, great everything but please use a de-esser, watching this with headphones almost made me deaf, I had to get an eq extension and drown out the highs just to watch
@6:03 Since when is Berkeley to Mountain View 500 miles??? More like 50.
Very informative, and well presented. Thank you so much!
Mountain View is not 500 miles south of UC Berkeley
I never thought I would use any computer without an Intel processor in 20 years. After Intel refused to recall my 14900K or refund anything whatsoever I switched to AMD Ryzen, the 7800X3D, and I won't look back ever until Intel proves competitive again against AMD and NVIDIA. I was such an Intel shill I had experimented with AMD before with a 7950X (no 3D-Vache) and sold it just to get Intel Raptor Lake processors. That was a huge mistake, to put it bluntly.
Intel's refusal to recall the Raptor Lake and Raptor Lake Refresh processors with degradation, disgusting overvolted settings designed to degrade the processor straight out of the box that Intel knowingly deceived customers with, customers who lost stable business because of Intel's lack of foresight, that clock speed alone doesn't determine how fast your CPU is. AMD understands this effortlessly.
AMD Ryzen will forever be my primary PC, Intel lost that right after two decades of increasing power inefficiency in spite of excellent performance, AMD Ryzen delivers on all fronts, performance and power efficiency.
Impressive quality in this story telling! I really enjoyed it - well done John. Can you please do the story of AMD, how they remained in the shadow of Intel for so many decades and how a taiwanese-born female CEO did what András István Gróf did - turn the ship and surpass Intel?
EXCELENTE! Uno de los mejores videos que he visto! Saludos desde Argentina 🇦🇷
Nice shot of Hack Reactor at 3:29!
Intel missed: Mobile (tried but failed) GPU (tried but failed). and now AI. (never saw it coming!). Any other company would've been dead by now....
Incredible presentation of this story
The fact that we are already at robotic AI and such in a matter of a few decades is absolutely astonishing and terrifying.
This was a remarkable video, bravo
"Second sourcing," not "secondary."
The essence of the idea is that there be two secure possibilities, not that there is a primary and a secondary, a truth which one very much wishes to keep quiet.
Outstanding work 😎
Thank you
Am I stupid or what? Why I wasn't already subscriber of this channel a long time ago? UA-cam you've been failing me!
What’s your projection for this current saga? I own shares…do i DCA or do I sell at a loss?
And Wintel is about to die, because Microsoft wants to make their own systems just like Apple does.
And what does Microsoft want to use for their processor? An ARM RISC processor. However today's RISC is nothing like the 80s - 90s RISC. CISC still have big advantages even though the instruction logic uses a lot more transistors..
Berkeley and Mountain View are not 500 mi apart. More like 50. Also, Gordon Moore did not coin the term "Moore's Law." Although the concept was his, he was actually somewhat embarrassed that it came to bear his name.
Somtimes I feel
like William Shockley
Sometimes I feel
Like my only friend
Is the Intel computer
The CPU of Angels
Lonely as I am
Together we cry
Were they “embedding transistors on silicon wafers”? Or etching silicon to produce transistors?
"2:10 he was generally regarded as the sole inventor of the transistor" False. The three made important contributions, and mostly Bardeen, a 2 time physics Nobel prize winner (the only person to ever achieve it)
Very well produced video, but do you have sources for your claims, e.g. about Shockley's story?
intel has always had my business. ive tried several amd units and they dont seem to last or are nearly as stable.
Great work once again!... mega inspiring even though I dont compete in the silicon wafer department 😏
FYI: It is only 50 miles from Berkeley to Mtn. View.
A good video overall, but I can't stop noticing the rampant copyright infringements going on. about 2/3 of the video is copyrighted footage that is not fair use.
Watching this on a laptop that has an Intel Core i5 10th-generation, trippy
The 1103 was the first big cash cow for intel, but competitors could copy it at lower cost. In ten years, it faded. Intel's next cash cow was a type of non-volatile memory, EPROM, needed for the BIOS code of PCs, and many other applications. For about 5 years, Intel was very successful with EPROMs and this kept Intel alive until the 286 took off. This is a forgotten product now, being embedded in many follow-on products, but it was lifeblood for Intel at a critical period.
How did competitors develop an IBM / Intel compatible computer?
30:03: it had four GIGAbytes of memory. 4KB is 4 thousand bytes, 4GB is four billion bytes, a million times greater.
This is how one company responsible for changing the world. One company leads to another innovative company and they all revolutionize the world we live in today. Same happened with PayPal mafia. And same is happening with internet companies and Ai companies today.
I enjoyed the programme immensely but I disagree with your view that William Shockley was considered the single inventor of the transistor.That is not the consensus view.Many would suggest that the premier thinker was John Bardeen and the practical laboratory worker was Walter Brattain.The equipment that features in the famous transistor picture belonged to Brattain.Shockley made a lot of enemies but Bardeen made a lot of friends on his was to a second Nobel prize.
Also, the world’s first semi-programmable digital computer was the British Colossus (1943).The ENIAC featured several years later.
Also,
Could you provide reference of book, articles where you got the history? I am really interested in ready that specially the business side of Fairchild.
Excellent video. I enjoyed it immensely.
Excellent video as always 👌
Thanks John for such an Amazing content
👏👏👏👏 amazing job. Thank you John!
So what Ram are they thinking of changing to its one of the old ones I believe but I read it somewhere just wondering if anyone else heard about this and which one it is and why .
I mean until a optical computer company that can build and sell a true and only way to move forward they are already hitting their limit at 5nm and 3nm So a personal optical computer is the only way not smaller and smaller and hotter and hotter
I work there. I'm just watching to see what our secret is.