Another exciting event happened following the release of the Pentium: the floating-point divide disaster. It caused Intel in 1994 to have to buy back $475 million processors produced with the flaw. I went to work at Intel in 1995, and all the employees were given a Pentium processor encased in plastic and turned into jewelry (tie tack, lapel pin, cuff-links, etc), and they sold a fair amount of the defective devices as curiosity items. They were extraordinary to look at, with a gold and rainbow sheen.
I think that was one incident that helped cause Intel's loss of Intel's dominance. The memes (although they weren't called that) were devastating and the slogan "Intel inside" became a warning.
Man this was a thrilling history covering so many milestones.
Your effort and presentation made it a gem. It will remain a gem.
@@reptilespantoso What were some of the biggest ones, and when did they occur?
"Profiles by John Coogan".... very well researched, the wait was worth it
@@diptyprakashswain1121 13:45 ...it's a biography of Andy grove. He's the secret
This is a wonderful recounting of a major part the world’s recent technological history. I was a young engineer at a company called Melpar, a small defense contractor in Northern Virginia when the Intel 4004 came out. My boss asked me if I thought I could replace a big box we made for the government with the newfangled microprocessor. Not knowing whether or I could or not, of course I said yes I can do that. And I did. I designed the hardware and software in nine months. It was an exciting time to be in technology.
These are the highest TV- standards series led by a charismatic enterprenour we never knew we wanted, but we all needed. John! Make an episode about your purpose!
Andy and Robert fired then rehired themselves not to showcase some formality, or as we call it PR these days. They did that to sacrifice a lot of their own seniority benefits and adhere to the set of rules of the newly hires. As such, they putt themselves in such position to go along and to inspire the rest of the company to vigorously do the same, be active and agile like a startup. It could also be that they meant anyone can be fired if not serve the interest of the company. It could be the precedence as why they bring back the current CEO amongst other reasons.
I have worked as an Integrated Circuit Layout Designer since 1983. All those chips designed between the first Fairchild chip in the 1950's and the early 1980's was drafted on paper or mylar on a drafting table using pencils. Computers purpose built to assist layout design were only first introduced in the 1980's.
Just designed an entire 16-bit RISC CPU in Verilog as a personal toy project. Took about 4 weeks to get it running, on an FPGA, but the time included writing an assembler. I cannot even begin to imagine what it would be to tape that out on paper.
00:00:00 - Bill Shockley invents the transistor and founds Shockley Semiconductor
00:05:00 - Andros Grove and others found Fairchild Semiconductor and develop the first silicon semiconductor chip
00:10:00 - Bob Noyce, Gordon Moore and team succeed in reducing the instability of transistors and price their products low to accelerate demand and gain market share
00:15:00 - Intel develops the 3101 SRAM chip, 1103 DRAM chip, 8080 microprocessor, 4004 microprocessor for personal computer, Altair 8800 and other computing products
00:20:00 - Intel founded in 1978 by Moore and Bob Noyce, growth slows in the early 1980s due to competition and financial difficulties
00:25:00 - Intel faces challenge to stay afloat in the memory chip market, switches to microprocessor production and becomes the world's largest microprocessor manufacturer
00:30:00 - Intel's 386 processor is a major success but IBM challenges its dominance, leading to significant challenges for Intel's CEO, Andy Grove, to keep the company on top.
This part is wrong: "Intel founded in 1978 by Steve Jobs and Bob Noyce, "
Its like the production value increases with each video. Kudos my dude!
I wish more younger people found this stuff as interesting as I do, I would kill for a job within any of these lines or work, even if it’s something very minuscule, it would feel awesome working for/towards something that is literally revolutionary 😭🔥
Someone has got to do them sort of jobs, it might as well be you, go for it I say!
Instead of killing someone go study computer science and/or electrical engineering. AMD, INTEL, and NVIDIA need all the talent they can get as they need to consistently come out with better and better products just to stay afloat.
Your mindset is "literally revolutionary", abouts. I encourage you to pursue your goal, relentlessly. Don't get distracted...🇺🇸 😎👍☕
Interesting story. My family’s first PC was a 486, and my parents ended up choosing an AST over a Compaq. This was 1990 or 1991 in ZhongGuanCun, Beijing China, and that computer cost about 2500 USD. The company Lenovo was a spin-off from the Institute of Computing, where both of my parents worked. Lenovo was originally called LianXiang. Their main early product was a Chinese input method, which allows you to type one Chinese character and be prompted with words and phrases that start with that character, thus the name LianXiang, which roughly means neural association or connection, an indexing and search process that occurs in your brain.
Wow haven't heard someone mention a 486 on here ever! I started on a commodore 64, then moved up to an amiga 600 (couldn't afford the 1200) then I finally got a 386 with 1 meg of ram! By then I was flying progressing to the 486 with (I think) 2 Megs of ram.. I remember having a mario game that was over 4 floppy disks. All running on DOS and windows 3.0/3/1. I remember when windows 95 came out and you got the free Wheezer music video. I used to sit there defragging the hard drive. Simple times! The good ol'virgin days!
@@curtis24-7 i made a pit stop with amiga 500. ANd only for Monkey ilsand 2 i bought a sec. disk drive. Fun times. Kids of the 90s great time.
My parents' first computer was a Packard Bell with a 486. I don't remember a lot of the specs, but it had Windows 3.1 and I think the hard drive was 20 MB. Kind of crazy to think that a lot of processors have even larger amounts of cache today.
John, I just came onboard to your channel. Huge fan of the thoughtful and thurough illumination you lend to what is already highly intriguing subject matter. Thanks so much!
You are one of the Best Story Teller on UA-cam. Thank you so much 😊
Love love love your content. As a huge fan of “Computer Chronicles” Gary Kildall episodes this was VERY well researched. Love it!!!
Always insane quality!
Oh my God, thank you so much John for this incredible documentary, I had to watch it twice without skipping any ads👌
Excellent stuff. With each date, I track back in time and think “what computer was I using?”.
Man I learn so much from these. You're the best John!!
I soldered my first computer together in 1977. It was a 8008! I had 1K of ram. Couple of years later I had a NorthStar running a Z80A with 16K and TWO (count them 2) 360k double-sided drives!
Gordon Moore attended a lecture given by C Harry Knowles (Bell Labs) where the fundamental concepts that became "Moore's Law" were presented. Somewhere in history Knowles roots in Moore's Law have been forgotten.
You’re dropping multiple videos lately, all amazing
Great video. A better title would have been: “the history of intel”
An amazing video as usual 🔥🔥🥳 btw could you make a video on how chatbot intergrated search engines would change internet based advertising ✌️
John, your work is simply brilliant. As always, thank you very much for this - yet another Coogan gem. Bravo! 👏👏🔥🔥
I worked for Intel in the 1990's as a Senior engineer. The one decisive aspect of working there was the fact that they relied on the workers to make the decisions and define the technology in the next generation of product.
Had a deskpro 386 when I was 10, I was the only kid in class with a PC; can’t believe it was 10 years old at the time! (1994). Glad my Dad saw the future.
My first job at 16 in 1986 was at Marconi in the UK. I went on a wire bonding machine course and met a man that worked in a factory in Wales. Sure it was Intel. He said they was making parallel processors but the scrap rate was up in the 90%. The place I worked at made Silicone on Sapphire space chips.
Very informative, and well presented. Thank you so much!
Awesome storytelling. Thanks for putting all the pieces of Silicon Valley’s history together, John.
When are you writing the one book to bind them all?
Can’t see anyone better positioned to write it. I’d rather have a street-credible entrepreneur write it than a journalist type like Ashlee Vance.
Excellent storytelling!!! Great work!!
Could you provide reference of book, articles where you got the history? I am really interested in ready that specially the business side of Fairchild.
Well done. I lived this story during my 30 yrs in SV.
Your content doesn't miss I swear
Thanks John for such an Amazing content
You did an amazing job captivating what most would consider a boring topic.
The pace of this guys storytelling is perfect!
Very well produced video, but do you have sources for your claims, e.g. about Shockley's story?
👏👏👏👏 amazing job. Thank you John!
Were they “embedding transistors on silicon wafers”? Or etching silicon to produce transistors?
Nice shot of Hack Reactor at 3:29!
well delivered content. Big up
I was a programmer back in the nineties. on Xenix 286, Sco & interactive Unix, later Linux.I lived the history. Great story !
This is very well produced
Finally......the wait is over
I love your storytelling, John. So compelling.
Is soo interesting that the situation where Intel found itself years ago is repeating itself nowdays, sucess lead to complacency, complacency lead to failiure
Thank you for the great story well told. But I would have also told about Intel practice to keep the better developed AMD of the counter in the shops, by paying a bribe fee. Not that I know but I learned this from another UA-cam video. But please, can you tell about the Intel plans till 2030, I am all curious.
Nice. What drives a lot of these entrepreneurs to start their own company is the observation by William Nordhaus, Yale economist, that a pioneer inventor only captures less than 5% of the value of their invention. The rest goes to society, second movers who come up with improvements, patent lawyers, and the M&A crowd. For example (one of many) Nobel Prize winner Kary Mullis's PCR DNA replication invention, the basis for much of today's biotechnology, made Mullis' something like $10k from his company (Cetus). Mullis made more from the Nobel Prize (which few inventors will win) and from being an expert witness in patent litigation. One reason it pays to be a gatekeeper (doctor, lawyer, businessperson) rather than an innovator. Speaking as somebody who spent years working in Silicon Valley (as a gatekeeper, allowed me to retire in my 40s).
Absolutely brilliant, Thank you for sharing ❤❤
How did competitors develop an IBM / Intel compatible computer?
A fine gathering,the wonderful convention’s a good time was had by all 🙃
great video, great editing style, great everything but please use a de-esser, watching this with headphones almost made me deaf, I had to get an eq extension and drown out the highs just to watch
Thanks Jo......for your amazing content
Wow! What a great storyteller. You should narrate every tech documentary.
What’s your projection for this current saga? I own shares…do i DCA or do I sell at a loss?
Hands down best storyteller on youtube. Just can’t get enough. So engaging.
I was paying attention back then.... In fact, I bought Andy Groves book. "Only The Paranoid Survive." It's a fantastic read if you buy it.
Slight correction. They were working on the 80386, not the 30386. How did the 80286 (and 80186) factor into this story? It's also worth discussing the difference between SX and DX, or DX2, and that the 80486 integrated the Floating Point coprocessor as well as introducing microcode to the design. By the P6 this was effectively giving a CISC instruction set on what is fundamentally a RISC processor with pipelines and out-of-order execution.
Not sure that microcode bit is 100% correct. Even an 8086 has some control run by non-upgradable microcode within the CPU. From a CPU designer's perspective, it is still microcode even if it isn't re-writable. A better classification for what was marketed as "Intel Microcode" would be to call it firmware IMO.
@@richardbaird1452 does the 8086 have a microcode? I don't think that's right. Microcode wasn't used until the Pentium Pro as I understand it. CISC CPUs before that used a conventional state machine to select the load execute cycle. Instructions were mapped out on the die in regions specific for the instruction being used. Microcode which was updatable came later. Without on die cache, it wasn't really a thing because that is what allowed reordering and pipelines.
@@R.B., 8086 absolutely does have microcode as do all of it's follow-ons (incl 8088) as well as the 68000 and it's decedents. It may have been the first microcoded microprocessor, though I'm not positive about that. You can actually see it on a micrograph of the die (it is usually labeled "Control Store" and just looks like a huge table). It takes up about 1/5th of the die. People have even attempted de-compiling of it based on the images.
At that point it was used as a way to allow complex things (like DIV, pre-fetch, etc...) with limited die space, rather than performance enhancement via controlling caches, multiple execution unit synchronization, etc... Have a look at the previous microprocessors (6502, 8080, 8085, z80), none of them have those relatively complex features, primarily because they essentially use a PLA for instruction decode instead of a microcode engine and control store.
For the 8086, they simply couldn't do all the things they wanted to do directly in circuitry because there wasn't enough room on the die. Different justification, same solution.
@@richardbaird1452, thanks for that additional perspective. I haven't decapped an 8086 myself, and my understanding mostly stems from my education a couple of decades ago when I created a 4-bit microcontroller as one of my labs. This notion that adding microcode to the x86 design was reenforced with other things I had read over the years. It makes sense that instructions like MUL and DIV might be implemented with microcode but I wasn't aware that Intel did that as early as the 8086.
Man that was an amazing story and you are an amazing narrator
Great video, thanks!
This is one of the most fascinating stories I have heard. Even though I knew every bit of this story and more, I couldn't stop myself from watching this video.
Intel is arguably the most important company in world history. And Andy Grove the best manager till date.
This video has nothing to do with the title. 😐😐😐
This was a remarkable video, bravo
EXCELENTE! Uno de los mejores videos que he visto! Saludos desde Argentina 🇦🇷
Excellent video as always 👌
Outstanding work 😎
Thank you
So what was the secret?
Really fine presentation. I was wondering if you plan to tell a similar story about Texas Instruments or should I say Geophysical Survey Inc. Some of the success of Compaq came from engineers that worked at TI that shifted into Compaq. TI and Intel fabrication efforts parallel each other along with microprocessor designs.
...nice work sir.
The story is good, but the visuals are weird. The chip on the finger was a BellMac and Bell labs fab, not Fairchild. Also, not sure why the old footage of Brattain circa 52 was included with the Fairchild discussion. The footage was quite random. Noyce wasn't sole inventor of the IC, but he did take Hoerni's planar process to come up with the truly fully planar IC, slightly behind Kilby's hybrid IC. That Cray MCM footage was bizarre at that point in the timeliness- wrong decade. Also the apple Mac in the 1982 Intel discussion wa weird, as they come out in 1984 but used Motorola chips.
The fact that we are already at robotic AI and such in a matter of a few decades is absolutely astonishing and terrifying.
Mountain View is not 500 miles south of UC Berkeley
That's what I always understood to make Intel the smarter choice... They had a patent on the instruction set that made it faster per clock cycle than AMD. This slower speed of AMD was because they had to use an instruction set backend workaround.
Great work once again!... mega inspiring even though I dont compete in the silicon wafer department 😏
Excellent video. I enjoyed it immensely.
Excellent video!
I work there. I'm just watching to see what our secret is.
At 6:06, you note that Berkeley, CA and Mountain View, CA are 500 miles from each other "down the road." Actually, it's more like 50 miles (across San Francisco Bay from each other, with a short road trip in between).
Am reading "Chip War" right now and this dovetails PERFECTLY with it!! Perfect content with perfect timing! Thank you!
I have been thinking about photonics. They are used to make logic gates with light/laser instead of silicon transistors. They might help with hyperscalers and AI. Researching best companies. Curious to hear input from someone who can read.
Not so perfect, since he left out the Clipper/Intergraph/Intel parts of the story.
Great job
What’s the secret? I just wanna know if it’s a buy
Incredible presentation of this story
"Second sourcing," not "secondary."
The essence of the idea is that there be two secure possibilities, not that there is a primary and a secondary, a truth which one very much wishes to keep quiet.
Am I stupid or what? Why I wasn't already subscriber of this channel a long time ago? UA-cam you've been failing me!
Amazing.
What was those computers doing with out a satellites
What I learn in electronic course, 20 years ago, is that intel continued to grow in designing the microprocessor thanks to USA army. The deal: USA army gave fund to intel to develop better microprocessor, intel provided USA army with to of the notch product with exclusivity until the next gen is developed, then the next gen is given to the army and the n-1 product enter the market. I don't know if it's true but that may explain how intel managed to create the 386 then 486, pentium at this rate.
Awesome video 😎👍
sadly, my '386 was faster than my '486; the only desktops I know used now are by gamers and businesses; it's pretty cool to have lived through the history of the PC but sad to see the newer laptops are practically disposable w/ hard drives soldered onto their motherboards
Excellent
Absolutely fantastic
Watching this on a laptop that has an Intel Core i5 10th-generation, trippy
Intel Core 2 duo E8500 were selling very well because many people tell people intel very durable and until today i am now still using it with a new Afox LGA 775 motherboard that has only one year warranty with a standby side kick Intel Pentium E6600. For playing computer games i have a AMD 4600G put near my left leg. Intel customer have intel core2duo E8500 computer that is very durable and still do not spoilt even if they overclock the CPU & fed it with the cheapest SGD 10 thermal paste with no expiry date and end up They go and buy AMD 4600G. When Intel Pentium 4 -2.4b ghz have no rambus motherboard for repair, I check temperature of Cpu is 20 degrees temperature more than normal temperature because i overclock it from 2.4 to 2.53ghz and it behave non-responsive and also with strange artifact on the monitor then it must be spoil.
Fascinating
Subscribed!
I hate to say it but your wrong about a whole bunch of the history. The first intel 16bit processor was the 8086 but they also made the 8088 which was designed to be compatible with the Z80 support chips. The Z80 was based on the 8080 but was enhanced in a variety of ways more & wider registers so it could execute some 16bit op codes. & thus became the heart of most business computers which used the Control Program/Monitor operating system. Meanwhile Apple were building its computers on the Commodore 6502 processor that was based on adapted motorola 6800 series processor, which is why the first Lisa's & Mac's were based on 68000 series processors
So first IBM PC were actually 8bit as they used the 8088 processor, it was only with the PC/XC & PC/AT did they switch to 16 bit. The next chip intel brought out was the 80186 that integrated the 8086 & some support chips. But the big jump was the 80286 their first 32bit chip. The problem is that it had two modes 1 real mode & 2 protected mode but due to a design flaw it couldn't switch between them but IBM still used the chip which ran OS/2. OS/2 was the first Mac style GUI based interface but it was written with Microsoft. Except Microsoft called its version Microsoft NT & it was based on the Digital VMS operating system. Microsoft also created a OS that ran onto of MS-DOS which was also a graphical user interface also known as Windows. Eventually OS2 was combined Windows New Technology. Meanwhile the PS/2 introduced a bunch of new technology like the PCI bus & USB
What made the 386 good was that it not only fixed the flaw it also allowed multiple rings of security locking out virus getting control of the computer
The history of IBM pc and the mistakes IBM made is the history of the growth of Microsoft where it sold both PC-Dos for IBM but also MS-Dos to all the other people making IBM compatible PC's
BTW Compaq made its origonal money by bringing out the first transportable PC similar to the Osborne 2
Was 80286 actually 32 bit? I see e.g. EAX first in 80386. I had 24 bit memory bus and virtual memory, but not 32 bit ALU.
And AFAIK 80386 not even the first 32 bit microprocessor, some HP, Motorola or even AT&T one was.
@@mikhailryzhov9419 I had a 286 pc compatible running OS2. Yes it was 32bit. But it wasn't the first 32bit in common usage. Motorola 68000 series, Zilog Z8000, things from Nat Semi. The advantage for the first IBM PC of using the 8088 was that it could use the commonly available support chips of the Z80 (which was a clone plus enhancements of the intel 8080)
@@annakissed3226 Can you elaborate? The basic arithmetic registers are still 16 bits. This is the quote from the Programming Reference Manual “This manual describes the 80286, the most powerful 16-bit microprocessor in the 8086 family, and the
80287 Numeric Processor Extension (NPX). “ 286 had protected mode, but it was still 16bit.
@@mikhailryzhov9419 if you say so, when it was sold to me it was sold as being 32bit. But it was a long time ago & I could well be wrong and you right!
Impressive storytelling. It started with Andras and ended it with him while telling the whole story of Intel. I really love the quality of your content.