Good work Dagogo! - a really interesting episode! Having lived through all that you talk about and being involved with a lot of it, it was great to see you consolidate the histroy into a digestible couple of episodes!
I used the 4004 in a couple of embedded designs, bounced around at several PC companies and worked at AMD from 1993-2020 mostly trying to catch up to Intel. Great video; you have encapsulated the history very well. Ordered your book :-)
I can't even imagine how the people who started Fairchild and Intel must feel about the drastic change their invention brought to the world. It won't be exaggerating if I say it was as revolutionary as the invention of Wheels.
I never stop getting amazed how many incredible, talented people has UA-cam enabled to emerge, on their own. Look at the quality on this video! 20 years ago you'd only get this on TV... maybe. You'd have to pitch the show to some producer, and jump through all the hoops. Now you just create a video and the world decides, nobody is holding you back.
I get what you’re saying but it Depends on the nature of the content. If it’s too far in the field of wrong-think a whole channel can be removed by the gods of gügl
Shame the first 30s mix up the definitions of microprocessor and CPU. All microprocessors are CPUs and the first commercial microprocessor was Intel. However CPU just describes the core compute hardware, the logic gates to make the fetch-decode-execute loop, APU and such. It does not describe whether it is a single item, or a bunch of hardware taking up a room. Not all CPUs are Mircroprocessors, as in it is all one one "computer chip".
I was an ASIC Engineer thru the 80's an 90's. In the late 90's Intel approach me about employment. Their salary offer was half of what I making at the time. I told them no and why and they're response was "but we're Intel."
It's fascinating to realize that Fairchild Semiconductor is The O.G. startup that define the 'startup culture' as we know today. Yes, every company has a starting point (either big or small), but the contemporary term of "start up" as a culture (especially in tech industry) is an entirely different story. For example, agriculture has been around since tens of thousands of years ago, but modern Agricultural Revolution (between mid 17th to mid 19th century) made it much more EFFECTIVE and EFFICIENT because of the increasing number of labor and land availability. This is parallel with the current tech industry. The discovery of silicon as semiconductor makes everything much more efficient. And history has told us that everything will eventually be replaced with a more efficient version of that thing (mostly on economical standpoint). But this idea of "startups" is going to be problematic if we glorify and romanticize it too much. It'll create another bubble.
Fairchild was the start of Silicon Valley. It wasn't the start of the start up culture. That goes back 850 years to the start of commercial banks then termed money lenders. One could even argue that it really goes back to the start of capitalism 3000 years ago where individuals would go into business for themselves selling food, wares or services en masse. Like, everyone did it. Fairchild didn't invent start up culture, it was itself a part of start up culture. Certainly start up culture was around in tech long before Fairchild. Think Alexander Graham Bell, Thomas Edison, Nikola Tesla, etc. All either their own start ups that turned into famous ground breaking companies that are still with us today
And the Fairchild compressor is still one of the most sought after pieces of recording outboard gear in audio engineering. No computer has ever been able to recreate its tone.
@@tjmarx you misunderstood what they were referring to as startup culture. Startup culture is the culture of new companies being fairly loosely organized more like a group of friends with a common project or like a college dormitory than a top down corporate hierarchy. Start up culture is associated with terms like synergy, dynamic, open, free flow, creative, etc. These are the kinds of companies that replace chairs with bean bags and stairs with slides to be slightly hyperbolic.
Sometimes I'm just amazed at the fact that content like this is free. Thank you, ColdFusion. Edit: It's honestly hilarious how butthurt so many people are by simple words. You must have a great life 🤣
@@Speedster189 have any of those ads convinced you to buy anything? If not it's nothing more than an inconvenience... Regardless, you don't HAVE to pay anything here
At the very beginning, and throughout this video, they confuse a "CPU" with "a CPU on a chip". The term CPU has been around since the 50s. It just means "central processing unit". All computers have them. Intel's big innovation was to put an entire CPU on a chip.
They also seem to confuse: * printed circuit boards with integrated circuits(!) * hard drives with magnetic core memory * Altair 8800 with Apple I (built around 6502, not 8008 or 8080) * PC sales with CPU sales... (there were > 25 different CPU architecures in the 1980/90s) Also nothing about the roles of Texas Instruments & Hewlett Packard in transistor and IC technology, or Datapoint & Zilog in CPU architecture and personal computers. They skip the pivotal 8086 and 286 and pretend most x86 processors were called Pentium ("synonymous with PCs"... sigh). Not a word about all failed Intel projects either (iAPX432, i860, i960, Itanium, ...).
Intel and the rest doesn't measure the same way. Intel 10nm roughly equal to AMD 7nm. It's visible at the transistor density level. Anyway, intel need more fabs (like all others) to meet demands, so using others infra is what they should've done years ago
I watch MANY channels here on UA-cam across MANY different topics, but I gotta hand it to you, you are without a doubt one of the BEST content providers I’ve ever come across. Your research is impeccable; your topics are unique & highly relevant; your editing, graphics & narration is spot on and your content output surprisingly frequent & consistent. I can’t say just how much I value and appreciate all the effort you put in for the benefit of us viewers. Cheers my friend!
It's always so disheartening to hear that if you look back far enough behind every major human achievement there was a terrible person who everyone despised but without them it may never have happened.
I'm a lifelong technology enthusiast who became interested in electronics as a child growing up in the 1970s. My electronic technician career began in 1985 after completing an associate degree program in electronics technology from DeVry Institute of Technology which later became DeVry University. The history of the CPU, microprocessor, integrated circuit, Fairchild Semiconductor and Intel, as presented here, is pretty accurate. A well made documentary to quickly present the birth and evolution of modern digital electronics.
Not really too many mistakes and omissions. For example: Lilienfield had working transistors (FETs) as early as 1927, and a patent in 1925. He was Austro-Hungarian. Also Fairchild as a company has not existed since 2016 when bought by On Semiconductor (used to be Motorola). John B Goodenough was instrumental in the invention of RAM. He should also have been mentioned. And last but not least AMD is Advanced Micro Devices. Those are the technical screw-ups not the grammatical ones like "could not be understated".instead of "could not be overstated".
How do they have 3 million subscribers and have such terrible audio quality? Seriously, spending $100 on a decent mic with a pop filter and soundproofing in their recording area could make it sound almost professional. Hell, even just hang some blankets on the walls or something.
@@oldskooldriver9379 have you ever dealt with modern chips? You no longer bring pure logic on that chip, it is more like designing a very power efficient city of logic skyscrapers. It is all about bringing the energy consumption down and spreading in such a way that you won't overheat. Also, since sub 10mm manufacturing is such a diva, you have to plan in that some of your skyscrapers won't work due to manufacturing errors and plan accordingly. Btw. that is why there is always a cheap GPU for every expensive one.
@@keineangabe1804 that's super interesting gpus just very specific types combinations of logic gates. Like is a ray tracing core is very specialized version of a rasterization core?
My father was a young engineer with IBM in the 1950’s when he was recruited by Shockley to join his venture in California. My father declined Shockley’s offer and stayed with IBM. I think he made the right decision.
Fun Fact: Pat Gelsinger was Intel’s first CTO. There is some hope that a man like him is the right person to bring Intel out of the mess it’s in right now. Edit: Turns out he’s not *the* first. Correction’s in the replies
@@ravenkk4816 well modular is the word you're looking for, yes they're in 14nm size but there is moderate difference between size and performance between them which is why intel is struggling and amd is surging. By the time intel reaches 7nm amd's betting on 5nm Lisa Su is truly a jesus for amd and it shows in their product which i respect
I worked at Intel, from 1978-80 and stood in line behind Ted Hoff, at a scrap pc board sale that the company held for engineers and technicians to get rid of the "dog" boards. (Broken). He was in line in front of me and talking to a younger engineer, whose posture told me he held Ted in high regard. So I consciously took a mental photo of his face and name. A few years ago I saw his face in a web page. That's him, I thought. It was a link to an obituary. I remembered his name too, so I read the article and it said that he was the creator of the first microprocessor. My gut told me that I was standing in front of greatness and it was wonderful to finally learn who he was. I was a test and repair technician working on their microprocessor development system. I fixed the boards I bought. 🙂
It's unbelievable that all of this is made just by a single person. I mean how can a single person be capable of doing this much work for a video. There's the editing, recording and all that shit. And yet the content quality doesn't decline at all. Man keep on with your good work cuz we all love u and will support you.
This is actually happening. I am currently finishing up a PhD in computer systems, and doing research on CPU design, and I can tell you that NO ONE wants to go work for Intel. Even if they offer six figures. The company currently has a reputation for just trying to milk their legacy products and buy up companies to kill competition. Look at what they are doing in the server market- every time a competing product comes out, they buy up the company and warehouse the product on the books. AMD is making some progress right now in that field, and Apple as well. However, it took huge amounts of money for those companies to overcome the near monopoly Intel has on the market. The new CEO they just hired has been talking about this problem in fact, so I have hope for the company if he can follow through. If he can get the brains back in the company, there can surely succeed again.
Same as Boeing, it's weird that engineers get kicked out when companies succeed, afterall it's the brilliant product engineering that made them succeed in the first place. Seems weird that they easily forget this
I used to have Atari2600 ..then a Commodore 16 then a Commodore 64 i am lucky to have grown up knowing these machines ....I wish i could see just a short 200 years in the future to see how far we get .
AMD has left Intel in the rearview. It will soon be "who is Intel?" if they don't get it together. I use the threadripper and it beats out the i9 that my wife has. Houdini and Adobe products run so much smoother and faster rendering on the AMD.
I wouldn't be 'laughing' at a company such as Intel, given that YOU have done next to zero in your life, and they invented the practical CPU chip. All company fortunes go up and down regularly. Stop falling for bullshit tripe, and try to find yourself a life before it's gone.
When a video with a good title starts with a complete nonsense statement: Stopped watching. - Intel invented the "CPU on a chip" - CPUs existed before - NO, NO, NO - applications did not required customized circuit boards. Programmable computers (von Neumann machines) existed from the very start. Von Neumann machines and custom circuits always co-existed and still co-exist today. - Intels innovation was about cost reduction. The single chip CPU was much much less powerful in those days than the mainframe and multi chip CPUs.
Yes, I noticed the same thing, and put a comment in immediately. I also watched the rest of the video. And I think it's actually pretty good, despite that mistake. Heck, this is really a stupid mistake. I just looked up CPU in the dictionary. It says "abbreviation for central processing unit". Nothing about it being on a chip.
The point is that single-chip microprocessors caused an important advance in what kinds of things it made sense to do with general-purpose processors instead of using custom circuits designed for specific tasks.
I honestly thought you must have had a production team this whole time I've watched you. Amazing content and editing, really. Congrats on the 3M subs, well deserved.
to be honest, this is definitely not news Intel is falling behind. What's shocking is that they can keep doing that to the point t hat ColdFusion publish a video about it. This means something.
I feel like the craziest part of this is the fact that they went from computers the size of a fridge straight to these tiny chips, with nothing in between.
Not quite. There's an intermediate step. Mainframes had CPUs the size of a refrigerator. Then came *minicomputers* (now midrange systems) that shrunk the CPU to several boards inside of a unit the size of five pizza boxes mounted on a rack inside a cabinet of the size of a refrigerator.
@@1960ARC Don't care about your storybook. This vid is about tech development. The comment was about tech. There was never any need to shove myth and magic into this.
@@cobra6481 I was giving a response to the crazy aspect which is not technology. Not liking my comment is your problem not mine. It was a snake that made man fall into sin.
Intel was an admired company in my days and still it is as it revolutionized the way we do our job simple from complex. Chips is a fascinating product. India does not have one decent manufacturer and to grow as an economy we need semiconductor industry to be established here. The story is inspiring like it is with apple Macintosh. Lets see the next episode. Cheers.
But as days go by more slice of the pie is taken from them. Laptop manufacturers are going AMD right now means in the future more slice would be taken from intel. Intel have already lost a big chunk of the pie and with the rise of the ARM chips like the M1 it would put intel into a deeper trouble if they don't make a move right now. Market share has no relation to being a laughing stock, I mean everybody thinks EA is an evil company yet people still buy from them.
Dagogo, there are few things as reassuring as a delivery as sober as yours. Thank you and congratulations, for grasping credibility so well. I look forward to reading your book.
Well, they were purposely slowing down what they're actually able to do so that they would be able to produce chips for longer. They were worried about Moore's Law. It backfired because the rest of the industry caught up and overtook them.
@@ashdoglsu no.... that was AMD that posted record profits.... . did you watch Intels last investor meeting??? where they said they could not compete till 2022, if not 2023? . or the investor meeting ~2 years ago..... where they said 10nm was delayed 6 months and then the meeting ~1 year ago.... where they said 10nm was delayed 1 year and then the meeting ~6 months ago.... where they said 10nm was delayed 1.5 years.....and 7nm is delayed 1 year.... . like.... you cant lie at an investor meeting..... so when intel says "we cant compete till 2022".... THEY MEAN IT!
@@ashdoglsu and why is it that the first 3 people they called to replace the current CEO declined? surely if Intel was so competitive and profitable.... people would be JUMPING at the chance to be the CEO!!! . and why would the CEO be leaving in the first place???
My respect for Intel has increased multifold after watching this video. I am thankful for the contribution it has made to the Computer Community but I will still go on buying for AMD because the present is different from history.
Just because some company/person/organisation was great in the past doesn't mean that you need to keep supporting them forever. While it's nice to acknowledge the past, it's the present and future that needs to be focused on, at least in my opinion. (This is an opinion that I try to get across to people, even though it could cost me personally!)
I work on an AMD Ryzen 5, and AMD is hands down the better manufacturer. They are undervalued, advanced, and put more efforts into their products rather than marketing, the perfect combination for a worthy purchase.
11th gen prices were just leaked and it seems that processors are worse than 10th gen and 50% more expensive... they're cashing in HARD because they sell out anyways, and Intel is laughing all the way to the bank... 😁
The only way Intel can rise again is by adapting RISC V architecture. When the whole world was moving to ARM architecture, Intel was still betting on X86.
Forget it. The previous CEO Paul Otellini threw away Intels chance at mobile: www.extremetech.com/computing/156126-intel-couldve-powered-the-original-iphone-but-decided-against-it-says-ex-ceo-otellini
Some of the best content on YT period! And a salute for including advertising thats relevant to the subject instead of a hard left like most channels do.
young kids of todays culture laughing at intel need to watch this, and see truly how they changed the entire world. perhaps the peak of innovation and genius for their time
The important part about the first microprocessor chip, is that it followed the Von Neumann architecture. Though the 4004 was only a 4-bit machine, it's programming followed a method understood by mainframe programming. Bill Gate's wrote the BASIC interpreter on a mainframe, to that could run on an Intel 8080, that was used on the MITs Altair. The first personal computer, on that Popular Electronics magazine cover. So even though the first microprocessor was a revolution, it more importantly wasn't too much of a revolution ! Any engineer that understood the inner workings of a mainframe computer, understood the first microprocessor.
Actually they did. Ever since they introduced their own 64 bit processors, they use a RISC processor at its core and emulate the x86 ISA through firmware (i.e. the "microcode", which isn't actual microcode, just that internal processor's instruction set ;-)
"The impact of the invention of the CPU can't be understated." This should of course be "can't be overstated". Not sure if there's any way for you to change that part of the video. If not, I'm sure people still understand what you mean.
@@KentoCommenT It's a stylistic device called litotes. Instead of saying "like", you say not unlike for emphasis. In this case it's saying "can't be understated", instead of saying "can be overstated" which sounds obviously wrong. "Can't be understated" is technically correct but redundant, because it assumes people are understating intel's impact in the first place, which isn't true.
Actually you’re wrong. To say “it can’t be understated” in the provided context is to say that understating it’s impact would be incorrect. Which is grammatically correct. For example “You can’t say no to a great deal”.
Just to put things straight. The intel 4004 was pathetically weak compared to computers of the time. It wasn't really until the mid 80s that microprocessors started to surpass the big room filling computers of the 60s in terms of performance.
4004 was designed for calculator and not very powerful, but it quickly changed. By 8085 in late 70s, microprocessors has become really powerful. We used to run 2048 port Central Office switches on a bunch of 8085 @ 12MHz in late 80s, quite comfortably!
@@goofybits8248 yes but they were still 8-bit devices. I know there 16 and 32-bit microprocessors appearing by the late 70s but they very weak numerically, no floating point, basic scaler architectures. So you're right that they were perfect for control tasks but right up until the late 80s there was still a market for big scientific computer because microprocessors wouldn't cut it.
I still have the Intel catalog with the 4004, 8008 (8 bit version if 4004), 8080, 8080A, and 64 bit memory chips. And I remember my professor at University of Michigan tell his first hand story of the Eniac computer (he had worked on it).
When occasionally creating such sentences it feels really satisfying and for a moment I hoped that it was your creation. The quote is from G. Michael Hopf. Probably it fits many extraordinary companies of our time, like SpaceX. Intel helped AMD in the early days, maybe AMD can help Intel today.
The term CPU is older than Intel, which "merely" made the significant step of making a single-chip CPU. Arguably, the major competitor to Intel is not so much AMD or Apple, but TSMC which leads the chip manufacturing technology field. Without this, AMD would not be storming forward; still it was a great move for them to go fabless, without major financial resources for what is now GF.
AMD only went fabless because they were at the brink of bankruptcy, so they sold off their manufacturing division as Global Foundaries. Maybe it ended up being beneficial in the end but don't get me wrong, it was not intentional.
@@kamil.g.m do you understand that amd's superior chips (at the time) couldn't sell because of this? no one displayed their chips in store and no contract from pc makers. inhel paid money incentives (bribes) on top of selling them chips at a loss just to kill amd. amd sank to rock bottom and raking up immense debts. it's a miracle they could survive and revive. stop alternative facts.
Busicom vitch: Where the logic? Faggin: (shows block diagram made in one/two nights) Busicom vitch: This bad! I hate it! Where the logic? Faggin: Uh um...I don't have any... Busicom vitch: You bad you bad! Faggin: Look, I just got the job yesterday! Busicom vitch: You late! 😂😂😂
We've started to make the switch to AMD for processor-intensive tasks without paying a fortune. People do forget, however, the AMD "incompatibility" years. Yikes on that.
I love amd for desktops but I find a lot of amds mobile offerings lacking sometimes but that could just be that the companies I buy from short the ram. 3 gigs of usable ram on a computer running windows Hewlett-Packard when ram for laptops is expensive? I'd like to know what you were smoking and if I can have some
@Jimbo Bimbo Not all CPUs are microprocessors. The guy who made this video seems to think the two terms are synonyms. At the time of the 4004, CPUs were made of separate components. In the PDP/11, for example, the CPU comprised several circuit boards within the computer. So, to answer your question, the 4004 is a CPU, but it wasn't the first CPU. It was the first microprocessor.
Also the binary states of a computer do NOT represent the state of the transistors. The transistors are used to make logic gates (AND, OR, NOT) and the input and output of those are simple binary values. These gates are then combined to create a complex system of logic which performs calculations using those gates and data of these calculations are the 1s and 0s your computer deals with.
@@thep751 A Transistor doesn't store anything. It's just an electrical switch. Also logic gates don't store anything either. You need a flip-flop or a latch for that, which are made from logic gates, which in turn are made from transistors.
Degago. You are amazing! I don't know how you do it, but keep it up and make it last. Yoir Research skills and story telling is amazing (lack of a better word)
I'm really excited to hear you have a book. Over the last two weeks I have become completely hooked on your programming! Very fascinating and insightful.
Whenever people summarize how transistors made cpu's possible they always say something to the effect of "they act as switches which are 1's and 0's and packing them all together cpu's can then process billions of these binary digits"... but beyond being switches/binary digits it really needs added that "when place together in groups with certain ones being controlled they can perform calculations". That small little tidbit is the missing layer that really helps explain (on a very high/simple level still) *why* these things, when put together, make a working cpu.
Good work Dagogo! - a really interesting episode! Having lived through all that you talk about and being involved with a lot of it, it was great to see you consolidate the histroy into a digestible couple of episodes!
Oh
The other episodes aren’t even out. How can you know that
I used the 4004 in a couple of embedded designs, bounced around at several PC companies and worked at AMD from 1993-2020 mostly trying to catch up to Intel. Great video; you have encapsulated the history very well. Ordered your book :-)
We should connect, Philip R.
@@w43o21l2f ? To talk about electronics?
I can't even imagine how the people who started Fairchild and Intel must feel about the drastic change their invention brought to the world. It won't be exaggerating if I say it was as revolutionary as the invention of Wheels.
I agree
Me too
It's up there. I think that is fair. If not the wheel, it is on par with the invention of electricity, which perhaps may be even more significant.
Noyce would've been awarded a shared Nobel prize with Jack Kilby for the invention of Integrated Circuit if he was still alive in 2000.
It really isn't exaggeration. Literally every industry runs on their invention today.
Coldfusion on Intel while cooking, nice day to everyone from Greece
edit: wow, so many kind people around the globe watching, keep it on guys
You are cooking lunch while I am cooking dinner, have a good day and evening.
Good night from New Zealand, he makes great videos to watch before bed.
Nice day from New Delhi. Working remotely from my roof - on blockchain technology
Nice day to you to from India
Have a great day, from Bangladesh.
I never stop getting amazed how many incredible, talented people has UA-cam enabled to emerge, on their own. Look at the quality on this video! 20 years ago you'd only get this on TV... maybe. You'd have to pitch the show to some producer, and jump through all the hoops. Now you just create a video and the world decides, nobody is holding you back.
@@omardelmar Have you even seen the Discovery channel lately? There are many YT channels that offer content that TV can only dream of.
@@omardelmar discovery and history channel is shit compared to videos like these
I get what you’re saying but it Depends on the nature of the content. If it’s too far in the field of wrong-think a whole channel can be removed by the gods of gügl
@@omardelmar It doesn't all have to be hyper-jazzy CGI. does it? Does most modern mean most bestest??
Shame the first 30s mix up the definitions of microprocessor and CPU. All microprocessors are CPUs and the first commercial microprocessor was Intel. However CPU just describes the core compute hardware, the logic gates to make the fetch-decode-execute loop, APU and such. It does not describe whether it is a single item, or a bunch of hardware taking up a room. Not all CPUs are Mircroprocessors, as in it is all one one "computer chip".
I was an ASIC Engineer thru the 80's an 90's. In the late 90's Intel approach me about employment. Their salary offer was half of what I making at the time. I told them no and why and they're response was "but we're Intel."
Damn, they're already arrogant in the 90s lmao
Also cool story m8, good for ya for declining them
Money Talks, BS Walks! 😎
It's fascinating to realize that Fairchild Semiconductor is The O.G. startup that define the 'startup culture' as we know today.
Yes, every company has a starting point (either big or small), but the contemporary term of "start up" as a culture (especially in tech industry) is an entirely different story.
For example, agriculture has been around since tens of thousands of years ago, but modern Agricultural Revolution (between mid 17th to mid 19th century) made it much more EFFECTIVE and EFFICIENT because of the increasing number of labor and land availability.
This is parallel with the current tech industry. The discovery of silicon as semiconductor makes everything much more efficient. And history has told us that everything will eventually be replaced with a more efficient version of that thing (mostly on economical standpoint).
But this idea of "startups" is going to be problematic if we glorify and romanticize it too much. It'll create another bubble.
I'd argue Shockley Semiconductor Laboratory was the real start of it.
Just ONE of the OG's
Fairchild was the start of Silicon Valley. It wasn't the start of the start up culture. That goes back 850 years to the start of commercial banks then termed money lenders.
One could even argue that it really goes back to the start of capitalism 3000 years ago where individuals would go into business for themselves selling food, wares or services en masse. Like, everyone did it.
Fairchild didn't invent start up culture, it was itself a part of start up culture. Certainly start up culture was around in tech long before Fairchild. Think Alexander Graham Bell, Thomas Edison, Nikola Tesla, etc. All either their own start ups that turned into famous ground breaking companies that are still with us today
And the Fairchild compressor is still one of the most sought after pieces of recording outboard gear in audio engineering. No computer has ever been able to recreate its tone.
@@tjmarx you misunderstood what they were referring to as startup culture.
Startup culture is the culture of new companies being fairly loosely organized more like a group of friends with a common project or like a college dormitory than a top down corporate hierarchy.
Start up culture is associated with terms like synergy, dynamic, open, free flow, creative, etc.
These are the kinds of companies that replace chairs with bean bags and stairs with slides to be slightly hyperbolic.
Sometimes I'm just amazed at the fact that content like this is free. Thank you, ColdFusion.
Edit: It's honestly hilarious how butthurt so many people are by simple words. You must have a great life 🤣
You are easy to please
@@alexanderv.8961 dude, I'm not your dad
It’s not free he puts an ad in the very front and other ads in the middle
@@Speedster189 have any of those ads convinced you to buy anything? If not it's nothing more than an inconvenience... Regardless, you don't HAVE to pay anything here
If something is free, YOU are the product
At the very beginning, and throughout this video, they confuse a "CPU" with "a CPU on a chip". The term CPU has been around since the 50s. It just means "central processing unit". All computers have them. Intel's big innovation was to put an entire CPU on a chip.
Yes, intel made the first..Possibly...There were others..Microprocessor.
They also seem to confuse:
* printed circuit boards with integrated circuits(!)
* hard drives with magnetic core memory
* Altair 8800 with Apple I (built around 6502, not 8008 or 8080)
* PC sales with CPU sales... (there were > 25 different CPU architecures in the 1980/90s)
Also nothing about the roles of Texas Instruments & Hewlett Packard in transistor and IC technology, or Datapoint & Zilog in CPU architecture and personal computers. They skip the pivotal 8086 and 286 and pretend most x86 processors were called Pentium ("synonymous with PCs"... sigh). Not a word about all failed Intel projects either (iAPX432, i860, i960, Itanium, ...).
Noticed the same, gave me instant toothache. Even more so when it seems that so few people notice it.
Advanced Micro Dachines
I'm no expert but I was thinking the same. Seems like a lot of misinformation and mix up of terms here.
7:15 AMD stands for “Advanced Micro Devices”
I just realized, now I want a _RYZEN_
Yep, Advanced Micro Machines are just little racecars with mods :)
"AMD = advanced micro machines"" pepeLaugh hes doesnt know
@@henrik1743 oh no no no
pepeLaugh this guy doesn't know
Intel Salesman: We have the hottest Chips on the market. (Quite Literally).
14nm: *exists*
Intel: “I can milk you”
Oh, so good.. Touche 😋 Don't be stealing anymore of my sarcasm.
They do have 10nm & 10nm+ you know.. :P
But I prefer AMD
Consumer: *powdered* milk doesn't count....
Intel and the rest doesn't measure the same way. Intel 10nm roughly equal to AMD 7nm. It's visible at the transistor density level. Anyway, intel need more fabs (like all others) to meet demands, so using others infra is what they should've done years ago
@@Kabodanki You can't messure nm in different ways. A nm is a nm. But they can often cheat with the numbers.
I can recall that the original IRON-MAN comics always bragged that Stark's armor was powered by transistors.
Lol
And that it ran on batteries lmao
YES
I watch MANY channels here on UA-cam across MANY different topics, but I gotta hand it to you, you are without a doubt one of the BEST content providers I’ve ever come across. Your research is impeccable; your topics are unique & highly relevant; your editing, graphics & narration is spot on and your content output surprisingly frequent & consistent. I can’t say just how much I value and appreciate all the effort you put in for the benefit of us viewers. Cheers my friend!
Yes but I'm definitely not sure about his recent apple video
One of the channles i deactivated my AdBlock on. Thats the least he deserves!
I totally agree
Sure
This video is disappointing for how badly fact checked and half heartedly researched it is.
It's always so disheartening to hear that if you look back far enough behind every major human achievement there was a terrible person who everyone despised but without them it may never have happened.
Besides being difficult to work with, Shockley became a proponent of racism and eugenics.
And who it may be here?
@@MJ-uk6lu William Shockley, the founder of Shockley Semiconductor Laboratory, which begat Fairchild, etc. You weren't paying attention.
@@briansmith8967 And how he was terrible? Poor personality? Maybe. But does that make person terrible?
@@MJ-uk6lu I guess you didn’t see my earlier comment of how he was a racist and believed in eugenics. Pay attention.
This is very reminiscent of a documentary called “something ventured” one of my all time favs, can’t wait for part 2!!🥂
Share a link if its still available.
OK, I guess I can stop looking for it now.
Intel minds-sharing propaganda as it's best. 😬 😷
Dagogo has been known to copy other documentaries almost verbatim before
@@hatchetscoured history is tstill the same story, it's how you present it is what makes it interesting
2021: "there is no reason anyone would want an atomic bomb in their home"
2050:
atoms are just a theory
I will come back to this comment
Hahahahaha
insert 2050's nano fusion reactors, lease the power of the sun into your families hands, for generations to come
@@Zero11s 🤦🏽♂️🤦🏽♂️🤦🏽♂️
It's simple, I see new cold fusion I click. Dagogo killing the informational yt game, can't wait for pt. 2!
it's simple, i see intel and laughing stock in the same title, i click
It is as simple as this
@@eldenyo it's as simple as that?
Had to like to disrupt the 666
@@furn2313 Thank you!
I'm a lifelong technology enthusiast who became interested in electronics as a child growing up in the 1970s. My electronic technician career began in 1985 after completing an associate degree program in electronics technology from DeVry Institute of Technology which later became DeVry University. The history of the CPU, microprocessor, integrated circuit, Fairchild Semiconductor and Intel, as presented here, is pretty accurate. A well made documentary to quickly present the birth and evolution of modern digital electronics.
ColdFusion should make documentaries. This is the best channel on UA-cam.
Not really too many mistakes and omissions. For example: Lilienfield had working transistors (FETs) as early as 1927, and a patent in 1925. He was Austro-Hungarian. Also Fairchild as a company has not existed since 2016 when bought by On Semiconductor (used to be Motorola). John B Goodenough was instrumental in the invention of RAM. He should also have been mentioned. And last but not least AMD is Advanced Micro Devices. Those are the technical screw-ups not the grammatical ones like "could not be understated".instead of "could not be overstated".
How do they have 3 million subscribers and have such terrible audio quality? Seriously, spending $100 on a decent mic with a pop filter and soundproofing in their recording area could make it sound almost professional. Hell, even just hang some blankets on the walls or something.
CPU engineers get to say they literally engineer logic for a living
There is a provability and pureness of building with logic that is so much saner than dealing with, say, a politician or a surely bartender.
Never too big to fail. just more LIKely.
ah yes, my neighbour was a professor of logic
@@oldskooldriver9379 have you ever dealt with modern chips? You no longer bring pure logic on that chip, it is more like designing a very power efficient city of logic skyscrapers. It is all about bringing the energy consumption down and spreading in such a way that you won't overheat. Also, since sub 10mm manufacturing is such a diva, you have to plan in that some of your skyscrapers won't work due to manufacturing errors and plan accordingly.
Btw. that is why there is always a cheap GPU for every expensive one.
@@keineangabe1804 that's super interesting gpus just very specific types combinations of logic gates. Like is a ray tracing core is very specialized version of a rasterization core?
Those last remarks, "I don't have a large team, just me and an editor that helps from time to time", got me in tears ;-;
My father was a young engineer with IBM in the 1950’s when he was recruited by Shockley to join his venture in California. My father declined Shockley’s offer and stayed with IBM. I think he made the right decision.
Depends on his skills and entrepreneurship. Picking a different path in life is a parallel universe with its own ups and downs.
Fun Fact: Pat Gelsinger was Intel’s first CTO. There is some hope that a man like him is the right person to bring Intel out of the mess it’s in right now.
Edit: Turns out he’s not *the* first. Correction’s in the replies
Oh
@@Custmzir oh what
They should bet big on RISC-V. Investing in x86 is probably pointless now.
@@thebritishindian1 they just now build hybrid chip
@@thebritishindian1 “is probably pointless now,” might be an understatement 🙃🙃🙃. We just entered into that next phase of power to performance liftoff.
Everyone: You can't get blood from a stone
Intel: But we can squeeze the blood from 14nm
The thing is ,most 7nm+ now day are actually 14mn size with 7nm performance, instead of true 7nm.
@@ravenkk4816 well modular is the word you're looking for, yes they're in 14nm size but there is moderate difference between size and performance between them which is why intel is struggling and amd is surging. By the time intel reaches 7nm amd's betting on 5nm Lisa Su is truly a jesus for amd and it shows in their product which i respect
What’s this 14nm?
@@yelectric1893 just think of it as the physical size of individual transistors .
Fun fact: 14nm is roughly equal to 70 silicon atoms side by side.
I was looking for part 2 then realized this was uploaded 2 hours ago.
Haha same! Cant wait for the 2nd part
Bharat, you're a conqueror. Really nice;;
lol
same ! :D :D :D
I worked at Intel, from 1978-80 and stood in line behind Ted Hoff, at a scrap pc board sale that the company held for engineers and technicians to get rid of the "dog" boards. (Broken). He was in line in front of me and talking to a younger engineer, whose posture told me he held Ted in high regard. So I consciously took a mental photo of his face and name. A few years ago I saw his face in a web page. That's him, I thought. It was a link to an obituary. I remembered his name too, so I read the article and it said that he was the creator of the first microprocessor. My gut told me that I was standing in front of greatness and it was wonderful to finally learn who he was. I was a test and repair technician working on their microprocessor development system. I fixed the boards I bought. 🙂
FAIRCHILD. What a name, considering the story behind.
Like hey don't fucking care what you think
Fairchild Channel F
@@omniyambot9876 ???
Maybe they thought of fairchild name as a irony for their nick name
@@omniyambot9876 yup hahah
Monopoly has made them forgot about why they're doing this in the first place
Yep! They became a financial institution.
intel will become nokia
@J Fz not only that. They have brand recognition
@@fearlessviper6407 I agree
@J Fz Only way to save themselves is to get into RISC v or RISC arm chips for now. x64 is just dead and long overdue to be discarded.
It's unbelievable that all of this is made just by a single person. I mean how can a single person be capable of doing this much work for a video. There's the editing, recording and all that shit. And yet the content quality doesn't decline at all. Man keep on with your good work cuz we all love u and will support you.
Every youtuber got a group behind em
@@meghanachauhan9380 i think coldfusion is only run by 1 person and not a team...
I think people don't understand what you originally ment.
Video editing isn't very hard, but yes he does quality videos.
Check Lemmino. He’s a one man team.
Before watching I predict the end of the story reads " Then they stopped hiring engineers and started hiring bean counters"
This is actually happening. I am currently finishing up a PhD in computer systems, and doing research on CPU design, and I can tell you that NO ONE wants to go work for Intel. Even if they offer six figures.
The company currently has a reputation for just trying to milk their legacy products and buy up companies to kill competition. Look at what they are doing in the server market- every time a competing product comes out, they buy up the company and warehouse the product on the books. AMD is making some progress right now in that field, and Apple as well. However, it took huge amounts of money for those companies to overcome the near monopoly Intel has on the market.
The new CEO they just hired has been talking about this problem in fact, so I have hope for the company if he can follow through. If he can get the brains back in the company, there can surely succeed again.
Same as Boeing, it's weird that engineers get kicked out when companies succeed, afterall it's the brilliant product engineering that made them succeed in the first place. Seems weird that they easily forget this
They only have 10,000 engineers and do you think they will hire accountants to design chips?
“Moore on this later”
Nice pun
You mean "noyce" pun right?
moore law is dead or is it XD
He threw that in a couple of times right?
@@SacredDaturaa eggxactly
@@SacredDaturaa robert noyce
Me: reads title
Intel executive: reads title
Intel: files lawsuit
Judge: I got no Intel on this.....
Get out
@@potatodroid2 U mad bro? :))))
Not sure why, but this was too funny. 😂😂
GOLD!! Thank you!
I used to have Atari2600 ..then a Commodore 16 then a Commodore 64 i am lucky to have grown up knowing these machines ....I wish i could see just a short 200 years in the future to see how far we get .
I never clicked something LITERALLY SO FAST in my entire life
Why: Because ColdFusion barely uploads
Not true, for the quality of the content provided I would say Dagogo uploads often!
@@Parky_T he/she said "barely uploads" then why is it not true?
Literally? How?
AMD has left Intel in the rearview. It will soon be "who is Intel?" if they don't get it together.
I use the threadripper and it beats out the i9 that my wife has. Houdini and Adobe products run so much smoother and faster rendering on the AMD.
But when he does, the quality and information provided is world class.
AMD is out of stock, but Intel has plenty of laugh in stock
NOICE
Noice
I wouldn't be 'laughing' at a company such as Intel, given that YOU have done next to zero in your life, and they invented the practical CPU chip. All company fortunes go up and down regularly. Stop falling for bullshit tripe, and try to find yourself a life before it's gone.
@@cjay2 Found the fanboy
@@cjay2 it's true but you're bugging if u thought he meant it like that it's just a quick roast
They deserve this. Having people pay a more for basically the same 4C8T CPU every year is karma.
Karma is inevitable in every aspect of life.. Its just an another example.
No-one is forcing you to buy a new one each year.
@@mikitz when the cpus ship with backdoors in them that can only be fixed by buying a new one then yeah they kinda are
I remember working in the warehouse for intel back in the 90’s and picking my first £1,000,000 order of pentium chips. Good company to work for.
When a video with a good title starts with a complete nonsense statement: Stopped watching.
- Intel invented the "CPU on a chip" - CPUs existed before
- NO, NO, NO - applications did not required customized circuit boards. Programmable computers (von Neumann machines) existed from the very start. Von Neumann machines and custom circuits always co-existed and still co-exist today.
- Intels innovation was about cost reduction. The single chip CPU was much much less powerful in those days than the mainframe and multi chip CPUs.
#KnowYourHistory #TheMoreYouKnow
Yep. ColdFusion gets many details wrong. Its understandable because most people do not understand semiconductor industry.
Same I managed exactly 21 seconds, if he gets something so fundamental wrong it's not worth me wasting my time watching.
Yes, I noticed the same thing, and put a comment in immediately. I also watched the rest of the video. And I think it's actually pretty good, despite that mistake. Heck, this is really a stupid mistake. I just looked up CPU in the dictionary. It says "abbreviation for central processing unit". Nothing about it being on a chip.
The point is that single-chip microprocessors caused an important advance in what kinds of things it made sense to do with general-purpose processors instead of using custom circuits designed for specific tasks.
0:36 "The impact of the invention of the CPU can't be understated." I'm pretty sure you mean it can't be overstated.
Yep you're right. This kind of thing drives me crazy in these UA-cam vids.
Up there with “could care less” instead of “couldn’t care less” drives a person batshite crazy lol
@@prepperjonpnw6482 i hate all double negatives in English. they confuse foreigners like me.
lol
@@mkl126 There is no double negative in "couldn't care less"
This is the one and only UA-cam channel that I even love the sponsors advertising.
I honestly thought you must have had a production team this whole time I've watched you. Amazing content and editing, really. Congrats on the 3M subs, well deserved.
"Advance Micro Machines - Even known as AMD" - Guess you mean Advanced Micro Devices
I also got lost there and was wondering how come it's advanced micro machines. It should have been advanced micro devices.
I guess confused with IBM (International Business Machines).
It is called '...Machines' in an alternative reality where Altraide is from... Lol.
Micro Machines is SNES game, I used to play it.
18:25 He knows what AMD is. He showed a picture of when AMD started late in the video. 18:25
I just want to forget this episode so that I can watch it again alongwith episode 2 🔥
to be honest, this is definitely not news Intel is falling behind. What's shocking is that they can keep doing that to the point t hat ColdFusion publish a video about it. This means something.
7:17 Advanced Micro *DEVICES* - *AMD*
There also referred to as advanced micro
micro machines made me giggle ngl
Yeah I was wondering if anyone else noticed that!
@@tim2tupman yes I did. How bout Pussycom @ 12:48 minutes!! Maybe he was too BUSY! Lol!!!
This video is aging really well, we are gonna need an intel comeback update story in a few years!!!
Not gonna lie, Title got me clicking the video fast.
Poor Intel
@@cc-000 yeah in india also
I feel like the craziest part of this is the fact that they went from computers the size of a fridge straight to these tiny chips, with nothing in between.
Not quite. There's an intermediate step. Mainframes had CPUs the size of a refrigerator. Then came *minicomputers* (now midrange systems) that shrunk the CPU to several boards inside of a unit the size of five pizza boxes mounted on a rack inside a cabinet of the size of a refrigerator.
@@1960ARC Don't care about your storybook. This vid is about tech development. The comment was about tech. There was never any need to shove myth and magic into this.
@@cobra6481 I was giving a response to the crazy aspect which is not technology.
Not liking my comment is your problem not mine. It was a snake that made man fall into sin.
@@1960ARC Care to provide any proof for any aspect of your comments?? 🤷🏻♂️🤦🏻♂️
@@1960ARC The snake you mean is REGRET. It could kill your happiness, but you could do evolution.
*Lisa Tsu has entered the chat
AMD : STONKS BIG
Intel was an admired company in my days and still it is as it revolutionized the way we do our job simple from complex.
Chips is a fascinating product. India does not have one decent manufacturer and to grow as an economy we need semiconductor industry to be established here.
The story is inspiring like it is with apple Macintosh. Lets see the next episode. Cheers.
5:48 John Oliver did age VERY well! (2nd on the right)
@@alycia5532 if there's meant to be a joke there, I don't get it. sorry. xD (I'm german)
Hey, just letting you know that, that account isn’t me! Please always look for the verification symbol.
@@ColdFusion report it
Nahh.. He looks more like Messi with glasses...
@@ColdFusion I reported it, it was on my comment as well.
I don't know about laughing stock, 50% PC market share and 80% laptop market share doesn't seem that bad to me
From 85% to 50% in just a few years
But as days go by more slice of the pie is taken from them. Laptop manufacturers are going AMD right now means in the future more slice would be taken from intel. Intel have already lost a big chunk of the pie and with the rise of the ARM chips like the M1 it would put intel into a deeper trouble if they don't make a move right now. Market share has no relation to being a laughing stock, I mean everybody thinks EA is an evil company yet people still buy from them.
I have 22nm
@@groszak1 I'm on 12nm.
@@carjac820 true
I stopped whatever I was watching when I saw ColdFusion notification. It is that good.
Dagogo, there are few things as reassuring as a delivery as sober as yours. Thank you and congratulations, for grasping credibility so well. I look forward to reading your book.
"You are watching ColdFusion TV" has an altogether separate fan base!
Im addicted to that line too :)
@@PhilthySteel me too! 😂
Laughing stock is a bit of hyperbole, but they are definitely having a rough time
They're laughing alright. Laughing at record setting profits they just posted.
Well, they were purposely slowing down what they're actually able to do so that they would be able to produce chips for longer. They were worried about Moore's Law.
It backfired because the rest of the industry caught up and overtook them.
@@bgill7475 source?
@@ashdoglsu no.... that was AMD that posted record profits....
.
did you watch Intels last investor meeting??? where they said they could not compete till 2022, if not 2023?
.
or the investor meeting ~2 years ago..... where they said 10nm was delayed 6 months
and then the meeting ~1 year ago.... where they said 10nm was delayed 1 year
and then the meeting ~6 months ago.... where they said 10nm was delayed 1.5 years.....and 7nm is delayed 1 year....
.
like.... you cant lie at an investor meeting.....
so when intel says "we cant compete till 2022".... THEY MEAN IT!
@@ashdoglsu and why is it that the first 3 people they called to replace the current CEO declined?
surely if Intel was so competitive and profitable.... people would be JUMPING at the chance to be the CEO!!!
.
and why would the CEO be leaving in the first place???
My respect for Intel has increased multifold after watching this video. I am thankful for the contribution it has made to the Computer Community but I will still go on buying for AMD because the present is different from history.
Just because some company/person/organisation was great in the past doesn't mean that you need to keep supporting them forever. While it's nice to acknowledge the past, it's the present and future that needs to be focused on, at least in my opinion. (This is an opinion that I try to get across to people, even though it could cost me personally!)
I work on an AMD Ryzen 5, and AMD is hands down the better manufacturer. They are undervalued, advanced, and put more efforts into their products rather than marketing, the perfect combination for a worthy purchase.
'part 1' damn, there's more roasting coming lmao
Calling them Laughing Stock seems quite hyperbolic to me
Last year, people making meme out of every intel products.
laughing stock in the DIY space at least
thats just to you then
Did you see how thin, powerful and cheap are the amd 4000 and 5000 series chips are?
They destroy Intel.
11th gen prices were just leaked and it seems that processors are worse than 10th gen and 50% more expensive... they're cashing in HARD because they sell out anyways, and Intel is laughing all the way to the bank... 😁
Narrated documentary-style videos about technology from the mid-1900s give me extreme feelings of nostalgia for some reason.
The only way Intel can rise again is by adapting RISC V architecture. When the whole world was moving to ARM architecture, Intel was still betting on X86.
Forget it. The previous CEO Paul Otellini threw away Intels chance at mobile:
www.extremetech.com/computing/156126-intel-couldve-powered-the-original-iphone-but-decided-against-it-says-ex-ceo-otellini
Some of the best content on YT period! And a salute for including advertising thats relevant to the subject instead of a hard left like most channels do.
young kids of todays culture laughing at intel need to watch this, and see truly how they changed the entire world. perhaps the peak of innovation and genius for their time
Intel is the definition of overwhelming innovation...
They are boring
How did you comment an hour before the video was released?
@@mydearfriend007 Patreon perks perhaps?
@@mydearfriend007 that’s what happen when you have the bell notification on
Was
The important part about the first microprocessor chip, is that it followed the Von Neumann architecture. Though the 4004 was only a 4-bit machine, it's programming followed a method understood by mainframe programming. Bill Gate's wrote the BASIC interpreter on a mainframe, to that could run on an Intel 8080, that was used on the MITs Altair. The first personal computer, on that Popular Electronics magazine cover. So even though the first microprocessor was a revolution, it more importantly wasn't too much of a revolution ! Any engineer that understood the inner workings of a mainframe computer, understood the first microprocessor.
It's interesting how the most ingenious inventions are usually born out of fun and risk, and not out of some planned projects.
same with making friends
_"The impact of the invention of the CPU can't be understated..."_
:D
^ Imposter
"They were the first company to invent the CPU"
@@alycia5532 there is one imposter among us
That should read, ‘Can’t be OVER stated.’ Americans usage of English seem to always reverse the logic of this and other expression erroneously.
Last part was nice where you included what every person in the story is doing now
Intel didn't get into RISCy business for their flagship computing... 🤪
for what then? 🤪
Course not. They did it for the same reason everybody else did... Rebecca De Mornay.
Actually they did. Ever since they introduced their own 64 bit processors, they use a RISC processor at its core and emulate the x86 ISA through firmware (i.e. the "microcode", which isn't actual microcode, just that internal processor's instruction set ;-)
@@TheJamieRamone doesn't that impact performance in a negative way?
That is a very... very good pun
When ever I watch cold fusion videos I don't mind the ads
And why do we not have a movie about these guys. I would not be typing this and you would not be reading this without their contribution.
But wouldn't you rather know about the Kardashians instead? Jk. 😂
If that was the case. He wouldn't be here😂
@@yruhatin100 god damn that’s one stale joke
Especially a movie with Shockley character in it, hmm..
I would, using that old BBC computer. In a kinda fonty-font thingy typeset, but legible.
nobody:
Coldfusion Title: Ya roasted!
Like the chips
ltt, gn and other tech channels have been at it for a while.
"Part 1" hahaha
"The impact of the invention of the CPU can't be understated."
This should of course be "can't be overstated". Not sure if there's any way for you to change that part of the video. If not, I'm sure people still understand what you mean.
Wait I'm confused. Could you explain?
@@KentoCommenT It's a stylistic device called litotes. Instead of saying "like", you say not unlike for emphasis. In this case it's saying "can't be understated", instead of saying "can be overstated" which sounds obviously wrong.
"Can't be understated" is technically correct but redundant, because it assumes people are understating intel's impact in the first place, which isn't true.
@@crediblesalamander8056 thanks! english is not my first language so this help out a lot :)
@@crediblesalamander8056 It should say "can't be overstated". To use "can't be understated" is wrong. At least in this particular case.
Actually you’re wrong. To say “it can’t be understated” in the provided context is to say that understating it’s impact would be incorrect. Which is grammatically correct. For example “You can’t say no to a great deal”.
That was beautiful. As an electrical engineer passionate about the history of the industry, I can't thank you enough for the content
Yeah especially regarding the history of the Microprocessor!
Just to put things straight. The intel 4004 was pathetically weak compared to computers of the time.
It wasn't really until the mid 80s that microprocessors started to surpass the big room filling computers of the 60s in terms of performance.
4004 was designed for calculator and not very powerful, but it quickly changed. By 8085 in late 70s, microprocessors has become really powerful.
We used to run 2048 port Central Office switches on a bunch of 8085 @ 12MHz in late 80s, quite comfortably!
@@goofybits8248 yes but they were still 8-bit devices. I know there 16 and 32-bit microprocessors appearing by the late 70s but they very weak numerically, no floating point, basic scaler architectures. So you're right that they were perfect for control tasks but right up until the late 80s there was still a market for big scientific computer because microprocessors wouldn't cut it.
@@goofybits8248 also, wow i never knew there was a 12MHz variant of the 8085
Whoa the last time I came this early, I had to apologise
😂😂😂
legendary comment sir
🤣🤣🤣
@@alycia5532 whasap coldfusion
I still have the Intel catalog with the 4004, 8008 (8 bit version if 4004), 8080, 8080A, and 64 bit memory chips. And I remember my professor at University of Michigan tell his first hand story of the Eniac computer (he had worked on it).
@13:35 The Intel 4004 sold for $60 in 1971 (equivalent to $439 in today’s money - Nov 2022).
"Hard times create strong men, strong men create good times, good times create weak men, weak men create hard times"
Intel, basically.
who's quote is that? real good!
When occasionally creating such sentences it feels really satisfying and for a moment I hoped that it was your creation. The quote is from G. Michael Hopf.
Probably it fits many extraordinary companies of our time, like SpaceX. Intel helped AMD in the early days, maybe AMD can help Intel today.
@@DarkSkay how did Intel helped amd?
The term CPU is older than Intel, which "merely" made the significant step of making a single-chip CPU. Arguably, the major competitor to Intel is not so much AMD or Apple, but TSMC which leads the chip manufacturing technology field. Without this, AMD would not be storming forward; still it was a great move for them to go fabless, without major financial resources for what is now GF.
TSMC can be considered the biggest competitor to Intel, but only in manufacturing chips. In designing them, AMD is their biggest competitor
AMD only went fabless because they were at the brink of bankruptcy, so they sold off their manufacturing division as Global Foundaries. Maybe it ended up being beneficial in the end but don't get me wrong, it was not intentional.
@@kamil.g.m amd almost went bankrupt because of inhel's dirty bribery of retailers and pc makers.
@@alsetalokin88 AMD almost went bankrupt due to their failure to compete at the time, you can't blame intel for that.
@@kamil.g.m do you understand that amd's superior chips (at the time) couldn't sell because of this? no one displayed their chips in store and no contract from pc makers. inhel paid money incentives (bribes) on top of selling them chips at a loss just to kill amd. amd sank to rock bottom and raking up immense debts. it's a miracle they could survive and revive. stop alternative facts.
‘It can’t be OVERSTATED...” not ‘understated’.
Also, it's Advanced Micro DEVICES, not Advanced Micro Machines.
I was looking for this comment.
Chill
0:36 'the impact of the CPU can't be understated.' You mean can't be overstated.
Busicom vitch: Where the logic?
Faggin: (shows block diagram made in one/two nights)
Busicom vitch: This bad! I hate it! Where the logic?
Faggin: Uh um...I don't have any...
Busicom vitch: You bad you bad!
Faggin: Look, I just got the job yesterday!
Busicom vitch: You late!
😂😂😂
"Advanced Micro Devices" - thus the 'D'
wink wink
AMD is top notch, Intel underestimated the market
We've started to make the switch to AMD for processor-intensive tasks without paying a fortune. People do forget, however, the AMD "incompatibility" years. Yikes on that.
Intel had poor execution, they failed not because they underestimate their foe. They spend lots of money to fight ARM, still fail.
@J Fz yes, Intel controlled the market, that's when they fucked up
I love amd for desktops but I find a lot of amds mobile offerings lacking sometimes but that could just be that the companies I buy from short the ram. 3 gigs of usable ram on a computer running windows Hewlett-Packard when ram for laptops is expensive? I'd like to know what you were smoking and if I can have some
this is the most under rated channel on UA-cam
Lets take a moment to appreciate the hard work of these engineers due to which we are able to attain the miraculous present technology 👏🏻👏🏻👏🏻👏🏻👏🏻
Yes we live in a fascinating time, and looking back, only makes us yearn for the future! Thanks for this wonderful video.
BTW: Intel did not invent the CPU, they created the first MICROprocessor.
@Jimbo Bimbo Not all CPUs are microprocessors. The guy who made this video seems to think the two terms are synonyms. At the time of the 4004, CPUs were made of separate components. In the PDP/11, for example, the CPU comprised several circuit boards within the computer. So, to answer your question, the 4004 is a CPU, but it wasn't the first CPU. It was the first microprocessor.
Also the binary states of a computer do NOT represent the state of the transistors. The transistors are used to make logic gates (AND, OR, NOT) and the input and output of those are simple binary values. These gates are then combined to create a complex system of logic which performs calculations using those gates and data of these calculations are the 1s and 0s your computer deals with.
Although one transistor can represent and store one and zero state.
@@thep751 A Transistor doesn't store anything. It's just an electrical switch. Also logic gates don't store anything either. You need a flip-flop or a latch for that, which are made from logic gates, which in turn are made from transistors.
@@NationalSecessionistForces DRAM cell?
Your segway to the ad was just as impressive as your typically impressive videos. I didn't even realize it was an ad until the last 2-3 seconds.
I have a strange craving for chips.
Have some Lay's
No!.... Don't !!! Those aren't..... Yeah, he ate the chips.
I really love the first part, "You are watching COLDFUSION TV". It's very futuristic and appealing to hear and see.❤❤❤
@@alycia5532 FAKE
Degago. You are amazing! I don't know how you do it, but keep it up and make it last.
Yoir Research skills and story telling is amazing (lack of a better word)
I'm really excited to hear you have a book. Over the last two weeks I have become completely hooked on your programming! Very fascinating and insightful.
Thanks for the watch! Waiting on the edge of my seat for part 2!!
fun fact: warren buffett had a chance to invest in intel as a startup with a value of 1 million at the time. he refused
Buffet is against taking risks and I don't blame him look where intel are now, he made a good decision.
@@KallusGarnet but Buffet would have made tons of cash before Intel goes downhill.
@@KallusGarnet You clearly don't understand how anything work.
@@KallusGarnet LMAO “BUFFET IS AGAINST TAKING RISKS”
The traitor movement is lead by someone called robert noyce!
“Noice”
Was looking for this comment before I comment 😂" noice "
@@Darthwarrior
Be my guest
“Noice”
No love for Intel...
@@nunyabizness199
Nani?
Noice
Whenever people summarize how transistors made cpu's possible they always say something to the effect of "they act as switches which are 1's and 0's and packing them all together cpu's can then process billions of these binary digits"... but beyond being switches/binary digits it really needs added that "when place together in groups with certain ones being controlled they can perform calculations". That small little tidbit is the missing layer that really helps explain (on a very high/simple level still) *why* these things, when put together, make a working cpu.
Fairchild: Oui tickle your parts! Oh, the corniness of engineers.
I'm sorry i laughed on this....
Not gonna lie, Fairchild's logo is timeless.
Thanks for not lying
Having worked at Intel for many years, it wasn't hard to see problems coming up quickly.
All courtesy of Israeli Intelligence via selling technology to China/Russia...
Mind expounding my friend?
@@tobiramasenju6290 he probably cant because of NDAs. NDAs are no joke.
What department did you work in?
@@Xabier2020 deception
The semiconductor was one of the most amazing creations of the 20th century.