I interpreted that as meaning the first of its family, what used to be called the “IBM-compatible PC.” Yeah, it’s not literally the first PC, but it’s the first device of the IBM-compatible PC family and was named, “IBM Personal Computer.” I had a bigger disagreement with the “great value for money” part. The IBM PC didn’t offer great value, which is how Compaq and Dell and so on were able to undercut them so quickly. It was a great financial success because, before IBM entered the market, PCs were great but untrusted. People didn’t want to put their business on a device that cost $1500 (Commodore 64, inflation-adjusted) to $5000 (Apple II, inflation-adjusted) and looked like a toy and came from a hobbyist company. But when serious company IBM came out with a ($4000 inflation-adjusted) device without frivolous bitmapped graphics, then that validated the concept for business users. Thus disproving by example the capitalist theory of the efficient market.
I like the fact that AMD managed to deadlock Intel into an agreement that ensures they can't just pull the rug out from under them. Intel owns the x86 instruction set and licenses it to AMD, but AMD owns the x86-64 architecture (or AMD64) and licenses it to Intel, and their attempts at an alternative 64-bit architecture like Itanium have failed miserably.
OR maybe (tinfoil hat on) AMD was artificially kept alive and allowed to live just so Intel doesn't trip into US anti-monopoly law which would crush their profits like rent-control .... neah, that's just tinfoil hat bullshit, no way someone would pull anything mischievous for the sake of profit .... large companies never do that.
I suppose ur generally correct but your math doesnt include the years approximately 6+ years ...the K62 was superior and (came out in 1997) to intels pentium 4 and so forth for a few iterations , plus first to 1ghz(1999) and first to 64bit(2003). .. AMDs multicore strategy was a comparitive failure but intel took forever to move off 4 cores, if AMD had been able to capatilize on their technical success more, they would have been competitive sooner...l dont feel like doing the research but i think u didnt count some other years. when itntel was behind they maintained mind share and likely $$$ share to aid that. the impression of who is ahead isnt known or noticed by the general public. Just as Thomas Edison was a great but still inferior engineer than Westinghouse...but most ppl know the former because his publicity stunt etc and he apparently took some of his engineers credit. The truth is not easy to arrive at in this world of cheating and collusions etc.
@@whiteknight7wayne493K64 beat Intel at one or two price points; Intel ruled the rest. Then after AMD bought ATI it started to bungle everything, almost died, had to sell off its HQ and its fabs, and was saved by 1. Jim Keller handing them Ryzen (for a fee); 2. the sudden growth of crypto mining which sent GPU prices and volumes to Mars; and 3. Intel getting so arrogant, complacent, insular, and clueless that it couldn't make any progress at all for nearly a decade.
Actually things were not that low cost back then with floppy drives costing near $300 and an average Apple II running in the $5000 range. Of course DIY PCs did not exist back then to purchase individual CPUs.
Intel chips are way cheaper than the amd counterparts in my country right now and they're actually flush with stock. Alot of people moving to Intel right now for how much cheaper it is and will be for a good few months to a year.
TL;DR: Everything but AMD and Intel sucked so badly that they died. The end. Hey, it's me two years from the future! Don't take the TL;DR too seriously, it's a SpongeBob reference.
As a computer science student, I think that part of the reason they have taken out the market is that it is VERY hard to make a CPU better, so if you already have a base and dedicated team and resources it is easier to continue on the market than it is for other companies to enter the market, as they won't have the baseline of equipment and reputation
As a fellow computer science student, i agree with you. It's actually very very hard to manufacturer cpus cheaply. And creating a new cpu architecture is nearly impossible lol.
As a human, I totally disagree. Licensing should only be a point to purchase until a certain time has passed. But there is no reason why they can't just copy and paste. There are massive potential profits and I got a feeling politics are stopping the copy and paste from happening.
@@thevindictive6145 well, intel did this with the 8088 processor (I think it was that one), and now we have 2 companies ruling the market. Also, Intel's first microprocessor was made exactly 50 years ago, so even with the best patent/copyright laws they may still have the patent for it, and it is known that companies try to "extend" the lifetime of a patent, but nothing is done (like printer inks and insulin medication). At the end you are right, realising the patents would make it easier, but I still think it would be hard
@@WhyteLis21 only problem is price because... Ya know 1200$ for a phone with badder specs than a 800$ phone and NO CHARGER apple cpu gonna be only the blueprints
Thank you! The first IBM PC wasn't anything of what's described in the video, it was clones and brand recognition that made it popular, and it still took almost 10 years for it to become popular in the home market
Yeah, it was the fact that the hardware could easily be replicated, and the small company known as “Microsoft” was allowed to license the OS to other companies.
@Scar. I'm saying that IBM may've been the ones to invent the overall PC architecture(the overall platform, not the processor ISA) but they were just overly expensive business machines (Not surprising given their name: International Business Machines Corp.) not really intended for the home market before Compaq and several other companies reverse engineered their platform and made their own machines. IBM did a big mistake and used off-the-shelf components so the other guys just had to implement their own BIOS that was compatible with the software used by IBM PC. There's so much more to the whole story so I highly recommend you spend some time reading up on it. :)
Hey! Don't forget Zilog's Z80! The one with built in support for dynamic RAM. And Fairchild made one CPU often used in early TV game consoles, the ones with cartridges. I think that one was called F8. Interesting since the memory chips contained the address register and the CPU just issued base address and then incremented or decremented the address.
@@stonethemason12 Come on, you know it's hip when kids repeat things they constantly see others make in comments like "I felt that" or "plot twist" or "we need to protect him at all costs"
If memory serves, Intel wasn't too keen on licencing x86. It took AMD and Cyrix reverse engineering early Intel chips. And I think IMB x86 CPUs were just rebranded Cyrix ones, because Cyrix desperately needed a fab to make their chips and IBM basically said "Sure we'll make your CPUs, But half of them are going to be IBM CPUs" and they didn't really have a better option.
@@bricefleckenstein9666 They accepted it then, grudgingly. But once that expired and IBM wasn't driving, Intel sued to stop AMD, and the courts exposed how badly they'd bungled the original agreement. Probably cost them about a trillion dollars over 30 years.
The reason for duo-triopoly is very simple and mathematically understandable: Because we allowed to destroy competition on markets and slow down (or completly stop) any new third party development process/company, due to introduction of patents/licenses.
Protip: There isn't only two CPU companies, even for x86. Vortex86 still exists for example. VIA might still be alive too. Elbrus2000 CPUs also can run x86 software through Binary Translation for any OS.
@@arthemis1039 Elbrus WILL be relevant in the consumer PC space. Some russian youtubers already have access to them and are able to play AAA games on them.
Eh, it would need to implement or emulate x86, which has a licensing issue that courts haven't pronounced themselves on yet. Although hopefully Apple's M1, which also runs x86 in part, might give them the right nudge.
@@Blaze6108 All x86 patents expire in year 2026, I believe. That means full emulation is possible then. RISC-V seems to be the future of all CPUs, including IoT, cloud and desktop, if the funding is there.
More like licensing. The idea that implementing an instruction set would be a copyright or patent violation is completely insane to me, but because it is legally untreated ground no one wants to take the risk, and the judicial branch has never taken a position despite how obviously important this is. Government inaction is great isn't it.
That is not correct. Cash is not the prime factor here. The issue is that the x86 architecture is needed, which is licensed by intel, which means, that yes you need money, but even then you need a shit ton of it for all the engineers and whatnot. You can also just make RISC-V CPUs, or AMR CPUs. The problem is that they don't directly support x86 programs. Their instructionsets are different. They use a different basic language for the CPU if that makes it easier to understand. Every program needs to be compiled for these with fingers crossed that you won't have to make massive changes. But even just looking at x86 we had problems for AMD Ryzen which for some programs took years to fix because of some differences like the infinity fabric. It's not just money. It's a lot of many weird things. And the problem is that x86 is a very old (and shitty imo) architecture, that is owned by one company, that theoretically can be forced to license it to others, but the cost of entry is just way too high. If it was a open source architecture like RISC-V that this would make things easier for competitors to arise. Which also makes me wonder why apple didn't go this route.
You guys should do a video on the other processors that made personal computers popular. Like the Zilog z80, MOS Technology 6502, Motorola 68000 and even ARM itself. Intel wasn't the only CPU manufacturer in the market back in the 70s and 80s.
We should retaliate by pronouncing Intel as "EYEN-tel" (as in Einstein) the same manner as Hank Marvin used to pronounce Jean Michel Jarre as "Gin Mitchel Jar... Eh" because Jarre himself pronounced Hank as "Onk" :)
This reminds me of how I say the company Asus' name as Ay-suss (A + Suss, as in sustainable) instead of Linus' Ay-Soos. What Linus says sounds odd to me.
That "Seericks" pronunciation made me question my tech trivia knowledge base for a moment... 😅 Incidentally, in my neck of the woods Asus is usually pronounced "Ah-Soos" due to language convention... 😅
On mobile it can mismatch comments with the wrong videos, it's pretty weird for a few minutes while you try to figure what the heck people are talking about.
Simply put, making CPUs is hard. Very hard. It is said that intel always aims to make the best of their CPUs, for ex. the i9, in each gen because only 10% of the silicon dies are fully functional i9 CPUs. The rest get discarded or repurposed (because CPUs also happen to be PICs). Photolithography is such a delicate process that it often goes wrong. I can also attest to how hard even designing a CPU is. In my university, my electronics professor used to make his students assemble a simple ALU (a section of a CPU dedicated exclusively to doing math) on a breadboard as a lab practice. He says that he stopped doing it because students burnt a lot of the ICs necessary for the practice. It's not that the uni couldn't afford to replace them, it's just that it happened every single semester. Serves as a testament of how easy it is to get it wrong. This is without circuit optimization. Our CPUs would be over twice as large and expensive if not for circuit optimization. I'm never surprised that no one ever competes with Intel, AMD, or other microchip companies. It's not just that there's a lot of expensive paperwork in the middle (patents, copyright, trademarks, licenses), it's also that making a CPU is one of the most difficult tasks in existence.
I remember AMD struggling for years to compete with Intel. Now AMD processors are so powerful! I remember Cyrix too.. that was a long time ago since I heard or read that name.
@WhatTheHell AmIDoingWithMyLife The US cannot forbid other countries from making CPU's, even x86 ones, China produces x86 CPU's for their own internal market using old Cyrex/Via tech. The real reason for so few CPU makers, is that modern CPU's are VERY VERY hard to do, not everyone has the tech and the people to do it, no matter if they cared for patents or not.
Because programmers and operating systems developers used to code for the CPUs of these two companies only. So basically, a CPU without a programmer is useless.
The American implementation of capitalism privileges the convenience of the consumer rather than the health of competing companies. That creates strong winner-takes-all dynamics. Intel created the x86, and once they grew big enough not to need second-source manufacturers anymore, they took arguably illegal steps (anti-trust law violations, lots of lawsuits and investigations and multi-billion-dollar deals to resolve them) to ensure that companies that wanted the fastest x86 processors couldn’t go to other companies for the not-fastest processors. AMD limped along on the back of third-tier manufacturers and PC enthusiasts, especially when they merged with NexGen in 1996 and started making processors that were competitive with the Intel processors of the time. The K7 Athlon was sometimes faster than the fastest Intel processors, and was a major turning point. It enabled them to overcome Intel’s monopolistic hold on the PC manufacturers. But AMD is still far smaller than Intel, not capable of producing nearly as many chips. Basically, Intel’s aggressive and arguably illegal tactics starved the other x86 chip makers so they couldn’t invest the money to keep up, and AMD managed to hang on long enough for the anti-trust cases to resolve in their favor.
At least they're doing auto captions again. I thought they were giving up on that, as most videos I've been watching lately don't have them, even if they have no other captions. Here I was thinking that Pornhub vs. blind people suit would result in sites having to be more accessible, note lss.
Just a quick side note. The Apple PPC G5 was the first 64bit consumer Desktop. And they ditched PowerPC mostly because IBM failed to deliver on a mobile version. We are seeing more ARM based systems though which should be interesting. When PowerPC was still a thing it actually forced Intel and AMD to innovate so hopefully that will start happening again.
Hello from the future! You were spot on! The high performance-per-Watt threat from Apple, Nvidia (Ampere), Broadcomm and others has forced AMD to innovate the most powerful and efficient chips we've ever seen, especially in the mobile and server spaces! And with RISC-V getting a ton of attention lately, it's looking like the processor landscape is shaping up to be more diverse than it's been since the 1980s!
Neither are really competitive over here intel is really expensive and amd is like 50% more expensive than intel. But I got some good deal on the used market so thats a win win, I basically got a new 6 core cpu and neither AMD or intel got money.
Cyrix is actually still around, sort of - it was acquired by VIA, who still make X86 CPUs today. You don't hear about them because their focus is on cheap CPUs with low power consumption. They did, however, partner with a chinese government entity to create CPUs that are in the ballpark of the early Core i-series in performance.
Its red vs blue vs black u forgot ARM cpu company very famous than these 2 combined which is used by apple for iphones mac mini ,mac pro,samsung .mediateck,qualcolm etc.Its also the first in japanese supercomputer Fujitsu with 52 cores and 152 nodes
As someone who worked for an Intel competitor in the early 1980's I can say from experience that it takes more than a spiffy chip to succeed in the general purpose MPU market. The support that potential customers looked for was staggering -- availability of engineering samples, development systems, a selection of operating systems and application software manufacturer's technical support and so on. But THE most important criteria in selection of an MPU chip was did the customer believe the manufacturer could deliver working chips in the needed quantities at the target price when they went to production. This was the driving motivation of IBM in selecting the 8088. IBM engineers were not thrilled with the 8088 for several reasons but IBM decision makers believed Intel could produce the 8088 in the required numbers as promised and offered an upgrade path to the 8086 which was introduced in 1982 with the AT. Once IBM, at that time the most respected name in computers, blessed the Intel offering and a huge third party effort to produce software and hardware add-ons emerged it was pretty much game over for competitors as no other general purpose market was nearly as big as P.Cs. Some companies did well in the workstation market but the volume was much lower and hence even those were doomed. Even the Motorola 68k family, which was clearly architecturally superior to the x86, fizzled out by the 1990s and was discontinued. Some of the competitors morphed their offerings into special purpose mpus for high-volume but low glamour roles like copiers and faxes but the large volume P.C. market was owned by Intel.
For sure. I studied ECET in college and my professors summed up how hard any electronic chip fabrication was. Something along the lines of, “The first chip cost 2 million dollars, the rest are $1 each”.
If you're industrial or military, you can get yourself a very expensive and very secure CPU from a smaller company. Some of them can emulate x86 and be used in workstations, it's just a matter of cost.
ur forgetting3rd and only CPU company Arm which is used by apple for iphones mac mini ,mac pro,samsung .mediateck,qualcolm etc.Its also the first in japanese supercomputer Fujitsu with 52 cores and 152 nodes.also snapdragon x elite with 12 cores is powering this chip which has defeated m2 pro in benchmarks and some intel i7-12 and amd ryzen 7 5th gen cpus
Oh, you can thank markets and Bitcoin for that. Bitcoin created the first overwhelming demand for GPUs which, paired with ebay, showed sellers how much they could charge for GPUs while still moving units. Why keep selling top tier units when you can price the 2080 Ti at $1200 and still sell tons with a higher margin and greater profits? So you can thank cryptocurrencies and gamers who will pay out the ass for hardware from scalpers for the endlessly inflating GPU prices.
@Lycan And the emphasis of maintaining a consistent data type and never intermingle floats unless you really want those imprecisions. Cyrix didn't expect simulations and their critical need of floats.
There is still Elbrus. Yes, it is shitty, always a few years behind, and you cannot even cool it with a liquid vodka cooler, but hey, technically it is a cpu company.
Yes they do, unfortunately "Technically they still exists" is about all you can say about them now with there last CPU coming out in 2011 and being low performance and higher TPD than Atoms of the time and the new one they were working on late in 2019 for release in late 2020 suffering from the cures that was 2020
Yeah remember them on early 2000 on some laptops, usually cheaper than Intel counterparts. Even back then, Intel's price are higher than the competition. But now they are practically gone on PC consumer space. They probably still make chips on appliances, small devices especially IoT. But not anymore on general purpose computing, or at least not most PC's.
"Why there is only two." "Because they bought other companies and/or underpriced their products. And sometimes did a better product." That's the right answer. The same answer for "why Oracle is so big and famous in Database market, as MS SQL Server?"
In Intel’s case, there were arguably illegal contracts with customers. Like how Microsoft made it expensive for large PC companies to preinstall operating systems other than Windows, Intel made it expensive for large PC companies to offer PCs with CPUs other than Intel.
Damn, that Cyrix CPU brings back memories... my first pc had that one ! ...And tbh it DID run quite well for the time. Used that thing till Pentium II got out and later switched to AMD (an overclocked AMD Duron CPU being my first 1ghz one)
my second CPU was a Cyrix. tried to upgrade my 166mmx with a cyrix socket 7 running at 233mhz or something.. it was fine, but no literally no faster. got a P2 and that was that.
@@davidjgreer9477 Same, my first was a 486DX 66mhz, and the second one was a Cyrix 5x86 running at I think 100mhz. My friend had an actual Pentium 100 and it was noticeably better at running Quake than mine, even though according to Cyrix they should have been identical. I ended up going back to Intel and staying there. They're reliable.
you forgot to mention that Intel created their non-backward compatible 64bit processors and launched it 1 year earlier than AMD but tanked and never got widespread support.
You guys should do a video about the differences between using a headphone port on your PC, audio interface, and dedicated DAC (Digital [to] Analog Converter)
This doesn't cover even 5% of it Long story short, intel made sure that there's no competition in mid 00's by selling their CPUs at a big loss to the big companies, which lead to even AMD and VIA to get pushed out of the market almost completely, AMD has finally recovered and VIA/Zhaoxin is getting close enough to intel to compete again
There are only companies making X86 cpus because of patents. It's practically illegal for any company to make X86 cpus because they would be violating a Intel or AMD patent. Same thing with ARM, PowerPCS, Snapdragon CPUs etc. They are all monopolies.
@@owainkanaway8345 well yeah, patents limit use of x86, but well somebody could use x86 without breaching patents as older patents have expired, x86-64 on the other hand is harder to do as those patents won't expire for pretty long time
You should do a video on how most USB-C cables are actually USB2; especially the charging cables that come with most phones. I mistakenly assumed any USB-C cable was USB3 and wondered why file transfers were still so slow.
Before the IBM PC, there was another CPU manufacturer maker that was in just about all the computers people were able to afford. In the early to mid 80s this chip vastly out-sold the IBM and clone machines. Yes, it was the 6502, and variants from MOS technologies. The 6502 was in the: Commodore 64 Commodore PET Apple II series Atari 8 bit line Atari 2600/5200/7800 Nintendo BBC Micro So the early history is quite a bit more complicated. In the early days in the 80s the 6502 machines vastly outsold the IBM clones. PCs won the war for a variety of reasons, but if history had worked out a bit differently, we might all have had ancestors of the 6502 in modern computers rather than ancestors of the 8088.
@@MaddTheSane This would only be the case if the title did not say "Why are there only TWO CPU companies" which means Intel AND AMD. And since they are talking about why there are no other CPU companies, it stands to reason that they mention the other major CPU manufacturer of the 80's and 90's (exclusive OEM to Apple) that didn't make it since that is what the whole video topic is about.
Together AMD and Intel have like 99.9% of the market share. Other companies exist but they aren't relevant and you won't find their parts at Microcenter or Best Buy.
Seems like Intel and AMD have the same kind of professional relationship Pepsi and Coke have. They're "rivals" but also know without the other there's no fun competition driving each other's prices up
I liked that red vs blue reference at the end
Only reason I liked the video
especially since i was disappointed a second earlier when he said politics.. it made the rvb reference much more satisfying
@@La_sagne just wait until the second joke also is about Politics. That joke age pretty badly.
I like how the top comment always spoils the best joke
same. also, as of writing this reply, the comment has exactly 117 reactions. I stayed my hand from liking the comment only so I don't ruin this number
"The 1981 IBM Personal computer was the first PC."
"It blew it's competitors out of the market."
Lol I noticed that too
My first PC was in 1991 a Philips 386sx 16mhz 40MB hdd and 4 mb ram and ms-dos 4.0 and windows 3.0
it was the first computer that made major inroads into homes other than the commodore 64
I interpreted that as meaning the first of its family, what used to be called the “IBM-compatible PC.” Yeah, it’s not literally the first PC, but it’s the first device of the IBM-compatible PC family and was named, “IBM Personal Computer.”
I had a bigger disagreement with the “great value for money” part. The IBM PC didn’t offer great value, which is how Compaq and Dell and so on were able to undercut them so quickly. It was a great financial success because, before IBM entered the market, PCs were great but untrusted. People didn’t want to put their business on a device that cost $1500 (Commodore 64, inflation-adjusted) to $5000 (Apple II, inflation-adjusted) and looked like a toy and came from a hobbyist company. But when serious company IBM came out with a ($4000 inflation-adjusted) device without frivolous bitmapped graphics, then that validated the concept for business users. Thus disproving by example the capitalist theory of the efficient market.
@@biennium992 great narrative, but I would suggest using the original price/year in order to allow people to calculate it by the time they read it.
I like the fact that AMD managed to deadlock Intel into an agreement that ensures they can't just pull the rug out from under them. Intel owns the x86 instruction set and licenses it to AMD, but AMD owns the x86-64 architecture (or AMD64) and licenses it to Intel, and their attempts at an alternative 64-bit architecture like Itanium have failed miserably.
OR maybe (tinfoil hat on) AMD was artificially kept alive and allowed to live just so Intel doesn't trip into US anti-monopoly law which would crush their profits like rent-control .... neah, that's just tinfoil hat bullshit, no way someone would pull anything mischievous for the sake of profit .... large companies never do that.
I suppose ur generally correct but your math doesnt include the years approximately 6+ years ...the K62 was superior and (came out in 1997) to intels pentium 4 and so forth for a few iterations , plus first to 1ghz(1999) and first to 64bit(2003). .. AMDs multicore strategy was a comparitive failure but intel took forever to move off 4 cores, if AMD had been able to capatilize on their technical success more, they would have been competitive sooner...l dont feel like doing the research but i think u didnt count some other years. when itntel was behind they maintained mind share and likely $$$ share to aid that. the impression of who is ahead isnt known or noticed by the general public. Just as Thomas Edison was a great but still inferior engineer than Westinghouse...but most ppl know the former because his publicity stunt etc and he apparently took some of his engineers credit. The truth is not easy to arrive at in this world of cheating and collusions etc.
@@whiteknight7wayne493K64 beat Intel at one or two price points; Intel ruled the rest. Then after AMD bought ATI it started to bungle everything, almost died, had to sell off its HQ and its fabs, and was saved by 1. Jim Keller handing them Ryzen (for a fee); 2. the sudden growth of crypto mining which sent GPU prices and volumes to Mars; and 3. Intel getting so arrogant, complacent, insular, and clueless that it couldn't make any progress at all for nearly a decade.
Similar to how Coke and Pepsi are "rivals" but in reality depend on each other for marketing and popularity
Eventually they'll just merge into 1 company. Governments will find some BS excuse why it's not a monopoly, and that'll be that.
plot twist: the guy is Linus in disguise but more chill
It's a deep fake, secretly the person behind is Linus
Ye
You can tell by the LTT sweater
and funny too
It’s Linus but in the 90’s and funny
1:49 for almost my entire life I finally know what AMD stands for
Level Up: Intel stands for Integrated Electronics.
@@drright71 nice lol lol
A simple google search would fullfil your quest long before.
@@drright71 *mindblown *
@@md.mehedihasan9348 the thing is, I didn't even knew that AMD is an abbreviation
A couple of years ago: "why is there only one CPU company?"
I'd like to think we've made 100% progress
Underrated comment
I built my first AMD rig in the early 2000s.
Pretty sure AMD has been around for a long long time.
@@DiarrheaBubbles hasnt been relevant for a while tho
@@jklusky2425 that's literally refuted by the video tho ;p
200%
fun fact: AMD used to be team green :-)
Can confirm, my stock cooler on my FX-8120 has the green logo. And also runs like shit.
Amd basically intel
nvidia was bigger green brand at any time
ATI was team red. AMD bought team red in 2006.
@@sharcc2511 I also have a 8350 stock cooler. That thing is hilariously bad. It seems like they build it like shit on purpose.
A "low-cost, but powerful" Intel CPU. How times have changed.
"...because of their high performance per watt..."
Actually things were not that low cost back then with floppy drives costing near $300 and an average Apple II running in the $5000 range. Of course DIY PCs did not exist back then to purchase individual CPUs.
They are actually low cost but powerful now that amd is king and kicked the prices up like 50-100 dollars
Intel chips are way cheaper than the amd counterparts in my country right now and they're actually flush with stock. Alot of people moving to Intel right now for how much cheaper it is and will be for a good few months to a year.
And yet, if you look at what you are feeding into it, it's comparatively quite lame.
TL;DR: Everything but AMD and Intel sucked so badly that they died. The end.
Hey, it's me two years from the future!
Don't take the TL;DR too seriously, it's a SpongeBob reference.
Thanks saved minutes of my life
Thank you
Oh well, I'm gonna watch it anyway
Thx
@@abagel186 welcome
As a computer science student, I think that part of the reason they have taken out the market is that it is VERY hard to make a CPU better, so if you already have a base and dedicated team and resources it is easier to continue on the market than it is for other companies to enter the market, as they won't have the baseline of equipment and reputation
As a fellow computer science student, i agree with you. It's actually very very hard to manufacturer cpus cheaply. And creating a new cpu architecture is nearly impossible lol.
As a dentist, I agree
As a highschooler who wants to pursue a field that works with computers, yes
As a human, I totally disagree. Licensing should only be a point to purchase until a certain time has passed. But there is no reason why they can't just copy and paste. There are massive potential profits and I got a feeling politics are stopping the copy and paste from happening.
@@thevindictive6145 well, intel did this with the 8088 processor (I think it was that one), and now we have 2 companies ruling the market. Also, Intel's first microprocessor was made exactly 50 years ago, so even with the best patent/copyright laws they may still have the patent for it, and it is known that companies try to "extend" the lifetime of a patent, but nothing is done (like printer inks and insulin medication). At the end you are right, realising the patents would make it easier, but I still think it would be hard
Yo I love this guy. His voice is like BUTTER
Still a ripoff from that guy from ltt though.
@@azyrael96 ikr, these guys have literally no shame in stealing voices🙄
too sweet even youtube thinks he speaking korean
@@azyrael96 you are kidding, right?
@@timothynye438 of course he’s kidding
Meanwhile the Chinese happily make x86 CPU's with the old Cyrix/VIA license.
lol
It is looking like x86 cpus will be obsolete soon with the rise of ARM. But I hope that RISC V CPUs replace ARM eventually.
@@GeoTechLand advancements in virtualisation will render cpu architecture obsolete for personal computers.
@@anona1443 > cpu architecture obsolete
That's... not how that works at all.
@@EDToasty I actually meant "immaterial"
Imagine if Qualcomm randomly said that they're gonna start making a PC cpu, the earth would shatter
If Apple hadn't switch to ARM for their M1 chip, I though, Apple would be the one, though. Lol.
@@WhyteLis21 only problem is price because... Ya know 1200$ for a phone with badder specs than a 800$ phone and NO CHARGER apple cpu gonna be only the blueprints
@@N3345 👍😁
@@brandaccount4968 Well then, you better keep up. It's not 2012, it's 2021! 😆
Didn’t Qualcomm make the Surface cpu?
*"Red pill or blue pill"*
This shit is relevant everywhere
L vs Light
Literally
Purple pill or gay pill (RGB pill)
Recommended: gay pill
Red wire or blue wire on bomb
Not in the graphics card domain.
I love how Zilog (maker of the Z80 processor) is technically still around and being used.
TI graphing calculators, right?
The IBM was anything but affordable at the time. We have the clones to thank for most of the popularity :)
Thank you!
The first IBM PC wasn't anything of what's described in the video, it was clones and brand recognition that made it popular, and it still took almost 10 years for it to become popular in the home market
The gang of nine made it a thing
Yeah, it was the fact that the hardware could easily be replicated, and the small company known as “Microsoft” was allowed to license the OS to other companies.
@Scar. The IBM PC wasn’t THAT popular when compared to the IBM compatible (or clone) PC’s
@Scar. I'm saying that IBM may've been the ones to invent the overall PC architecture(the overall platform, not the processor ISA) but they were just overly expensive business machines (Not surprising given their name: International Business Machines Corp.) not really intended for the home market before Compaq and several other companies reverse engineered their platform and made their own machines. IBM did a big mistake and used off-the-shelf components so the other guys just had to implement their own BIOS that was compatible with the software used by IBM PC. There's so much more to the whole story so I highly recommend you spend some time reading up on it. :)
Nobody:
Subtitles: Korean
Edit: erm, they roll out their own subtitle, the joke can considered dead.
nobody nobody tired of this
and... the Korean is gibberish too, it's doesn't even seem to be a phonetic transcription, just word salad.
At least I'm Korean and can read it, but it's just words... doesn't match
😂
@@JanghanHong it's trying its best, some parts are phonetically similar
One economic principle: Economies of Scale.
Yeah. Silicon engraving machines are expensive.
Also patent trolling to hell. Trying to make x86 CPU without getting trounced by intel and AMD patents is quite impossible. Just ask nvidia.
@@dan_loup Another Economic Principle: Regulatory Capture.
Not how it works lol
@@devinotero1798 Hahahahahehehehehhohohohowhohowhwohwho 😅🤣🤣😅😂😂😅😅🤣😂😂😅🤣
Riley: Why are there only two cpu companies?
Apple: Who said there were only two?
When apple started to make pc CPUs
@@somabiswas2629 Don’t you mean Mac cpus
@@hexados7479 I mean Processors not only for Mac or iMac, for many PCs like Intel
@@somabiswas2629 true. they only make cpus for mac and you cant build a custom mac.
the heglian dialectic. thats why.
( and id like to see you build a mac by yourself)
Hey! Don't forget Zilog's Z80! The one with built in support for dynamic RAM. And Fairchild made one CPU often used in early TV game consoles, the ones with cartridges. I think that one was called F8. Interesting since the memory chips contained the address register and the CPU just issued base address and then incremented or decremented the address.
"Two CPU companies"
Correction: *Two x86 CPU companies* .
Correction: Two CPU companies that make CPUs that actually work for desktops.
@@wta1518 You have to count laptops too.
@@saricubra2867 Sorry. Two CPU companies that make CPUs that actually work for desktops or laptops without emulation.
@Jobins John Most, not all.
@@wta1518 LOL 68000 what was that insignificant blip.
When he said "Please colonel, I'm married", I felt that
Really. You felt it.
@@stonethemason12 It's devastating. You're devastated right now.
Remember, KFC made a dating game with the colonel.
@@stonethemason12 Come on, you know it's hip when kids repeat things they constantly see others make in comments like "I felt that" or "plot twist" or "we need to protect him at all costs"
@@jackalsandwolves3693 I was trying to forget about that
3:24 After the "now, of course" I expected Riley to say "its time for the quick bits"
it's*
@@CreeperPookie how old are you? You seem pretentious
If memory serves, Intel wasn't too keen on licencing x86. It took AMD and Cyrix reverse engineering early Intel chips.
And I think IMB x86 CPUs were just rebranded Cyrix ones, because Cyrix desperately needed a fab to make their chips and IBM basically said "Sure we'll make your CPUs, But half of them are going to be IBM CPUs" and they didn't really have a better option.
No, it took IBM *requiring* a second source - which ended up being AMD.
@@bricefleckenstein9666And Intel bungling that license agreement.
@@blairhoughton7918 More that Intel didn't worry about having AMD as the second source, once they decided to put up with it due to IBM pressure.
@@bricefleckenstein9666 They accepted it then, grudgingly. But once that expired and IBM wasn't driving, Intel sued to stop AMD, and the courts exposed how badly they'd bungled the original agreement. Probably cost them about a trillion dollars over 30 years.
@@blairhoughton7918 The third party agreement didn't EXPIRE.
Intel sued to END it, as IBM was no longer worrying about enforcing it.
The reason for duo-triopoly is very simple and mathematically understandable:
Because we allowed to destroy competition on markets and slow down (or completly stop) any new third party development process/company, due to introduction of patents/licenses.
Protip: There isn't only two CPU companies, even for x86. Vortex86 still exists for example. VIA might still be alive too. Elbrus2000 CPUs also can run x86 software through Binary Translation for any OS.
Yes indeed, but they are not relevant in the Consumer PC space :)
I thought via was Nvidia now or is that just me
@@arthemis1039 Elbrus WILL be relevant in the consumer PC space. Some russian youtubers already have access to them and are able to play AAA games on them.
@@sharoyveduchi I hope you are right !
Both Apple and Microsoft are moving to Arm. So in a couple of years we will have many more options for the PC
RISC-V is on the horizon, and might change the Intel-AMD duopoly. It's an interesting time we live in
plis, i have a lot of hope about a future with RISC-V
Eh, it would need to implement or emulate x86, which has a licensing issue that courts haven't pronounced themselves on yet. Although hopefully Apple's M1, which also runs x86 in part, might give them the right nudge.
If I was AMD or Intel, I'd be researching and developing RISC-V for a post-x86 world.
Haha good one
@@Blaze6108 All x86 patents expire in year 2026, I believe. That means full emulation is possible then. RISC-V seems to be the future of all CPUs, including IoT, cloud and desktop, if the funding is there.
Because nobody else has enough cash and/or the right connections for the hardware to make them
ua-cam.com/video/dQw4w9WgXcQ/v-deo.html
Unless it's Elon musk ofc :p
Imagine if he could though oml
bruh
More like licensing. The idea that implementing an instruction set would be a copyright or patent violation is completely insane to me, but because it is legally untreated ground no one wants to take the risk, and the judicial branch has never taken a position despite how obviously important this is. Government inaction is great isn't it.
That is not correct. Cash is not the prime factor here.
The issue is that the x86 architecture is needed, which is licensed by intel, which means, that yes you need money, but even then you need a shit ton of it for all the engineers and whatnot. You can also just make RISC-V CPUs, or AMR CPUs. The problem is that they don't directly support x86 programs. Their instructionsets are different. They use a different basic language for the CPU if that makes it easier to understand.
Every program needs to be compiled for these with fingers crossed that you won't have to make massive changes.
But even just looking at x86 we had problems for AMD Ryzen which for some programs took years to fix because of some differences like the infinity fabric.
It's not just money. It's a lot of many weird things.
And the problem is that x86 is a very old (and shitty imo) architecture, that is owned by one company, that theoretically can be forced to license it to others, but the cost of entry is just way too high.
If it was a open source architecture like RISC-V that this would make things easier for competitors to arise.
Which also makes me wonder why apple didn't go this route.
Lol at the subtitles dubbing “non-x86” as “NaN x86” at 3:42
So what's funny?...mistakes are everywhere
@@anmol9886 Still funny to me.
You guys should do a video on the other processors that made personal computers popular. Like the Zilog z80, MOS Technology 6502, Motorola 68000 and even ARM itself. Intel wasn't the only CPU manufacturer in the market back in the 70s and 80s.
You should check out the channel LowSpecGaming, covers all of the processors you mentioned(including ARM) other than the Motorola 68000
Wait. Isn't Cyrix pronounced like """"Sairix""""?
Who are you asking?
Samir Naga... Naga ... Naga... Not gonna work here anymore, anyway
See-rix
I knew someone who worked for them, he pronounced it Sigh-Ricks. And yes, I am old AF.
LOL I don't think anybody knows
I've heard all 3:
sir-ix
seer-ix
sy-rix
To this day I was convinced, it's pronounced "Ci-Rix", like the Y in Cyprus is an "I" - Was I wrong my whole life?
No, just some people are too lazy to find the correct pronunciation: ua-cam.com/video/Malr_1Gx62Q/v-deo.html
We should retaliate by pronouncing Intel as "EYEN-tel" (as in Einstein) the same manner as Hank Marvin used to pronounce Jean Michel Jarre as "Gin Mitchel Jar... Eh" because Jarre himself pronounced Hank as "Onk" :)
This reminds me of how I say the company Asus' name as Ay-suss (A + Suss, as in sustainable) instead of Linus' Ay-Soos. What Linus says sounds odd to me.
That "Seericks" pronunciation made me question my tech trivia knowledge base for a moment... 😅
Incidentally, in my neck of the woods Asus is usually pronounced "Ah-Soos" due to language convention... 😅
I think it just depends on your region/country.
Heard a guy the other day pronounce Linux with a long "I" sound like Linus except with an "X"
UA-cam: this video has 13 comments!!
Me: can I see them?
UA-cam: No.
@@applesilicon6863 algorithm so drunk that it's influencing UA-cam's code
@@applesilicon6863 yeet
On mobile it can mismatch comments with the wrong videos, it's pretty weird for a few minutes while you try to figure what the heck people are talking about.
Simply put, making CPUs is hard. Very hard. It is said that intel always aims to make the best of their CPUs, for ex. the i9, in each gen because only 10% of the silicon dies are fully functional i9 CPUs. The rest get discarded or repurposed (because CPUs also happen to be PICs). Photolithography is such a delicate process that it often goes wrong.
I can also attest to how hard even designing a CPU is. In my university, my electronics professor used to make his students assemble a simple ALU (a section of a CPU dedicated exclusively to doing math) on a breadboard as a lab practice. He says that he stopped doing it because students burnt a lot of the ICs necessary for the practice. It's not that the uni couldn't afford to replace them, it's just that it happened every single semester. Serves as a testament of how easy it is to get it wrong.
This is without circuit optimization. Our CPUs would be over twice as large and expensive if not for circuit optimization.
I'm never surprised that no one ever competes with Intel, AMD, or other microchip companies. It's not just that there's a lot of expensive paperwork in the middle (patents, copyright, trademarks, licenses), it's also that making a CPU is one of the most difficult tasks in existence.
The "How did this happen?" in the beginning remembered me of bill wurtz's "history of the entire world, i guess".
I remember AMD struggling for years to compete with Intel. Now AMD processors are so powerful!
I remember Cyrix too.. that was a long time ago since I heard or read that name.
Original title in case it got changed: Why are there only two CPU companies?
They do seem to be renaming video titles pretty often these days.
Just seeing what works best with YT analytics
you can also see the original title if you have notifications on
Why would it get changed?
Please fix your stuff UA-cam. Nobody wants the title change meta, the only reason it exists is because your algorithm wants it to exist.
I feel like I still don't know why there's only two...
Because there are many more...
Yeah it barely touche Don it but licenses and probably because it's a very capital intensive industry
@WhatTheHell AmIDoingWithMyLife The US cannot forbid other countries from making CPU's, even x86 ones, China produces x86 CPU's for their own internal market using old Cyrex/Via tech. The real reason for so few CPU makers, is that modern CPU's are VERY VERY hard to do, not everyone has the tech and the people to do it, no matter if they cared for patents or not.
Because programmers and operating systems developers used to code for the CPUs of these two companies only. So basically, a CPU without a programmer is useless.
The American implementation of capitalism privileges the convenience of the consumer rather than the health of competing companies. That creates strong winner-takes-all dynamics.
Intel created the x86, and once they grew big enough not to need second-source manufacturers anymore, they took arguably illegal steps (anti-trust law violations, lots of lawsuits and investigations and multi-billion-dollar deals to resolve them) to ensure that companies that wanted the fastest x86 processors couldn’t go to other companies for the not-fastest processors.
AMD limped along on the back of third-tier manufacturers and PC enthusiasts, especially when they merged with NexGen in 1996 and started making processors that were competitive with the Intel processors of the time. The K7 Athlon was sometimes faster than the fastest Intel processors, and was a major turning point. It enabled them to overcome Intel’s monopolistic hold on the PC manufacturers. But AMD is still far smaller than Intel, not capable of producing nearly as many chips.
Basically, Intel’s aggressive and arguably illegal tactics starved the other x86 chip makers so they couldn’t invest the money to keep up, and AMD managed to hang on long enough for the anti-trust cases to resolve in their favor.
Daaaaaamn! Riley coming in with that Linus level segue into the sponsor! Well done, my man.
I hope there's more CPU companies in the future.
There already are. Everyone's phones have CPUs. Almost all of them Arm cores in Qualcomm SoCs fabbed by TSMC, but, hey, it's at least one...
*Arm joined the chat*
Back in the day we pronounced it Ceye - Rix
Nvidia saw this video and didn't get AMD, they got MAD.
I was hoping you would actually discuss some of the "actual manufacturers" and Fabs and how they are "rebranded" as another manufacturer.
When I clicked on this video I was like: why is Linus voicing over this video?
Sameeee
@Hexor nah
3:54
I see a Red vs Blue reference, I get happy.
Is it just me or does the background look very Team Green lol
According to UA-cam (and its automatic subtitles system), this video is in Korean.
Holy cow thanks for pointing this out
yea that is so funny
At least they're doing auto captions again. I thought they were giving up on that, as most videos I've been watching lately don't have them, even if they have no other captions.
Here I was thinking that Pornhub vs. blind people suit would result in sites having to be more accessible, note lss.
@@nobody7817 Not at the time when this video was released. It was only Korean.
At least it isn't another video with Vietnamese auto captions.
I would love to see the two companies fight it out in a game of TF2.
You already know why.
Back in the socket 7 era AMD, Intel and Cyrix used to be friends and lived happily together, then came Slot 1/A and the rest is history.
Just a quick side note. The Apple PPC G5 was the first 64bit consumer Desktop. And they ditched PowerPC mostly because IBM failed to deliver on a mobile version. We are seeing more ARM based systems though which should be interesting. When PowerPC was still a thing it actually forced Intel and AMD to innovate so hopefully that will start happening again.
Hello from the future! You were spot on! The high performance-per-Watt threat from Apple, Nvidia (Ampere), Broadcomm and others has forced AMD to innovate the most powerful and efficient chips we've ever seen, especially in the mobile and server spaces! And with RISC-V getting a ton of attention lately, it's looking like the processor landscape is shaping up to be more diverse than it's been since the 1980s!
once upon a time, it was Intel and AMD
Now it's just AMD and Intel
Neither are really competitive over here intel is really expensive and amd is like 50% more expensive than intel.
But I got some good deal on the used market so thats a win win, I basically got a new 6 core cpu and neither AMD or intel got money.
Cyrix is actually still around, sort of - it was acquired by VIA, who still make X86 CPUs today. You don't hear about them because their focus is on cheap CPUs with low power consumption. They did, however, partner with a chinese government entity to create CPUs that are in the ballpark of the early Core i-series in performance.
@@TheCallMeCrazy good to know that. my first one was an IBM Cyrix back in '95
Ask about VR headsets for next tech quickie
Its red vs blue vs black u forgot ARM cpu company very famous than these 2 combined which is used by apple for iphones mac mini ,mac pro,samsung .mediateck,qualcolm etc.Its also the first in japanese supercomputer Fujitsu with 52 cores and 152 nodes
As someone who worked for an Intel competitor in the early 1980's I can say from experience that it takes more than a spiffy chip to succeed in the general purpose MPU market. The support that potential customers looked for was staggering -- availability of engineering samples, development systems, a selection of operating systems and application software manufacturer's technical support and so on. But THE most important criteria in selection of an MPU chip was did the customer believe the manufacturer could deliver working chips in the needed quantities at the target price when they went to production. This was the driving motivation of IBM in selecting the 8088. IBM engineers were not thrilled with the 8088 for several reasons but IBM decision makers believed Intel could produce the 8088 in the required numbers as promised and offered an upgrade path to the 8086 which was introduced in 1982 with the AT. Once IBM, at that time the most respected name in computers, blessed the Intel offering and a huge third party effort to produce software and hardware add-ons emerged it was pretty much game over for competitors as no other general purpose market was nearly as big as P.Cs. Some companies did well in the workstation market but the volume was much lower and hence even those were doomed. Even the Motorola 68k family, which was clearly architecturally superior to the x86, fizzled out by the 1990s and was discontinued. Some of the competitors morphed their offerings into special purpose mpus for high-volume but low glamour roles like copiers and faxes but the large volume P.C. market was owned by Intel.
So, basically... Intelectual "Propriety" protected by the State. That's why.
The State. That thing is an old, archaic, barbarous relic. Ancient, outdated, and obsolete! Unlike like the Gold that I hold. 😀
I also thought about that , maybe there is more but the other ones arent as good
TL;DR: It's expensive and hard to become a cpu company
like really expensive and hard
For sure. I studied ECET in college and my professors summed up how hard any electronic chip fabrication was. Something along the lines of, “The first chip cost 2 million dollars, the rest are $1 each”.
@@Frizzy9000
Nanoscale transistors and fabrication.
Surprised Amazon hasn't tried to get involved. They seem to like destroying every other industry.
@@TraumaER yeah, except the phone industry
especially the phone industry
If you're industrial or military, you can get yourself a very expensive and very secure CPU from a smaller company. Some of them can emulate x86 and be used in workstations, it's just a matter of cost.
ur forgetting3rd and only CPU company Arm which is used by apple for iphones mac mini ,mac pro,samsung .mediateck,qualcolm etc.Its also the first in japanese supercomputer Fujitsu with 52 cores and 152 nodes.also snapdragon x elite with 12 cores is powering this chip which has defeated m2 pro in benchmarks and some intel i7-12 and amd ryzen 7 5th gen cpus
no one:
the set: casually has 3k worth of cpus lying around
Are those 2 cpus really worth that much?
@@Clangdon0148 Yeah, 10900k about $600, plus a threadripper, which depending on the model could go anywhere from $1400 to $4000
@@_rileyweaver_637 ah ok, I couldn’t tell what the Ryzen cpu was
Better question would be why are gpus very expensive? like come on $500 would get you the best 6years ago
Oh, you can thank markets and Bitcoin for that. Bitcoin created the first overwhelming demand for GPUs which, paired with ebay, showed sellers how much they could charge for GPUs while still moving units. Why keep selling top tier units when you can price the 2080 Ti at $1200 and still sell tons with a higher margin and greater profits? So you can thank cryptocurrencies and gamers who will pay out the ass for hardware from scalpers for the endlessly inflating GPU prices.
Was just wondering this. Bought a $200 laptop today that runs like garbage but for $400 I could have gotten top of the line back in 2010.
@@olsonbryce777 Top of the line for 400$ laptop? That’s bullshit, even in 2010.
@@Demmrir Ethereum created the demands for GPUs. Bitcoin was mined in CPUs first, then ASICs
integers: whole numbers with no decimal, floating points: numbers with decimal point. is simple, should have just said that :P
@Lycan
And the emphasis of maintaining a consistent data type and never intermingle floats unless you really want those imprecisions.
Cyrix didn't expect simulations and their critical need of floats.
Remember Cyrix chips? They knew of DLC before Steam did. I started out with a 486DLC, man!
That giggle at the start makes me WANT TO WORK THERE. It seems so fun.
This is why we need ARM and RISC 😁👍🏻
ARM is RISC ...
ARM = Nvidia
Both Intel and AMD cpus are internally RISC. Even Ryzen have a separate ARM cpu for its Platform Security Processor.
@@TakZ000modern x86 cpus are CISC tho
There is still Elbrus.
Yes, it is shitty, always a few years behind, and you cannot even cool it with a liquid vodka cooler,
but hey, technically it is a cpu company.
@@TheLoy71 and he is not American.
Alternative title: The pals that hate each other
I had an EMachines with a Cyrix. It was terrible. Moving to an AMD K6-2 blew me away.
Do a video on why we don't see horizontal PCs anymore!
I love those!
Because it takes less space on the floor or desk? Or what answer are you anticipating?
oh yeah.....riley!!!!
for a sec , i thought its radbrad
@@edeshkumar9686 me too
Why do you have theradbrad profile picture
@@alertsaucer4104 he's my inspiration. So I used it
Technically, VIA also still exists in the x86 space.
Yes they do, unfortunately "Technically they still exists" is about all you can say about them now with there last CPU coming out in 2011 and being low performance and higher TPD than Atoms of the time and the new one they were working on late in 2019 for release in late 2020 suffering from the cures that was 2020
Yeah remember them on early 2000 on some laptops, usually cheaper than Intel counterparts. Even back then, Intel's price are higher than the competition.
But now they are practically gone on PC consumer space. They probably still make chips on appliances, small devices especially IoT. But not anymore on general purpose computing, or at least not most PC's.
"Why there is only two."
"Because they bought other companies and/or underpriced their products. And sometimes did a better product."
That's the right answer. The same answer for "why Oracle is so big and famous in Database market, as MS SQL Server?"
bought*, That's*, missed a quote before "That's*"
@@CreeperPookie Thanks, Mr
Spell Checker
@@AlexeiDimitri No problem.
In Intel’s case, there were arguably illegal contracts with customers. Like how Microsoft made it expensive for large PC companies to preinstall operating systems other than Windows, Intel made it expensive for large PC companies to offer PCs with CPUs other than Intel.
You are really rocking that Ned Flanders look! If you keep growing the moustache, you may eventually achieve a full Yosemite Sam.
Damn, that Cyrix CPU brings back memories... my first pc had that one !
...And tbh it DID run quite well for the time. Used that thing till Pentium II got out and later switched to AMD (an overclocked AMD Duron CPU being my first 1ghz one)
my second CPU was a Cyrix. tried to upgrade my 166mmx with a cyrix socket 7 running at 233mhz or something.. it was fine, but no literally no faster. got a P2 and that was that.
@@davidjgreer9477 Same, my first was a 486DX 66mhz, and the second one was a Cyrix 5x86 running at I think 100mhz. My friend had an actual Pentium 100 and it was noticeably better at running Quake than mine, even though according to Cyrix they should have been identical. I ended up going back to Intel and staying there. They're reliable.
you forgot to mention that Intel created their non-backward compatible 64bit processors and launched it 1 year earlier than AMD but tanked and never got widespread support.
Thank God
Who's watching this after the release of the Qualcomm X Elite?
Lmao there are 100+ CPU companies, it's just that their CPU's are no where as good as the ones from AMD or Intel
I got the Pulsway ad that Linus made even tho I'm using the adblocker. Linus knows how to work his way around IT things
You guys should do a video about the differences between using a headphone port on your PC, audio interface, and dedicated DAC (Digital [to] Analog Converter)
3:54
"Hey."
"Yeah?"
"You ever wonder why we're here?"
That's one of life's great mysteries
Ha!
This doesn't cover even 5% of it
Long story short, intel made sure that there's no competition in mid 00's by selling their CPUs at a big loss to the big companies, which lead to even AMD and VIA to get pushed out of the market almost completely, AMD has finally recovered and VIA/Zhaoxin is getting close enough to intel to compete again
yea, this is like buzz feed for tech enthusiasts, lol
There are only companies making X86 cpus because of patents. It's practically illegal for any company to make X86 cpus because they would be violating a Intel or AMD patent. Same thing with ARM, PowerPCS, Snapdragon CPUs etc. They are all monopolies.
@@owainkanaway8345 well yeah, patents limit use of x86, but well somebody could use x86 without breaching patents as older patents have expired, x86-64 on the other hand is harder to do as those patents won't expire for pretty long time
I feel bad for linus, he has to keep reading these bot comments
is this bot comment too?
It's cute that you think Linus doesn't have people to do that for him.
@Yitzy He wouldn't have time to do anything else. So I don't take that literally.
@@patrickweaver1105 Ppl believe the dumbest of things...
@@bsigns1935 nah b
wow, integer and floating point, trems i havent heard in decades
I see Techquickie getting all "Between Two Ferns" back there...
apple watching this: :/
Apple makes chips only for themselves
They have complete control over hardware and software
Plus a mid range CPU from AMD is better than an M3 chip
Has anyone noticed that the three “teams” in the tech world (nvidia, intel and amd) are RGB?
Not only that nitendo, Xbox and PlayStation are RGB tho
AMD is green. RTG is Red.
Holy music stops
Apple: Am I a joke to you?
Most People: Yes.
Very yes.
IBM: All the yes.
Definitely yes
Apple never Gave their CPU’s to anyone , If they did , most flagships would’ve been using their A-series chips rather than Snapdragon
Doesn't apple have the single most powerful CPU that you can put in a phone
@@cybercery5271 Kinda. Qualcomm is a little bit better now with the s21
You should do a video on how most USB-C cables are actually USB2; especially the charging cables that come with most phones. I mistakenly assumed any USB-C cable was USB3 and wondered why file transfers were still so slow.
Apple: *"there is another"*
I think apple also uses Intel CPU‘s... Correct me
@@usuckmf3488 They also develop their own M chips
@@דיפי That's not desktop yet
@@deadlypyre with the mac mini yes
@@mastenn those aren't ment to be actual desktops bruh
"At least THIS Red vs Blue fight won't involve politics"
BUUUUUUUUUUURRRRRNNNN ROOSTERTEETH!!!
Has literally nothing to do with rt lol
Short answer is because anti-monopoly laws aren't remotely effective and most politicians too weak or corrupt to dare break up the monopolies.
"two" != "mono"
I swear I've heard a desktop cpu manufacturer called VIA in 2000s but it doesn't seemed to last long and disappear not long after I've heard about it.
Before the IBM PC, there was another CPU manufacturer maker that was in just about all the computers people were able to afford. In the early to mid 80s this chip vastly out-sold the IBM and clone machines.
Yes, it was the 6502, and variants from MOS technologies. The 6502 was in the:
Commodore 64
Commodore PET
Apple II series
Atari 8 bit line
Atari 2600/5200/7800
Nintendo
BBC Micro
So the early history is quite a bit more complicated. In the early days in the 80s the 6502 machines vastly outsold the IBM clones. PCs won the war for a variety of reasons, but if history had worked out a bit differently, we might all have had ancestors of the 6502 in modern computers rather than ancestors of the 8088.
Surprised you did a history of CPU manufacturers and didn't mention Motorola.
This was only a history of x86 CPUs.
@@MaddTheSane This would only be the case if the title did not say "Why are there only TWO CPU companies" which means Intel AND AMD. And since they are talking about why there are no other CPU companies, it stands to reason that they mention the other major CPU manufacturer of the 80's and 90's (exclusive OEM to Apple) that didn't make it since that is what the whole video topic is about.
3:57 ah Linus taught him well.
Riley, its not "See-rix"
This was too quick :) Not a word for VIA and in the 80s there were many clones behind the iron curtain.
You forget the Russian one - Elbrus-8S.
Together AMD and Intel have like 99.9% of the market share. Other companies exist but they aren't relevant and you won't find their parts at Microcenter or Best Buy.
Seems like Intel and AMD have the same kind of professional relationship Pepsi and Coke have. They're "rivals" but also know without the other there's no fun competition driving each other's prices up
Its not about fun. If there is only one giant everybody wants to be the next giant or the giant killer.