The reason Moore's law no longer applies is because Intel have reached the smallest possible size of transistors that can be made of silicon. What's interesting is that it wasn't technology that impeded Moore's Law - it was the physical material that it was being made of that stopped progress. 14nm is the smallest manufacturable size possible - any smaller is said to be too difficult to manufacture, too fragile, or most importantly follow a drop-off in efficiency/size. The next big step would be to _stack_ 14nm transistors into a package - something that was realized only just a few years ago. Two or three iterations of that and we might breach 10ghz. But by then we should have quantum computing :D
What about using carbon nanotube transistors. I heard IMB made a 7nm transistor with them. Also I read somewhere that silicone transistor size can be smaller, but then you enter the realms of quantum tunnelling, or something like that.
so you play chess without using your brain? because your brain is basicly a biological computer. and all computers have a cpu. it's even been tested that a human brain has an average of 25 petabytes of memory. (1 petabyte is 1024 terabytes for those that don't know)
I will say a few things. AMD had 50% of the market in early 2000. Not only that. Even after Intel got the 64-bit license they struggled to compete with the Athlon. Stop pretending like Intel is a god. It's not. I will admit that Intel probably spared AMD at the start.
Luka David Torkar just like Microsoft saved apple's ass in... 97 i think it was, nobody said intel was god though, but atm they have the market and the more powerful CPUs. By a significant margin
G33RsofDeath Ture. But some of them also cost a lot more. I mean a dual core for 150 €?! He was saying as if intel is playing with AMD and letting it catch up cause they don't want to do that. If intel would have a choice they would destroy AMD by lowering prices on their i3s and i5s. The main point why intel is setting prices for their CPUs so high is because they are afraid of becoming a monopoly. That would lead to the company splitting up. They already had problems with the law when they tried to push AMD as low as they could but then got cought and now they have to pay 1 bilion dollars to the EU.
Luka David Torkar intel will never fully kill amd because there are laws against monopoly a d they would end up having to pay a fucking shit ton of money in taxes so it's not in their interest
you forgot to mention that intel paid the producers of aida64 to make it so that AMD chips run slower, effectively making it seem that they were much faster than AMD, which wasn't the case. AMD just won over a billion dollars in a lawsuit because of this.
Donnie www.anandtech.com/show/3839/intel-settles-with-the-ftc From the article:Intel reworked their compiler to put AMD CPUs at a disadvantage. For a time Intel’s compiler would not enable SSE/SSE2 codepaths on non-Intel CPUs, our assumption is that this is the specific complaint. To our knowledge this has been resolved for quite some time now.
FullMetal Ballsack so they didn't pay money to Aida64 for anything. They just enabled certain features of their proprietary software to work only on their platform. You know who else does that and nobody bats an eye? nVidia with CUDA. In this case, you can always compile your program on GCC or MSVC and compare how well it runs on different hardware platforms. Nobody is forcing anyone to use Intel C Compiler. In fact, I've seen few builds out there that use that compiler, even if it's presenting superior features to the free alternatives. One of the very few ones is the x265 HEVC codec, built by some russians, hosted on an unofficial website. And they offer builds on GCC and MSVC as well, so you can always choose which one works best for you.
Donnie idk how you can make excuses for Intel after they did this. It's super fucked up, but okay whatever you say. I have an i5 4690k myself so I'm not an AMD fanboy I just hate when business do shady shit like this. It's bad for everyone.
FullMetal Ballsack it's fucked up, they betrayed the customers that bought the Intel Compiler. But to say that they paid money to a famous testing suite to prefer their product is light years away. They did not do that. What they did, was tune their product to themselves. There can be a number of reasons, one of which is to look better than the competition. If the Intel Compiler were a free product, then this move would have been a perfectly logical one. After all, why spend additional money to make your product compatible with competitive products? But here, indeed, they take money for it, and do not fully deliver.
So. No one's bothered by the fact that they've given money to someone who's had a known pedophile moderate their twitch chat? I find that hard to believe.
You for got to mention that intel was sued for anti-competitive practices many times. Intel essentially bribed OEMs to not sell AMD chips, this is the primary reason intel has such a market share.
not to mention that amd processors before FX were lacking severely in power efficiency and performance, but also without x64 intel Core series woul've nothing by now
@@pinkyfloo8814 you don't have a clue, right? Before "FX" AMD has Athlon and Athlon XP who won hands down at IPC and power consumption against Intel Oh btw, 64 bits was accomplished by AMD, Intel doesn't or woulnd't know how to do it
@@IronMetal and wtf did i say asshole? Without x64 they would be nothing. Never ever implied that they created x64... And don't come with that Athlon bullshit, they were lacking. The P4 was surely hotter than your mothers prolapse but then c2dual and c2quad came out to smash AMDs ass. Nowadays it's the opposite and AMD couldnt be smashing Intel harder... or probably it can, with the upcoming 6000 series
Intel didn't license x86 to AMD to let them stay in business. that was just an added bonus for AMD later. AMD started out as another manufature for Intel since Intel couldn't produce enough of their product. so they had to license x86 for AMD to legally do this.
+Zane Erickson Well, they both started out of Fairchild after Noyce took the Traitorous Eight and left Shockley. In some ways its kind of like two siblings who hate each other, but still help one another out.
+Zane Erickson wasnt that they could not make enough if they wanted. it was more IBM requiring that it was at least 2 different deliverers of chips to them so intel dragged AMD on the wagon.
+louis tournas 8086 you mean? I used to have one aswell (and it was 12 mhz, twice faster than Intel's). +Zane Erickson - Intel licensed x86 for tons of reasons, big one of them more recently was in order to get a license for the EM64T from AMD themselves (that's right - AMD developed the 64 bit extensions much better than Intel). What the author of the video did not mention was that Intel's attempts at x64 was a complete disaster (IA-64). Also in the 2000s they had to pay AMD tons of money *not* to save them but to settle the lawsuits AMD was throwing at them for buying off OEMs like DELL (which still doesn't sell much from AMD) in the Pentium IV era (another dark ages for Intel)... In fact from all the CPUs so far, Intel's core architecture (i3/5/7) was the only one getting significantly ahead in terms of *gaming* performance per clock for the past 20 years (including PII and IIIs where they had to face K6-2 and 3 and 3DNow). Also what we gained from Intel ending the socket 7 era was the introduction of the much more stable chipsets (LX, BX, ZX) which back then were always a nightmare to deal with... I can't wait for Intel or Apple to start making diodes (will buy a whole lot of them just for the brand)! P.S. I'm on a i7 4770K and yet can't wait to see what Zen will achieve (and probably get my hands on one soon after it comes out), cause I'm sick of this +5% incremental upgrades Intel taunts us with...
+Zane Erickson The licensing had more to do with IBM's involvement, where they basically wanted AMD to keep Intel in check by adding competition. AMD used to be just a parts manufacturer, but they eventually were allowed to modify the x86 architecture in whatever way they wanted, which resulted in stuff like K6.
#6 is false. AMD created many of the technologies Intel uses, while Intel has licenses the x86, Intel could not produce 64-bit chips without the cross-license agreement the two companies share, meaning that without this agreement, Intel would also be unable to produce(or sell the currently produced) chips. the claim that Intel 'allowed' AMD to survive is false, because it implies control over their fate, while what actually happened was the companies got into a war, and were forced to litigate the technologies into an agreement. the agreement is mutually necessary, not something that Intel created or put into place. agreement between the two: www.sec.gov/Archives/edgar/data/2488/000119312509236705/dex102.htm
Yeah, you have a couple of facts wrong there. #1 - Intel didn't licence x86 to AMD because it wanted to avoid being a monopoly, it was forced to do so by IBM in the 80s because IBM wanted a second source for CPUs in case Intel was unable to deliver for whatever reason. Since it was the 80s, what IBM wanted, IBM got, no questions asked. #2 - In the early 2000s, they didn't release a crapload of technology and marketing in a landslide, they conducted illegal antitrust practices and skewed their compiler to hogtie AMD and VIA CPUs. How did you manage to leave that out of it? Oh wait, I know. THIS CHANNEL HAS BEEN BROUGHT TO YOU BY INTEL.
I love this channel! almost EVERY video u guys upload (like 9/10) is a video that I say "I have to watch that right now" as soon as I see it in my sub box. ive only been subbed for like 2 weeks or so now, but this channel has definitely become 1 of my favorite channels. Answer to ur question in the video is, I learn a shit ton from damn near every video u guys upload and I especially love all the tech videos u guys upload. keep doing what ur doing cuz I know at least I, myself, will be here for quite a long time now! :D
Been subbed for about two weeks or so, your content is pretty decent, and you don't try to be excessively funny and maintain the right amount of 'seriousness'. Keep it up guys!
this is the best video ive seen from Gameranx in a long time! and this video had a LOT better writing than the previous videos, and its also great to here the Jake doesnt sound as awkward and insecure as he usually does!
I'm a Computer Engineer and IC Designer, and I can say from what I've learned until today that the third generation of Intel's processors are built on 22nm lithography, the interesting fact about this is that dimensions like these are thinner than the visible light's wavelength, another interesting thing about it is that the gate oxide (isolator used to build transistor's behavior) on such a small litography are in the order of a few atoms, so you guys can imagine how hard it is to build the processors we have today. ^^
technically the i stands for "inside" intel inside being there slogan. Intel Core Inside 7 (product number) 7 being the series. product number being the 4790 6700, or whatever. Atleast thats how i understand it. Sure the official reason is probly because fuck it why not. But ive always liked to think of the i being a stand in for "inside"
Awesome video, great commentary, and yes - many of these facts were indeed new to me. I especially liked how you explained the weird symbiotic relationship among these two chip-producing powerhouses.
I feel the need to bring up a correction - To my knowledge, Moore's law still applies in early 2016 as Intel is still consistently shrinking dyes (although they're reaching the limits of silicone). The clock speed of a CPU does not necessarily directly correlate with the size of a transistor or the power of a processor; hence why a 2 Ghz processor from 10 years ago can be much slower than a 1 Ghz processor from the present day.
Some Weeb no... All CPU records are done with LN2, and Skylake does not get even close to 8Ghz let alone 10GHz. AMD still have the record with the highest GHz.
Well If you multiply your cores by their ghz then you get over 10 GHZ :). Now having all CPU cores at 10ghz or higher will still take a lot of time... I got disappointed by AMD 14nm RX480 in even being hard to hit 1400mhz on the die while Nvidia is more around the 2000mhz...
here is a fact, both intel and AMD moved their main mind blowing CPU's to Israel . intel did it first, then came along AMD,here is a fact weather you're buying an intel CPU or AMD CPU or GPU, that technology comes from Israel .
Renewable Energy ftw, My respect for Intel just increased. I love my G3258. Cost $70AUD, Clock 3.2Ghz OC'd@3.8Ghz,Temp 24c with an Intel liquid cooler at night
Garrett Austin and that's why both Intel and AMD have slown GHz increase so that they can either have a standard 65 watt or 95 watt without blowing up your board and that aftermarket heat pipe heat sink that you just had to have
Actually, the issue is more to do with stability than anything else. A 850W PSU would be more than enough to (theoretically) power a CPU at over 10GHz. When it comes to temperatures, if the CPU is consuming at most 400W of electricity, that means that the CPU is generating roughly 400 Joules of thermal energy/second. The thermal dissipation properties of liquid nitrogen will more than suffice to keep the CPU cool. The actual issue is down to the architecture of the CPU, i.e., either the memory controller on the die cannot cope with the bandwidth, or simply the transistor gates themselves cannot oscillate at a frequency of 10GHz.
you know what I am very happy I subscribed your channel. you guys are awesome, you not only keep uploading videos about gaming, you uploading videos about technology too that's what I like about you guys. you should to start a contests in your website if you have, quiz contest is better I think where you can ask about gaming, technology or about your channel. it's a fun way for learning or you can also set the prize for that. well whatever I love you guys keep it up.
That jet shouldn't have been retired...with the proper upgrades it would completely destroy the F18s (and it would've been cheaper to do those too...).
Moore's Law has actually kept pretty consistent and probably will for the next few years. The Gigahertz of a processor does not rely on moore's law it is the speed at which the internal clock of the CPU runs. NOT the number of transistors on the chip. Fun facts
Actually, the max clock speed of a processor is directly determined by transistor size. MOSFET transistors work by physically moving electronics to create or destroy wires. To increase clock speed, you can increase how hard you push/pull the electronics(increases voltage), reduced the electronics resistance to movement(chemistry stuff), or reduce the distance the electronics need to be moved(node size). There is a side effect where reducing size also reduces max voltage, but that isn't what stopped clock speed increases. Duo to low yield rates and a sharp drop in power efficiency around 4GHz, it just isn't economical to go past 4GHz. Intel tries to get around this with an extensive binning process, but the fact still remains, that for each 4.7GHz processor the produce, they make many more that hang around 3GHz.
There's a big difference between buying energy credits and actually being green. The amount of documentaries i've seen on this kind of shit.... do some searching, but simply put energy credits is what would allow a hummer to claim it's just as green as a smart car, or a coal plant to be just as green as a solar plant.
Here's some more I don't think most people know/remember: Intel started as a memory manufacturer. Their market was to replace core memory with transistor memory chips(DRAM, SRAM ...). The core memory consisted of little magnetic cores carefully woven one bit at a time by hand, mostly by women. The litography masks for Intels 4000 series of chips, including the 4004, was made by hand, by cuting and carefully placing little strips of masking material under a microscope. Floating point math used to be a coprocessor chip with the x87 instruction set. It was on the 486 (DX version) that you could finally get an integrated x87 floating point processor. The latency of RAM measured in clock cycles has ballooned out ferociously, from running synchronously with the CPU, with exactly one clock cycle of latency, to several hundred clock cycles of latency today. L1 caches were originally an addon chip soldered to the motherboard, then they were integrated and L2 cache became a thing. Then L2 cache were integrated and L3 cache became a thing. Then L3 caches were integrated. Then memory controllers where integrated to reduce latency. RAM isn't random access anymore; it is much faster to read linearly in memory than randomly. A lot of the complexity of CPUs relates to dealing with RAM latency; the CPU speculatively prefetches memory from RAM, executes instructions out of order and predicts branches and executes them speculatively (if it fails, it has too undo everything since it misspredicted). If you read randomly from RAM, the CPU can't predict and your code may run more than 100 times slower than if everything was packed into one contiguous piece of memory and read sequentially. X86 CPUs are internally RISC CPUs and the instruction set changes whenever intel feels like it without affecting programmers. Externally they behave as if they were X86 CISC CPUs, but there's a big decoder that makes RISC instructions (micro ops) out of everything, which is what ends up being executed internally.
+Decan Andersen Implying they work at 100% all the time Implying every shit optimises their stuff that it doesn't almost solely use only the first core
HEY, be honest, are you sponsored? Because form what I've seen, and experienced, AMD in A LOT OF WAYS, make better products than Intel. Higher temp thresholds, faster cpu, are the two I've heard of most in the tech field. I'm not saying I'm Mr. Techwiz and I know all, just going by word of mouth honestly. Never done a comparison personally, so I know, I'm kinda blowing smoke out my ass, just heard lot''s of bad things about Intel, from it's business practice, to the unnecessary limitations to their hardware.
2:02 - 2:05 and we have them, if you multiply the clock speed by the core count, you could easily count the first CPUs with 10GHz clock speed was made actually in 2007 (e.g. Core 2 Quad Q6700 - clock per core is 2,66 GHz. If you multiply it by 4 - 2,66 GHz x4 + 10,66 GHz.) So Intel and AMD BROKE the 10 GHz barrier, actually in 2007. (BOTH)
Triggered Autism The "i" in front of Phone is meant to represent the supremacy of the iPhone over the competitors. The iPhone being the true and only intelligentPhone.
Wauw i'm surprised that I havent come across this channel earlier. This is exactly how I like my tech. Alot of Facts, truth & humor in not too many minutes. Makes it interesting educational & funny. subbed.
Intel on Intel, alright - I'll play. They did make a chip that went into the flight control of the F-14 - that's true. Go back further. In the early days of integrated circuits, they'd make a chip for a calculator company, Casio, with functions to add, subtract, multiply and divide. These were done with digital logic. So Casio requested a new chip, with some more functions, (sqr, sqrt, log etc...) and they presented a circuit diagram, suggested "Like this." It was a pretty big nasty circuit. Moor and his team had been playing with the idea of slotting not part, but all a microprocessor on one integrated circuit. This resulted in something a bit more complex again than what Casio had asked for, but it could do any simple math function and be given fresh microcode in a later model to add more, and it was all new and shiny and engineers liked that sort of thing. So they showed it to Casio, who were super impressed, and they ordered it. Officially, it wasn't a computer parts - it was the "works" of a simple calculator, but it was integrated into a single chip, and it worked the way a processor core does, not the way a group of discrete digital logic stuff does. That was about '63, and that was Intel's first processor on one chip. They also made Casio a quartz crystal chip time piece, on a single silicon chip, which brought them to digital watches, at 1/4 or less the price of Texas Instruments or Seiko.... Before pcs, Intel made a lot of money (relatively) through Casio digital watches and Casio calculators.
Fun Fact: The factories that manufacture CPUs are the most cleanest places on earth. a tiny spec of dust or dirt is all it takes to cause a defect in the manufacturing of these computer chips.
in your video on the history of intel, you left out a very important chapter. a company bu the name of Zilog also made an ascii based cpu that powered several cp/m computers. my first "personal computer" was powered by their 8080 chip and i later upgraded to the faster Z80 chip which was pin compatible and slightly faster. unlike ibm pc compatibles those computers used 8" floppy drives and double sided double density drives held 860KB as opposed to the pc's 360KB. the cp/m system i had came equipped. with 64KB of memory an ran at a speed of 4.77 megahertz. it could running cp/m 2.2 recognize a hard drive partition of 32 megabytes just like ms-dos. these systems utilized serial terminals for input/output so they were not capable of even ascii graphics. just a small bit of computer history for anyone that is interested.
For 2:27 yeah, 3 Ghz processors are now "the default", but there is a catch to it on the chemical and physical level. Instead of making it go faster, we just add another core. This way, the entire processing load is handled by multiple cores. Sure, you can overclock a CPU if you do it properly and carefully.
10ghz is actually the physical limit that the electrons allow us. but we probably will never reach that mark. the record was achieved in a lab using liquid nitrogen cooling and was around ~7ghz, and it still died after a short while.
Wow! I must've gotten this as a kid. But I always assumed Intel was short for Intelligent, which I thought was a fitting name for a company. *Mind Blown*
Cool summary. The answer to #1 is lithography: Think movie projector, but backward! Instead of using optics to magnify a small image and projecting on a wall / screen, you take a big image (mask) and project it down to a tiny, tiny size. Yeah. Easier said than than, when you have 3 billion transistors and your traces (wires) are 14nm wide, but that's the principle.
2:01 I've been saying this for years, Moore's "law" hasn't held true for almost 20 years now, and yet everytime I point this out people jump down my throat and call me a moron. We should be in the fucking 100s of TERAHERTZ processors by now if this "law" held true, not just 10 ghz
I gotta give you mad props for using the word "probably." In fact, I didn't know any of these things. The number of channels that skip that word (e.g. 10 Things You Don't Know About Porcupines), and then proceed to give a list containing two things I hadn't known about porcupines, make me want to coat their porcupine in lemon juice and then hit them with the porcupine.
I haven't seen a comment about this yet, so here goes. Moore's law is about the number of transistors on a chip, not the clock speed of the chip. With more transistors, the chip can do more in one clock cycle than a chip with fewer transistors at a higher clock speed for some number of fewer.
some intel on intel huehuehiehuehuehue hue hue hue
''Noyce'' one
I hate you both
+Lorenzo Little
+alice hendriks I can't think of any "Moore" puns
+UltimateFusion6 I'd be Moore than willing to help with that ^_^
The reason Moore's law no longer applies is because Intel have reached the smallest possible size of transistors that can be made of silicon. What's interesting is that it wasn't technology that impeded Moore's Law - it was the physical material that it was being made of that stopped progress. 14nm is the smallest manufacturable size possible - any smaller is said to be too difficult to manufacture, too fragile, or most importantly follow a drop-off in efficiency/size.
The next big step would be to _stack_ 14nm transistors into a package - something that was realized only just a few years ago. Two or three iterations of that and we might breach 10ghz. But by then we should have quantum computing :D
What about using carbon nanotube transistors. I heard IMB made a 7nm transistor with them. Also I read somewhere that silicone transistor size can be smaller, but then you enter the realms of quantum tunnelling, or something like that.
+Patrick Mathew not exactly, but there are indeed problems tied to wave properties of electrons
+Patrick Mathew they made a 7nm chip using silicon. It was treated differently but it was still based on silicon.
+Patrick Mathew silicon SI LI CONNNNNN.
Silicon and silicone are two completely different things.
Patrick Mathew Wow, salty much? Just trying to help a fellow person out from future embarrassment. Jesus.
I can play chess without a CPU.
*facepalm*
+holdmybeer Impossibru!
so you play chess without using your brain? because your brain is basicly a biological computer. and all computers have a cpu. it's even been tested that a human brain has an average of 25 petabytes of memory. (1 petabyte is 1024 terabytes for those that don't know)
Bishop Munoz
I bet you're fun at parties.
so you use an APU
I will say a few things. AMD had 50% of the market in early 2000. Not only that. Even after Intel got the 64-bit license they struggled to compete with the Athlon. Stop pretending like Intel is a god. It's not. I will admit that Intel probably spared AMD at the start.
Luka David Torkar just like Microsoft saved apple's ass in... 97 i think it was, nobody said intel was god though, but atm they have the market and the more powerful CPUs. By a significant margin
G33RsofDeath Ture. But some of them also cost a lot more. I mean a dual core for 150 €?! He was saying as if intel is playing with AMD and letting it catch up cause they don't want to do that. If intel would have a choice they would destroy AMD by lowering prices on their i3s and i5s. The main point why intel is setting prices for their CPUs so high is because they are afraid of becoming a monopoly. That would lead to the company splitting up. They already had problems with the law when they tried to push AMD as low as they could but then got cought and now they have to pay 1 bilion dollars to the EU.
Luka David Torkar intel will never fully kill amd because there are laws against monopoly a d they would end up having to pay a fucking shit ton of money in taxes so it's not in their interest
G33RsofDeath Well they almost did when Bulldozer came out...
i smell an AMD fanboy from a mile.
I'm an AMD guy. I like the idea of having an 8-core 4ghz processor.
+Zenn which can't beat intel's 6-core 3ghz processor lmao
Correction - not actually 8 cores bud!
And news reports say they technically don't have eight cores
+Liza Shy Stupid? Yes, facts are stupid.
+SnazzyMacGuy ACtually they are 4 modules, that function as 8 cores...
you forgot to mention that intel paid the producers of aida64 to make it so that AMD chips run slower, effectively making it seem that they were much faster than AMD, which wasn't the case. AMD just won over a billion dollars in a lawsuit because of this.
+FullMetal Ballsack source link please. Otherwise I call fanboy bullshit.
Donnie www.anandtech.com/show/3839/intel-settles-with-the-ftc
From the article:Intel reworked their compiler to put AMD CPUs at a disadvantage. For a time Intel’s compiler would not enable SSE/SSE2 codepaths on non-Intel CPUs, our assumption is that this is the specific complaint. To our knowledge this has been resolved for quite some time now.
FullMetal Ballsack so they didn't pay money to Aida64 for anything. They just enabled certain features of their proprietary software to work only on their platform. You know who else does that and nobody bats an eye? nVidia with CUDA.
In this case, you can always compile your program on GCC or MSVC and compare how well it runs on different hardware platforms. Nobody is forcing anyone to use Intel C Compiler. In fact, I've seen few builds out there that use that compiler, even if it's presenting superior features to the free alternatives. One of the very few ones is the x265 HEVC codec, built by some russians, hosted on an unofficial website. And they offer builds on GCC and MSVC as well, so you can always choose which one works best for you.
Donnie idk how you can make excuses for Intel after they did this. It's super fucked up, but okay whatever you say.
I have an i5 4690k myself so I'm not an AMD fanboy I just hate when business do shady shit like this. It's bad for everyone.
FullMetal Ballsack it's fucked up, they betrayed the customers that bought the Intel Compiler. But to say that they paid money to a famous testing suite to prefer their product is light years away. They did not do that. What they did, was tune their product to themselves. There can be a number of reasons, one of which is to look better than the competition.
If the Intel Compiler were a free product, then this move would have been a perfectly logical one. After all, why spend additional money to make your product compatible with competitive products? But here, indeed, they take money for it, and do not fully deliver.
How about the fact that Intel has actually given money to Anita Sarkessian? What about THAT FACT/
+Chip Hitler no one cares. We're only talking about video games.
+gameranx yes.
+gameranx Actually, there's quite a lot of people that care.
+gameranx Well said.
So. No one's bothered by the fact that they've given money to someone who's had a known pedophile moderate their twitch chat? I find that hard to believe.
You for got to mention that intel was sued for anti-competitive practices many times. Intel essentially bribed OEMs to not sell AMD chips, this is the primary reason intel has such a market share.
not to mention that amd processors before FX were lacking severely in power efficiency and performance, but also without x64 intel Core series woul've nothing by now
@@pinkyfloo8814 you don't have a clue, right? Before "FX" AMD has Athlon and Athlon XP who won hands down at IPC and power consumption against Intel
Oh btw, 64 bits was accomplished by AMD, Intel doesn't or woulnd't know how to do it
@@pinkyfloo8814 🤣🤣🤣
@@IronMetal and wtf did i say asshole? Without x64 they would be nothing. Never ever implied that they created x64...
And don't come with that Athlon bullshit, they were lacking.
The P4 was surely hotter than your mothers prolapse but then c2dual and c2quad came out to smash AMDs ass.
Nowadays it's the opposite and AMD couldnt be smashing Intel harder... or probably it can, with the upcoming 6000 series
Intel didn't license x86 to AMD to let them stay in business. that was just an added bonus for AMD later. AMD started out as another manufature for Intel since Intel couldn't produce enough of their product. so they had to license x86 for AMD to legally do this.
+Zane Erickson Well, they both started out of Fairchild after Noyce took the Traitorous Eight and left Shockley. In some ways its kind of like two siblings who hate each other, but still help one another out.
+Zane Erickson wasnt that they could not make enough if they wanted. it was more IBM requiring that it was at least 2 different deliverers of chips to them so intel dragged AMD on the wagon.
+Zane Erickson I have a PC with 8088 chip and it is stamped AMD.
+louis tournas 8086 you mean? I used to have one aswell (and it was 12 mhz, twice faster than Intel's).
+Zane Erickson - Intel licensed x86 for tons of reasons, big one of them more recently was in order to get a license for the EM64T from AMD themselves (that's right - AMD developed the 64 bit extensions much better than Intel). What the author of the video did not mention was that Intel's attempts at x64 was a complete disaster (IA-64).
Also in the 2000s they had to pay AMD tons of money *not* to save them but to settle the lawsuits AMD was throwing at them for buying off OEMs like DELL (which still doesn't sell much from AMD) in the Pentium IV era (another dark ages for Intel)...
In fact from all the CPUs so far, Intel's core architecture (i3/5/7) was the only one getting significantly ahead in terms of *gaming* performance per clock for the past 20 years (including PII and IIIs where they had to face K6-2 and 3 and 3DNow).
Also what we gained from Intel ending the socket 7 era was the introduction of the much more stable chipsets (LX, BX, ZX) which back then were always a nightmare to deal with...
I can't wait for Intel or Apple to start making diodes (will buy a whole lot of them just for the brand)!
P.S. I'm on a i7 4770K and yet can't wait to see what Zen will achieve (and probably get my hands on one soon after it comes out), cause I'm sick of this +5% incremental upgrades Intel taunts us with...
+Zane Erickson The licensing had more to do with IBM's involvement, where they basically wanted AMD to keep Intel in check by adding competition. AMD used to be just a parts manufacturer, but they eventually were allowed to modify the x86 architecture in whatever way they wanted, which resulted in stuff like K6.
#6 is false. AMD created many of the technologies Intel uses, while Intel has licenses the x86, Intel could not produce 64-bit chips without the cross-license agreement the two companies share, meaning that without this agreement, Intel would also be unable to produce(or sell the currently produced) chips. the claim that Intel 'allowed' AMD to survive is false, because it implies control over their fate, while what actually happened was the companies got into a war, and were forced to litigate the technologies into an agreement. the agreement is mutually necessary, not something that Intel created or put into place. agreement between the two: www.sec.gov/Archives/edgar/data/2488/000119312509236705/dex102.htm
Your profile pic I am near fidel castro
@@chickeninabox planning on seizing the means anytime soon? ;)
Yeah, you have a couple of facts wrong there. #1 - Intel didn't licence x86 to AMD because it wanted to avoid being a monopoly, it was forced to do so by IBM in the 80s because IBM wanted a second source for CPUs in case Intel was unable to deliver for whatever reason. Since it was the 80s, what IBM wanted, IBM got, no questions asked. #2 - In the early 2000s, they didn't release a crapload of technology and marketing in a landslide, they conducted illegal antitrust practices and skewed their compiler to hogtie AMD and VIA CPUs. How did you manage to leave that out of it? Oh wait, I know. THIS CHANNEL HAS BEEN BROUGHT TO YOU BY INTEL.
I love this channel! almost EVERY video u guys upload (like 9/10) is a video that I say "I have to watch that right now" as soon as I see it in my sub box. ive only been subbed for like 2 weeks or so now, but this channel has definitely become 1 of my favorite channels. Answer to ur question in the video is, I learn a shit ton from damn near every video u guys upload and I especially love all the tech videos u guys upload. keep doing what ur doing cuz I know at least I, myself, will be here for quite a long time now! :D
+adirtybirdy T
Gotta give it to ya, you took the words right out of my mouth.
Please do NVIDIA facts!!!
so...
iGameranx
lmao
iRanx
They should do it
Been subbed for about two weeks or so, your content is pretty decent, and you don't try to be excessively funny and maintain the right amount of 'seriousness'. Keep it up guys!
10 INTEL facts? then do a 10 AMD facts and 10 NVIDIA facts
+Cesar Alvarez HD definitely!
+gameranx Could you then do nVidia facts after that?
gameranx Awesome!
+gameranx Let's do an AMD facts after this!
+Cesar Alvarez HD Yes!!!!!!!!!
do nvidia facts!!!
+mahtipiirakka but theyre still okay, they do use less power for abou the same performance
+Julien Nogueras but theyre still okay, they do use less power for abou the same performance
That is exactly why it is interesting.
+Julien Nogueras my nividia gtx 970 is doing pretty well
+mahtipiirakka YES
Ill rate it 3.5/4.0
Can we get a top ten on AMD?
+Jacob Jock I would like this as well.
It's in the works, trust me.
ha burn
+owl “1080p” gamer That is what AMD chips do...
+Ritu Chauhan thats really only what the average person needs. Not everyone can drop $400 on ONE monitor for 1440p, let alone 4K.
this is the best video ive seen from Gameranx in a long time! and this video had a LOT better writing than the previous videos, and its also great to here the Jake doesnt sound as awkward and insecure as he usually does!
I'm a Computer Engineer and IC Designer, and I can say from what I've learned until today that the third generation of Intel's processors are built on 22nm lithography, the interesting fact about this is that dimensions like these are thinner than the visible light's wavelength, another interesting thing about it is that the gate oxide (isolator used to build transistor's behavior) on such a small litography are in the order of a few atoms, so you guys can imagine how hard it is to build the processors we have today. ^^
technically the i stands for "inside" intel inside being there slogan. Intel Core Inside 7 (product number) 7 being the series. product number being the 4790 6700, or whatever. Atleast thats how i understand it. Sure the official reason is probly because fuck it why not. But ive always liked to think of the i being a stand in for "inside"
thefilmdirector1 so 6950x = Intel Core Inside 9 = i9 cpu
You didnt even understand what he tried to say. 6950x is an i7 anyway
thefilmdirector1 nope. it doesnt work like that. remeber i7 920 and i7 2600/K. Not even a single 7
the 'idick"
The reading comprehension here is appalling..
Great vid, but the audio cuts almost overlap on your voice in parts
Intel is bae
+Screaming Tater Tots tru
Michael is bae
Intel is poop? Btw Bae means poop in a Scandinavian country
Shadowthief Gaming No shit Sherlock
+Screaming Tater Tots What's bae?
Great video. Love that this one is a little extra nerdy and off the usual gaming topics.
Awesome video, great commentary, and yes - many of these facts were indeed new to me. I especially liked how you explained the weird symbiotic relationship among these two chip-producing powerhouses.
To me, the "i" in i3/i5/i7 stands for "Intel". Simple.
And I'm watching this video having all AMD hardware on my computer...
Gio good
Gio Good choice
I thought Intel's name was taken from "intelligence"
oh i never thought of that
same... the truth is disappointing... :(
Delicious Kawaiigami yea right
I feel the need to bring up a correction - To my knowledge, Moore's law still applies in early 2016 as Intel is still consistently shrinking dyes (although they're reaching the limits of silicone). The clock speed of a CPU does not necessarily directly correlate with the size of a transistor or the power of a processor; hence why a 2 Ghz processor from 10 years ago can be much slower than a 1 Ghz processor from the present day.
Great videos Jake! Keep up the great work man!
Best channel ever
+dragoloco700 thanks!
i chose intel because of being green, blood free and pushing forward.
and am quite disappointed by my AMD CPU in my laptop.
Good reason dude
DO TOP 10 AMD FACTS :D
Cool Video, Very Interesting. Keep Making these awesome videos.
Great commentary. Funny and enjoyable, and these days most videos fail in doing that.
AMD reached little over 50% market share in the mid 2000s and could have been more, have Intel NOT payed many OEMs NOT use AMD hardware...
We have 10ghz CPU's.
If you're brave enough.
And you have water cooling.
Some Weeb water cooling cant handle 10GHz, you gotta have LN2 for that.
*****
Water cooling can totally handle 10ghz if you're using skylake.
Some Weeb no... All CPU records are done with LN2, and Skylake does not get even close to 8Ghz let alone 10GHz. AMD still have the record with the highest GHz.
Dude the CPU Frequency doesn't matter. The voltage is the point. If you want to reach a really high frequency you have to switch to ln2 anyway.
Well If you multiply your cores by their ghz then you get over 10 GHZ :).
Now having all CPU cores at 10ghz or higher will still take a lot of time...
I got disappointed by AMD 14nm RX480 in even being hard to hit 1400mhz on the die while Nvidia is more around the 2000mhz...
here is a fact, both intel and AMD moved their main mind blowing CPU's to Israel . intel did it first, then came along AMD,here is a fact weather you're buying an intel CPU or AMD CPU or GPU, that technology comes from Israel .
Jews are smart like ayn rand
Another fantastic video. Keep 'em rolling!
Renewable Energy ftw, My respect for Intel just increased.
I love my G3258. Cost $70AUD, Clock 3.2Ghz OC'd@3.8Ghz,Temp 24c with an Intel liquid cooler at night
Just appreciate amd for making it cost low :)
That's a bit of an oxymoron
+unsigned int fat
introducing the iVac the first apple product that doesn't suck
Bobtheguy sports and gaming I have so many questions...
actually we can make 10GHz cpus but the issue is the power it would take to run
Kameron Swan no the issue is temperatures
and the only way to get that hot is by having that much power pumping through it
Garrett Austin and that's why both Intel and AMD have slown GHz increase so that they can either have a standard 65 watt or 95 watt without blowing up your board and that aftermarket heat pipe heat sink that you just had to have
Actually, the issue is more to do with stability than anything else. A 850W PSU would be more than enough to (theoretically) power a CPU at over 10GHz. When it comes to temperatures, if the CPU is consuming at most 400W of electricity, that means that the CPU is generating roughly 400 Joules of thermal energy/second. The thermal dissipation properties of liquid nitrogen will more than suffice to keep the CPU cool. The actual issue is down to the architecture of the CPU, i.e., either the memory controller on the die cannot cope with the bandwidth, or simply the transistor gates themselves cannot oscillate at a frequency of 10GHz.
then why don't they make the di out of copper instead of tin-bismuth
ANOTHER *BREAKING* FACT:
Intel thinks people shit money and charge 200$ for an i5.... YEY! -_______-
+YellowPeaches Compared to cost of a lot of other items now, that's fairly low
+YellowPeaches I'm sorry? Do you expect to go to Walmart and buy an i5 for $20?
+YellowPeaches they're reliable and it's worth the price. Lower power consumption, cooler running, better performance over all.
do u even realize how these things get made...lots cost and as far as preformance goes intel is the way to go...on budget tho go for amd...
+Noah S. IK but an AMD chip costs almost two times less and is also very realible and good.. FYI
you know what I am very happy I subscribed your channel.
you guys are awesome, you not only keep uploading videos about gaming, you uploading videos about technology too that's what I like about you guys.
you should to start a contests in your website if you have, quiz contest is better I think where you can ask about gaming, technology or about your channel.
it's a fun way for learning or you can also set the prize for that.
well whatever I love you guys keep it up.
this was all new info to me, some of it was amazing! Keep up the great work.
ALL HAIL THE ALL MIGHTY TOMCAT!!!
That jet shouldn't have been retired...with the proper upgrades it would completely destroy the F18s (and it would've been cheaper to do those too...).
meanwhile IBM, global foundry, and Samsung are making 7nm chips and Intel still struggling with making 10nm possible :3
Weibin Zhou this is for full sized CPU'S right? not something that fits in a user's hand....
Devesh Singh nope gf is helping AMD make GPU :3
:3
You were joking right??? They do have it... do some research...
they don't. do some research
this is moore noyce than i already have in my head alot of words for 7 mints
This was put together nicely, good stuff.
Moore's Law has actually kept pretty consistent and probably will for the next few years. The Gigahertz of a processor does not rely on moore's law it is the speed at which the internal clock of the CPU runs. NOT the number of transistors on the chip. Fun facts
Actually, the max clock speed of a processor is directly determined by transistor size. MOSFET transistors work by physically moving electronics to create or destroy wires. To increase clock speed, you can increase how hard you push/pull the electronics(increases voltage), reduced the electronics resistance to movement(chemistry stuff), or reduce the distance the electronics need to be moved(node size).
There is a side effect where reducing size also reduces max voltage, but that isn't what stopped clock speed increases. Duo to low yield rates and a sharp drop in power efficiency around 4GHz, it just isn't economical to go past 4GHz. Intel tries to get around this with an extensive binning process, but the fact still remains, that for each 4.7GHz processor the produce, they make many more that hang around 3GHz.
Bullshit the I is from IRobot the book from Isaac Asimov!
+Vinnie J you must be atleast 50 years old to know that book and no i am only 23 but my father has that book and he is 55
911Maci Yeah i know it trough my dad too but their is no way people at Intel and Apple dont know that book.
+Kian Gurney or i is for intel. 'Intel 3", "Intel 5", "Intel 7", etc
+danodude but what does the i in intel stand for?
Jack Le qjfopiqurqofipkajfdovjeqrdaf
There's a big difference between buying energy credits and actually being green. The amount of documentaries i've seen on this kind of shit.... do some searching, but simply put energy credits is what would allow a hummer to claim it's just as green as a smart car, or a coal plant to be just as green as a solar plant.
I'd say that they're still pretty clean, when considering that they have multiple solar cell power plants on-site..
How about Intel Bribed Dell 800 Million , not to buy AMD chips
Here's some more I don't think most people know/remember:
Intel started as a memory manufacturer. Their market was to replace core memory with transistor memory chips(DRAM, SRAM ...). The core memory consisted of little magnetic cores carefully woven one bit at a time by hand, mostly by women.
The litography masks for Intels 4000 series of chips, including the 4004, was made by hand, by cuting and carefully placing little strips of masking material under a microscope.
Floating point math used to be a coprocessor chip with the x87 instruction set. It was on the 486 (DX version) that you could finally get an integrated x87 floating point processor.
The latency of RAM measured in clock cycles has ballooned out ferociously, from running synchronously with the CPU, with exactly one clock cycle of latency, to several hundred clock cycles of latency today. L1 caches were originally an addon chip soldered to the motherboard, then they were integrated and L2 cache became a thing. Then L2 cache were integrated and L3 cache became a thing. Then L3 caches were integrated. Then memory controllers where integrated to reduce latency.
RAM isn't random access anymore; it is much faster to read linearly in memory than randomly. A lot of the complexity of CPUs relates to dealing with RAM latency; the CPU speculatively prefetches memory from RAM, executes instructions out of order and predicts branches and executes them speculatively (if it fails, it has too undo everything since it misspredicted). If you read randomly from RAM, the CPU can't predict and your code may run more than 100 times slower than if everything was packed into one contiguous piece of memory and read sequentially.
X86 CPUs are internally RISC CPUs and the instruction set changes whenever intel feels like it without affecting programmers. Externally they behave as if they were X86 CISC CPUs, but there's a big decoder that makes RISC instructions (micro ops) out of everything, which is what ends up being executed internally.
2:33 where can I get this image ?
wallpaperscraft.com/download/processor_cpu_upgrade_installation_chip_robot_5633/1920x1200
renewable energy, i think im inlove with intel
well, 3ghz x 4 cores is technically 12ghz
No.
+Decan Andersen No, it's not straight multiplication, you moron.
+Decan Andersen Implying they work at 100% all the time
Implying every shit optimises their stuff that it doesn't almost solely use only the first core
Sticking 4 cars together doesnt multiply the horsepower.
UA-camwolltemeinennamennichtalsomachichihnlang but what if you glued them together and hooked the control to a single car, what if
HEY, be honest, are you sponsored? Because form what I've seen, and experienced, AMD in A LOT OF WAYS, make better products than Intel. Higher temp thresholds, faster cpu, are the two I've heard of most in the tech field. I'm not saying I'm Mr. Techwiz and I know all, just going by word of mouth honestly. Never done a comparison personally, so I know, I'm kinda blowing smoke out my ass, just heard lot''s of bad things about Intel, from it's business practice, to the unnecessary limitations to their hardware.
+Evan Park not at all. What if I told you we're just recognizing our PC gamer audience and that we have an AMD video in the works as well?
Fucking hell tell me, what strand did you smoke?
amd is worse
+Nick Schiener on what basis? emulation?
***** in overall performance disregarding price
all of tbe the facts were new to me, and they all were really interesting! keep up the good work!
2:02 - 2:05 and we have them, if you multiply the clock speed by the core count, you could easily count the first CPUs with 10GHz clock speed was made actually in 2007 (e.g. Core 2 Quad Q6700 - clock per core is 2,66 GHz. If you multiply it by 4 - 2,66 GHz x4 + 10,66 GHz.) So Intel and AMD BROKE the 10 GHz barrier, actually in 2007. (BOTH)
First
At being a loser ☹️
Dood your not a loser
+The VulpesFox but if you do, do a back flip
+Brian Junker i fist old people
+Brian Junker that's not what he meant by jump. he means don't jump a freaken building to his death! seriously don't do suicide
+Grant Ess he knows exactly what he means.but he's saying that he should do a backflip before his death XD
iPhone means idiot phone
Triggered Autism why does this comment exist
Triggered Autism
The "i" in front of Phone is meant to represent the supremacy of the iPhone over the competitors. The iPhone being the true and only intelligentPhone.
Radu Gabriel *9years ago.
Cloud Man
You mean SINCE 9-10 years ago.
yolo swacgity
Delete your comment please, you are making a fool of yourself.
AMD Please.
Wauw i'm surprised that I havent come across this channel earlier. This is exactly how I like my tech.
Alot of Facts, truth & humor in not too many minutes. Makes it interesting educational & funny. subbed.
How to start fanboy flame war:
Step 1: "[Brand X] is better than [Brand Y]"
Step 2: Wait...
Step 3: ????
Step 4: PROFIT!!!!
We Always Love watching your videos!
We had no idea about the standard socket!
Intel on Intel, alright - I'll play. They did make a chip that went into the flight control of the F-14 - that's true. Go back further. In the early days of integrated circuits, they'd make a chip for a calculator company, Casio, with functions to add, subtract, multiply and divide. These were done with digital logic. So Casio requested a new chip, with some more functions, (sqr, sqrt, log etc...) and they presented a circuit diagram, suggested "Like this." It was a pretty big nasty circuit. Moor and his team had been playing with the idea of slotting not part, but all a microprocessor on one integrated circuit. This resulted in something a bit more complex again than what Casio had asked for, but it could do any simple math function and be given fresh microcode in a later model to add more, and it was all new and shiny and engineers liked that sort of thing. So they showed it to Casio, who were super impressed, and they ordered it. Officially, it wasn't a computer parts - it was the "works" of a simple calculator, but it was integrated into a single chip, and it worked the way a processor core does, not the way a group of discrete digital logic stuff does. That was about '63, and that was Intel's first processor on one chip. They also made Casio a quartz crystal chip time piece, on a single silicon chip, which brought them to digital watches, at 1/4 or less the price of Texas Instruments or Seiko.... Before pcs, Intel made a lot of money (relatively) through Casio digital watches and Casio calculators.
Dude you're awesome. Great channel
Fun Fact: The factories that manufacture CPUs are the most cleanest places on earth. a tiny spec of dust or dirt is all it takes to cause a defect in the manufacturing of these computer chips.
LAWL love your voice man, keep up the good videos!!
in your video on the history of intel, you left out a very important chapter. a company bu the name of Zilog also made an ascii based cpu that powered several cp/m computers. my first "personal computer" was powered by their 8080 chip and i later upgraded to the faster Z80 chip which was pin compatible and slightly faster. unlike ibm pc compatibles those computers used 8" floppy drives and double sided double density drives held 860KB as opposed to the pc's 360KB. the cp/m system i had came equipped. with 64KB of memory an ran at a speed of 4.77 megahertz. it could running cp/m 2.2 recognize a hard drive partition of 32 megabytes just like ms-dos. these systems utilized serial terminals for input/output so they were not capable of even ascii graphics. just a small bit of computer history for anyone that is interested.
Thanks for the facts. It was a fast way to understand more about a leader in computing technologies
PLEASE DONT STOP MAKING VIDEOS I LOVE THEM SO MUCH
i don't know why i keep coming back to this channel, i do find the topics really interesting
you are doing great bro :D
this is why i love you guys..... you give me information on shit i never cared about, and now I CARE ABOUT IT (keep it up)
Dunno what to comment so here's the recipe for a glass of water:
- Glass
- Water
Instruction
- Put water into glass
Enjoy!
+Haripazha Aezakmi How original...
+Haripazha Aezakmi thanks for the tips
All of this was new info for me :D. Thanks for those awesome 10 Facts!
great video man I did learn a lot thanks
Oh man, I love these vids u guys do.
For 2:27 yeah, 3 Ghz processors are now "the default", but there is a catch to it on the chemical and physical level. Instead of making it go faster, we just add another core.
This way, the entire processing load is handled by multiple cores.
Sure, you can overclock a CPU if you do it properly and carefully.
Good vid, can you make one on AMD/ ATI?
Hahaa love your work fella, great vid
im new to pc building and gaming so i didnt know any of this about intel thanks for the video man.
I like how their "first chip" had the amd symbol on it as the two companies worked reasonably closely back then I believe
everything was new for me,,,, and i like the humor of the narrator, im already a sub,, but i give a like
Would love to see 3DFX facts and ATI (RIP) facts. Good work.
10ghz is actually the physical limit that the electrons allow us. but we probably will never reach that mark. the record was achieved in a lab using liquid nitrogen cooling and was around ~7ghz, and it still died after a short while.
The only thing affected by quantum tunneling is transistor size
Great information guys!
I love these historical outlines. Great stuff.
next top 10 should be 10 things you probably didn't know about gameranx.
Wow! I must've gotten this as a kid. But I always assumed Intel was short for Intelligent, which I thought was a fitting name for a company.
*Mind Blown*
Cool summary.
The answer to #1 is lithography:
Think movie projector, but backward!
Instead of using optics to magnify a small image and projecting on a wall / screen, you take a big image (mask) and project it down to a tiny, tiny size.
Yeah. Easier said than than, when you have 3 billion transistors and your traces (wires) are 14nm wide, but that's the principle.
2:01 I've been saying this for years, Moore's "law" hasn't held true for almost 20 years now, and yet everytime I point this out people jump down my throat and call me a moron. We should be in the fucking 100s of TERAHERTZ processors by now if this "law" held true, not just 10 ghz
A top 10 AMD facts would be nice. As far as I remember, AMD has brought many innovations to the CPU field.
I gotta give you mad props for using the word "probably." In fact, I didn't know any of these things. The number of channels that skip that word (e.g. 10 Things You Don't Know About Porcupines), and then proceed to give a list containing two things I hadn't known about porcupines, make me want to coat their porcupine in lemon juice and then hit them with the porcupine.
I'm so glad i subbed to you today
I haven't seen a comment about this yet, so here goes. Moore's law is about the number of transistors on a chip, not the clock speed of the chip. With more transistors, the chip can do more in one clock cycle than a chip with fewer transistors at a higher clock speed for some number of fewer.
But small transistors can switch faster.
Could someone explain me why it's rainbow colored integrated circuits? 7:00
Dude, i didnt know any of this. Awesome :)