Is it the End for Moore's Law? - Computerphile
Вставка
- Опубліковано 5 жов 2024
- Moore's Law has held true for 40 years, but many say it will soon end - Can chip designers avoid the laws of physics? Professor Derek McAuley explains how chips are built.
Domino Addition - Numberphile : • Domino Addition - Numb...
EXTRA BITS: • EXTRA BITS - Silicon C...
/ computerphile
/ computer_phile
This video was filmed and edited by Sean Riley.
Computer Science at the University of Nottingham: bit.ly/nottscom...
Computerphile is a sister project to Brady Haran's Numberphile. See the full list of Brady's video projects at: bit.ly/bradycha...
at 2:28 i thought it was an editing error and the video started replaying
Good video, but the editing is a bit off it seems.
The start just starts mid sentence, without introduction and then some of the start parts are reused toward the 3 minute mark. Strange.
I wish Brady would stop doing those intros with a bit from the middle of the film - I don't know what purpose it serves apart from confusing the viewer ( you think you're supposed to know what they're talking about but it's ripped out of context). This technique may work for some entertainment stuff, to whet your appetite, but I don't find it useful for educational videos.
The start of this video seems to be mis-edited
It's great to actually hear real computer scientist talk about Moore's Law, instead of reading it from 'wanna-be scientist' from various tech sites, who clearly have no idea what they are talking about.
Great video.
Still a great video! And now, two years later, we're starting to see 16nm chips, with 14nm & even 10nm on the way. Moore's Law continues to hold true! How low can we go??
One of the biggest costs in the semiconductor industry is actually the rate of progress it self. If you have to upgrade your production line every 18 months to have a sellable product then you only have 18 months to recover your costs. If computing power per area hits a wall, the cost per area will plummet and people will just start building computers with more physical silicon in them.
There's 3 massive areas in their infancy that will save the future of computing capabilities: 3d processors, neural networks, and quantum computers. I wouldn't worry about hitting a wall until these areas are better researched.
I doubt the improvements will continue to follow such a predictable curve as Moore's Law gave but we're nowhere near the limits of computation.
I know in 1997, when the first 90nm chips were coming out, but most things were still 150nm. I was at a presentation by a fellow from ASML, who already predicted back then that they had a roadmap up to 22 nm, but would expect to hit the end of Moore's law after that.
One of the best explanations of the litographic process I've seen
Would love to see a Computerphile explaining 'block chains' as used by bit-coin.
we definitely should have one about cryptocurrencies!
"bit-coin"? What is this, 2012?
Block-chains are more computationally interesting than crypto currency. The implications are larger. De-centralized trusted ledgers via block chains.
How is it not general CS? You can use it for literally ANY transaction based protocol. Call it by a generic term, but I'd assume people would want to know the origin as being bit-coin. Just my opinion.
Yes Computerphile should definitely explore the realms of Bitcoin, I own a UK Bitcoin exchange which is currently being tested. I also happen to be looking for a good programmer. > BTCSense
Thanks for the great vids, but the editing on this one was really confusing. I thought something had glitched and I hit refresh in my browser because he was repeating himself. But anyway, great stuff, one of my favorite channels on the tube!
Was told by my profs 10+ years ago that we were already at the point that Moore's law was about to end. Take it with a pinch of salt.
I'd still be pretty content about maxing out Moore's Law. When we have a limit to the expansion of processing power we could very well consolidate our social structures around the computer; just like with printing. It went through a formative process and then settled into a set form.
Don't suppose there's more footage of him explaining how a semiconductor works?
This is the perfect time to move on to how logic gates work to make "magic" happen on the computer. ;)
Why there is 40 seconds in the beginning of the video that is totally random talk is beyond me. Why not start the video when he actually starts talking about the subject matter?..
+colorschan Apparently it was the closest they managed to a "short summary of things to come". They have those on most computerphile videos. I suppose it is meant to create expectancy in the listener, but in this case, you're right, it doesn't work out that well, as it is just a snippet that reappears later.
I feel like a lot of these episodes start off in the middle of a one sided conversation. There's practically zero background given at the beginning. They're usually given about half way through, making the previous stuff useless, unless you go an rewatch them. I still enjoy them, but I'd appreciate it if you could start the videos off a little different.
Well, nothing lasts forever, but just as transistors replaced vacuum tubes (as I'm sure vacuum tubes replaced something before it), something will inevitably replace transistors as well and continue the trend until computers literally cannot get any smaller due to the physical limitations of the universe. Even then, who knows. I guess the real question is how much computational power do you need? Eventually I think it will come down to just programming them to make them run more efficiently.
I believe Ray Kurzweil predicts the Singularity will hit around 2039-ish, and at that point computers will (theoretically) begin to invent ways of improving themselves. At that point, all bets are off. Most people, including Mr. Kurzweil, predict that when computers start to upgrade themselves, they will do so at a rate much faster than that of Moore's Law. Too fast, in fact, for a regular human to comprehend.
0:17 and 2:22 are the same scene?
matrix glitch in this video?
I really loved the last couple minutes of the video describing how the laws of quantum mechanics begin to interfere with our computing as we begin working in such low scales.
I imagine 3D computer models will become the next step, as it's probably the easiest to achieve once we figure out the heat situation. Quantum computing is a fantastic concept but that's for our children's children's children to play with, not to mention that it computes in a completely different way using Q bits than our current classical bit systems.
Molecular computing would be a cool next step considering we have been able to use transistors the size of carbon atoms, but that has it's own world of instability problems.
New materials like graphite might help us for a while, and having more efficient CPU architectures will also do the trick.
Pretty sure doping is not done using ions, that would literally make it negatively or positively charged whereas in a transistor the P type simply has a hole where an electron can bond and the N type has an unbonded electron but the overall charge is still neutral.
Moore's law in its technical state, yes. End is nye. But speaking on principal, no. There'll be shortcuts which make an effective circuit alot smaller than it actually is. ARM processors are a prime example of why the principles of moore's law.
The end of moore's law has been predicted since the beginning of moores law, though
Extremely interesting ...again. Thanks Brady.
George Edwards thanks for watching - though no credit to me, my mate Sean makes most of the computerphile videos.... >Brady
Whoops. Thanks be to Sean. Cheers, mate.
***** Well then tell Sean we said thanks!
so i'm a bit confused with how having less components in your circuit increases cost (the left side of the bowl of the moore's law graph)
Stacking chips to have dense computing? How will that impact the heat transfer rate in computers?
We could also start building optical computers that use light instead of electric impulses.
I could listen to this fella talk for hours, interesting stuff :)
There is also the question of how much computing power we actually need. I mean, most applications hardly use all that power which hardware provides nowadays. Sure, there are always areas where you can't get enough computing power, like cracking encryption codes or realtime rendering a photorealistic 3d environment, but these are not the things that "Joe Average" would use nor need on their desktop PC, laptop or tablet computer. So basically, for the normal standard usage (except gaming of course, but that's one of those special applications I mentioned above), PCs that are 5 years old and older most of the time are very much powerful enough, and even a machine that is 10 years or older can still be of some decent use for lots of things to do. That wasn't the case in the year 2000, when a 10 year old PC was practically useless for most things you might want to do with a computer. My oldest PC I'm currently using is a P3 from 1999. OK, it serves only as a print server in my wireless local network, but it is set up to be able to use the www as well, although it is dead slow if one would actually try that (even with the now 4 year old Xubuntu 10.04LTS on it).
so what's the actual fabrication limit before we start getting issues with the transistors? 15nm? 9nm? less?
I have been wondering about this for a while, myself.
I sort of lost focus when he started explaining how basic doping is done, though. I know that's not what the video was about, but his explanation was so clear to me, moreso than any other I've encountered. Well done, Prof. McAuley.
Very interesting. Two small points, though -- there is not much point in having a taster at the beginning of a nine-,minute video, especially if you are going to repeat it less than two minutes in. And it is disconcerting to have the speaker looking up at the interviewer -- far better to get down to his level.
What are your thoughts on graphene-based ICs? I know a lot of research has been done on it, but will it end up coming to fruition?
It seems like it might take a while to refine new manufacturing processes for graphene chips, so even if it does eventuate, how long might we be waiting for it?
This was a great episode, I wish you'd do more interesting stuff like this.
Why is every professor always showing a bipolar junction transistor to explain digital circuits? They are almost never used in integrated digital circuits. If it just that they could clam its a US invention instead of a German.
Always start with "the first transistor was invented in the US in 1946"... and then when you learn about field effect transistors, you learn that it was invented by a German in 1926. Is it only me, or is 1926 before 1946 or not?
We are starting to "run into it/up against it" with current components. I wonder if a video might be done with Prof. McAuley on the rowhammer exploit.
i would like to see more videos like this on the physical parts of computers
Awesome video, I am sure we will think of something to solve our issues. Die stacking of course has it's own problems, mainly heat build-up in the center of the stack. IBM was experimenting with 'computer blood' which is a liquid that flows through the stack and deliverers current while removing heat. Can't wait. :P
I wish to see more about cpu architecture from computerphle.
One current solution Prof. McAuley neglects is using different semiconductors that have better frequency response. You don't have to make the transistors smaller when you can really crank up the clock signal reliably.
Then you hit the thermal problem, and you will not get any speed advantages.
That can be solved with different transistor geometries.
No you can´t. If the whole chipe uses more power, then you hit the thermal barrier anyway.
The driving force today is thermal ceiling. Thats why GPU are running at 1Ghz in steed of 4Ghz, because if they run on 1 Ghz instead, they use less than a quarter of the power.Making a 4 times larger chip running at a quater of frequency produce less thermal power enabeling manifactures to increas the size of the chip more.
Moore's Law was not coined by Moore. The term "Moore's law" was coined around 1970 by the Caltech professor, VLSI pioneer, and entrepreneur Carver Mead in reference to a statement by Gordon E. Moore.
Thank you for this video! I'm going to start requesting for an electronics channel now :-)
I've always wondered this, while I want a more complex universe in general. Why can't we just have a larger, CPU/chip/etc. I see the value of being small but if we just made it twice as thick, etc wouldn't you get about 2x as fast. I would have a smaller radiator, fan, etc and have a faster computer. Why are all the chips so small, why not larger?
Because the larger it is, the further information such as electricity has to travel, and therefore the slower it is.
Could we not just make the chips bigger? They don't take up much space in a computer compared to other components.
Finding the best Micro-architecture will be our next challenge I believe. Often times we create things without understanding their full potential. MRI machines provide a great example of this. We didn't even know we could use the machine to perform fMRI procedures until well after they had hit the market.
Wow havent seen dot-matrix printer paper in a long time.... I don't know why but a part of me is glad it's still around
that video sums up my first lesson of electrical engineering
If we can't decrease size of transistors, why we can't just increase clocks and amount of cores (with larger CPUs) like we did at beginning of CPU race (without increasing cores part)?
You should do a video about extreme overclocking. Explaining how the size matters in terms of heat, and the relation of those to MHZ. I think it would be interesting for people that dont know how that works...although if you are suscribed to this channel...you probably do.
How would you even detect an error from such a single thing? And to correct it, wouldn't that require simply more of them?
the moireeffect on his shirt mesmerises me
So is the window's blue screen is a quantum glitch?
I read an article that talked about the actual wiring in the circuits and that they are so thin that the electrons flowing through them cannot travel side-by-side. There is not enough room. They have to travel single file and that is a problem because that means the wire size has reached it's limit. I can go no smaller. I don't know what the long term solution is, but that is an example of just how the laws of physics are screwing up Moore's Law.
Even if we do reach the end of Moore's Law (and I doubt we will any time soon) we will still keep discovering newer and more efficient means of programming. I mean, look at the brain, its big, its slow, and yet it performs amazing functions nigh unfathomable to the modern computer scientist. I wouldn't be too worried.
This... could be a way... to go to Quantum Computing.
The end of Moore's law will be a boon for consoles, as well as poor countries.
We can probably go beyond the 5 nanometer limit by using smaller atoms or using elementary particles, but at this point it may not even be necessary to have so much power.
how would we continue moore's law AFTER quantum computing?
You can keep making them smaller but keep them "thick" (on the z axis) so they can continue to work.
I'm really excited for the 3D technology, I actually have a job lined up in that field after I graduate in a month! Woot Woot!
Will we move to multi CPU systems when the processor's hit their limit? Will we go from the traditional one CPU, to like 4 12 core Atom processors?
We have already been moving away from CPU heavy solutions for like 10 years now. What have hold us back in the in PC (and xbox) is the directX that forces the CPU to run looked steps with the GPU. Now with DX 12 this problem is midigated, so with the new DC the GPU is CPU independent, they you will basicly not have to use a high spec CPU to get great frame rate.
Do you normally repeat footage, Brady?
why not mention quantum computing on this subject
my question is why it's called Moore's Law instead of Moore Theory ?
my gut reaction: we need to find something other than a transistor then.
Nowadays IBM is down to 8nm.
Enjoyed this. Thanks #DerekMcAley and #Computerphile .
The reason for mores law ending is not from quantum tunneling. It is due to the power wall and the exponential growth of power density.
www.eetimes.com/document.asp?doc_id=1259039
Could Graphene Chips running at 100s ghz help?
Have a look at nano tubes (graphene) we could go a bit further with that
Intel already has 5nm chips running in the lab. This will ensure Moore's law continues for at least 10 more years.
so basically if quantum computing doesn't pick up in the next decade or 2 were gonna need either CPU's with ridiculous amounts of cache or potentially 100's of gigabytes of ECC RAM to do simple computing task
how far are we from that limit?
RAUL FERNANDEZ Moores law should end at around 2020. So we are pretty close. This doesn't mean that computers aren' going to get faster, computers will continue advancing.
Lucas Physics Next up; quantum processing :D
do they literally mean stacking ?
wow, stumbled across this when we're at 12nm (4.2ghz)/14nm (5ghz) to 7nm (2.5ghz ARM)
intel i5 have clock 3 - 3.4 ghz at 22 nm.
In 2010 IBM constructed graphene transistor with clock 300 GHz at 120 nm.
Looks like graphene will replace silicon.
So should it still be called a law?
He repeated the part about the arm processor
Ehh, not so much stacking of chips as there is going to be static and heat interferences. It is more likely that curved chips and dedicated piping to be the next best method. It takes advantage of 3rd dimensional real estate while dispersing heat more evenly, you can fit three times more chip on a tube the same width, grade school math. Now taking advantage of how differently isolated atoms act compared to groups of atoms it could be easy to create multi-element transistors only a few atoms wide, using the quantum mechanic variable to our advantage.
3D computing, multi core systems, Ligt computing...
There is plenty of stuff we don't use jet so I don't think it's over jet.
Multicore systems are an alternative to higher clock speeds, not to higher component densities. Also, you're just assuming some technology will step in and everything will continue as usual. There's no law of nature to ensure this. In fact, if there is any such thing as a physical law in this then that exponential growth will at some point stop. Same goes for economic growth BTW. People just assume it will keep growing because it did so for a couple of decades. Stupidest mistake ever.
What if we stayed at 14 nm and made the chips bigger? And... if radio can get to the moon and back in 2.3 seconds and electrons move at 60% of the speed of light, why oh why does it take my computer 60 seconds to boot?
Okay, how about a positronic computer system.
Without a "Quantum Leap" we're pretty much at the end of the line. Don't hold out much hope for "quantum computing" in the near future, particularly in relation to "normal" computing. Perhaps molecular "printing" (which doesn't exist yet) will keep moores law alive for another generation... that seems the only hope.
What about heat?
what about quantum computers?
I believe moore's law will continue just slower as most of the quantum doors have been open in applicable silicon, but there will be doors to open.
If it's slower, it doesn't follow the Moore's law anymore.
Well, there is alway the promise of CPUs made of carbon nanotubes and quantum computing. Wonder when that's going to happen...
why would we make computers out of carbon nano tubes?
convert to micro vacuum tubes might bring cost down
As i see it, until we find an alternative to binary computing or electrically driven computers. Size of the chips will increase to meet ever greater demands.
But one day someone really bright will figure out a new breed of computing that won't be limited by size but by how the component is used. Programmers capable of working with this new technology will be kings of the world.
Cadde "But one day someone really bright will figure out a new breed of computing that won't be limited by size but by how the component is used."
Can you explain what you mean by this? It sounds like kumbaya to me.
mountainhobo If i could explain it then i wouldn't explain it... I would be getting rich off of it.
I really wish i was that bright, but I am not.
22 nm that's like from here to Australia !!!!!
Brady either i'm losing the plot or the video starts again a few mins in?
Technically, the drawing at 4:00 is of a diode, but otherwise I like this vid
Basically, it only lets electricity flow just in one direction
allright a electric cheaeck valve
we still have quantum computers
No more Moore? Say it isn't so!
I love these videos, but please stop copying a 15s clip from the middle and putting it at the begging. It's really weird to run into it a second time mid-video.
A couple thoughts. One person criticized the processing of the human brain because the neurons 'leak' their signal, it seemed to me this could actually be helpful. You mention 3d and architecture. My engineer mind tells me that cost savings will still be accomplished here by changing the design less times. Then, of course adding more of the currently discrete components into the architecture. Specializing the architecture to the tasks, such as having a video editing processor and a slightly different gaming processor, etc. Efficient cooling can also yields results in performance, possibly to the point of using liquid and ultimately superchilled architecture, especially possible with all the discrete components within the chip covering. It seems to me that we could actually use chemistry itself to make the transistors and/or we could get a transistor with more states (0, 1, 2) and then possibly quantum computing. Still, it seems we still reach some theoretical limits are daunting like weather prediction and global warming modeling.
until new materials and techniques correct the tunneling effect.
Multiprocesseurs et processeurs asynchrones sont une solution.