Absolutely. I think our limited perception and understanding is a huge barrier to our learning process. We need to start disregarding these impedements, such as that our ubiquitous reality doesn't match some of our scientific observations or otherwise we won't advance. If it was obvious, it wouldn't be exciting. "If it's easy, it's not worth it".
This is one of my all time favorite videos on UA-cam! I have watched this video about 7 times now and I just absolutely love how well Mr. Moriarty explains the semi-transistor manufacturing.
Prof Moriarty is one of my favorite guests on any of the “phile” videos. Awesome guy and very good at breaking things down to a level I can understand.
Being a chemist, just having listened to a physicist, talking about mechanics, for the purpose of computing, I just realized that the electron couldn't care less about how it's manipulated and by whom.
This was a really informative video. Some illustrations/animations to visualize what he said would have made the video even better, although I understand that they take quite some time to make.
i agree. His experience as a professor really shines through. I'm not sure if there's a definite correlation, but all the physics professors I had in college were the best teachers I ever had. Along with their ability to explain things they were passionate/excited about their field
That was a brilliantly clear and energetic overview of modern chipmaking. Professor Moriarty explained how a Silicon transistor works, but didn't label it as such. The Silicon substrate is formed into the transistors by adding impurities (doping).
Very good speaker, very well explained, and engaging. This really helped me understand exactly what the subject was about. I'd love to have this guy as a teacher.
Thats not the case there. 5nm isn’t exactly 5nm. It does not represent a geometrical shape on transistor. It’s just the technology’s name. You can call it “marketing”. The real limitation for geometrical shape of transistor is 7nm. Nothing more than that.
I brought up the fact that Chemistry and Physics and ultimately everything that follows is a seamless whole just divided into digestible parts to a chemistry instructor once, and he almost flipped his lid. It was almost the same reaction from the physics department, yet they worked together constantly on things, though the chemists tend to be more reserved and the physicists tended to let their reservations go a bit, especially on things that went "boom". Nice to hear the Professor say nearly the same thing, about the relationship.
The wavelengths they're using in mass production today are not that small. I believe 193nm is still standard, even though they're making features as little as ~14nm in size (actually many parts of a "14nm" process are not 14nm, but it's all pretty much to scale compared to older process nodes like 22nm). The next step is (supposed to be) EUV, where they do drop to very short wavelengths and high energies, as discussed in the video. They are having a lot of issues getting that to work for mass production though.
The diffraction of light really is a limitinig factor when reducing the size of features. One of the ways semiconductor manufacturers get around this is phase shift masking, which Prof. Moriarty explained as two masks just slightly offset from each other.
+sewer renegade There are actually several ways of masking phase shift masks. I think Professor Moriarty is portraying the general concept of setting up having the edges of the photomask phase shift the light passing by so that when the light reaches the photoresist on the wafer, the edges of the patterns are being enhanced by constructive and destructive interference of the light waves, thus making features smaller than the wavelength of light possible.
I really admire Phil Moriarty's ability to talk around naming concepts like quantum tunneling, and keep his talk on track even with tangential questions.
To elaborate on the "Layers" question, yes, it's very much done in layers. In fact, even the first layer wasn't fully described here. After the exposed (or unexposed, depending on the process) areas of the polymer are washed away, another layer of some material is applied. Then the places where the polymer remained are washed away, leaving the new material only in the gaps. The material can be dopants for the underlying silicon, metal layers to connect components, insulating layers to separate things, etc. It can even be exposed to etchants, rather than a new material, to remove whatever layer is showing.
Three years ago they were talking about 14 nm, today we're talking about 5 nm; so this video becomes history in less time than it would take to study Electronics.
Unfortunately the 5nm and 7nm are really brand names than actual sizes of features - they are usually much more refined 10nm process which are capable of higher densities by eliminating issues with previous 10nm and 14nm processes. Essentially we are right up at the limits of manipulatable sizes when it comes to computing - most every improvement now comes from extremely complex and well designed architecture - cache for instance which is what amd rides on for their superior processing of late with 5000 and coming 6000 series chips
@@aravindpallippara1577 You need to search before replying, and address what was written. Plants are being constructed in Arizona, Tainan, etc. with 2 nm coming in 2025 - call it creative naming or fudging on the numbers - each new plant builds a smaller process. People don't invest and they don't spend over a hundred billion to convince you, you are not convinced. Instead the money is spent to place billions of transistors in the space previously occupied by one transistor, decades ago. They really are getting smaller.
Wow, that was was barely within what I could follow all the way through. The few times I started to get lost he stopped and explained it a bit more. Very well done, and I even picked up a few new things.
The energy of his explanations.... HE should be duplicated in the kindergarten, school, high school... NOBODY cannot be energized and curious the way he is talk and explain things! His energy in language would make me a damn poet!!
You said the wavelength is a limit but the nobel prize last year was for the invention of microscopes which overcome this wavelength barrier. I think they used the light emitted by proteins and blocked the light emitted by neighbouring proteins so that the resolution was down to one protein. There might be a way to use that for making smaller chips.
My "like" happened at 7:05. The "quantum computing" question wasn't very well informed, though it may have highlighted a common misconception. The answer, spurred from the question, about how "classical" computing must necessarily exploit the quantum nature of matter if it intends on reaching ~1nm scale features is totally spot on. Looking forward to the next ~5min of video!
I thought that the wave interference of the photons (as in, interference you see with the double split experiment) would be come a problem. But instead you could create a mask that actually utilizes this phenomenon to create interference patterns that match the target pattern on your silicon sheet.
My fiancé worked for Intel, he was sent around to various clean rooms and such to work on the computers that were running the electron scanning microscopes that they were using for debugging chips. The whole process is pretty cool to me, as I just stopped learning the abstractions of the CPU at the logic gates, and VHDL design. A curious wonder, what kind of feature size would a hobbyist be able to achieve? I mean, there's that guy who built a macro-computer by using full chips for his transistors, and I know most of us are better off using FPGAs anyways. But say someone wanted to get into etching their own silicon, what do you think would be the range of quality that they could get to?
I'm just glad there's someone that can see that physics chemistry and computer science are integral to each other than the normal 'brinksmanship' that you see in these fields.
1:46-1:56 this triggers me. Why is the smaller reference box not level with the bottom? _WHY?!_ Like, I get it, they re-used the animation frame and just replaced the abbreviated measurements rather than making a new one each time but *_WHY?!_*
please, more on the physics of computer hardware! there's been so many amazing inventions and discoveries through the years in the semiconductor industry so we can use computers as we know it...
Correct me if I'm wrong, I looked into this recently… but we can't physically "see" individual atoms because they're mostly empty space (and something to do with the wavelength of light). We can only observe the effect of individual atoms on their surroundings, which is what our images show.
He seemed to be hinting that there's a way to exploit the laws of quantum mechanics (instead of being hindered by them) within the context of classical computing. I'd be very interested to learn more about that possibility.
5:40 "as a physicist, it's not that you understand it. you just get used to it" wise words. I tried understanding the wave nature of electrons, lost half of my hairs just to get my head around that thing and I am not even a physicist. Physics can be addictive. also, it can be intuitive and unforgivingly confusing at the same time. I know a person who'd agree to the last statement. That is, Mr. Erwin Schrodinger.
I swear every time a computerphile video ends, when I hear those beeps, I start singing "Askepios" by the Mars Volta. I love it. At the end of each video I literally start singing "I'll be there waiting..." and start asking myself "damn what song is that?"
With feature size shrink, we're already running into really tough problems. Notice how there have not been a lot of processor improvements (pipelining as one example of a processor improvement) and have resorted instead to parallelization (multiple cores on a single die) and increasing clock frequency. Plus we're starting to see where feature sizes are so small that operating one component starts to affect the operation of adjacent components (e.g. rowhammer in memory chips). 14nm may be the practical limit for feature size.
funny thing is, average clock frequency is actually decreasing. Mainly because of popularity of note-books. Power consumption and heat-output rises faster than clock frequency, so it's more power-friendly to have slower parallel circuits instead of one fast circuit. There are very few algorithms which can't be parallelized. The circuit development simply mirrors the demand. Even with that in mind, microchips are still the fastest computer-machines we can build today. For example a single consumer-grade CPU is more powerful than all quantum-computers build to this date combined. And the quantum computers are all build to perform specific task. Biocomputers are only recently been build as a proof of concept.
Intel started 10nm transistor production couple of weeks ago, their 10nm CPUs are comming out next year. IBM said they managed to make a 7nm transistor, so we will for sure go at least to 7nm, below that we'll have to see since quantum laws start there.
Nadir Jofas , I'm referring to the pursuit of more processing per unit of time. At first, chip designers thought of things like pipelining, branch prediction, and so forth, and physical/fabrication improvements to allow clocking the parts faster. For a relatively long time now, there hasn't been any improvement in that sort of design, so now parallelization seems the only practical avenue left. But as others point out, for some problems that doesn't help because they're not intrinsically able to be parallelized. Still, for other problems/workloads, we're trying die shrink to get more cores in a single package.
It's funny that the process for etching those silicon wafers is basically the same as making printing plates. I don't understand why the electron beam is so slow though. Like, I get that it has to scan the entire surface, but doesn't a CRT TV do that with a much larger surface many times per second?
Yeah, but I'm saying an electron beam can refresh an entire TV screen like 60+ times per second, right? So I don't see why it would take so long to scan a tiny silicon wafer.
Copydot This is different from the beams in a TV. Because of the sensitivity of the chip, only low charge density electron beams can be used. I just looked it up and was surprised to find the process takes days.
The main difference is that in CRT, you really need to hit only few points with phosphor layers. On chip, you have to hit very preciselly every point with much higher resolution. Another thing is that if you'd use same high power technology as in CRTs, you'd actually dope the silicon with electrons or change the bonds between atoms.
so.. a moving electron creates a magnetic wave/wake in it's passing that expands radially. can you take advantage of these wakes and layer the silicone wafers and their circuitry in a way that layer two could be providing power to layers 1 and 3 as well of course with less amps/volts but still enough to open or close gates... perhaps even have currents passing through the wafers laterally connecting circuits on all layers at differing voltages and the gates be three way and open differently due to the differing currents being direct or eddie currents... etc. would these methods improve or be inovative to the industry or are they already in use?
Hearing about 14 nanometers being the smallest you can go, while my cpu is made with 7 nanometers manufacturing just shows how fast technology moves on. This is 4 to 5 years ago, and at that time, their ultimate goal was 13.5 or something like that! Science moves fast!
this is great info, since I come from a physical science background this shows the application of what I learned, wish they had a class on the physics of computers
My personal guess on what happens when the industry hits the wall, is that they figure out ways to build them larger. They are already doing this with RAM and once they figure out how to build chips which emitt almost no heat they will do so with processors aswell.
Speaking of the 'integrated whole'... Years ago I read Richard Feynman's 'Lectures on Computation' (certainly less popular than his lectures on physics, but Feynman did a bit of computer science as well) and he described a model of computation that theoretically required zero energy. I don't know enough quantum physics to understand the exact mechanism (it had to do with particles moving from an excited state to one of multiple rest states I believe) but I've never heard about it anywhere else. I would love to hear some academicians talk about this idea. Or about amorphous computing, although that is still a fairly specialist field... Oh, but memristors! I'm sure you could easily do a video about them! Please don't get distracted by the whole "neuromorphic processors" stuff, that's all anyone ever talks about. Talk instead about the impact they could have on computer architecture - Everything, CPU, registers, L1 cache, L2 cache, L3 cache, RAM, mass storage, all of it could be done with a single large array of memristors. And portions of the array could dynamically change from providing memory functionality to providing computational capacity faster than a RAM read. That could change everything!
I believe what you are essentialy talking about is quantum computing. It uses photon as information. By emmitting light on surfaces on the atomic scale you 'bounce' the photonic energy inside an atom causing the electrons that are 'orbitting' the nucleus to up energy levels. When atoms become more complicated these energy levels gain sublevels. The energy photons carry which equals the Planck constant * the frequency of the light is the lowest obtainable energy possible(zero energy). This energy can be transported just like electrons(i.e. information) and can be stored in the energy levels(i.e. memory). If you can read/write(manipulate) photons you can make a computer using the lowest resolution our universe has to offer.
Hello there! I am not educated in science or anything related to the topic. I am an analog photographer. I think I understand the basic function of what he is explaining but I have one question the answer was not given too. In photography the resolution on the film is partly due to the size and shape of the silver crystals that later turn into metallic silver. What is the light sensitive layer on the wavers made out of? Or how is it able to render such little detail, that most of the worries expressed in this video goes into making the exposure wavelength smaller. That would be very interesting to me. Thanks! I really liked this video of yours, truly amazing.
It might just not be so interesting. Because it's an excellent conductor, but not necessarily a good SEMI-conductor. To be more precise, graphene doesn't have what's called a gap (unless put under mechanical strain, or stacked in several layers). At the end, it's still too complicated to make and does not lead (theoretically) to a lot of performance improvement anyway...
I don't know why but he reminds me of Roy from The IT Crowd Just of course more intelligent than, "Hello IT, have you tried turning it off and on again?"
There are metal layers on top of the transistors. E.g. Intel 45nm has 9 metal layers. You always needed room for these since invention for the microchip. So, chips were never flat.
Yeah, i know that. That's not what i meant. More along the line of what memory chips been going through lately, with 32 or 48 layer stacking, except for CPUs. TSV, etc.
I was wondering, what is the limiting factor in hard drive size, both physical and storage related? if you could do a video explaining, that would be appreciated.
Amazing video! I have a question about Professor Moriarty's explanation of how the semiconductor industry is able to create such precise patterns on transistors. When the two offset masks are placed over the silicon wafer and light is shown, how is it that the light is able to deterministically etch a pattern? Why would it not behave like a wave/particle in the double slit experiment and defract into a probabilistic wave pattern on the wafer?
we are getting limited designing new electronics because of shot noise, the transistor grid is so small and so close to the size of an atom that the oscillation of the atom themselves affects the transistors, generating "shot noise" so is unpractical to build transistors any smaller if we could, that is why Ivy Bride and Skylark run hotter besides the fact they draw more power, we pretty much reached a diminishing returns point, will be interesting to see if we are going to hit a dead end for a while, the next 10 or 20 years will tell
"You don't understand it, you get used to it" sooo relatable
Yep
females xD
Isn't it like that for everything we "know"? Classical physics is just as strange, we are just used to it.
I don't see why it's so common to have a problem understanding quantum physics.
Absolutely. I think our limited perception and understanding is a huge barrier to our learning process. We need to start disregarding these impedements, such as that our ubiquitous reality doesn't match some of our scientific observations or otherwise we won't advance. If it was obvious, it wouldn't be exciting. "If it's easy, it's not worth it".
"You don't understand it, you just get used to it"
Probably one of the best physics quotes ever
Yeah that is actually very profound , it keeps you going.
It’s sounds like the food that’s served in US prisons and jails!
John von Neumann was the first one to say it. Give him the credits
Phil, aren't you too large to call yourself a nanoscientist?
Dad, I told you not to make these jokes in UA-cam comments!
its okay, he identifies as a nanoscientist!
cissized pig
He was referring to little phil... downstairs.
If he's a nanoscientist, I hope I never meet even a microscientist, let alone a full scientist!
My favorite connection from physics to electronics, is the fact that quantum tunnelling effects are at the heart of how flash memory and EPROMS work.
It seemed like the video ended while he was still explaining something.
Prof. Moriarty actually never stops talking. The best you can do is turn off the camera just as he changes topics.
He was about to unify classical and quantum physics
He was about to show his pee pee
This is one of my all time favorite videos on UA-cam! I have watched this video about 7 times now and I just absolutely love how well Mr. Moriarty explains the semi-transistor manufacturing.
More of this guy please.
I don't always watch computerphile (over my head) but you could put Phil Moriarty in a video about paint drying and I would watch it........oh wait.
I want more of this guy. He could start his own channel and just talk and I think he'd have thousands of subscribers;)
He has his own channel. Called Phillip Moriarty.
He does have his own channel! Moriarty2112, or you could follow Sixty Symbols where he has many videos about physics!
You should watch Sixty Symbols, he features in literally several dozen videos there.
+Phi6er Aww really? I liked this guy. :/
seconded
Prof Moriarty is one of my favorite guests on any of the “phile” videos. Awesome guy and very good at breaking things down to a level I can understand.
Why don't they just download more RAM into the electron beam to make it go faster?
I'll create a GUI interface using Visual Basic to see if I can track down an IP address for the download.
CSI?
Usual Hollywood hacker nonsense, that particular excerpt is from CSI:NY.
Justin Bell I thought so
"we need to hack faster !!!"... 3 people typing at once... on the same keyboard
Being a chemist, just having listened to a physicist, talking about mechanics, for the purpose of computing, I just realized that the electron couldn't care less about how it's manipulated and by whom.
This was a really informative video. Some illustrations/animations to visualize what he said would have made the video even better, although I understand that they take quite some time to make.
Check out en.wikipedia.org/wiki/Photolithography
i agree. His experience as a professor really shines through.
I'm not sure if there's a definite correlation, but all the physics professors I had in college were the best teachers I ever had. Along with their ability to explain things they were passionate/excited about their field
I love how excited he gets to answer each question and you can tell it’s genuine too
That was a brilliantly clear and energetic overview of modern chipmaking. Professor Moriarty explained how a Silicon transistor works, but didn't label it as such. The Silicon substrate is formed into the transistors by adding impurities (doping).
I usually just understand 25% of these talks but just love this channel and will keep coiming back to it again and again. Thank you for this!
hi there camera man!
Hello!
Was the camera man sitting on a basketball?
not enough giggles for that to be the case.
And this why movies tend not to use real mirrors!
movies use real mirrors, they just don't have the camera face on with the mirror
Watching this in 2019, they are now manufacturing 7nm microprocessors, how things move on.
intel 14nm have 8nm wide fins in a finfet transistor, I think they are making features bit smaller than 7nm in the absolute sense.
And on what he call Extreme UV (EUV)
@SuperTanner But how can they go smaller than atoms
And tomorrow morning a 5nm machine’s getting delivered to my home.
Let’s come back in a year, see where we’re at
we are in 5nm stage now
Very good speaker, very well explained, and engaging. This really helped me understand exactly what the subject was about. I'd love to have this guy as a teacher.
2019 update: 5nm in the works, 7nm in production (AMD Ryzen 3000 series, e.g.)
So if 14nm = 50 atoms, 7nm = 25 atoms, 5nm = 17 atoms. Getting there
3nm in lab
yet they are too inneficient
Thats not the case there. 5nm isn’t exactly 5nm. It does not represent a geometrical shape on transistor. It’s just the technology’s name. You can call it “marketing”. The real limitation for geometrical shape of transistor is 7nm. Nothing more than that.
For silicon ^^
One of my favorite Computerphile videos. Great explanation of concepts I've always wanted to understand. Thank you!
I can never get enough of Professor Moriarty. Such a fantastic and interesting person!
You should have seen him in Sherlock Holmes
I brought up the fact that Chemistry and Physics and ultimately everything that follows is a seamless whole just divided into digestible parts to a chemistry instructor once, and he almost flipped his lid. It was almost the same reaction from the physics department, yet they worked together constantly on things, though the chemists tend to be more reserved and the physicists tended to let their reservations go a bit, especially on things that went "boom". Nice to hear the Professor say nearly the same thing, about the relationship.
Just a silly question: doesn't diffraction screw up with the lithographic process considering that light has to go through such tiny apertures?
Yes, which is why you use light of very small wavelengths.
The wavelengths they're using in mass production today are not that small. I believe 193nm is still standard, even though they're making features as little as ~14nm in size (actually many parts of a "14nm" process are not 14nm, but it's all pretty much to scale compared to older process nodes like 22nm).
The next step is (supposed to be) EUV, where they do drop to very short wavelengths and high energies, as discussed in the video. They are having a lot of issues getting that to work for mass production though.
The diffraction of light really is a limitinig factor when reducing the size of features. One of the ways semiconductor manufacturers get around this is phase shift masking, which Prof. Moriarty explained as two masks just slightly offset from each other.
i didnt get how shifting the two templates helps
+sewer renegade There are actually several ways of masking phase shift masks. I think Professor Moriarty is portraying the general concept of setting up having the edges of the photomask phase shift the light passing by so that when the light reaches the photoresist on the wafer, the edges of the patterns are being enhanced by constructive and destructive interference of the light waves, thus making features smaller than the wavelength of light possible.
I really admire Phil Moriarty's ability to talk around naming concepts like quantum tunneling, and keep his talk on track even with tangential questions.
To elaborate on the "Layers" question, yes, it's very much done in layers. In fact, even the first layer wasn't fully described here. After the exposed (or unexposed, depending on the process) areas of the polymer are washed away, another layer of some material is applied. Then the places where the polymer remained are washed away, leaving the new material only in the gaps. The material can be dopants for the underlying silicon, metal layers to connect components, insulating layers to separate things, etc. It can even be exposed to etchants, rather than a new material, to remove whatever layer is showing.
Three years ago they were talking about 14 nm, today we're talking about 5 nm; so this video becomes history in less time than it would take to study Electronics.
Unfortunately the 5nm and 7nm are really brand names than actual sizes of features - they are usually much more refined 10nm process which are capable of higher densities by eliminating issues with previous 10nm and 14nm processes.
Essentially we are right up at the limits of manipulatable sizes when it comes to computing - most every improvement now comes from extremely complex and well designed architecture - cache for instance which is what amd rides on for their superior processing of late with 5000 and coming 6000 series chips
@@aravindpallippara1577 You need to search before replying, and address what was written.
Plants are being constructed in Arizona, Tainan, etc. with 2 nm coming in 2025 - call it creative naming or fudging on the numbers - each new plant builds a smaller process.
People don't invest and they don't spend over a hundred billion to convince you, you are not convinced.
Instead the money is spent to place billions of transistors in the space previously occupied by one transistor, decades ago.
They really are getting smaller.
Wow, that was was barely within what I could follow all the way through. The few times I started to get lost he stopped and explained it a bit more. Very well done, and I even picked up a few new things.
Moriarty is one of my favorite science communicators
The energy of his explanations.... HE should be duplicated in the kindergarten, school, high school... NOBODY cannot be energized and curious the way he is talk and explain things! His energy in language would make me a damn poet!!
You said the wavelength is a limit but the nobel prize last year was for the invention of microscopes which overcome this wavelength barrier. I think they used the light emitted by proteins and blocked the light emitted by neighbouring proteins so that the resolution was down to one protein. There might be a way to use that for making smaller chips.
these videos are like the best thing in my life sometimes. thanks for continuing to make them. :)
this was incredible to watch. the passion and conviction he showed was amazing
My "like" happened at 7:05. The "quantum computing" question wasn't very well informed, though it may have highlighted a common misconception. The answer, spurred from the question, about how "classical" computing must necessarily exploit the quantum nature of matter if it intends on reaching ~1nm scale features is totally spot on. Looking forward to the next ~5min of video!
"If you find that confusing.. Good" - soo funny and soo true
God, I love how professor Moriarty explains stuff !
This is probably my favorite video on the channel.
Physics AND computer science, all with Professor Moriarty. Great combo, I really enjoyed this episode. Thanks guys!
This is a great effing interview.. awesome enthusiasm and passion.
I thought that the wave interference of the photons (as in, interference you see with the double split experiment) would be come a problem. But instead you could create a mask that actually utilizes this phenomenon to create interference patterns that match the target pattern on your silicon sheet.
Great video; thanks Computerphile and Dr. Moriarty :)
Nice to see you again Dr Phil!
I love how you can see how passionate this man is about what he does.
I'd love to hear Prof Moriarty talk about spintronics and photonics if you have him on again.
This material is way over my head, but this video was fascinating. Thanks for putting it together.
8:43 ish - How do they make the masks with features that small?
This guy is magic, and clearly loves his stuff. Excellent.
This interview elevated my understanding of how we're able to manipulate atoms. Thank you.
My fiancé worked for Intel, he was sent around to various clean rooms and such to work on the computers that were running the electron scanning microscopes that they were using for debugging chips.
The whole process is pretty cool to me, as I just stopped learning the abstractions of the CPU at the logic gates, and VHDL design.
A curious wonder, what kind of feature size would a hobbyist be able to achieve? I mean, there's that guy who built a macro-computer by using full chips for his transistors, and I know most of us are better off using FPGAs anyways. But say someone wanted to get into etching their own silicon, what do you think would be the range of quality that they could get to?
I'm just glad there's someone that can see that physics chemistry and computer science are integral to each other than the normal 'brinksmanship' that you see in these fields.
"Quantum Mindset" #bandname
+MaxPower ^ rofl
But damn if you try to find the location and time of any given concert in particular.
+scabbynacker "Are you thinking with quanta yet?" #tagline
i don't know if i understood at least half of the video, but what i do know now is that things just do their thing in physics
It's very interesting to dive into the physics and chemistry of electronic computing, it's not a subject I've explored much as a computer scientist.
This is brilliant! I would be really glad if you made more videos on this topic
Professor Moriarty.
Who's first thought isn't Sherlock Holmes.. :)
Sorry just had to mention it.
Videos like this immunize me from the Kruger-Dunning Effect.
same here my friend. i hqd to look up the effect so im even more ignorant ;-)
putting you in your place is a simpler way of saying it
If you think it did that, you should be worried.
sugarfrosted Okay... it was just a booster.
Or even the Dunning-Kruger Effect.
I'd love to hear an update, now that Industry has processors at the 5 nm level.
Welcome back professor. I had wondered about your absence.
Big man working on small stuff, respect
1:46-1:56 this triggers me. Why is the smaller reference box not level with the bottom?
_WHY?!_
Like, I get it, they re-used the animation frame and just replaced the abbreviated measurements rather than making a new one each time but
*_WHY?!_*
11:35 and futher:
Shouldn't it be integrated whole? English is not my primary lang.
please, more on the physics of computer hardware! there's been so many amazing inventions and discoveries through the years in the semiconductor industry so we can use computers as we know it...
silicon is reflective? MIND BLOWN.
I really liked this sort of unplanned interview
Watched extra bits, still want more.
I see Phil Moriarty. I watch. I upvote
indeed,his videos are better , because he actually is more specific than others...
face01face f
And that's how you identify a redditor.
I see Phil Moriarty. I upvote. I watch
I love how excited he got for the silicon question.
Correct me if I'm wrong, I looked into this recently… but we can't physically "see" individual atoms because they're mostly empty space (and something to do with the wavelength of light). We can only observe the effect of individual atoms on their surroundings, which is what our images show.
Please do an updated one of these about the current physics of the newest chips
He seemed to be hinting that there's a way to exploit the laws of quantum mechanics (instead of being hindered by them) within the context of classical computing. I'd be very interested to learn more about that possibility.
5:40 "as a physicist, it's not that you understand it. you just get used to it" wise words. I tried understanding the wave nature of electrons, lost half of my hairs just to get my head around that thing and I am not even a physicist. Physics can be addictive. also, it can be intuitive and unforgivingly confusing at the same time. I know a person who'd agree to the last statement. That is, Mr. Erwin Schrodinger.
I swear every time a computerphile video ends, when I hear those beeps, I start singing "Askepios" by the Mars Volta.
I love it.
At the end of each video I literally start singing "I'll be there waiting..." and start asking myself "damn what song is that?"
With feature size shrink, we're already running into really tough problems. Notice how there have not been a lot of processor improvements (pipelining as one example of a processor improvement) and have resorted instead to parallelization (multiple cores on a single die) and increasing clock frequency. Plus we're starting to see where feature sizes are so small that operating one component starts to affect the operation of adjacent components (e.g. rowhammer in memory chips). 14nm may be the practical limit for feature size.
funny thing is, average clock frequency is actually decreasing. Mainly because of popularity of note-books. Power consumption and heat-output rises faster than clock frequency, so it's more power-friendly to have slower parallel circuits instead of one fast circuit. There are very few algorithms which can't be parallelized. The circuit development simply mirrors the demand.
Even with that in mind, microchips are still the fastest computer-machines we can build today. For example a single consumer-grade CPU is more powerful than all quantum-computers build to this date combined. And the quantum computers are all build to perform specific task. Biocomputers are only recently been build as a proof of concept.
Intel started 10nm transistor production couple of weeks ago, their 10nm CPUs are comming out next year. IBM said they managed to make a 7nm transistor, so we will for sure go at least to 7nm, below that we'll have to see since quantum laws start there.
I am pretty sure that the average clock speed is stabe since years.
Nadir Jofas , I'm referring to the pursuit of more processing per unit of time. At first, chip designers thought of things like pipelining, branch prediction, and so forth, and physical/fabrication improvements to allow clocking the parts faster. For a relatively long time now, there hasn't been any improvement in that sort of design, so now parallelization seems the only practical avenue left. But as others point out, for some problems that doesn't help because they're not intrinsically able to be parallelized. Still, for other problems/workloads, we're trying die shrink to get more cores in a single package.
rchandraonline Ah dammit youtube. I was referring to KohuGaly
This man is brilliant and captivating. Would definitely enjoy more videos featuring him.
4:56 You are doing it wrong !
You are supposed to say "You can't brake the laws of physics !".
XD
It's funny that the process for etching those silicon wafers is basically the same as making printing plates.
I don't understand why the electron beam is so slow though. Like, I get that it has to scan the entire surface, but doesn't a CRT TV do that with a much larger surface many times per second?
A couple seconds for a process is slow when you literally have to make millions of these per day.
Yeah, but I'm saying an electron beam can refresh an entire TV screen like 60+ times per second, right? So I don't see why it would take so long to scan a tiny silicon wafer.
Copydot This is different from the beams in a TV. Because of the sensitivity of the chip, only low charge density electron beams can be used. I just looked it up and was surprised to find the process takes days.
The main difference is that in CRT, you really need to hit only few points with phosphor layers. On chip, you have to hit very preciselly every point with much higher resolution. Another thing is that if you'd use same high power technology as in CRTs, you'd actually dope the silicon with electrons or change the bonds between atoms.
Ahh, I see. I figured it had something to do with the resolution, but I wouldn't have thought about doping. Thanks
Really cool to see semiconductor fab processing explained here!
so.. a moving electron creates a magnetic wave/wake in it's passing that expands radially. can you take advantage of these wakes and layer the silicone wafers and their circuitry in a way that layer two could be providing power to layers 1 and 3 as well of course with less amps/volts but still enough to open or close gates... perhaps even have currents passing through the wafers laterally connecting circuits on all layers at differing voltages and the gates be three way and open differently due to the differing currents being direct or eddie currents... etc.
would these methods improve or be inovative to the industry or are they already in use?
This is my favorite video in a while.
Hearing about 14 nanometers being the smallest you can go, while my cpu is made with 7 nanometers manufacturing just shows how fast technology moves on. This is 4 to 5 years ago, and at that time, their ultimate goal was 13.5 or something like that! Science moves fast!
this is great info, since I come from a physical science background this shows the application of what I learned, wish they had a class on the physics of computers
Need to do an update video with Phil now that it's 2020.
My personal guess on what happens when the industry hits the wall, is that they figure out ways to build them larger.
They are already doing this with RAM and once they figure out how to build chips which emitt almost no heat they will do so with processors aswell.
Speaking of the 'integrated whole'... Years ago I read Richard Feynman's 'Lectures on Computation' (certainly less popular than his lectures on physics, but Feynman did a bit of computer science as well) and he described a model of computation that theoretically required zero energy. I don't know enough quantum physics to understand the exact mechanism (it had to do with particles moving from an excited state to one of multiple rest states I believe) but I've never heard about it anywhere else. I would love to hear some academicians talk about this idea. Or about amorphous computing, although that is still a fairly specialist field... Oh, but memristors! I'm sure you could easily do a video about them! Please don't get distracted by the whole "neuromorphic processors" stuff, that's all anyone ever talks about. Talk instead about the impact they could have on computer architecture - Everything, CPU, registers, L1 cache, L2 cache, L3 cache, RAM, mass storage, all of it could be done with a single large array of memristors. And portions of the array could dynamically change from providing memory functionality to providing computational capacity faster than a RAM read. That could change everything!
I believe what you are essentialy talking about is quantum computing. It uses photon as information. By emmitting light on surfaces on the atomic scale you 'bounce' the photonic energy inside an atom causing the electrons that are 'orbitting' the nucleus to up energy levels. When atoms become more complicated these energy levels gain sublevels. The energy photons carry which equals the Planck constant * the frequency of the light is the lowest obtainable energy possible(zero energy). This energy can be transported just like electrons(i.e. information) and can be stored in the energy levels(i.e. memory). If you can read/write(manipulate) photons you can make a computer using the lowest resolution our universe has to offer.
Really like how you explain things, hope you do more videos in the immediate future!
Hello there! I am not educated in science or anything related to the topic. I am an analog photographer. I think I understand the basic function of what he is explaining but I have one question the answer was not given too. In photography the resolution on the film is partly due to the size and shape of the silver crystals that later turn into metallic silver. What is the light sensitive layer on the wavers made out of? Or how is it able to render such little detail, that most of the worries expressed in this video goes into making the exposure wavelength smaller. That would be very interesting to me. Thanks! I really liked this video of yours, truly amazing.
Was Roy off the IT Crowd based on this guy?
what about graphene? Are we get closer to make chips or is still to difficult?
It might just not be so interesting. Because it's an excellent conductor, but not necessarily a good SEMI-conductor.
To be more precise, graphene doesn't have what's called a gap (unless put under mechanical strain, or stacked in several layers). At the end, it's still too complicated to make and does not lead (theoretically) to a lot of performance improvement anyway...
ah i see, you appear to be a fellow VLSI engineer as well....
I don't know why but he reminds me of Roy from The IT Crowd
Just of course more intelligent than, "Hello IT, have you tried turning it off and on again?"
Similar accent
He's also got the Roy-esque quality of talking about intricate stuff in a non-jargon way
Was about to comment this, brilliant.
...and gestures.
I was wondering who he reminded me of!
I wish they explained all these details at computer science classes in universities.
It would be cool if Computerphile did an episode on 3D chips. With physics wall coming soon, the only place will be up.
There are metal layers on top of the transistors. E.g. Intel 45nm has 9 metal layers. You always needed room for these since invention for the microchip. So, chips were never flat.
Yeah, i know that. That's not what i meant. More along the line of what memory chips been going through lately, with 32 or 48 layer stacking, except for CPUs. TSV, etc.
I was wondering, what is the limiting factor in hard drive size, both physical and storage related? if you could do a video explaining, that would be appreciated.
What about using serial beams in parallel with one another?
This guy won't shut up. I love it!!!!
We missed you!
How long will it be before we start to see scanning tunneling microscopes that don't cost an arm and a leg?
Amazing video! I have a question about Professor Moriarty's explanation of how the semiconductor industry is able to create such precise patterns on transistors. When the two offset masks are placed over the silicon wafer and light is shown, how is it that the light is able to deterministically etch a pattern? Why would it not behave like a wave/particle in the double slit experiment and defract into a probabilistic wave pattern on the wafer?
we are getting limited designing new electronics because of shot noise, the transistor grid is so small and so close to the size of an atom that the oscillation of the atom themselves affects the transistors, generating "shot noise" so is unpractical to build transistors any smaller if we could, that is why Ivy Bride and Skylark run hotter besides the fact they draw more power, we pretty much reached a diminishing returns point, will be interesting to see if we are going to hit a dead end for a while, the next 10 or 20 years will tell
I learned SO much from this. Thank you.
I also love his passion. It excited me to learn this.
4 years later apple is putting 5nm chips in phones