Enjoyed the video? Consider subscribing. If you want to watch other videos on semiconductors, check out the playlist: ua-cam.com/play/PLKtxx9TnH76QEYXdJx6KyycNGHePJQwWW.html
Thank you for doing this, my dad was trained as radio technician and i was always fascinated by the colour coded resistors, transistors and led's. I couldnt pursue the subject academically but this just filled a gap in my understanding of the subject.
@obimk1 Do you know how many cpus the scan machines (the kind that cost $50M-$150M to buy) make in one year. Is it like a half million 5900x cpus from one machine per year. I would like to know more about it, but it seems hard to get info on it. Thanks
Dear Asianometry, It’s videos like these I appreciate the most from creators like you. I worked on a chip integration team for a awhile and I’m truly humbled by how capable modern IC design is. From my desk it just looked like lines of Verilog code but I new someday it would be turned into nanoscopic transistors on a silicon wafer and that’s amazing. It seems today’s generation is more interested in software but few appreciate that hardware has its own story to tell. I hope your content inspires younger people like myself to pursue a career in hardware.
As someone who is mostly interested in software I gotta say, Id have to be 100 times smarter at the very least to design any sort of chip sincerly hats off to everyone who has worked and works with hardware and its optimization
@@lucamagnani5243 I am 15 and have been working with both software and hardware/electronics for the last couple years. It really isn't all that hard if you study electronics a bit, learn the basic components and circuit design/theory. Start with simpler designs for fun and make small projects which you can actually create in real life, and just progress with time. I still am nowhere near able to design complex circuits but its fun. But yes, many people in the younger generation, including me, usually are more into software development. I'm assuming its because we already passed the "hardware boom" and we have come to a point we all take electronics/hardware as granted so there isn't much people who think "OOO I CAN MAKE IT BIG WITH ELECTRONICS", and just deal with what we can at home, aka software.
I think more people are interested in software because it's way more accessible. There's tons of free tools to write, compile, and execute software. There's lots of great tutorials on software development. Learning basic programming is pretty straightforward - many languages are basically readable without much fundamental knowledge, and modifying programs is approachable. Furthermore, software is a much higher abstraction, and it's easier to accomplish goals by writing software. Hardware has none of that going for it. There's much fewer tools, there's a smaller pool of talent working on hardware, hardware requires at a minimum basic circuits knowledge, and logic gates aren't intuitive to look at. There's no easy way to design a chip and then get it - sending a design off to a fabricator to make one for you is insanely expensive; fabrication only becomes cost efficient at scale. Even if you do fabricate one (and it actually works, which is another big ask without a huge QA team) then what? Sure, basic IC's can perform simple functions, but realistically the chip you need is either already made or is so hyper-specific that you'd likely be better off with a FPGA or (more approachably) a microcontroller like an Arduino or a mini computer like a raspberry pi. And for those, you're just doing some basic circuits and programming. As far as careers go, it's doable, but my understanding is that a career in chip design requires a PhD. It's pretty challenging. The undergrad program of electrical/computer engineering requires some very intense math courses, and they aren't for everyone. Grad school for that discipline is (I can only imagine) insane. So yea, while it'd be great to get more young people interested in hardware, it's probably not going to happen. Software is easier to learn on your own, easier to implement, and in significantly greater demand in the job market.
I've been painstakingly trying to learn electronics. I'm currently in the hardware portion of my learning. Remember I'm doing this on my own through YT, is it common to understand hardware but not software? Or understand how an old VCR works but can't understand the circuit side of things, I look at circuits and get brain fog. I'm slowly coming around but still I can look at a circuit board out of its case and I couldn't tell you what it goes to or its functions until it's back in its case and I can see the machine it works for. I've been collecting electronics off the side of the road that ppl discard. Some work fine just old, some are junk but have components to fix something else. I see where the future in trade is heading, I'm an old oilfield/construction person but I couldn't wire up a house safely, without fearing that place burning down.
Trip down memory lane for me. I was lucky enough to experience what you father did. In 1976 as a fresh EE graduate I started my education once again working for Fairchild in a small design office in Bristol in UK designing SSI/MSI CMOS logic. Because of the small circuits and lack of EDA tools, Logic/Circuit/Layout design was done by hand. The final product of design was then drawn by a team of draftsmen on mylar and then digitized using a CALMA Graphics system comprising a digitizing table (work of art), TEK storage display driven by a gorgeous NOVA 1200. We had to boot the system from paper tape on a TTY. I can still hear the clatter on the TTY reading the tape operating at 50 baud. Everything was manually checked and when you were happy, or ran out of time, the data was written to magnetic tape and then sent to Mountain View for fabrication. 6 weeks later the packaged chips would return and In those days it was not unusual to have major problems with a design. Finally made it to the US HQ so my time in that little design office in the centre of Bristol was just the beginning of a great journey of which I am so grateful.
This channel is so awesome, no annoying 5 minute product placements, no dumb influencer stuff, just great researched facts and a symphatic dude presenting it. Nice work
Man, the speed and consistency of your videos is amazing. Your's is one of the few channels that always puts out well researched videos that don't just sensationalize things for views. I checked your channel for a new video last night and knew that by the time I got up there would be a new one :)
@Yog Sothothologist ah yes, of course, a race superiority stereotype, what a great metric to measure skill by, definitely used appropriately in practice and never used to justify racial discrimination /S
@@randomnessslayer I am normally very tolerant, free speach is essential, it is not illegal to be mean but I don't understand the fun with insulting a different race. For me It is ok if people make fun of me and my race, if there is a catch and they are funny and make me laugth, " It is not even funny. Just mean to asians
Another outstanding video. As one who struggled to explain silicon issues to customers (my longest role was in Product Reliability for Automotive chips) I'm consistently amazed (and jealous) of the detail and range of topics you discuss.
Thanks, I'm studying EE and basic circuit design and I had no idea how deep this rabbit hole goes especially when it comes to modern nanometer -scale lithography. Makes sense that they wouldn't do all of these billions of transistors by hand every single time.
This is the first time I got a video recommendation about something I had no idea how it works. And after I watch this, it really adds up into my knowledge about chip making and designing.
I actually worked on implementing pathfinding algorithms for route tracing and it's incredibly tricky, especially given the limited space and the fact traces are not allowed to intersect. It's an NP complete problem, meaning the optimal trace configuration (with the shortest total trace length) can only really be found if you try all configurations. I tried coming up with algorithms that come pretty close to the optimal configuration.
It took me several hours to watch this 12-minute video 😂 Every 5 seconds there’s something interesting that I just have to google or at least write down to google later 😂 Great stuff!
I know that most comments appreciate your research work rather than the video itself, but the quality of your videos is impressive, up to the point that it is the first channel that I have recommended to my friends.
Didn't know that Cadence was such a useful software. We have a software-simulation course this semester and we only worked around simple circuits likes adders and simple gates. Daaamn, this video should be shown in the course introduction class ! Great video
Greatly enjoyed this. Thank you. Takes me back to the 80's. Interesting to hear how you speak of your dad. I felt the same about mine, being exposed to electronics early.
Although digital circuits are designed with hardware description languages nowadays, analog circuits are still mostly designed on the schematic level and can’t be easily automated.
Phenomenal introduction bro! I'd always wondered what the chip designing pipeline is like, this is gotta be the first video where I got a true picture in my mind of what actually goes behind computer engineering
What a great and informative video. If I may add some more points (I'm an IC Designer in the semicon industry myself): - Mentor Graphics is another big EDA tool vendor along (though recently bought by Siemens) - Most of the automation effort has advanced the design of digital design circuitry, which occupies, probably, >50% of the area of the chips released today. Analog Design, on the other hand, is extremely difficult to automate. Some effort has been done, but I still don't know of a complex chip being designed solely by a machine as of now. Even though digital might seem like "the advanced technology", analog design is still needed, as you just CANNOT interface the real world with a purely digital chip. - That's the reason why people say the wet dreams of the semicon industry is to move the digital circuitry closer to the input as much as possible in order to save costs and automate everything. - Some big semiconductor chip companies in the world have actually made efforts to build their own in-house EDA software in order to save money on the EDA 3rd party licensing. Though, AFAIK, only the simulation engine, not so much the actual GUI. BONUS: - Your video thumbnail is a screenshot of KiCAD, which is a free PCB software. I think a screenshot of the Cadence environment would've been a better fit (hehehe)
@@karamany9870 I can't possibly answer that in a few lines and not willing to put a wall of text explaining my opinion. Choose what you like the most! Don't be miserable doing what you don't like. Plus, no one knows if you'll end up in engineering or eventually moving to some more managerial roles.
One of my friends worked in TI in analog design and has now switched to Intel. He always said analog circuit designing is like black magic. Even the most skilled designers are not always sure of the full implications of the chips they design. What do you mean by moving the digital circuit closer to the input though? Because digital logic would still have to implemented by analog design.
@@allentchang Hummm.... back in 1987 it was LSI workstations, if I recall. I don't know the operating system, but I believe they were Tectronix graphics terminals (Zilog). They were not fast, but very high resolution for the day. In 1993 it was Apollo workstations (Seagate), which were running Mentor. It certainly ran a Unix variant, but it was an unusual one. Into the new century, its all been Verilog using Xilinx software (various startups), running on Windows (does Xilinx even run on Linux/Unix?). Our fabs also tell the story: last century it was custom fab (Zilog), then AT&T fab (Seagate), then after that probably TSMC, I don't recall. Afternote: Actually I do recall, at Zilog the Tek terminals were driven by racks and racks of LSI-11's, a PDP-11 that fit in a single or double RU. I remember because we had a big serial port mux that would allow you to get a connection to any of the machines. I used to write scripts that would start jobs on multiple machines overnight, which was the only way to get reasonable simulations of chips. I believe they were running Unix. Our chip simulations were done on a custom gate level simulator that I learned a lot from, since it would simulate things like domino logic. And yes, I am old.
@@scottfranco1962 Used Mentor Graphics for digital design in college around 1996. Also used Avanti Hspice in circuit courses. Half of the college’s workstations seemed to be some Unix variant while the other half were SunOS / Solaris, which were much newer. Linux wasn’t embraced until roughly a decade later. Now Mentor got eaten by Siemens and Avanti got eaten by Synopsys. In any case they sure know how to pick weird tool names: Dracula, Assura, Hercules, Calibre, Spectre. They should have stuck with things related to spices, cinnamon, and cloves.
@@allentchang So I have not been part of direct IC design for a lot of years. Last time was at Segate, but we didn't do the layouts or see the layouts (sad). I imagine that analog is still schematics. I think what happens now is that the Verilog tools deliver mostly automated layouts that use "fixer" layout people to correct automation problems and for special layouts, again, including analog. A company like Intel or AMD might typically use a mix of both, for example if the design calls for say a cache, that's going to be mostly laid out by hand for a single cell of it, then replicated many times. The author of this video series is not quite correct that everything is standard cell. Even in the old days, when we used cells they were usually from a library of custom layouts made for that particular design. I guarantee you the latest x86 processor does not use standard cells. The other issue is drive lines. One thing an IC designer does a lot is change the drive strength within the chip to match drive to length of line so that it does not drive it too little or too much, and matches the rise time requirements.
@@scottfranco1962 An analog engineer has to work closely with an layout engineer to make sure sources of signal mismatches are kept to a minimum and that the spice models do work as promised under certain layout conditions. Working with only schematics is no longer the way to go.
This is essentially the video I was looking for but never even tried to find. Back when I was younger I used to build circuit boards. I would create a schematic and then try to transfer that schematic to a double sided trace layout that could be etched onto a copper clad board using photo etching and chemicals to dissolve away unwanted copper. I often wondered how complicated it must be to do that on 6 layer boards or even internally on an integrated circuit. And now with CPUs being so complicated I knew there had to be some sort of automated algorithm that could drastically assist the engineer.
3 роки тому+1
Your videos about processors are unreal. Better than almost anything on youtube, maybe only those conference videos are better at explaining stuff ( but take an hour to finish).
I feel sad when people only acknowledge 2 major EDA companies as Cadence and Synopsys. They always tend to forget Mentor Graphics (now Siemens EDA) the next big giant competing neck to neck with the above 2.
Nice video! One small nit: the image for your “memory circuit” isn’t a memory circuit. I was on the edge of my seat wondering if you’d cover the importance of the hierarchy of abstraction. I needn’t have worried. I worked at Honeywell in the mid ‘70s on an EDA simulation tool, while a coworker made use of our Multics system for his thesis work on circuit board layout, because of the large address space. That drove home for me the complexity of layout problems. I’ve been a bystander since, but one thing I’ve observed is that the use of standard cells reduces the layout complexity. If you have a cell with 10 cells, layout deals with 1/10 the number of items. Since layout complexity grows combinatorial or worse, that factor of 10 savings becomes much much more. It addresses complexity both for the designer, and for the algorithms the EDA software uses. As complexity grows, you add more levels of abstraction. You might arrange memory, I/O, CPU subsystems on the chip based on the area needed, route interconnections, then lay out the interconnections between each of the subsystem’s cell, with the cells being pre-routed, or perhaps alternate selections of cells of different shape, to fit together like Tetris. Without the levels of abstraction, modern chips would be impossible. I’m always impressed by the depth and production of your videos. Making things accessible to a wide audience is a challenge. I think you do a great job.
Thank you for the informations. I had been looking for a video for days about how processors and circuits were designed, and I found it on your channel. I'm also researching photolithography and found several videos related to the topic here. Greetings from Brazil.
0:19: Hmm . . . . that looks like a common emitter BJT amplifier with emitter degeneration . . . . . 2:15: EOR for exclusive OR instead of XOR? 2:42 Doesn't a memory circuit contain feedback (ie output directly or indirectly connected back to the input)?
Instructive. In late 80's entropy was used to improve a better heat control of those chips. That had an impact on layouts. I'm sure that those principles survived. AI will soon lift chip design at unimaginable heights. Hope you'll talk about that aspect soon. Thank you.
@Hatwox I dunno man, In school, i learned that the benefits of miniaturization is cheaper chips. if you can fit for transistors into less space, and have the tech to do so, the chip is cheaper on a per-transistor basis. a PC is not a chip, a PC has many chips on its various components. a PC now will have more chips than before, but the chips themselves are cheaper when they are made more efficiently, the cost comes from the additions that are available because of the efficiency gain. if that doesnt make sense, its because im a dumb dumb.
This is nitpicking, but the image you show as a "memory circuit" is a multiplexer, it doesn't store any state of its own. It can be used, however, to fetch data from a memory bank / register (among many other uses). Still loved the video!
I’ve worked from the other end making modules and design rules for design software. Designers used the software, and told us when they needed a new or custom module to make the ic a customer wanted.
I am a backend semiconductor eng in the Fab. This is a good overview of the whole process. I heard some of the terms and don’t know what they are and the purpose. Now, I have a general idea. You only mentioned the digital design (AND OR gates etc). The Analog design may be different. I heard Analog design is more difficult. Good Analog circuit designer is hard to find.
I am an RF test engineer by trade and my RF peers would say those PCB or silicon layer-to-layer interconnects as "vee-ya" and not "vay-ya" (via). I am not sure about chip designers and sons pronunciations though. :)
Nowadays there is a lot more in the chain of process. I work at a company where we design such chips, and there are quite some steps and work untill a tapeout (actually creating it at like TSMC) :)
As a software engineer, this kind of thing feels like a whole new horizon, and sounds really interesting! I wonder in what kind of major does this thing being taught (just in case I ever get the motivation --- and money to take another degree 😅)
@@unkownuser8455 can confirm, it was a dope 2 years when I was still in the CE major - before I chose to switch to SE solely because of the differing depth of the math classes required (SE was easier, in most regards). I do wonder what it would be like if I stuck with CE... but then again maybe that just means I should see if I can break into the dev world of EDA software :)
An absolutely fascinating and educational video. You have really brought me up to speed on how the electronics industry functions today with this and your other videos on design and manufacture of chips. Any thoughts on the effect of US trade sanctions and how it is forcing TSMC to locate leading edge fab facilities in the US as opposed to Taiwan?
For the good of the world we need to get as much chip manufacturing and design out of Taiwan before China attempts to invade... it's coming thanks to Biden.
Videos (like this) on the tools used to generate technology are fantastic. The simulation tools (e.g. Ansys HFSS, or Sonnet) would be of significant interest to me.
open source tools tend to improve with usage from the public. for a parallel, look at the improvements on kicad since ordering pcbs cheaply became accessible, or how FPGA open source tool are also evolving. Sadly, chip fabrication is still prohibitively expensive for amateurs, so we won't see advances there.
TYSM for the video! Quick question about chip manufacturing process though: I know that magnifying lasers brings the chip designs to a small scale, but how do you manufacture the mask that is used to pattern the laser? If there are billions of transistors, how do they cut each one out in the mask?
Great presentation!... although a little attention has been put on circuit simulation&testing stages of the design process which regardless its flavor (analog | digital | mixed) is perhaps the most sophisticated and resource consuming task. Perhaps this could be the theme of the next video !:)
whoa!! I have a small version of one of those in a plastic case. I never knew what it was, but it cool that found this video. I was totally like "huh, i've seen that before." my grandpa used to work somewhere, where i got to see one up close when i was 5 or 6. I totally didn't understand it but it looked cool. 4:05
Enjoyed the video? Consider subscribing. If you want to watch other videos on semiconductors, check out the playlist: ua-cam.com/play/PLKtxx9TnH76QEYXdJx6KyycNGHePJQwWW.html
Yeah. I’d love to do a video about it someday.
Thank you for doing this, my dad was trained as radio technician and i was always fascinated by the colour coded resistors, transistors and led's. I couldnt pursue the subject academically but this just filled a gap in my understanding of the subject.
@obimk1 Do you know how many cpus the scan machines (the kind that cost $50M-$150M to buy) make in one year. Is it like a half million 5900x cpus from one machine per year. I would like to know more about it, but it seems hard to get info on it. Thanks
great channel! please keep making videos like this.
Excellent John
Thanks!
Dear Asianometry,
It’s videos like these I appreciate the most from creators like you. I worked on a chip integration team for a awhile and I’m truly humbled by how capable modern IC design is. From my desk it just looked like lines of Verilog code but I new someday it would be turned into nanoscopic transistors on a silicon wafer and that’s amazing.
It seems today’s generation is more interested in software but few appreciate that hardware has its own story to tell. I hope your content inspires younger people like myself to pursue a career in hardware.
As someone who is mostly interested in software I gotta say, Id have to be 100 times smarter at the very least to design any sort of chip sincerly hats off to everyone who has worked and works with hardware and its optimization
@@lucamagnani5243 I am 15 and have been working with both software and hardware/electronics for the last couple years. It really isn't all that hard if you study electronics a bit, learn the basic components and circuit design/theory. Start with simpler designs for fun and make small projects which you can actually create in real life, and just progress with time. I still am nowhere near able to design complex circuits but its fun. But yes, many people in the younger generation, including me, usually are more into software development. I'm assuming its because we already passed the "hardware boom" and we have come to a point we all take electronics/hardware as granted so there isn't much people who think "OOO I CAN MAKE IT BIG WITH ELECTRONICS", and just deal with what we can at home, aka software.
I think more people are interested in software because it's way more accessible. There's tons of free tools to write, compile, and execute software. There's lots of great tutorials on software development. Learning basic programming is pretty straightforward - many languages are basically readable without much fundamental knowledge, and modifying programs is approachable. Furthermore, software is a much higher abstraction, and it's easier to accomplish goals by writing software.
Hardware has none of that going for it. There's much fewer tools, there's a smaller pool of talent working on hardware, hardware requires at a minimum basic circuits knowledge, and logic gates aren't intuitive to look at. There's no easy way to design a chip and then get it - sending a design off to a fabricator to make one for you is insanely expensive; fabrication only becomes cost efficient at scale. Even if you do fabricate one (and it actually works, which is another big ask without a huge QA team) then what? Sure, basic IC's can perform simple functions, but realistically the chip you need is either already made or is so hyper-specific that you'd likely be better off with a FPGA or (more approachably) a microcontroller like an Arduino or a mini computer like a raspberry pi. And for those, you're just doing some basic circuits and programming.
As far as careers go, it's doable, but my understanding is that a career in chip design requires a PhD. It's pretty challenging. The undergrad program of electrical/computer engineering requires some very intense math courses, and they aren't for everyone. Grad school for that discipline is (I can only imagine) insane.
So yea, while it'd be great to get more young people interested in hardware, it's probably not going to happen. Software is easier to learn on your own, easier to implement, and in significantly greater demand in the job market.
I've been painstakingly trying to learn electronics. I'm currently in the hardware portion of my learning. Remember I'm doing this on my own through YT, is it common to understand hardware but not software? Or understand how an old VCR works but can't understand the circuit side of things, I look at circuits and get brain fog. I'm slowly coming around but still I can look at a circuit board out of its case and I couldn't tell you what it goes to or its functions until it's back in its case and I can see the machine it works for. I've been collecting electronics off the side of the road that ppl discard. Some work fine just old, some are junk but have components to fix something else. I see where the future in trade is heading, I'm an old oilfield/construction person but I couldn't wire up a house safely, without fearing that place burning down.
Trip down memory lane for me. I was lucky enough to experience what you father did. In 1976 as a fresh EE graduate I started my education once again working for Fairchild in a small design office in Bristol in UK designing SSI/MSI CMOS logic.
Because of the small circuits and lack of EDA tools, Logic/Circuit/Layout design was done by hand. The final product of design was then drawn by a team of draftsmen on mylar and then digitized using a CALMA Graphics system comprising a digitizing table (work of art), TEK storage display driven by a gorgeous NOVA 1200. We had to boot the system from paper tape on a TTY. I can still hear the clatter on the TTY reading the tape operating at 50 baud. Everything was manually checked and when you were happy, or ran out of time, the data was written to magnetic tape and then sent to Mountain View for fabrication. 6 weeks later the packaged chips would return and In those days it was not unusual to have major problems with a design.
Finally made it to the US HQ so my time in that little design office in the centre of Bristol was just the beginning of a great journey of which I am so grateful.
Dang, dad was a chip designer. Lucky! I'd loved to have been able to talk electronics/computers with my dad.
I talk about electronics/computer/phone with my dad all the time
Mostly have to explain the same things over and over 😐
I have no chip designer dad, but I am.
Back in the day, it was a different animal all together
I wish I had a dad who gave a shit about me.
The only talking my dad did with me was with his fists... NOW IM IN TECH BABY.
This channel is so awesome, no annoying 5 minute product placements, no dumb influencer stuff, just great researched facts and a symphatic dude presenting it. Nice work
Man, the speed and consistency of your videos is amazing. Your's is one of the few channels that always puts out well researched videos that don't just sensationalize things for views. I checked your channel for a new video last night and knew that by the time I got up there would be a new one :)
I am an analog RF ASIC designer, I still draw schematics, and i hand draw critical layouts
@Yog Sothothologist So bro?
@Yog Sothothologist ah yes, of course, a race superiority stereotype, what a great metric to measure skill by, definitely used appropriately in practice and never used to justify racial discrimination /S
but do you have to, or you mean hand draw but with CAD or something?
@Yog Sothothologist i am very tolerant, but that was not funny in any way.
@@randomnessslayer
I am normally very tolerant, free speach is essential, it is not illegal to be mean but I don't understand the fun with insulting a different race. For me It is ok if people make fun of me and my race, if there is a catch and they are funny and make me laugth, "
It is not even funny. Just mean to asians
Detailed, truthful and understandable but not boring. Thanks for a great video!
this is a "spinach" video, what a waste of time.
“Mythical man-month” is still my favourite book! It's timeless! It applies all the way to all design stages, from software to silicon gate design.
Another outstanding video. As one who struggled to explain silicon issues to customers (my longest role was in Product Reliability for Automotive chips) I'm consistently amazed (and jealous) of the detail and range of topics you discuss.
Thanks, I'm studying EE and basic circuit design and I had no idea how deep this rabbit hole goes especially when it comes to modern nanometer -scale lithography. Makes sense that they wouldn't do all of these billions of transistors by hand every single time.
This is the first time I got a video recommendation about something I had no idea how it works. And after I watch this, it really adds up into my knowledge about chip making and designing.
My father was a chip designer at Lays
My father couldn't get lays
I actually worked on implementing pathfinding algorithms for route tracing and it's incredibly tricky, especially given the limited space and the fact traces are not allowed to intersect. It's an NP complete problem, meaning the optimal trace configuration (with the shortest total trace length) can only really be found if you try all configurations. I tried coming up with algorithms that come pretty close to the optimal configuration.
Did you publish them?
You should come work for the CCP
@@JohSebBacChina has EDA tools down to 14nm this quickly after US bans.
It took me several hours to watch this 12-minute video 😂 Every 5 seconds there’s something interesting that I just have to google or at least write down to google later 😂 Great stuff!
I know that most comments appreciate your research work rather than the video itself, but the quality of your videos is impressive, up to the point that it is the first channel that I have recommended to my friends.
Feels like a presentation you could do on a big event. Beautiful.
Good video again. I will like to see more videos on the whole chip design/manufacturing process. Thanks
Siemens EDA (mostly former Mentor Graphics) is a big competitor to cadence analog mixed signal and physical verification flow
Didn't know that Cadence was such a useful software. We have a software-simulation course this semester and we only worked around simple circuits likes adders and simple gates.
Daaamn, this video should be shown in the course introduction class !
Great video
Greatly enjoyed this. Thank you. Takes me back to the 80's. Interesting to hear how you speak of your dad. I felt the same about mine, being exposed to electronics early.
Thanks John :) This was really interesting!
Although digital circuits are designed with hardware description languages nowadays, analog circuits are still mostly designed on the schematic level and can’t be easily automated.
Phenomenal introduction bro! I'd always wondered what the chip designing pipeline is like, this is gotta be the first video where I got a true picture in my mind of what actually goes behind computer engineering
I have been searching for this video months of how processor chips are designed! Finally got it! Thanks a lot!
2:42 that's a multiplexer, not a memory circuit.
true
This and pronunciation turned me off from the video. Made me feel like I was watching a training session from an intern.
Have another like for remembering Digitala Electronics 101.
What a great and informative video.
If I may add some more points (I'm an IC Designer in the semicon industry myself):
- Mentor Graphics is another big EDA tool vendor along (though recently bought by Siemens)
- Most of the automation effort has advanced the design of digital design circuitry, which occupies, probably, >50% of the area of the chips released today. Analog Design, on the other hand, is extremely difficult to automate. Some effort has been done, but I still don't know of a complex chip being designed solely by a machine as of now. Even though digital might seem like "the advanced technology", analog design is still needed, as you just CANNOT interface the real world with a purely digital chip.
- That's the reason why people say the wet dreams of the semicon industry is to move the digital circuitry closer to the input as much as possible in order to save costs and automate everything.
- Some big semiconductor chip companies in the world have actually made efforts to build their own in-house EDA software in order to save money on the EDA 3rd party licensing. Though, AFAIK, only the simulation engine, not so much the actual GUI.
BONUS:
- Your video thumbnail is a screenshot of KiCAD, which is a free PCB software. I think a screenshot of the Cadence environment would've been a better fit (hehehe)
do you do digital or analog?
@@karamany9870 analog
@@DJTrancenergy and how’s the future of analog compared to digital? Because I’m currently choosing a specialisation in my degree
@@karamany9870 I can't possibly answer that in a few lines and not willing to put a wall of text explaining my opinion.
Choose what you like the most! Don't be miserable doing what you don't like. Plus, no one knows if you'll end up in engineering or eventually moving to some more managerial roles.
One of my friends worked in TI in analog design and has now switched to Intel. He always said analog circuit designing is like black magic. Even the most skilled designers are not always sure of the full implications of the chips they design. What do you mean by moving the digital circuit closer to the input though? Because digital logic would still have to implemented by analog design.
Verilog and VHDL took most of our fun away. I liked the schematic method of design a lot. We used the same workstation for layout as well.
Which flavor of unix/linux did you use?
@@allentchang Hummm.... back in 1987 it was LSI workstations, if I recall. I don't know the operating system, but I believe they were Tectronix graphics terminals (Zilog). They were not fast, but very high resolution for the day. In 1993 it was Apollo workstations (Seagate), which were running Mentor. It certainly ran a Unix variant, but it was an unusual one. Into the new century, its all been Verilog using Xilinx software (various startups), running on Windows (does Xilinx even run on Linux/Unix?). Our fabs also tell the story: last century it was custom fab (Zilog), then AT&T fab (Seagate), then after that probably TSMC, I don't recall.
Afternote: Actually I do recall, at Zilog the Tek terminals were driven by racks and racks of LSI-11's, a PDP-11 that fit in a single or double RU. I remember because we had a big serial port mux that would allow you to get a connection to any of the machines. I used to write scripts that would start jobs on multiple machines overnight, which was the only way to get reasonable simulations of chips. I believe they were running Unix. Our chip simulations were done on a custom gate level simulator that I learned a lot from, since it would simulate things like domino logic.
And yes, I am old.
@@scottfranco1962 Used Mentor Graphics for digital design in college around 1996. Also used Avanti Hspice in circuit courses. Half of the college’s workstations seemed to be some Unix variant while the other half were SunOS / Solaris, which were much newer. Linux wasn’t embraced until roughly a decade later. Now Mentor got eaten by Siemens and Avanti got eaten by Synopsys. In any case they sure know how to pick weird tool names: Dracula, Assura, Hercules, Calibre, Spectre. They should have stuck with things related to spices, cinnamon, and cloves.
@@allentchang So I have not been part of direct IC design for a lot of years. Last time was at Segate, but we didn't do the layouts or see the layouts (sad). I imagine that analog is still schematics. I think what happens now is that the Verilog tools deliver mostly automated layouts that use "fixer" layout people to correct automation problems and for special layouts, again, including analog. A company like Intel or AMD might typically use a mix of both, for example if the design calls for say a cache, that's going to be mostly laid out by hand for a single cell of it, then replicated many times. The author of this video series is not quite correct that everything is standard cell. Even in the old days, when we used cells they were usually from a library of custom layouts made for that particular design. I guarantee you the latest x86 processor does not use standard cells. The other issue is drive lines. One thing an IC designer does a lot is change the drive strength within the chip to match drive to length of line so that it does not drive it too little or too much, and matches the rise time requirements.
@@scottfranco1962 An analog engineer has to work closely with an layout engineer to make sure sources of signal mismatches are kept to a minimum and that the spice models do work as promised under certain layout conditions. Working with only schematics is no longer the way to go.
Very well narrated video.. the best content on UA-cam... UA-cam algorithm please give this man a medal... 🏅
This is essentially the video I was looking for but never even tried to find. Back when I was younger I used to build circuit boards. I would create a schematic and then try to transfer that schematic to a double sided trace layout that could be etched onto a copper clad board using photo etching and chemicals to dissolve away unwanted copper. I often wondered how complicated it must be to do that on 6 layer boards or even internally on an integrated circuit. And now with CPUs being so complicated I knew there had to be some sort of automated algorithm that could drastically assist the engineer.
Your videos about processors are unreal. Better than almost anything on youtube, maybe only those conference videos are better at explaining stuff ( but take an hour to finish).
Fantastic channel. Well researched and produced highly relevant content! Well done
I feel sad when people only acknowledge 2 major EDA companies as Cadence and Synopsys. They always tend to forget Mentor Graphics (now Siemens EDA) the next big giant competing neck to neck with the above 2.
Mentor Graphics sucks. Seimens has a lot of work to do to bring that troubled toolchain up to modern standards.
@@brianransom16 they use cadence not mentor graphics.
@@byugrad1024 calibre
Mentor sucks. Period.
Synopsys and Cadence aren't just both based in the U.S., they're actually only 10 km apart in Silicon Valley.
Nice video! One small nit: the image for your “memory circuit” isn’t a memory circuit. I was on the edge of my seat wondering if you’d cover the importance of the hierarchy of abstraction. I needn’t have worried. I worked at Honeywell in the mid ‘70s on an EDA simulation tool, while a coworker made use of our Multics system for his thesis work on circuit board layout, because of the large address space. That drove home for me the complexity of layout problems. I’ve been a bystander since, but one thing I’ve observed is that the use of standard cells reduces the layout complexity. If you have a cell with 10 cells, layout deals with 1/10 the number of items. Since layout complexity grows combinatorial or worse, that factor of 10 savings becomes much much more. It addresses complexity both for the designer, and for the algorithms the EDA software uses. As complexity grows, you add more levels of abstraction. You might arrange memory, I/O, CPU subsystems on the chip based on the area needed, route interconnections, then lay out the interconnections between each of the subsystem’s cell, with the cells being pre-routed, or perhaps alternate selections of cells of different shape, to fit together like Tetris. Without the levels of abstraction, modern chips would be impossible.
I’m always impressed by the depth and production of your videos. Making things accessible to a wide audience is a challenge. I think you do a great job.
Glad I’m not the only one that noticed. It looks more like a mux to me.
@@anonymous.youtuberPretty much, yes.
Thank you for the informations. I had been looking for a video for days about how processors and circuits were designed, and I found it on your channel. I'm also researching photolithography and found several videos related to the topic here. Greetings from Brazil.
0:19: Hmm . . . . that looks like a common emitter BJT amplifier with emitter degeneration . . . . . 2:15: EOR for exclusive OR instead of XOR? 2:42 Doesn't a memory circuit contain feedback (ie output directly or indirectly connected back to the input)?
2:42 is a mux, not memory
There are several mislabeled circuits. Maybe this is the decoder for a memory? It is not the actual memory cell, which is the most important part!
Instructive. In late 80's entropy was used to improve a better heat control of those chips. That had an impact on layouts. I'm sure that those principles survived. AI will soon lift chip design at unimaginable heights. Hope you'll talk about that aspect soon. Thank you.
AI is currently optimizing CPU designs.
This is how we get 25% energy efficiency with every other generation.
And 25% more pricey. XD
source?
@Aaron Speedy Price going down is an evidence for improvement.
Not the AI involvment in this improvement
@Hatwox I dunno man, In school, i learned that the benefits of miniaturization is cheaper chips. if you can fit for transistors into less space, and have the tech to do so, the chip is cheaper on a per-transistor basis. a PC is not a chip, a PC has many chips on its various components. a PC now will have more chips than before, but the chips themselves are cheaper when they are made more efficiently, the cost comes from the additions that are available because of the efficiency gain. if that doesnt make sense, its because im a dumb dumb.
Trust me, AI is just another tool, not very different from log tables or protractors.
Great video on EDA. Easy to understand and well organized video. Thank you my friend.
Is it possible to be self chip designer ?
@@Thelearntosell Probably not. This expertise requires quite a bit of education and training to get into.
@@kayrealist9793 I am an electrical engineer graduted recently studied electronics and DLd like courses
@@CoruscationsOfIneptitude sir can you give me link as I am confused from which shall I start please?
I'm fascinated by this stuff. Thank you. +New sub
I did some end-user testing for an EDA startup, this is bringing all the memories back.
This is nitpicking, but the image you show as a "memory circuit" is a multiplexer, it doesn't store any state of its own. It can be used, however, to fetch data from a memory bank / register (among many other uses). Still loved the video!
Mindboggling and almost scary in its complexity. Great video!
at 2:40 that's not memory. That's a demux or mux, so its just logic
it's a mux
Very nice. Exactly the type of video I was looking for.
10:02 Did you really pay for a Unicode emoji? You can literally type it in a text area and just scale it up lmfao what
I’ve worked from the other end making modules and design rules for design software. Designers used the software, and told us when they needed a new or custom module to make the ic a customer wanted.
I am a backend semiconductor eng in the Fab. This is a good overview of the whole process. I heard some of the terms and don’t know what they are and the purpose. Now, I have a general idea.
You only mentioned the digital design (AND OR gates etc). The Analog design may be different. I heard Analog design is more difficult. Good Analog circuit designer is hard to find.
I am an RF test engineer by trade and my RF peers would say those PCB or silicon layer-to-layer interconnects as "vee-ya" and not "vay-ya" (via). I am not sure about chip designers and sons pronunciations though. :)
great stuff. highly appreciate as always!
Thanks very much for this video! Reminded me about this topic!
One of my favorite channels. I always look forward to your videos
I worked for 6 yrs at Xilinx as FPGA Product Application Engineer. Was fun for a while but now I've moved into software development.
Amazing video! I love it! You are a brilliant man! I must have watched 10 of your videos today! :p
The book about IBM360 OS (The Mythical Man Month) has a very interesting cover design. It shows a giant sloth slowly sinking into a tar pit).
Nowadays there is a lot more in the chain of process. I work at a company where we design such chips, and there are quite some steps and work untill a tapeout (actually creating it at like TSMC) :)
I would not be surprised if this channel doesn't have a million subscribers by next year
EDA encompasses more than just IC design. Would be cool if you did a video more focused on some other aspects of EDA.
Thankful the UA-cam algorithm just popped this into my feed, great video
thanks for your service, Great videos overall
I really wish Risc-V would catch up to the industry leaders so we can finally have good open source CPUs...
Awesome. I feel enlightened.
Wow, this is what i searching for long!
Thank you for your channel. An extremely grateful subscriber added.
Wow incredibly complex; nice basic overview!
Actually m a embaded Systems Engineer and I use VHDL in every thing I do .. There is no way to do our work without it
OK, this explains a lot! I've been wondering about your background as a combination super history/society sleuth + semiconductor expert.
That explains the knowledge of the chip production process.
Just discovered your channel, love it !
I think you went back and forth between chip design and PCB design, which are two completely different fields.
Awesome video, thank for sharing
During the 90's I was helping design circuitry with the Software PADS, I don't know if still exist.
Mentor Graphics, now a Siemens Company
Very educational. Thanks
Very cool, thanks!
Thank you for the explanation about the chip crisis. Great video!
Thank you for such a good channel!
Just discovered this channel through this video, really engaging content! Subbed.
As a software engineer, this kind of thing feels like a whole new horizon, and sounds really interesting! I wonder in what kind of major does this thing being taught (just in case I ever get the motivation --- and money to take another degree 😅)
computer engineering, my friend :D
@@unkownuser8455 can confirm, it was a dope 2 years when I was still in the CE major - before I chose to switch to SE solely because of the differing depth of the math classes required (SE was easier, in most regards).
I do wonder what it would be like if I stuck with CE... but then again maybe that just means I should see if I can break into the dev world of EDA software :)
Electronic engineering. You need a masters typically to be able to design at the transistor level
Thank for all the information , and great videos in how this industry works and develops 👍
An absolutely fascinating and educational video. You have really brought me up to speed on how the electronics industry functions today with this and your other videos on design and manufacture of chips. Any thoughts on the effect of US trade sanctions and how it is forcing TSMC to locate leading edge fab facilities in the US as opposed to Taiwan?
For the good of the world we need to get as much chip manufacturing and design out of Taiwan before China attempts to invade... it's coming thanks to Biden.
It blows my mind the work that goes into just designing the chip alone.
Videos (like this) on the tools used to generate technology are fantastic. The simulation tools (e.g. Ansys HFSS, or Sonnet) would be of significant interest to me.
My dad was a peasant. I became a farmer. And my son is gonna be a battery in the matrix.
Great video and introduction to the modern design process it seems.
You are god of electronics . Thank you for the video and channel .
open source tools tend to improve with usage from the public. for a parallel, look at the improvements on kicad since ordering pcbs cheaply became accessible, or how FPGA open source tool are also evolving. Sadly, chip fabrication is still prohibitively expensive for amateurs, so we won't see advances there.
Thank you for producing this!
TYSM for the video! Quick question about chip manufacturing process though:
I know that magnifying lasers brings the chip designs to a small scale, but how do you manufacture the mask that is used to pattern the laser? If there are billions of transistors, how do they cut each one out in the mask?
Thank you for the video.
That was a really cool and interesting video, to say the least!
If you are interested about VLSI history I recommend reading "Coping with the Complexity of Microprocessor Design at Intel - A CAD History" paper.
the alien look of cpu architecture is the first thing that made me want to study electronic engineering
Great presentation!... although a little attention has been put on circuit simulation&testing stages of the design process which regardless its flavor (analog | digital | mixed) is perhaps the most sophisticated and resource consuming task. Perhaps this could be the theme of the next video !:)
Using circuits to design circuits I love that
Have you thought about a video on the role of ATE hardware and software in chip manufacturing?
whoa!! I have a small version of one of those in a plastic case. I never knew what it was, but it cool that found this video. I was totally like "huh, i've seen that before."
my grandpa used to work somewhere, where i got to see one up close when i was 5 or 6. I totally didn't understand it but it looked cool. 4:05
2:36 isn't that a 4:1 multiplexor? Does it have memory?
Thanks for the video