I am looking forward to how powerful processors will be ten years from now. I spend a lot of time rendering 3D graphics on my home computers, but there are some types of animations I tend to avoid simply because I know they will take years to render. Hopefully one day, even these will be possible in just a few minutes on home computers.
+Physics Videos by Eugene Khutoryansky If Moore's law still holds for the next ten years in terms of speed doubling every 18 months (which is not actually what Moore said but whatever) and that's becoming somewhat of a big if, something taking 2 years will still take over a week.
Ten years from now we'll be seeing the first consumer quantum computers, possibly with light-based processors. Moore's Law may or may not apply, since it's already started to fail -- two years to doubling instead of eighteen months. No telling what effect new paradigms and technologies will have on it.
batfan1939 In ten years there will be things for consumers that we don't even know how to build in a lab yet? I don't think so. It's not even clear optical computers provide any advantage over electronic computers. Quantum computers will probably always be some kind of coprocessor in general-purpose computers.
One of the first ARM employees did a talk at my school a year and bit ago for an engineering submodule. Seemed like a genuinely nice guy and knew his IP about as intimately as it's possible too. Spent a while talking to him about my projects afterwards, truly fascinating fella!
Computerphile mouse game: Find a tight space on the video and try keeping the mouse cursor inside it. Guaranteed fun on any Computerphile video even if you don't know anything about the topic.
+DJDavid98 lol, never noticed that they were all shaky cam.... I think this is one of the few classes of video where it's helpful, as it creates artificial action and engages the viewer more than static shots.... Interesting.... I wonder if it's intentional...
Hrm. Rather than tape on acetate, Steve Furber [formerly of ARM] said, in one of Computerphile's previous videos, the "tape out" comes from the fact that a _computer tape_ was sent in the post from the design house to the fab.
+The Hoax Hotel Well, you might also want a microwave oven full of oxygen and a massive metal chamber from the 60s, along with an IBM 5150, and other computers to transfer data between 5.25 and 3.5 inch floppies. That's how they do it at "Canada's Capital University"
+William Young VHDL, Verilog or some other hardware description language. Here's an overview: en.wikipedia.org/wiki/Hardware_description_language I think the best way to learn such a language is to buy a cheap FPGA and program it.
+William Young - VHDL is a wonderful language for programming FPGA's. I don't know how well it scales in terms of a massive commercial product, that is the ARM architecture, but it works wonders for smaller projects.
+IchBinEin The language it self should work just fine even on massive projects like a Cortex-A57. The synthesis and place & route software is another story though. No doubt it will be vastly different than what you get for FPGA's.
Interesting stuff. And knowing how many things use ARM chips.. good to know. There are definitely inventions that wouldn't run as nicely without these low power chips.
i accidentally compiled source code on my GPU [nivida optimus] and boy it was fast as anything , the amazing thing was i was using bumblebee bridge on arch , used the command primusrun make , i didnt know i could do that
The whole testing cpu's bit is pretty fun. They are just so complicated that full simulations are way too slow. And by that I mean they run at a couple hertz, compare that to the gigahertz the actual processor runs at. And even partial simulations are only somewhat better although there are specialized and horribly expensive machines that kind of help. And then there are those bugs that are way too obscure to ever be found by humans and just randomly testing stuff is really slow... Point is, the first second after turning a cpu on might do more testing than the entire development up till then. I have some serious respect for the people that somehow manage to make them work almost all the time.
Good show! I'd like to hear Mr. Davies and Dr. Moriarty discus project difficulties of working with materials near quantum boundaries. It might be patently hilarious.
It's a not well explained, but some people will know Taiwan, Austin, areas of China etc to be huge tech manufacturing locations. Just one of those assumptions made that turns out not to be that clear.
TheBluMeeny I would say.. that might be two. Taiwan is somewhat of.. maybe not the texas but more of a miniature US of Asia... Taiwan use to be china. But when Mao occupied china all the industrialist fleed to Taiwan with the navy. This left communist china with out a navy so they could simply not attack Taiwan
But they didn't show the most interesting bits: How does the special language he's talking about look like? What is it? What simulation software do they use and on what principles do they operate? I am disappointed :(
I used to write software, and that could always be debugged or patched, but I can't imagine 'writing' hardware where, as he says, once you hit 'print' it's a million bucks spent, boom just like that. Talk about job stress.
So does this mean that two phones from different manufacturers that have the same arm cpu can potentially have chips fabbed in completely separate places, and contain completely different overall chip which contain different components? That's very interesting. I wonder how much impact the separate chip fabbing may have on the end product.
Out of curiosity , is it public information whether ARM engineers use VHDL , Verilog or another internal logic design language/tool to develop and test their IP ? :) ( I hope rubber tapes are out of use now ... :D )
+Lobster with Mustard and Rice What a backwards-ass continent. You guys can keep trying to change simple things that have been established for large amounts of time. But there is only one type of billion, 1,000 millions.
+Charles Miller It really is a billion. A couple of years ago I remember they sold 7 billion in a year. No doubt they sell a few more now. Think each mobile phone, tablet and many other devices contain several ARM based chips.
+Christy Salter Actually they shipped 0 CPU, but they shipped 12 billion licenses. I would presume this includes microcode licenses as well as core licenses. The issue is that AMD run licensees of intel and intel run (microcode) licenses from AMD. Its not x86 licenses, they expired long a go. Put its x64, SSE, MMX, 3Dnow and other licenses.
+Raymond “RooMan” Lobban They would obviously re-use the majority of their previous designs and only improve upon a few area's in each iteration. Designing chips now a days isn't really that different from writing software. Instead of C++ or java, they use VHDL or verilog. As you said, they simply copy the parts of the design that hasn't changed.
A question for Jem or anyone else in the know; What “special language” was he referring to that ARM programmers use to design chips? What would be the typical route of someone wanting to get such a job with ARM?
Hope I am not too late, but hopefully someone else might benefit. He means languages like Verilog, SystemVerilog and VHDL, and their corresponding IDEs, e.g. Vivado.
I know Intel has had a few silicon bugs. Never heard of ARM having them. I know there are device specific silicon limitations/errata (specifically in the microcontroller space) for ARM, but likely more due to manufacturer faults than CPU design. So how many design bugs does ARM have/had?
+leppie The mainstream-discussed Intel's bugs were always in microcode (Pentium FDIV, Halt and Catch Fire, Skylake), not the chips themselves. ARM chips don't use microcode, they are therefore much simpler and easier to test. (This has a lot to do with that Intel uses CISC instructions while ARM uses RISC. Intel's instructions are designed to be easy to use while ARM's to be easy to implement. Since almost nobody programmes in assembly anymore, guess who's better for today's world.) On the other hand, there were a few bugs during the years as well. If you can get hold of ARM documentation, look for errata.
+Jan Sten Adámek Thanks for the feedback. I dont know much about the microcode aspect, but the was the FDIV Pentium bug not fixable with microcode? I would not have expected a recall/class-action-suit if it was fixable by microcode, unless the fix was actually too large to be addressed in microcode storage.
+ Computerphile and the foundries get the chip maker machines from ASML :) (except for 5% of the market) and that's how the city of veldhoven transformed in a manufacturing plant XD
+ahenryb1 Depends on which side of the pond you're on. In American English, "ARM" is a company, and thus a singular entity, so properly conjugating "do not" yields "doesn't" instead of "don't". In British, "ARM" might be plural, because British English is different, in which case "don't" would be correct.
"ARM does not fabricate chips" or "ARM only designs chips" would have been fine titles too. In american english, the phrase "ARM don't make computer chips" is what we call "Hickish", it sounds as if its in a southern vernacular, much like "I ain't gonna eat them chips"
So Jem Davies and Steve Furber disagree on the origins of the phrase "tape out". In another computerphile video Furber says that "tape out" comes from writing the design schematic to tape that'll then be shipped to the foundry to make masks.. Given that Furber was there and doing it, I'm going to put my bet on him.. (Which video? Dunno, I've watched a few, and there isn't a transcripts to search, and I'm too lazy to find out which one.)
Hey guys I am really interested in computers and I want to learn (at least basic things) about programming. Can you recommend a website or a youtube channel for that?
+Mert Oral I highly recommend that you do that! Sorry but I don't know of any website or utube channel at the top of my head. My suggestion to you is to pick a programming language (java or c++ are good choices for a beginner) and search for beginner tutorials on google. I've been programming since I was 8 and even though it's not related to my occupation or studies, I do it on a daily basis as a hobby that has brought me a lot of joy. The good thing about programming is that once you fully master one of the languages, it becomes trivial to learn and even master the rest (in a matter of weeks).
+Mert Oral Handmade Hero (handmadehero.org) and Learn Python the Hard Way (learnpythonthehardway.org) are what I can think of right now that are both completely free and get you started without you having to dig through stuff. Nand to Tetris (nand2tetris.org) covers building your own hardware from Nand gates (and DFFs for memory) up through assembly, compilers, operating systems, and games, but you need to buy the book for everything past chapter 6. The first six chapters are available for free online and cover most of the hardware and introduce you to the assembly language used in the book. You could also look at reddit.com/r/learnprogramming for more resources. The sidebar on the right has a lot of information, so read it before you post anything.
for anyone who thinks that hardware chip language sounds interesting, you ought to look into FPGAs. You can program them directly, and have them emulate virtually any other microchip, given that it has the specs to do the job.
They're also obscenely expensive at gate counts for projects of any reasonable complexity. Maybe worth it for prototyping, but not cost-effective for mass production.
Rob Mckennie You're gonna have to qualify that with exactly how many logic gates you're talking about. There's low-end for everything. If you want to simulate something on the order of a Cortex-A series, it will cost dearly.
Mike Trieu Why would you want to emulate a device of that kind of complexity? I don't think the kind of hobbyist to whom I was recommending an FPGA would have any kind of need to emulate a modern CPU.
Rob Mckennie Oh, well you didn't clarify "hobbyist" in your OP. Also, "hobbyist" is somewhat of a nebulous term, too, when you have folks like Bunnie Huang literally reverse-engineering his own CPUs for the Novena project.
+Blackwater Park For an explanation of why 'don't' is perfectly acceptable, see the Wikipedia article on "Collective noun[s]," particularly the section titled "Metonymic merging of grammatical number." Or, if you don't want to be bored to tears, you could just take my word for it :) .
I don't know if it's some weird difference between British and American grammar, but it should read, at least in American English, "ARM doesn't make chips.".
Nope. The future is looking to be FPGA-ASIC hybrids with with home computers turning into a terminal on a mainframe. I wish it was otherwise. I love open source hardware, but look up the Xilinx Amazon partnership that was recently announced. Then ask why Intel bought out Altera. The end of Moore's Law will result in the expansion of FPGAs. FPGAs are capable of running a local user interface with remote software faster than a consumer grade system with local storage. Why the shift? Proprietary software security. You won't have access to the complete software on a terminal based system. The future will be the 1960's mainframe computer version 2.0... if open source projects like the linux kernal don't take most of the market share first. -Jake
@@UpcycleElectronics So you expect everyone to just dump their freedom into garbage? No sane person will use centralized resources for private purposses. Yeah Linux kernel is ubiqutous today and going strong.
So ARM don't make computer chips? That's funny. In my city there's a company called LEG, and they do neither. They just help you find an apartment where you can put your computer.
The paragraph below is mostly taken from the response I gave to an earlier comment asking the same question. The short answer is that, in English, the subject of the title sentence, "ARM," is what's known as a collective known. In at least one variety of English - British - it is considered acceptable (though it is not required) for collective nouns, even in their singular form, to be used with plural verb forms. Thus we get "ARM Don't Make Computer Chips." For a more detailed explanation of why "don't" is perfectly acceptable, see the Wikipedia article on "Collective noun[s]" particularly the section titled "Metonymic merging of grammatical number," Or, if you don't want to be bored to tears, you could just take my word for it :) .
the anotation over the raspberry pi one takes you to why computers use binary and the anotation over the why use binary takes you to the raspberry pi video
A question about people like these architects of arm. what kind of education do they need to have the knowledge required for such job? I know that it's probably not business administration, but other than that, not much. I'm just asking out of curiousity because the things he talks about seem to include both hardware and software and other stuff.
Interesting, I've been wondering why the market is flooding with new arm chips, for raspberry pi-clones, cheap smartphones, tablets, embedded devices like registers, climate control devices and such. Someone must have been testing these things for multi-purpose use before releasing it to the audience at a dumpprice.
"in Taiwan, in Austin Texas, and China and you know, all those sorts of places" I'm not sure whether to be offended by that sentence because I can't tell what sort of place that list is representative of.
+RobloxDev It's actually a perfectly valid construction in some forms of English, though it may not be common in your country of origin. Check out the other comment thread discussing this question, a couple of us have posted more info to try and clear this up.
Reallly really thank i was really baffled on how they can build that kind of chip now i know how they doing the programming language seems so "logic" to do this is awesome :)
I am looking forward to how powerful processors will be ten years from now. I spend a lot of time rendering 3D graphics on my home computers, but there are some types of animations I tend to avoid simply because I know they will take years to render. Hopefully one day, even these will be possible in just a few minutes on home computers.
Sorry, in 10 years these computers won't need you any more to do the animation.
+Physics Videos by Eugene Khutoryansky
If Moore's law still holds for the next ten years in terms of speed doubling every 18 months (which is not actually what Moore said but whatever) and that's becoming somewhat of a big if, something taking 2 years will still take over a week.
Ten years from now we'll be seeing the first consumer quantum computers, possibly with light-based processors. Moore's Law may or may not apply, since it's already started to fail -- two years to doubling instead of eighteen months. No telling what effect new paradigms and technologies will have on it.
batfan1939 In ten years there will be things for consumers that we don't even know how to build in a lab yet? I don't think so. It's not even clear optical computers provide any advantage over electronic computers. Quantum computers will probably always be some kind of coprocessor in general-purpose computers.
+Physics Videos by Eugene Khutoryansky Cannot you just get it rendered on a render farm?
One of the first ARM employees did a talk at my school a year and bit ago for an engineering submodule. Seemed like a genuinely nice guy and knew his IP about as intimately as it's possible too. Spent a while talking to him about my projects afterwards, truly fascinating fella!
Computerphile mouse game: Find a tight space on the video and try keeping the mouse cursor inside it. Guaranteed fun on any Computerphile video even if you don't know anything about the topic.
+DJDavid98 Great, now I can't un-see that :P
+DJDavid98 lol, never noticed that they were all shaky cam....
I think this is one of the few classes of video where it's helpful, as it creates artificial action and engages the viewer more than static shots.... Interesting.... I wonder if it's intentional...
+KevintheBooth It's a feature!
@@PIXELST0RM link?
ah ok, so ARM simulates it in Minecraft, that answers a lot of questions right there :D
Hrm. Rather than tape on acetate, Steve Furber [formerly of ARM] said, in one of Computerphile's previous videos, the "tape out" comes from the fact that a _computer tape_ was sent in the post from the design house to the fab.
+lmiddleman Yeah - I need to know which one is correct!
+Greyarea23 It should be computer tape for storing GDS file for making "mask set".
+Eddy Yau i think I was wrong, actually, though haven't had complete confirmation. I think it does refer to the computer tape going out the door...
+Jem Davies Since 16 FF+ is very expansive, does 500 test chip produced by MPW?
I concur. Chips were always designed in cad. You can down load open source and see the design of the 6502 chip for example ! In a normal PC now.
"Taiwan, Austin Texas and China... ... those sorts of places."
HAHAHAHA
High tech manufacturing zones = those sorts of places.
@@tylerdurden3722 cheaper place to produce = those sorts of places.
@@tukangbobo Afganistan, Venice, Antarctica = those sorts of places
why Texas tho, isn't it expensive to produce it in the U.S ?
@@tylerdurden3722 Atlantis, Haiti, and Mars = those sorts of places.
Amazing video! Thanks for the wonderful overview of ARM. I might just apply for a job there and see what happens :)
For those who wants to know the actual languages (like VHDL, Verilog, etc..) and the softwares (Quartus, Altium, etc..)
Exactly what I suspected.
Thank you.
This is my new favorite UA-cam Channel.
I want a photolithograph for Christmas.
+The Hoax Hotel Totally agree with the commenter above me. if you like the idea of designing your own chips, an FPGA could be the way for you.
+The Hoax Hotel Well, you might also want a microwave oven full of oxygen and a massive metal chamber from the 60s, along with an IBM 5150, and other computers to transfer data between 5.25 and 3.5 inch floppies. That's how they do it at "Canada's Capital University"
Yeah, well I want a unicorn, but you don't see me gettin one of those any time soon, do ya?
+The Hoax Hotel If you rich, you can use old process to do it. It cost much cheaper.
+Meep Walrus Well I never. Hipster Canada.
Nice video, sums up my small 3 year carrier and currently Master studies! Will apply to ARM for an internship this summer! Wish me luck :).
Now what I actually want to know is what is the language used to make chips like that. And what does it look like.
+William Young VHDL, Verilog or some other hardware description language. Here's an overview: en.wikipedia.org/wiki/Hardware_description_language
I think the best way to learn such a language is to buy a cheap FPGA and program it.
+William Young - VHDL is a wonderful language for programming FPGA's. I don't know how well it scales in terms of a massive commercial product, that is the ARM architecture, but it works wonders for smaller projects.
+IchBinEin The language it self should work just fine even on massive projects like a Cortex-A57. The synthesis and place & route software is another story though. No doubt it will be vastly different than what you get for FPGA's.
+IchBinEin Verilog is more common than VHDL, as I know.
GREAT video. This the sort of stuff I am really interested in!
You guys need to go over the basics of hardware descreption language sometime, don't just leave us hanging
Interesting stuff. And knowing how many things use ARM chips.. good to know.
There are definitely inventions that wouldn't run as nicely without these low power chips.
i accidentally compiled source code on my GPU [nivida optimus] and boy it was fast as anything , the amazing thing was i was using bumblebee bridge on arch , used the command primusrun make , i didnt know i could do that
What kind of code were you compiling?
The whole testing cpu's bit is pretty fun. They are just so complicated that full simulations are way too slow. And by that I mean they run at a couple hertz, compare that to the gigahertz the actual processor runs at. And even partial simulations are only somewhat better although there are specialized and horribly expensive machines that kind of help.
And then there are those bugs that are way too obscure to ever be found by humans and just randomly testing stuff is really slow... Point is, the first second after turning a cpu on might do more testing than the entire development up till then.
I have some serious respect for the people that somehow manage to make them work almost all the time.
"As we set about designing the arm, we didn't really expect to pull it off" ...
Good show!
I'd like to hear Mr. Davies and Dr. Moriarty discus project difficulties of working with materials near quantum boundaries. It might be patently hilarious.
Taiwan and Austin Texas... aaahhh... those sorts of places. Got it.
+Brascofarian Yea... i thought the same things... places that are..... ? what.. warm and far away?
It's a not well explained, but some people will know Taiwan, Austin, areas of China etc to be huge tech manufacturing locations. Just one of those assumptions made that turns out not to be that clear.
+matsv201 And at least one of them you run the risk of being shot in!
TheBluMeeny I would say.. that might be two. Taiwan is somewhat of.. maybe not the texas but more of a miniature US of Asia...
Taiwan use to be china. But when Mao occupied china all the industrialist fleed to Taiwan with the navy. This left communist china with out a navy so they could simply not attack Taiwan
matsv201 I am aware of the history of Taiwan. I was just making a light hearted joke, nothing more.
A great explanation. Thanks for posting. Good stuff to know.
But they didn't show the most interesting bits: How does the special language he's talking about look like? What is it? What simulation software do they use and on what principles do they operate? I am disappointed :(
Simply awesome interview! Thanks!
I used to write software, and that could always be debugged or patched, but I can't imagine 'writing' hardware where, as he says, once you hit 'print' it's a million bucks spent, boom just like that. Talk about job stress.
Similarly, hardware also gets debugged and patched
tea cheers for light wave radio four quadrant pixel dance?
3:13 Talking about minecraft was a bad move. Never do that with old techies, just keep it simple.
Never do that with anyone.
Sorry for OT question. Is it "ARM (=firm - singular, it) doesn't" ... or is it "ARM (=people in ARM - plural,they) don't"? Or both?
So does this mean that two phones from different manufacturers that have the same arm cpu can potentially have chips fabbed in completely separate places, and contain completely different overall chip which contain different components? That's very interesting. I wonder how much impact the separate chip fabbing may have on the end product.
Out of curiosity , is it public information whether ARM engineers use VHDL , Verilog or another internal logic design language/tool to develop and test their IP ? :) ( I hope rubber tapes are out of use now ... :D )
This is the first time I've been told I can write hardware like code... ++Coding Skills
From the title I wasn't optimistic but this was actually really interesting. Nice video!!
how do they test the next generation of processor if they have to simulate it on older tech?
Who’s here after Apple silicon
You people at computerphile should interview Alberto Sangiovanni Vincentelli or someone from the Electronic Design Automation community! :)
Did he mention what hardware simulation language they use? Does anyone use Verilog in industry?
Are companies likely to edit the IP cores, or directly integrate them into large SoCs as is?
I would love to do this kind of thing in the future. What do I have to study in college to be a CPU architect?
what billion does he mean? us billion or european billion?
+Lobster with Mustard and Rice US billion, 1000 millions.
Stoppi thanks
I thought the European Billion has been outdated and now just use 1,000 Million?
Mat Smith XyllianPC is right. Numberphile has made a video on it. "How big is a billion". It's interesting, pls watch it
+Lobster with Mustard and Rice What a backwards-ass continent. You guys can keep trying to change simple things that have been established for large amounts of time. But there is only one type of billion, 1,000 millions.
wait... at the start did he say they ship a billion CPUs a month?
Surely he meant to say a million....
he definitely meant a million lol
+Charles Miller It really is a billion. A couple of years ago I remember they sold 7 billion in a year. No doubt they sell a few more now. Think each mobile phone, tablet and many other devices contain several ARM based chips.
+Charles Miller ua-cam.com/video/1jOJl8gRPyQ/v-deo.html
+Christy Salter Actually they shipped 0 CPU, but they shipped 12 billion licenses.
I would presume this includes microcode licenses as well as core licenses.
The issue is that AMD run licensees of intel and intel run (microcode) licenses from AMD. Its not x86 licenses, they expired long a go. Put its x64, SSE, MMX, 3Dnow and other licenses.
+Joe Holland He said month
would they make the different 'blocks' from scratch each time? or would they just copy and paste from previous designs.
+Raymond “RooMan” Lobban They would obviously re-use the majority of their previous designs and only improve upon a few area's in each iteration. Designing chips now a days isn't really that different from writing software. Instead of C++ or java, they use VHDL or verilog. As you said, they simply copy the parts of the design that hasn't changed.
A question for Jem or anyone else in the know; What “special language” was he referring to that ARM programmers use to design chips? What would be the typical route of someone wanting to get such a job with ARM?
Hope I am not too late, but hopefully someone else might benefit. He means languages like Verilog, SystemVerilog and VHDL, and their corresponding IDEs, e.g. Vivado.
I have an interview for an intern role at ARM today and tomorrow.
Great interview.
I actually saw someone use a checkbook to pay at Walmart the other day so there's at least one guy who still does it.
Would be nice to know what language they use ..
"I am the Architecht. I created the Matrix. I've been waiting for you.
He's probably not saying it but what he is talking about is using FPGA's using Veralog or the xylinx equivalency.
+Joseph Nicholas Sorry in the end he did mention FPGA modeling. Arm Cortex chips are great!
I know Intel has had a few silicon bugs. Never heard of ARM having them. I know there are device specific silicon limitations/errata (specifically in the microcontroller space) for ARM, but likely more due to manufacturer faults than CPU design. So how many design bugs does ARM have/had?
+leppie The mainstream-discussed Intel's bugs were always in microcode (Pentium FDIV, Halt and Catch Fire, Skylake), not the chips themselves. ARM chips don't use microcode, they are therefore much simpler and easier to test. (This has a lot to do with that Intel uses CISC instructions while ARM uses RISC. Intel's instructions are designed to be easy to use while ARM's to be easy to implement. Since almost nobody programmes in assembly anymore, guess who's better for today's world.)
On the other hand, there were a few bugs during the years as well. If you can get hold of ARM documentation, look for errata.
+Jan Sten Adámek Thanks for the feedback. I dont know much about the microcode aspect, but the was the FDIV Pentium bug not fixable with microcode? I would not have expected a recall/class-action-suit if it was fixable by microcode, unless the fix was actually too large to be addressed in microcode storage.
leppie It was not possible to update microcode in those early Pentiums, next-gen Pentium Pro was the first that can be updated by software.
+Jan Sten Adámek And again I learn something new :D
+leppie there was a pretty interesting talk on chipdesign on the 32c3, which also touches the hardware-bug / microcode stuff. watch?v=eDmv0sDB1Ak
Can you do a video about matlab?
It always entertains me hearing about using computers to design and make computers.
Actually there's a manual composition of ip by people at chip level too.
Can you make a video oh how computers generate random numbers?
+ Computerphile and the foundries get the chip maker machines from ASML :) (except for 5% of the market) and that's how the city of veldhoven transformed in a manufacturing plant XD
Fascinating system ... and it's actually working!
The title sounds wrong, even if grammatically correct
+ahenryb1 Good on you for at least acknowledging that your intuition of how English works isn't a universal law.
+ahenryb1 Depends on which side of the pond you're on. In American English, "ARM" is a company, and thus a singular entity, so properly conjugating "do not" yields "doesn't" instead of "don't". In British, "ARM" might be plural, because British English is different, in which case "don't" would be correct.
"ARM does not fabricate chips" or "ARM only designs chips" would have been fine titles too.
In american english, the phrase "ARM don't make computer chips" is what we call "Hickish", it sounds as if its in a southern vernacular, much like "I ain't gonna eat them chips"
Ian Walker And yet our version of English in the U.S. is only one of the many equally valid variations on the language.
*doesn't
Scartch Sketch,etchy pig sales?
Advanced Risc Machines. It's a plural.
So Jem Davies and Steve Furber disagree on the origins of the phrase "tape out". In another computerphile video Furber says that "tape out" comes from writing the design schematic to tape that'll then be shipped to the foundry to make masks.. Given that Furber was there and doing it, I'm going to put my bet on him.. (Which video? Dunno, I've watched a few, and there isn't a transcripts to search, and I'm too lazy to find out which one.)
The guy asking the questions sounds a lot like Prof. Brian Cox!
Hey guys I am really interested in computers and I want to learn (at least basic things) about programming. Can you recommend a website or a youtube channel for that?
+Mert Oral I highly recommend that you do that! Sorry but I don't know of any website or utube channel at the top of my head. My suggestion to you is to pick a programming language (java or c++ are good choices for a beginner) and search for beginner tutorials on google. I've been programming since I was 8 and even though it's not related to my occupation or studies, I do it on a daily basis as a hobby that has brought me a lot of joy. The good thing about programming is that once you fully master one of the languages, it becomes trivial to learn and even master the rest (in a matter of weeks).
+Mert Oral Handmade Hero (handmadehero.org) and Learn Python the Hard Way (learnpythonthehardway.org) are what I can think of right now that are both completely free and get you started without you having to dig through stuff.
Nand to Tetris (nand2tetris.org) covers building your own hardware from Nand gates (and DFFs for memory) up through assembly, compilers, operating systems, and games, but you need to buy the book for everything past chapter 6. The first six chapters are available for free online and cover most of the hardware and introduce you to the assembly language used in the book.
You could also look at reddit.com/r/learnprogramming for more resources. The sidebar on the right has a lot of information, so read it before you post anything.
Is the interviewer Mat Watson of CarWow?
Why are these videos have such bad camera shake? Is it on purpose? What is so hard about setting the camera on a tripod?
So do they literally program with code like you woulw with java? How do they know what to improve on?
for anyone who thinks that hardware chip language sounds interesting, you ought to look into FPGAs. You can program them directly, and have them emulate virtually any other microchip, given that it has the specs to do the job.
They're also obscenely expensive at gate counts for projects of any reasonable complexity. Maybe worth it for prototyping, but not cost-effective for mass production.
Rob Mckennie You're gonna have to qualify that with exactly how many logic gates you're talking about. There's low-end for everything. If you want to simulate something on the order of a Cortex-A series, it will cost dearly.
Mike Trieu Why would you want to emulate a device of that kind of complexity? I don't think the kind of hobbyist to whom I was recommending an FPGA would have any kind of need to emulate a modern CPU.
Rob Mckennie Oh, well you didn't clarify "hobbyist" in your OP. Also, "hobbyist" is somewhat of a nebulous term, too, when you have folks like Bunnie Huang literally reverse-engineering his own CPUs for the Novena project.
I have the same ShoreTel phone sitting next to me in my office right now.
computer chips are like potote chips
except not edible
+Austin Pinheiro but they taste the same right?
jklw10
nop
yeah but computer chips are editable
Malappapas
only if your runnon on a potatoeos
Papalu Pikalo? Is that you?
shouldn't it be 'doesn't' in the title and if not, why?
+Blackwater Park For an explanation of why 'don't' is perfectly acceptable, see the Wikipedia article on "Collective noun[s]," particularly the section titled "Metonymic merging of grammatical number." Or, if you don't want to be bored to tears, you could just take my word for it :) .
Great video. I really enjoyed it.
how does one get into this writing hardware thing?
really parallels tirn roumd the 'other way'too gain 'inner mirror enviro'of processes,processors,alike 'm.i.t'hand,by exceptions?!
What program or language is used to write hardware? I would like to try to architect myself a microprocessor
Ørjan Solli vhdl verilog systemverilog
I don't know if it's some weird difference between British and American grammar, but it should read, at least in American English, "ARM doesn't make chips.".
Zak Zennii according to Cambridge it doesn't matter in British English.
I knew there was a magic box! I knew it!
I think, in the near future, there will be a movement of people calling for an end to proprietary hardware.
LegIt?
Nope. The future is looking to be FPGA-ASIC hybrids with with home computers turning into a terminal on a mainframe. I wish it was otherwise. I love open source hardware, but look up the Xilinx Amazon partnership that was recently announced. Then ask why Intel bought out Altera. The end of Moore's Law will result in the expansion of FPGAs. FPGAs are capable of running a local user interface with remote software faster than a consumer grade system with local storage. Why the shift? Proprietary software security. You won't have access to the complete software on a terminal based system. The future will be the 1960's mainframe computer version 2.0... if open source projects like the linux kernal don't take most of the market share first.
-Jake
@@UpcycleElectronics So you expect everyone to just dump their freedom into garbage? No sane person will use centralized resources for private purposses. Yeah Linux kernel is ubiqutous today and going strong.
@@GeorgWilde is 2% ubiquitous?
So ARM don't make computer chips? That's funny. In my city there's a company called LEG, and they do neither. They just help you find an apartment where you can put your computer.
Since people like correcting the grammar in the title of this video:
ARM ne fabrique pas les puces d'ordinateur.
+Meep Walrus I think this should be the right title. No more confusion with don't/doesn't
Nevermind that stuff, what I want to know is how he lost only the tip of ONE finger in the middle of his hand o.O
English question : Shouldn't be doesn't instead of don't ?
The paragraph below is mostly taken from the response I gave to an earlier comment asking the same question. The short answer is that, in English, the subject of the title sentence, "ARM," is what's known as a collective known. In at least one variety of English - British - it is considered acceptable (though it is not required) for collective nouns, even in their singular form, to be used with plural verb forms. Thus we get "ARM Don't Make Computer Chips."
For a more detailed explanation of why "don't" is perfectly acceptable, see the Wikipedia article on "Collective noun[s]" particularly the section titled "Metonymic merging of grammatical number," Or, if you don't want to be bored to tears, you could just take my word for it :) .
the anotation over the raspberry pi one takes you to why computers use binary and the anotation over the why use binary takes you to the raspberry pi video
A question about people like these architects of arm. what kind of education do they need to have the knowledge required for such job? I know that it's probably not business administration, but other than that, not much. I'm just asking out of curiousity because the things he talks about seem to include both hardware and software and other stuff.
Gediminas B bachelor electrical engineering/ computer engineer or where I live bachelor electrical engineering and then a master computer engineering
If I recall correctly, ARM don't make chips/computer chips, is improper grammar.
You should have substituted don't for doesn't.
ender_scythe doesn't matter in British
But Arm arquitecture is to close and then change from one OS to other will not be possible!!!!
Interesting, I've been wondering why the market is flooding with new arm chips, for raspberry pi-clones, cheap smartphones, tablets, embedded devices like registers, climate control devices and such. Someone must have been testing these things for multi-purpose use before releasing it to the audience at a dumpprice.
"in Taiwan, in Austin Texas, and China and you know, all those sorts of places"
I'm not sure whether to be offended by that sentence because I can't tell what sort of place that list is representative of.
+Trabber Shir The sorts of places with booming tech sectors.
+Adam Leuer Hehe...Texas is certainly *booming*
Get it, cause guns and 'splosions.
another video How to make a design of CPU
I've noticed that about 30% of the comments so far are about linguistic concepts. Seriously, people?
What's that sound at 8:54 ?
That is the sound a phone makes in the UK if you leave it off the hook... >Sean
Are you trying to say ARM as if it's a plural? I'm pretty sure that "Don't" in the title should be "Doesn't"
+RobloxDev It's actually a perfectly valid construction in some forms of English, though it may not be common in your country of origin. Check out the other comment thread discussing this question, a couple of us have posted more info to try and clear this up.
Glad this channel isn't grammarphile. People would be shitting bricks at that video title.
Is it rude to ask what's up with his right arm middle finger?
I've seen logic done with Lego pneumatics..
Reallly really thank i was really baffled on how they can build that kind of chip now i know how they doing the programming language seems so "logic" to do this is awesome :)
I wish I was smart enough to work for these people.
Fascinating!
I wish there are subtitles in this video, seems interesting but I'm deaf...
Well lucky for you the new google pixels have a live caption feature that captions everything on the fly.
A tablet is a computer.
SFP sopwith camel pixel wrote slate,yers sincerly,F.Leghorn
this video is gold. Arm going public
awesome video
I wish my computer was running on an ARM chip.
Very interesting!
Until recently...
That is just amazing...