I'm absolutely fascinated by these old mechanical computers. There was no software back then to design them, the device was designed within someone's imagination. Truly incredible.
They were designed on paper using methods and instruments that were practiced for years. Imagination alone was never enough. We have always used tools to supplement the deficiencies of our bodies and minds. Software is merely the latest tool.
Your comment doesn't make much sense. Even right now when you are designing something you need to visualize the outcome before writing the code for the task. So this has been happening in the past and in the present and will continue to happen in the future. You can't design something without imagining it in your head first.
A mechanical computer is simply a scale model simulator of the process under consideration. Its components are assumed to represent all of the forces involved in the real world process, but some aspects might have been overlooked. As the narrator said, an analogue computer is specific to the problem being solved, hence there is no concept of software.
The ball and disk integrator actually blew my mind. I cannot believe such a thing has existed and I only ever heard about it now. What a beautiful machine.
For those interested, May I suggest books 📚 like “1800 mechanical movements, devices and appliances” which give an amazing insight into the variety of complex mechanical movements possible in that realm.
@@tomorrow6 Yes that Ball and Disk was really neat, I thought I knew about all the fundamentals in this area. That 1800 movements book is really cool too. Any other sources of roots concepts like that, feel free to post'em : ) also hoping I don't miss it when the new part 2 video comes out
Honestly, I don't think it would be too hard to come up with this idea. It follows pretty naturally from the fundamental theorem of calculus, which basically says that you have to translate the height of the input into the slope of the output. I was more impressed by the pulley thing. Now that is neat!
Yea no doubt. The more I learn the more it becomes apparent our whole human world is a massive construction standing on the shoulders of countless geniuses. It’s just mind boggling how clever the differential machine is alone.
it also helps to understand that because mathematics dogma was driven towards physical representations of equations before the advent of digital computing, that it would be a lot more intuitive for people at the time to make mechanisms that perform equations !
My mom used to work on an analog computer in the 1940's. She worked in a comptometer office when they got this "new machine." She never called it a computer but as She described it my mouth nearly dropped. I was in college learning programming at the time and we had recently gone over the history of computing. She said it had a bunch of wires and plugs and dials and flashing lights. Her boss couldn't figure it out so he gave her the manual to figure it out.
@@SirUncleDolan When I started in DP, I did, too. And if you dropped a deck, you better hope someone keyed sequence numbers in 73-80. In the mainframe world, The Librarian was the first source manager that made it big and provided a decent backup and maintenance mechanism and made card decks pretty much obsolete.
When I was learning to be an engineer back in the early '70s, analog computers were on the way out the door. Large-scale integration was beginning, and Moore's Law was a new concept that my professor's predicted was going to revolutionize computers. Fifty years later I am retired after a career in digital computing, and now I find that analog is making a comeback. I am looking forward to part 2; I know enough about analog computer that I can anticipate some of the application for which they will be useful. I suspect the improvements in electronics, and perhaps even 3D printing of components will produce new and sophisticated analog machines. Makes me wish I was 20 again, so I could have a second career in analog! Please keep making these videos - you are doing valuable work. 👍
Absolutely. Although I personally suspect that 3D printing will take a back seat to modern CNC machining -- "computer numerical control" -- use a digital computer to control a lathe or mill (or yes, a 3D printer) to build the parts for an analog computer. 3D printing plastic parts is great, but I'd rather have a precision machined piece out of aluminum or brass. And just as an example (that I googled; I am not an expert in either field), a typical 3D printer has a resolution of 140 microns, while precision machining can get below 5 microns
Nice that you share this with us! What a time to have been in digital computing, all those changes. I hope you see still more evolution happening. Must be exciting, for me it is anyway, though I'm only 40 years old:)
Chris from "Clickspring" is building an Antikytherean mechanism and has been building it using period tools and techniques to the best of the experts knowledge...the ww1/2 analog firing computers are still incredibly advanced and smartly built tbh. They are just insanely accurate for as little input as they take and what they can interpret and output for solutions
As a US Navy reactor operator of 60s-era nuke submarines, I am recalling that subs had a large number of analog computers, from bow to stern, so to speak. In my own training for my specialty, we were told of magnetic amplifiers (mag-amps) used in German gun directors, that still worked perfectly after being recovered from sunken ships. Part of my work was checking and correcting as needed, the micrometer settings of certain variable circuit components in a particular analog computer that was absolutely vital to the operation of a nuclear submarine engineering plant. In the meantime, we did manual calculations with a slide rule and graphs, to determine when the reactor should go critical. (all of these submarines were turned into razor blades decades ago, so no useful classified information is in this remark)
the classic submariner's lament...when a sub is decommissioned and turned into so much scrap metal, the end result of which is the sub is remanufactured into other steel products...we say, she was turned into razor blades (a trivial end to a once formidable machine. @@tfuenke
What a coincidence! Just this week I started listening to a few of your songs again :D But you're right, he does have a very good way of explaining and visualizing complex topics.
This was astoundingly relevant to--almost a summary of--my History of Science: The Digital Age course, for which I have a final for tomorrow. This video is practically a 'further reading' section. Thank you for this.
Good luck! Although it depends what part of the world you live in, a final on the week of Christmas sure is rough. Hope you at least get New Years off!
This presentation, on this channel, may be my all-time favorite. As a software engineer, I am blown away and humbled by the innovations of people like Lord Kelvin. I absolutely loved the organization and flow of this presentation.
Agreed! Their entire production is the very highest in quality. Just reviewing the references on this 1 video alone says a lot. It really is one of the best science channels on UA-cam. When I think about this other youtuber...a 20-something year old turd who's worth $20M from making imbecilic, "how dumb can we get in 20 minutes on this video?" while he's driving a Bently down Sunset Blvd in We-Ho, w/ his model girlfriend, who wouldn't give him the time of day except for his $20M, his Bently, his mansion & his fame. No wonder this country can't have nice things.
@@MrDAMS1963 Which country are you talking about? If you're in the USA you already have some of the nicest things in history. If you want to make history, change the world. If you want to make money, appeal to the average masses.
I used and helped develop analog computers in 1960's and early 70's. Gun aiming analog mechanical computers used gears and wheels . It considered ship vector and speed and distance to target. The disk/ball mechanism for integration mentioned were also part of it. Electronic ones used tube then transistor operational amplifiers with resistor input and feedback for aritmatic plus capacitor feedback for calculus anf special diode networks in input for trig. And so on. Moon landing simulator we made combined analog (to "think" ) and digitals with control for in/ out. Civil engineers and auto companies used analog pc's to optimize suspensions and roadside slope grading. Plotters as large as beds drew plans and curves far more accurate than primitive digitals. I used m9 in USAF.
The firing solution computer on the USS North Carolina is impressive. It also considered temperature, barometric pressure, delta in altitude (for targets on cliffs), rotation of the earth, and roll of the ship.
And now a digital computer, very small can do all this calculation very quickly, less costly and much easier to maintain. I did two tours on DDGs that were converted DLGs and we had a 5" gun that used this old analog system, but everything else ran on digital computers. The accuracy of the 5" gun wasn't great. I did a tour on an FFG with a 3" gun and it could put a round through a window of about 1sq. m at quite a distance. It ran on a digital computer. I was a DS, Data Systems Tech, and worked with these digital computers and different systems involved in CIC. Out system changed names a couple times and eventually it got eaten up into an integrated weapons system when AEGIS came about. I REALLY enjoyed working on this equipment (1980 - 2000) and I've followed computer technology ever since. The need for ever more powerful computers does not require that a transistor keeps shrinking and this is a fallacy that different people who follow computer technology have spread (what we do when we can no longer shrink a transistor). The fact is when a transistor hits that point it will use such a small amount of electricity that ICs can become much larger as long as clock speeds are low enough. But this brings up the problem of failure rate for die being produced from a wafer, where the larger the die is the higher in percentage is for unusable die from the wafer, since every wafer has imperfections. So we have now moved to chiplets, so a CPU can be made of multiple chiplets and this solves so many problems, along with being able to stack one die on top of another, and eventually probably being able to produce multi layer die that I see most of this talk about the need to move away from digital along with moving away from silicon wafers as sensationalism and nothing more. A bigger problem for digital computers has been and will be how fast memory is. CPUs have to do a LOT of work to try to predict what data/instructions will be needed and fetch it from memory and load it into cache on the CPU which works much faster. If memory could be sped up to where it could run even 1/2 the speed of the CPU cores that amount of performance boost would be VERY impressive and it would simplify the CPU because you wouldn't need so much branch prediction and prefetching, flushing an instruction pipeline with a branch prediction is wrong, etc........
@@tango_uniform Also pitch, wind direction and speed, for a gun if I remember correctly. Even in launching SAR missiles, you need pitch and roll to get the missile going in the correct trajectory
I hope you have watched the video narrated by Spock titled “The Last Question”. If youre an AC pioneer, you’ll appreciate it. You guys are the golden age of engineering. Hats off.
In 2000s, I helped to disassemble a big modular analog computer from 70s. It was pretty universal in its day, but mostly used to model the transients during a big electric motor startup. A big 20x20 switchboard, few electric drums for variable coefficients, everything an engineer could have wished for. As the students, our task was to grab the equations it was modelling and port it into MATLAB. A paltry Pentium-2 PC was quite faster and much more precise than that giant machine. After the model was ported, we together with the faculty's staff disassembled the computer and moved its modules into the humidity-controlled storage. They wanted to preserve them for history. Wonder if it's still in there...
It's so crazy how adding or multiplying sine waves, something that's as simple as punching values into a calculator today, used to require some unbelievable engineering. I mean, just the notion of such an advanced mechanical computer makes my head hurt. The things we do today would be seen as magic many years ago. Great video!
You need to punch in a whole lot of numbers in that calculator to do a fourier transformation. The fact that it was analog is ideal for fourier transformations instead of doing a lot of calculations the next calculation comes from turning the wheel slightly. No wonder these where still used in the 1960'ties.
Obviously there must be an external input, but you don't necessarily need to do everything manually, like in a calculator. If you already know the formulas that you will follow, you can create a web panel / etc to facilitate it to the user.
The analog computers used by navies during the First and Second World Wars are amazing technological feats. The inputs are the continuous, relative motions of opposing ships and the outputs are a synchronous stream of firing solutions.
@@byrnemeister2008 It became a important piece of equipment to the Allies due to it’s increased accuracy with pinpointing the exact location for detonation.
The US Navy was still using pure analog computers for gun fire control into the 70s and possibly early 80s. In the early 70s I worked on the fire control radar for the Tartar missile system and at that time our system was a mixture of analog and digital.
So much goes over my head, b/c mathematics proved that my brain was deficient in something. I maxed out intellectually/academically at _pre_ Algebra. It was heartbreaking and I'll never be completely be rid of the shame and guilt, b/c I excelled at writing and English. Bc/ of my math and analytical weakness, my IQ is only 80, despite the fact that I managed to finish 2 college degrees. So, degrees don't really mean anything (unless they have ivy league backing).
I was a field engineer for Electronics Associates Inc., the largest manufacturer of analog computers in the 60's and 70's. I traveled all over, but the last year was at NASA Ames. Among other things, the navigation 8-ball used in the Apollo program was developed on our computers. They were state of the art at time time. We later combined them with digital computers. The complex computations were handled with ease on the analog portion and the raw number crunching was done on the digital computer. All together we had around 15 large scale analog computers on site at NASA Ames filling entire rooms and involved in all aspects of their various missions from spaceflight to life science studies. Analog computers speak to engineers in their language--mathematics. Digital computers require interpretation between languages.
4:15 It's pretty mind boggling that a telegraph cable was laid across the Atlantic ocean in the 1850's. I'd love to learn the details of that endeavor someday.
In brief - yes, it worked. But ... it was shoddily made, and leaked water. Thus, the signals were very weak. Two solutions were proposed. One was to use a mirror to amplify the weak signals by reflecting a dot on a wall. The second, advanced by an arch-rival, was to use high voltage. The arch-rival won the debate .. and burnt out the cable in a few days. This is in a short story is told by Neil Stephenson - "Mother Earth Mother Board" from 1996
Bet you can find at least half a dozen videos on this topic. Now is the time to pray 🙏🤞as my luck is usually terrible when it comes to winning bets (though logically speaking I should win).
@@adblockturnedoff4515 Except Neil Stephenson is a great writer. He even starts out the story with this convoluted Victorian introduction 'wherein it is explored...'
So Lord Kelvin was the first guy to think of a system to compute the FFT! How come I never heard about this in engineering school. Before DFTs and FFTs, there existed AFT (Analog Fourier Transform). Mind blown...🤯. And what an elegant construction. These concepts should be used to teach mathematics and engineering in STEM. The genesis of the thinking behind any mathematical and engineering breakthrough goes way beyond equations and has real world analogies that are much easier to understand. Brilliant video. Thanks for everything you do, Veritasium. 🎉🙏🏽
Precisely this, had they taught me the history behind the math in high school I would have been more interested. As opposed to asking the teachers what sine and cosine is actually used for and getting a non answer.
@@m1ndk1ller My teacher used to say just study whatever is in your book dont ask useless questions, these people just don't know the importance of all this and just made us into calculators.
For those of us, who were never explained the reasonings of math, or having a dictionary/thesaurus, to find out what the differing parts of math are/were for, it is all but impossible to see something that does not have any tangible roots for yo draw an image within our minds, … I have never understood the practicality of algebra, any further than Einstein’s time x mileage, =‘s mph, … drawing a picture inside my mind of what something should look like, (such as using the “carpenter’s rule of thumb”), … is the one sure way, that I have used in creating doorway’s that are genuine in their appearance, & setting up a baseball/soccer field in proper angular accordance with the rules, … flat lining a parking lot, with a slope of 2 tenths of an inch from the highest point of the area, to the sewer drains, have ensured that water would not collect, and flood patrons vehicles in abundant rainstorms, … running a leveled string line, ensures grade at marked consideration points, for the backfill, grading, of stone, & legitimacy of pavement, whether poured, or heat adhered such as asphalt, … making an unbalanced wall look flat is also to be had, provided shims are properly placed, & utilized throughout construction, … painting said walls & ceilings can also be sight affected by how much paint is applied with a roller, & in what direction, it is rolled out, with textures, notwithstanding, …
My grandad was a bomber pilot in the RAF in WW2, his crew was occasionally given new devices to test, one of them was a machine that contained a moving map that was small enough to strap to their thigh. This map would rotate around rollers and update to show the terrain he was flying over regardless of visibility to the ground. The whole crew thought it was incredible and provided great feedback to the engineers that developed it. But as was often the case it was taken away after testing and they never got it back for the rest of the war. Always been curious what that machine was, how it worked and why it was taken away if it seemed to be working.
Many incredible inventions and discoveries made during the war were never actually put into use with front line forces because it was feared they would do more harm than good if they fell into the hands of the Axis powers. This was especially true of inventions like the one you describe. A key part of the aerial defence of the UK was the use of decoys on the ground to send German bombers using visual navigation off-course - at various points during the Battle of Britain entire fake towns and other features were created for this purpose. If the device you describe were to found by the Germans in a crashed plane then it could be reverse engineered and the accuracy of their bombers would improve ten fold, even in weather conditions where our own fighters could not see the enemy. Worse still, what if these same computers were adapted into guidance systems for the German V1 and later V2 unmanned missile and rocket attacks? The navigation abilities of V1 were notoriously bad, which is why it was largely fired at targets so big that they were hard to miss, i.e. London. They weren't precision weapons, but what if this technology was available to the Germans? That development may have altered the outcome of the war. These are just hypothetical reasons, I'm sure someone knows the real reasons why the device your grandfather tested was never seen again. However these types of very real fears were the case for many inventions being kept on a shelf marked Top Secret instead of being used.
It was taken away and given to the Nazis to help them build up the Nazi War Machine! Then, our inventions that the Nazis was using made them look like a genius, when it was in fact American genius. There are spies in our Military Industrial Complex who sit in the highest military and civilian positions waiting to transfer American military and civilian secret to our enemies for a handsome price!
I dropped out of school, regrettably, but whenever I find your channel while scrolling, I always seem to pay more attention than I did in school. You've taught me more than most of the people who were paid to teach me, and for that I appreciate you V, keep up the good work!
I've done a teardown video of a B52 bomber astro compass analog computer and it's glorious how these thing can compute sinusoidals and do integration etc.
Why destroy the old mechanism? None are being made any more today. Is there a security reason for it’s destruction? Or am I mistaken and you meant that you took it apart to learn about it’s function?
@@wynfrithnichtwo8423 Yeah, you've missed a term... "Teardown" involves (specifically) taking carefully apart so that it can be rebuilt... including any and all repairs or replacements... so that when you are finished with a "full operation" you have two majority stages... "Teardown" and "Rebuild". If you'd like a specific reference in literature, Hayne's or Chilton's or "Climber" (for motorcycles) offer "Complete Teardown and Rebuild" Guides to automobiles, motorcycles, and ATV's... and there are probably other publications... SO you can order or download based on whatever vehicle you'd like by Manufacturer, Year, Model, and any additional nomenclature... AND read up on just about anything you'd ever want to know about it... Some of us "restorers" have to take things apart in a "teardown" operation, to study them. We usually have to take pictures prolifically of the project as we do so, and then study those pictures and the components to decipher how to get them back together to make them work "like new" again... and other times to simply fabricate a new one... occasionally (when we're lucky) better than the original. Nobody is willingly tossing one (essentially) into a chipper. I promise. We have more respect for the past and the heroes who lived it, than that. We're just as fascinated as you are! ;o)
I do want to point out on the rather incredibly improvement of shells/kill ratios of Allied AA guns: that wasn't solely down to better maneuvering of the guns. Planes are fast and small, so hitting them in three dimensional space is really difficult. The Americans and the British jointly developed the proximity fuse, basically a way for the shell to detect when it was near an enemy plane and detonate, and this at least removed the third dimension from their aim, making them much more accurate.
Oh from one of the Curious Droid channel video yeah it basically the fuse that make allied guna more accurate and with the additional improvement with equipment to calculate the trajectory it makes more accurate guns
Proximity fuses were far more important than the rare analogue control of a aa gun. As you still had to aim the gun at the plane yourself for the analogue computer to attempt to correct for speed, distance and wind. Our own analogue computer in a trained operator (called a brain) did this just as well with some intermittent flack rounds for the adjustments to be made. Proximity rounds that actually went off weather they hit or just went near was a game changer (though these were only in use on the large calibre aa's not the light machine gun type).
Right, we need a comparison of before/after the proximity fuse was implemented, which I understand to be very close to the end of the Pacific theater. I checked: first use January 1943.
In absolute awe of these old computers. The Rotary Ball Integrator is beautiful. Back in college, it was all about learning the symbology of how to solve integral and differential equations, and although we were told we needed to understand conceptually what we were doing, we didn't spend any time on it. Seeing these machines add all those waves makes that visual representation all the more fascinating and gives you a sense "aha! so that's what it means to add waves and take the integral"
I don't think I really appreciated what integration was doing until I took a chemistry lab in college with an outdated FTIR that could only print to paper. We had to cut different peaks and weigh them to compare against each other.
@Grace Jackson I never get this about educational systems, they all start backwards and expect people to be good at grasping things out of a vacuum. I tought myself the most complex stuff possible without any help from teachers or experts. When I analysed how I did it it was always "backwards" to how school tought you: 1. I saw a problem, 2. I broke it down to the basic most important aspects and (...) then I calculated it. I was always very happy with my solutions (they worked perfectly). (...) = 1. as an amateur having absolutely no clue on deeper scientific stuff, 2. looking up the formulas necessary for the calculation (or in case of language learning - learning the grammar), 3. learning on my own how to apply the formulas, 4. adding formulas together to solve the problem. Why could I never do this stuff in school or university? Because I only had a fuckton of puzzle pieces and absolutely no picture in mind of what they are supposed to ressemble. If you have a problem - like the tidal waves - you very clearly have data on what to expect and compare to your solution. I wish school would start out teaching the actual real world application FIRST (yes, it is extremely complicated) and break it down in smaller and smaller parts that can be tought individually. You can always see that you are in "chapter 3" of the "tidal wave problem" and where this fits in the bigger picture and what is left to do. At first you think "Sh*t I'll never be able to learn this clusterf**k?!", but having this picture of what is expected is giving such a great motivational reward when you realize you solved this HUGE problem on your own. How does school do it? They teach you all the tiny pieces with NO explenation of what they are for. Then at the end you get to solve a problem that is laughably easier than tides calculation and you still don't know how to solve it, because you never had to apply the puzzle pieces to real world problems. School... every time... I hate it so much.
As an electrical engineering student in the early 80's, I worked at a steel plant where I helped fix an analog (tube) computer that controlled a mill which cold-rolled aluminum. Just when we figured out what was wrong with it, they decided to use digital instead.
The problem with analog computers is that there will always be some slop in the connections between parts; the problem with digital computers is that always there will be cosmic particles to malfunction it. :-)
What a great talk. I remember hearing a story about analog computers for use in naval gunnery; they were used to keep the guns level at the target in rolling seas. When they tried to replace them with digital computers they found the digital computers were too slow to compensate for the movement of the ship, and had to go back to analog controls. It took a couple of decades of digital computer development before the digital computers were fast enough to replace the analog ones.
I’m glad you brought that up. This video states the problems with analog but did not mention the huge advantage and that is speed. An analog computer can solve VERY quickly, which is why they were so effective for use on gun sights against dive bombers.
If he born today... Elon musk would have become 2 most richest person.... Or at least, Nobal would have started giving Noval prized for maths and cs.... Just to give him....
@@darkblaze1594 A differentiator would be relatively easy to design compared to an integrator. Without having any prior knowledge of such designs, I can imagine one possible way it would work. If you have a machine which plots waves on a chart, you attach a weight to the swinging arm which plots the line, in order to increase its inertia. Then you measure the force on that arm, which is proportional to the acceleration of the needle, which is of course its derivative or second derivative, depending on what you were measuring.
As a person into computer science myself, I love looking at the old ways of doing things. Seeing the pulleys and the ball integrators was incredible, and I can't wait for the next video.
@@Hshjshshjsj72727 There are actually continuous data structures in computer science research, so an analog system is not exactly necessary to store analog data.
Analog devices like the Fourier analyzer are so astounding to me for how creative the inventor had to be. One of the coolest is a "planimeter", which measures the area of an arbitrary shape by tracing around its perimeter.
@@LucasPlay171 Possible but not the easiest solution-it's much quicker and cheaper to cut the shape out and weigh it and then compare it with the weight of a unit square. Years ago one of my chemistry teachers explained to me that that's how they were taught to take the area beneath the graphs their instruments produced-as chemists they had high resolution, sensitive balances to hand.
I had no idea about the history of tide computations. The presentation of the Fourier application is the best video that I ever saw. Even if I hadn't taken Calculus 30 years ago, there would have been a lot of understanding imparted from the stunning visual aids. The video also provides an appreciation of the genius (1% inspiration and 99% perspiration) of earlier generations.
Claude Shannon is by FAR my favourite mathematician, and, in my opinion, the single most underrated. Very happy to see him get a mention here! His work enabled the ENTIRETY of modern computing, communication, and information theory. Plus, he has a lot of other really cool work, such as the sampling theorem (bane of audiophiles everywhere). And better yet, he did a significant portion of his important work as a master's student!
And Shannon's Law is still used to analyse the capacity of an electronic communications channel C = W log2(1 + S/N ), where C is the channel capacity in bits per second, W is the bandwidth in hertz, and S/N is the signal-to-noise ratio
Hey brother, all these information theory I have those topics in my UG course... dumb faculties don't care about explaining how all equations are actally used in realworld... but I'm intrested in exploring on them to get a strong understanding.... where can I start exploring to understand everyting that Shannon presented?? can you point me to some direction ?
My physics teacher used to tell us that digital computers would never catch on because of their limited accuracy, and those analog computers were going to be big. That was in the UK in the 1960s. Since then, I have been laughing about that and thinking “How wrong can you be!!”. But, well, maybe he knew something… let’s wait and see.
@jshowa o Yeah, if a finite amount of frequencies is present. Even a simple step function has infinite amount of frequencies present. The Nyquist theorem is more about the resolution limit of discrete approximations
I remember my teachers all tell me how important it is to understand basic math because we wouldn't be walking around with a calculator in our pocket... Growing up with computers, it's crazy how much it's changed, but also how much of it is still the same.
As a computer engineer, I was always told that every one has always dreamed to build analog computer because of the enormous performance potential over digital computers. However, it's not that engineers are dumb, that's not that simple. There is the precision and noise problem that you pointed out, digital component can easily erase noise and restore a digital signal which is very interresting to transmit data over the distance. But in that's not the most important point. We are curently already creating quantum compter which require extreme precision, but there isn't any decisive project for analog computers yet. In fact in the case of computing the true game changer is timing! Most analog components have time based behaviours, they use analog values of physical constants like voltage, intensity, etc. that doesn't change instantly because they tightly binded to physics laws. When you have a few component moving over a few minutes like old computers presented in the video, it's no big deal, but the nightmare start when you try assembling million, billion or even more of them and syncronizing them at the micro-second or nano-second. And th is not a trivial problem. On the other side, the main advantage of digital gates is that they are syncronized on a clock. There is a regular clock in your computers (today usually around 4 GHz) and all the memory cells that store bits of data in your computer change there state at the same time when this clock tic. The clock period is designed to be long enough to be sure that every electronic component in the whole computer chip had the time it needed to update its electric state and stabilize its value. Some components would probably take less time than that, but in the end we are garanteed that all electronic signals are stable and reliable. Just with that hypothesis all simulation tools and computer designer can make an enormous simplification by just ignoring all time related physical behaviour and instead focus on a much much much simpler finit state automata. If you look at a processor simulation tool used by computer architects you would not see anything related to voltage, intensity, or physic equations, because all of that is neglected and instead we can focus on much simpler binary signals. In fact I lied a bit, in practice at the level of miniaturization we reached, we still had to deal with noise, interferance, etc. but that's mostly solved globaly by building robust components and adding shielding around chips when needed, an not by solving the exact physical equaltion between every single electronic components. It's not that no body tried to do analogus computer but, in history, since the transistor was invented, it has always been simpler and more efficient to reduce complexity with digital computing and compensating by an increased clock speed, adding more parallel transitors computation or a more optimized chip architecture, that trying to work out the complexe equations of billions of analogous electronic components. It's a little bit like an intersection with traffic lights. It's theoritically possible that everyone pass the intersection at almost full speed by precisely computing everyone's speed. But since cars became widely used, we all accepted that its safer to sync everybody using traffic lights in a binary way.
What about optical computers? Some startups and research groups are investigating light as a signal and calculation medium, using wave interference to perform calculations. While most of these are digital, I think there's potential to make analog optical computers without as many of the downsides you mentioned. I wonder if that's what part 2 of this video will be about, but I could be wrong.
If I had a billion laying around I would simply start a non-profit organization to preserve and fund more of these analog computers. You know, just in case of... electronic doomsday, massive EMP's, GRB's or any of such.
The rotary ball integrator, oh God, what a fascinating piece of machinery! Can someone guide me as to where I can get my hands on one? Not only would it be an amazing prop to have on my desk but it would serve well as a teaching aid to Calculus students so they can develop an appreciation for the physical "analog" of the integration process.
Kelvin's tide machines are my single most favourite things. I remember seeing them for the first time in the London Science Museum over 30 years ago. Small nitpick. The pics you show when mentioning Colossus are actually the Bombe machines used to crack Enigma, not the machine used to crack the Lorentz codes. Everyone forgets Konrad Zuse and his early computers. These were earlier and more advanced than the ENIAC. But since he was German, and developed them during WW2, history has not been kind.
There was more than just war as a reason why Zuse's computers didn't go thru the roof. However some companies used it back then and made big profits because of that and are still in business today.
If I'm not mistaken, technically, Zuse's computer was not fully Turing-complete. Don't get me wrong, it was a miracle of engineering nonetheless, but I think it's rightful that ENIAC gets the credit for its achievement.
@just some guy tired of life Bro why would you expect someone to expect details on how an anti aircraft gun worked in literally 5 minutes? This video is NOT about teaching you how an anti aircraft gun works, there are other videos and books on the topic. Sine and Cosine waves are literally taught in school, you wouldn't expect him to teach us how to do carry division in a 20 minute video on analog and digital computers lol. He gave examples of analog and digital computers, which you can research yourself if you find them interesting. I don't think you understand the point of the video: Not to explain everything perfectly, but generate the interest in YOU to learn them yourself.
@just some guy tired of life Well yes you're completely right, but I'm not expecting to learn all from a single video it's just that now I know stuff that didn't know it existed, in order to have deep knowledge about it I must investigate by myself....
Absolutely great video! The first person who added sine waves with Scotch Yoke was not William Thomson (Lord Kelvin) but Francis Bashforth (1819-1912). He built a Fourier synthesizer in 1845 for purely math purposes (equation solver). When Kelvin's tide predictors started to become famous he tried to make his priority known but it didn't work very well. His paper "A description of a machine for finding numerical roots of equations" was reprinted in 1892.
I'm totally fascinated by that analog pully frequency sumarization device and that mechanical fourier transformation. Such a beautifully simple solution.. and man, that was just 126 years ago... Mind=Blown.
I had a lab exercise to simulate a control system stability with an analog computer. The analog computer was built using vacuum tube op-amps. Their signal range was +/- 100 V. A bit later, where I worked, my boss had also earlier used the same analog computer and we started discussing if we could build our own, but with recently introduced semiconductor op--amps. Well, that discussion resulted in a task for me to design and build one. What I ended putting together was a box with 10 op-amps, a bunch of stackable resistors and capacitors and a few special units, and a panel with banana sockets for the stackable components. All along with amplifier symbols on the panel. The signal range was reduced to just +/- 10 V. My boss was soon happily simulating not only his control systems, but also extrusion and cooling processes for high voltage cable insulations. That was just before I got transferred to another division of the company, where new challenges were waiting me in a new R&D lab. So I myself never had the opportunity of really learning to use "my" analog computer in anything but simple servo loop simulations (optimizations).
I am a meteorology student and my university is very old fashioned. I had to do a lab course for meteorological instruments, and one of them involved using an analog computer to measure the humidity. The device contained human hairs and the computer was able to measure how much the hairs stretch and draw a graph of the humidity. The tutor asked me beforehand what I expected to see, and I said an exponential curve, and the thing really did draw a perfect exponential curve. I remember both my partner and I were kind of fascinated by it even though nobody would ever actually use such an archaic device anymore. Thinking about analog computers reminds me of another professor who claimed that every single physical thing that happens is a measurement, which is why the wave function in quantum mechanics collapses when it's observed. Thinking about it, building devices to mimic these measurements that God and the universe is constantly making makes a lot of sense.
And now imagine that in your brain such images of the world are created by a variety of interactions between neurons. I think human brain is some kind of an analog computer, too. It's really fascinating! ❤
hair hygrometers are archaic? Out of the mouths of babes. Humidity transmitters using horse tail hair are still in use today and are generally more stable than the electronic sensors. The horse hair does not stretch. It elongates as it absorbs moisture. The hair bundle is maintained under a gentle stress, just enough to keep the hairs from becoming loose. The only one I know of that can rival a horse hair hygrometer is the chilled mirror technique. In this device colimated white light is directed at a 45 degree angle to a front surface mirror cooled by a Peltier junction. The light is reflected off the mirror to a photoelectric detector. The control circuitry controls the Peltier device to maintain a slight decrease in light which indicates a thin film of dew is on the mirror. The temperature of the mirror is the dew point. The (analog) electronics can convert the dew point into relative humidity when needed. I have a horse hair dew point transmitter in my display case that is 90s vintage. It's output is to control the current in a current loop to between 4 and 20ma. That's a standard analog measurement range. It used to be 10-50ma but to reduce power consumption in large control systems, the 4-20 ma was standardize. The compliance voltage is usually 45 volts so many consumers of the signal may be connected in series. Voltage operated instruments used dropping resistors across their terminals to develop the voltage. Most indicators and analog control elements use the current signal directly. The entire reactor control and protection system of the Sequoyah nuclear plant near Chattanooga, consisted of a large room full of racks of analog computing elements. Square root extractors to linearize the differential pressure signal across an orifice plate flow measurement. Integrators and differentiators. PID controllers. Alarms (devices which closed contacts when the current exceeded a setpoint, voters (selected the highest or lowest of several inputs) and so on. Construction was started in 1970 and it went commercial in about 1978. It was only 3 years ago that Westinghouse replaced the analog control system with a multiply redundant custom-designed computer system.
When I joined the Navy in 1971, they still operated analog computers in some of their fire control systems. An entrance exam showed some simple gear output questions. I worked on a Norden system that was a true hybrid. It had a few analog computers that operated on gear ratios, electronic pots, accurate voltage references, relays, shaft encoders, etc. to a digital computer that had vacuum tubes, early 1960's transistors, late 1950's developed magnetic core memory, magnetic drum memory with about 2 dozen read/write heads in it. Not an integrated circuit chip in the whole system. This fit inside an attack aircraft that operated off of aircraft carriers plus ordinance and 2 crew. This unlike the aforementioned bombsight, actually got the job done. Yet these break down and needs constant maintenance and like the airframe it is attached to gets old and outdated.
My grandfather built analog computers into battleships in WWII. He described one as being roughly 1000 cubic meters in size with the mechanical components weighing hundreds of tons. He also described them as being accurate to within 60 decimal places and I still don't have a pocket calculator that can do that.
@@nunyabiznez6381 The one he described must have been on land. It couldn't have been additional *hundreds* of tonnes placed inside a WW2 battleship. The Mark 37 system fire control system in the Iowa class battleships, for example, weighed 1 400 kilogrammes.
Man. The quality and exactness of your visuals have literally exponentially increased over time. New videos are simply amazing.... It always makes me wonder whether I should start a career in research
I appreciate the compromise you’ve come to around clickbait. You use a clickbaity title, but not deceptively so. Rarely if ever do I feel like you’ve over promised. You’re really setting a gold standard in how to make great content on the internet today. Thank you for all your hard, and thoughtful work!
In order to get popularity in the algorithm (and compete), clickbait is a necessary evil, sadly. I don't like it either, but I'm perfectly fine with channels I want to see succeed, using it to stay relevant and gain viewership, even if they (slightly) over-promise in the title. The more people that watch Veritasium, the better, at least in my opinion.
Spent a huge chunk of my sophomore years in 2001 on learning 741 OP-amp, professors told us it was used for making computers and calculators during the 60s. But later the entire ecosystem of analog computers were completely replaced by boolean logic circuits. But some type of computations, especially numerical integration/differentiation can be done really really fast and quite accurately on a very simple 741 OP-amp configurations (even you can make one now, just order the components over ebay and won't cost more than 10$ with batteries) but for some reason people don't use that technology anymore. Would be really excited to see how they are coming back. I'm hyped for the next installments.
The ball and disc system is used in the transmissions of many snowblowers but the ball is another smaller disc. This gives a neutral position, a variable forward speed and a variable reverse speed. Pretty amazing tech.
I had never really thought about the digit in digital, so I had always considered digital to be synonymous with electrical, and analogue with mechanical. Your mechanical digital computer taught me something new today.
Be amazed then - it simply comes from "digit" as in "finger". That which you use to count up to ten, for you have ten of them. Which made "digit" just another word for "number between one and ten, that we use to assemble really large numbers made of multiple digits". So "digital" is just another word for "numeric", as in "based on numbers".
Analogue means 1 to 1. Every button or input corresponds with a specific output. I don't understand the mechanical digital computer. If we didn't assign 1 and 0 to the outputs, would it still be digital?
When I started in Engineering school, there were generic purpose electric analog computers. I never got to use them, because they were phased out before I got to that level, but I think they were used by students of differential equations to prove their models. Instead of coding, they worked by patching components with wires, such that they could combine the effects of different analog elements, to obtain an output curve to the input they provided. Something to an electronic version of the tidal wave prediction machine, but configurable to different sequences depending on the patching.
A version of Kelvin's ball-and-disc integrator was also used in the fire control of large battleship guns during WWII and after. There were tube-type electronic computers, but they could not survive the concussion of a salvo of four 16" guns firing simultaneously. The one that I saw in class used a wheel rather than a ball for the output.
I wondered why no mention was made of analog electronics in this video (or at least it was glossed over relatively quickly). A big step in the chain of technology was from electromechanical devices to devices that were purely electronic, but not yet digital.
Reminds me of an internship I had while I was in engineering school: I worked for the Geophysical Institute in Alaska, with a group that analyzed synthetic aperture RADAR data. Back then, gigabyte file sizes were still pretty hard to work with and the data took hours to process, something that most SAR devices do in real time now (I think). They said that in the early days of SAR, they used some kind of optical analog computer that used lenses and mirrors to do all of the calculations. Fascinating.
What is a synthetic aperture..like as opposed to an organic or naturally occuring aperture? Or wjat kind of radar technically has some kind kf aperture I'm curjous and not familiar? I'm only thinking of like a radar satellite dish or antenna I guess.
@@leif1075 the “synthetic” means that it’s calculated rather than natural. Maybe a bad analogy would be: if you have a camera with a tiny lens but you moved it along a really accurate path while taking a thousand images, if you used computers to stitch them all together and overlay them, you could achieve an image that was larger than your mini camera could take. Radar images are way more accurate with a very large dish or antenna so they can “resolve” small targets. Well, big antennas and dishes like that are expensive and hard to put on airplanes and even harder to put into space. So, some smart people figured out that if you took a smaller antenna, and made send out radar pulses almost continuously, and then did some tricky math, you ended up being able to get really good resolution by “synthesizing” the “aperture” of the radar.
I was a Fire Control Technician (gunnery computers) in the US Navy in 1959. The computers were all analog and had rolling ball integrators as shown in this video. The adders and subtractors were miniature differential gears identical to the differential which drives the rear wheels of your car. At that time we never thought of solving the Fire Control problem digitally, using transistors. 🇺🇸
Wait….. ¿You put a personalised AD in your video to promote a new channel? ¿¿¡¡HOW DID YOU DID THAT?? you genius!! ¡I never saw that before! that was very original and I specifically love how you did because you started the ad with “so it looks like you are about to watch a Veritasium video” using the 5 second wait to skip the AD to attract the interest of the viewer to listen to the ad and not skip the add immediately ¡I loved you guys a great! :D
I would love to see a veritasium / clickspring crossover. The connection between the theory in physics to the the real world in machining, has a lot of merit for each.
Photonic computers is an interesting type of analog computer that's just taking off. By making a microchip with what can be described as tunable partially reflecting mirrors, we can make a system that's essentially a matrix multiplication. Because the chip only changes to be re-programmed, it can crunch through multiplications as fast as you can put them in. Constant time, or even faster depending on how you look at it.
Even normal computers can crunch through multiplication/instructions as fast as you can put them in. problem is, they get hot LOL. photonic seems good tho for a certain types of calculations tho
@@benjaminmuller9348 Voltage and current is a problem since you need big power sources to run fast, moreover there is a point when it just too much voltage through the semi conductor and at that point you get a shortcircuit, so there is a physical limit that we are already hitting since our electronics are a few atoms wide now days.
Driving me crazy waiting for part 2 of this. This video has ignited my imagination like CRAZY. It is amazing to live in these times with such amazing learning resources.
I come from at least 5 generations of clockmakers. My dad was the last generation to actually be a clockmaker and I am a software engineer. Perhaps my grandkids will be clockmakers again :-)
That was amazingly cool. Doing a discrete Fourier transform by hand is hard enough, but designing a machine to do it with nothing but gears and pullies just blows my mind. We have it so easy now with digital computers and software. We can program anything we want as sloppily as we want and a compiler will find a way to get a CPU to do the work. With analog you can't succeed without a complete understanding of every aspect of what you're building.
When I started my career, 1980. Analog controls were phasing out and digital just starting. The digital computer we used was a custom design, 2, 16 bit shift registers into a single bit comparator, with carry. We had 11 instructions, but only 4 were used mainly. I asked why we moving towards digital when obliviously it was so rudimentary? The answer, because we could change control equations with software, whereas changes to analog systems, always involved changes to hardware. We had a group that modeled differential equations with this big patch board with op-amps, lags, leads and gains. Analog controls are differential equation computers.
The obvious answer is to use a mix of parts, so digital can do what it does best, and analog can do what it does best, and the two can talk to each other. Integration is an ass to do digitally. All your physics calculations are integrals. Want to know when two moving objects will collide? That's an integral. Want to know how something will accelerate or bounce? That's an integral. Want to procedurally deform terrain without needing a million triangles? Integrals have you covered. Just add a hill to the equation. And digital SUCKS ASS at it, needing specialized graphics cards and huge memory to do raytracing. In analog? It's a few amps and a coil of wire. You don't get perfectly crisp results, but it's a PHYSICS AND LIGHTING ENGINE, you didn't WANT perfectly crisp results.
@@williambarnes5023 -- Can you keep a tiny ball from slipping when it's spinning fast enough to do a couple million integrations per second? Eventually we might have thousands of nano balls or equivalent piezoelectric structures, but hyperthreading will be here for a while yet.
Back around 1970 there were surplus bomb sights available by mail-order, and I got a few parts from one - a 24 volt motor and a couple of really nice worm gears with screw actuators with feedback to track air pressure governed by a metal bellows. It was like a big mechanical servo to adjust for altitude, and that was just a small part of it. Don't know if it was a Norden. Ten years earlier, I was about 10 years old prowling around a local business and found an aerial bomb in their dump. I remember thinking it was like a kiddie bomb, but heavy steel, maybe 6-7 inches long with 4 fins inside a 2-3 inch cylindrical cowl - the cowl being the same diameter as the rest. So I brought it home. When my mother saw it, the blood drained from her face. She knew a bomb when she saw one and knew what they could do, having spent the war in London during the blitz. It turned out it was being used as a harmless paper weight and somebody got tired of it and tossed it out. I don't recall where it went. I think maybe she also threw it out because it made her uneasy.
Something to note is that it's not really Moore's law that's ending. The real issue stems from the death of Dennard scaling which is a core part of Moore's observations and a big player in the financial decisions when producing transistors. Something that's been really interesting to see is the research and work into Memristors as a form of analog computing or even being combined with traditional CMOS logic, however, there are many other interesting "Post Moore" computing techniques so I can't wait to see what's discussed in part 2.
@@rolly6020 But quantum computing only has 3 values. That's still a significant upgrade from 2, but it's still a far cry from the hundreds, thousands, etc., that you can theoretically get from an analog computer. The main drawback to analog systems is the lack of accuracy due to loss, but one solution I thought of is already being used for quantum systems: Superconducting. Why go for 3 when you can go for way more with basically the same restrictions (i.e. having to use superconducting wiring)?
@@rolly6020 Maybe I should have explained better. Moore's law is still far from ending. Like I was saying the end of Moore's law has grossly been misattributed to issues in the computing field whereas the real issue is the end of Dennard scaling. We can still shrink the transistor and double the number of transistors on chip and will be able to do that for quite a bit longer, the real issue stems from errors when considering tunneling and also issues with leakage. These issues make the doubling of transistors much less efficient due to the increased amount of computation now needed for error correction due to the previous issues I referenced. This directly describes Dennard's scaling, we can no longer shrink a transistor while maintaining the same power density as before. Poor Moore get's all the bad rep when it's not really his observations which are at fault lol.
@@XxCODSMACKxX Yes, in any particular area where physics begins to run into barriers like single molecule atomic sized computing switches, there must be found whole other paradigms to end run the problem. And sometimes those don't exist, I expect, or at the very least will require a whole new aspect of knowledge of reality to advance further, like quantum mechanics did for us recently.
I'm no computer nerd but we could use spintronics or use a different coding system like the theoretical sloot digital coding system; making better use of the technology we have now. Still, I find analog computers very fascinating.
When I began as an undergrad 50 years ago, after experiencing numerical analysis with mechanical calculators, then on an Elliot 903, we moved on to analogue computers. As a Physicist I wondered why anybody would want to programme a digital computer to solve differential equations when it took a couple of minutes to patch the equation into the analogue box, and results came out instantly as the variables were changed - rather than punching a new paper tape, queueing to feed it in, then after dealing with error messages trying again and getting the result. I spent two 'gap' years working on British Government work (the Official Secrets Act prevents me speaking further), and when I returned to university the analogue devices had been scrapped. I have a permanent eBay watch for an EAI TR20 - the device on which I lost my analogue virginity.
@@benwhitehair5291 back in the days, degrees actually meant something. The higher educated were really elites and forefront of knowledge, and there aren't as many of them. Thus, they have more opportunities to work with governmental agencies that needs them.
I think, like the video points out, that when exactness and reproducibility are required, digital computers have an advantage. Also the limitations of creating accurate and precise analog components at the time is probably why digital took over almost everywhere. Part 2 will probably explain why analog might be making a comeback these days.
@@gottagoMS123 degree inflation is such a weird thing. On the one hand its amazing that the general population has easier access to the forefront of human knowledge. At the same time though, it has dumbed down a lot of it to the point I know many with "degrees" that probably dont deserve them...
I have both Fourier analysis and Laplace transform exams this semester, and didn't had any clue what I was calculating, coincidentally this vid gave me a very good visual example! Veritasium never disappoints ❤️
What a failure of teaching! Actually my pure math class was where I first encountered it and he was a terribke teacher... when people got confused hed say "its simple!!!!" Was diff EQ prof. Then all the bazillions of EE and esp. Signal processing classes I later took it got to being like do you really need to repeat this for the 600th time during the first 2 weeks of the semester? Became trivial It also is critical to quantum theory, the wave equation. Dirac equation is the truer one vs shroedinger. The one who conceived of the delta dirac function that makes so much of fourier analysis so useful i.e. FT of impulse response = frequency response... where convolution in time is multiplication in frequency basically the fundamental theorem of signal processing in LTI systems
And Z transform is the digital version of Laplace. Never really worked with Laplace in real life but Z transforms are very useful. Something like imagine the s=jw axis being rotated on itself into the unit circle of the complex Z domain where z = exp(jw)
Modern digital oscilloscopes and spectrum analyzers does Fourier analysis in real-time. My own oscilloscope (Rigol DS1054Z) makes 1G (1 000 000 000) samples per one second and its make Fourier from this on real time, and everything is done by very small CPU with ARM core - approximately 1cmx1cm (half inch x half inch) size, and uses couple wats of energy. This is huge incredibile. I know, there are exists much faster and more precise oscilloscopes, but they are very expensive.
Apart from small op-amp based reconfigurable machines, real analog computers are superfast and consume very little power. They have been used a lot by military, mainly by radars. I've been designing SAW processors with TeraFlops performance back in 80s. But the amount of second order effects, the need to intimately understand lots of various physics involved, the time required to and the plain hardship of "debugging" ie polishing of the repeatable design makes it totally unsuitable for mass production. There are not very many people in the world capable of doing the work.
From what I know the general idea between digitial and analog is: Digital is just less efficient then physicaly implementing whatever you want to simulate directly, but a lot more general. Trowing all your effort into optimising 1 digital solution is better then optimising an infinate amount of analog solutions. (The engineering that goes into lithography is just insane.)
Its not that, you need abstraction, and to build off your previous work, to actually have that be effective. Like everyone would be writing their own code in litteral assembly, but people need to be more abstract when making more and more complex programs, and thats why you have stuff like Python being used so often now.
@@someonespotatohmm9513 like you can probably have a decent enough analogue system and then work off abstracting it, and make some version of HDL for it, but i feel like its still gonna cancer to work with.
@@someonespotatohmm9513 There's also the sampling and resolution problem. For audio signals, most radio signals, and below, that's not a huge problem anymore, but once you're dealing with higher-band radio signals, and ESPECIALLY visual signals, it gets trickier and trickier.
there's no chance it would be worth it now... you where in the era of 8 bit micros with like 16-64 KB of memory evolving to 16 bit with about 512KB to 1MB typical ram
I wonder what would have happened if the ship with the Antikythera device hadn't sunk... would it have inspired more technological advancement? Or would it have been melted down and forgotten, and only by it sinking was it preserved?
it was likely just one of several such devices, it is only interesting because people had forgotten and reinvented the technology. more useful than the Rosetta Stone back then less useful now.
I think it would in fact belong to the group of "that which could've furthened human knowledge, had it not been destroyed"- e.g. Alexandria's library, or Alan Turing's life
It was definitely not a one-off, it is way too complex for that. Presumably there were at least dozens of preceding devices built over years, maybe centuries, as the creators learned and refined their craft. They were clearly rare and expensive - only toys for the rich or the ruling class. So the seeds of more technological advancement were available, but it obviously needed more than just those seeds to sprout into anything like an industrial revolution - the ancient world was just so different from western societies of the past few centuries; for one thing they had pervasive slavery so there was not much demand for machines to do work.
Ancient Greece was the most dramatic Greek tragedy. Split into science and mysticism factions, the mystics eventually won. Makes me wonder if we're headed for the same fate today in this country. Ignorant mob rule destroys the educated intelligentsia that drives progress.
I'm shocked you didn't mention Chris over at Clickspring, as far as I'm aware, he's one of the only people to have actually made a replication of the Antikythera mechanism using similar methods to what the original creator may have used. It is an absolutely fantastic series and I cannot recommend it enough
@@150cameron In the few videos he as put out (mostly on the second chanel), he has commented a few times saying that more is coming. He's already poured thousands of hours into it both making and studying/writing about it, and he seems to be on the next phase which is going to take many more hours of work. oh and then going through all that footage and making videos for us. Absolute legend if you ask me.
@@150cameron He made some significant discoveries during his investigation of the machine which lead to him writing scientific papers instead of machining gorgeous parts. I think he figured out it is based on lunar movement, not solar, or something like that, basically upending what everyone thought the machine was for and how it worked. He's been releasing detailed videos highlighting certain previously made components for the Antikythera machine and assuring us there is more to follow.
@King Pistachion It was the first thing i thought when i saw the beginning of this video. Clickspring's Antikythera mechanism is just amazing. Loved the whole series and his research on ancient tools technology. I really hope it will be mentioned at least on this channel too.
Worked with a US Navy Gun Fire Control Computer for years in the mid 70's, before digital was common. They worked very well, but as explained, didn't always give the same exact answer twice in a row. Close, but not exact. That's why regular tests and calibrations were required. Since a projectile, unlike a guided missile, cannot change course in flight, having to consider ship pitch and roll, ship speed and direction, muzzle velocity, target altitude, range, direction and speed, time of flight and be able to predict where the projectile had to be to meet the target at a future position took a lot of timing motors, their associated gears and shafts, ballistics cams and their associated followers, synchros, resolvers, summing networks, indicator dials and hand crank inputs. My particular computer was about 6 ft tall, 2 ft deep and 10 ft wide. That could probably be replaced today by a computer the size of a typical tablet. Old systems like those have long been replaced by digital systems.. but the old engineering solutions were innovative and impressive.
actually all replaced by a single chip about the size of a pinhead. The apple watch probably has 1,000 times more computing power than your original mechanical computer.
Is there any resources out there which show the damage caused by big naval shells when they land and explode. Lots of videos of the guns firing. Not much of the shells landing.
@@n-da-bunka2650 Yes.. the chip can do the calculations, quickly and accurately, but to have a complete system, you need a way to input info from various sensors, a way to input information manually, a way to see the results and a way to get the results to the equipment that will actually fire the projectile.. that's why I suggested a "tablet" as the minimum size device to accomplish those tasks.. Also, even though military digital equipment is "hardened", the old analog computers were naturally immune to EMP events.
To be fair, the amount of general purpose digital computing power needed to replace a dedicated function electromechanical device is orders of magnitude more computing power than actually done with those old dedicated devices. The design and build of electromechanical computers like the fire control computer from ww2 and immediate postwar stuff like the Salems are almost beautiful in how it all goes together and moves and does what it does how the do it. It really is something when you notice that an actual USN video meant to inform sailors on the very very basics of how they function is the length of a tv documentary, but containing zero filler, just to describe the basic movements of the cams and whatnot, is telling as to how mechanically complex they are..... All of that working together to do something which back in 1990 couldn't be done well enough with digital stuff to justify their replacement, is a testament to how good they can be, and are like a symphony of movement that makes the most glorious of triple expansion engines envious of them. That said, apparently Isambard Kingdom Burnel's propeller used on his 1845 SS Great Britain, which was the first screw prop ship to be a proper regular bluewater ship transiting the Atlantic is only a few percent more efficient in how well it moved water as compared to a moderrn screw prop, and that was built more with a combo of Brunel's brilliance and determination to do whatever he set his mind to, and some trial and error...... so take all of this as you will. (Can you imagine what Brunel could design and build with the tools we have today? Remember this is the guy who designed, built, launched (after some difficulty) and operated a ship with a displacement of around 33k tons, 19k tons gross tonnage, not surpassed in gross tonnage until 1901 and in terms of volume, 6 times larger than anything else ever, back in 1853 with the SS Great Eastern)
Until the 1980’s or so, automobile automatic transmissions were actually analog computers. They ran on oil and transmission fluid and were quite complex. The shifting logic was performed by the transmission fluid. This is why fixing them back then often necessitated a visit to a transmission specialist who only worked on gearboxes. Once it became electronic (and then digital) it all changed.
However, there are automobiles where you can manually switch gears (manual transmission), which are a lot less complex, require a lot less maintanace and are drastically more ecological. Unfortunately, the vast majority of car drivers are lazy asses and learning to use manual drive (which is the standard in many countries) is a bit too difficult for them.
With manual transmissions, the driver controls the shifting of gears - both timing when to shift and also how to shift. No need for any kind of computer. With an automatic transmission, some control logic is needed. This used to be analog and later became electronic, ie digital. I love driving manual but fewer manufacturers are offering manual gearboxes these days. Electric motors produce large torque at low revs and have pretty much no need for gears so with these cars there may be no gearbox at all, and hence no associated computer. Enjoy shifting manually while you can. NB It gets worse. As cars become autonomous, you not only won’t shift gears, you won’t drive at all! No fun!
@@GP-qi1ve You say that manual transmissions are "more ecological", but overall they're not. Automatic transmissions do a better job of consistently keeping the engine in the most efficient power band, resulting in increased fuel economy and lowered emissions/CO2. While it's true that a conscientious, knowledgeable driver with a manual transmission can outperform an automatic transmission in that respect, that doesn't describe the great majority of drivers nor how most people with manual transmissions choose to drive. And I say that as someone who can drive a stick and enjoys it. Furthermore, hybrid cars are even more ecological, and manual transmissions are not even feasible on a hybrid drivetrain.
@@GP-qi1ve I learned my lesson in to NOT buck the system when my '81 manual lost an output bearing and a used tranny was 2x the cost of a used auto trans (& took 6 mos. to find 150 miles away) and just a new bearing/ output shaft was 2X the cost of the whole used tranny (auto trans. was 1/4 cost than for new parts). Supply & demand works for the lazy asses (I got lazy fast).
My favourite cross-over between the analogue and digital worlds is the old cellular technology CDMA, it blew my mind when we learned about it at uni. Basically it allows all mobile phones to communicate at once, without interfering with eachother because it uses the interference itself to encode the signal. Each device is assigned a unique digital code, the device then XORs the data to be transmitted with the code and those bits are transmitted through the air, which interferes with other devices communicating on the same channel. To decode the data for a specific transmitting device, the receiving base station multiplies the transmitter's unique code with the raw signal, and out pops the transmitted data. Constructive interference produces a signal with levels above and below 1 and 0, but when multiplied with the transmitter's code, any values over 1 are interpreted as a 1, and any values below 0 are interpreted as a 0.
Okay that is genuinely impressive and some damn cool and clever technology. It also explains to me why CDMA tends to be used for longer distances or in noisier environments than most other cellular data technologies. Thank you for sharing! :D
@@leonsteffens7015 Yeah, basically each device has a code that is pairwise orthogonal with every other device's code so the inner product produces the component for that device only
The Antikythera mechanism and the so called ‘Hero’s engine’ make me think of a quote that read: “If ancient civilisations had any idea of how much potential their technologies held, we would already be exploring the neighbouring stars” (Arthur C. Clarke)
I can't believe I'd never heard of Kelvin's integrator before. So elegant! I almost thought it wasn't real, that they'd never built one - and then you showed a picture. Just amazing! I'd imagine the main problem would be getting the bounds right so that you never hit the limits and ended up clipping the values.
Yeah, I was thinking about that. The mechanism behind the integrator is basically a CVT or continuously variable transmission. You could perhaps account for that by swapping out the ball for one with a smaller radius. This way, having it move farther from the center would yield a much greater angular velocity, and thus upper and lower limit.
Okay, I really like the way this guy does sponsored content. Instead of slapping it in the middle where it interrupts content, he puts it at the end! That's a very respectable thing to do! And for my opinions on the rest, it's really interesting seeing how mechanical computers work! I've always had a fascination with stuff like clockwork, and what is a clock but a basic mechanical computer? Thank you for this video!
What a fascinating vid! I worked at IBM in the 70s/80s, and there was always some analog computing going on still, inside the lab, with all sorts of strange tools. Oscilloscopes hooked to sensors measured all sorts of things I did not understand at the time, but I puzzled through the operation of our cathode ray scopes and figured out how to troubleshoot the operation of the first (or second) Robotic Arm, used then to pump out disks and diskettes by the boatload. What a wild time!
As someone who's OBSESSED with both computers and mathematics, I can't believe I never knew like, 80% of this before. My area of study for computational devices in college was always like, digital-style computers from the Lovelace era and onward, so those tide-predicting devices were something never brought up! How fascinating!
"Brought up"? So, your learning is based on the whims of others. Maybe you should actually learm how to learn. Don't just sit there waiting for service.
@@mikemondano3624 This seems like one of those "i didn't even know of it's existence in which to research further to understand it's function". Don't fault him/her for not knowing something, isn't that why we are here?
One of the craziest things I ever did in my electrical engineering classes was building an FM radio receiver. The professor was describing how FM radio worked and what we would need to build to process it. I raised my hand and asked "Wait, so you need to take the derivative of the FM signal to get the original sounds, right?" "Yes, that's right." "So this circuit... does calculus on the input?" "Uh, you could say it that way." A person can differentiate in minutes. A digital computer can differentiate in milliseconds. Analog electronics can differentiate at the speed of light.
@@billballinger5622 The output of an analog computer happens at the same time as it receives the inputs. There's no "calculation" time really. A simple example is a rope on a pulley that lifts a weight. When you pull down on the rope, the weight goes up at the same rate you pull. Near-Instantaneous input/output.
@@billballinger5622 Processors process using semiconductors and logical gates, so they have processing time. Analog signals simply are transformed from input to output due to interactions between matter and energy
@@eaglekepr "There's no "calculation" time really." Analog circuits don't respond instantly either. They have capacitance (resists change in voltage) and inductance (resists change in current). Which is faster, analog or digital, is not a question that can be answered in generality. It depends on the exact details of the implementation, the specific operation you're trying to run (certain things are a better fit for one computational model or another), and how much inaccuracy you're willing to tolerate.
This is why I love mechanical analog machines. With the simple movement of hundreds of gears, pinions, discs, balljoints, and multiple other hard components, a machine is made to do one task incredibly accurately. Sure, in an advanced scale, these machines take up huge room while an SSD fits on a thumbnail, but this is all visual. With some understanding of mechanisms, you can see exactly how the machine functions, and with some understanding of mathematics, understand what the machine's job is.
One of my hobbies is making music with synthesizers. It's a golden age for both types: analog and digital. Often they are both, having digital control of analog sound generation and filtering. The unpredictability of the analog side is considered to be "musical".
in the 1970's, I worked with analog equipment whose main component was the op-amp. It was amazing with it's calculating ability and I thought at the time it would revolutionize electronics. However, digital came out and simplified understanding the circuits for the common man.
This video gave me chills, each story was awe inspiring, shows much of the current technology we are taking and using for granted are the result of handful of extremely dedicated, hardworking, usefully creative people and an unknown Force which is making it happen. And as always the transitions in the video were smooth.
Just discovered you, and absolutely loving what I’ve seen so far. The discussions they inspire in the comments are pretty entertaining too. Please please please consider writing books for schools, and/or making videos specifically for school curriculums. You’d make classes infinitely more engaging, and possibly inspire some future careers. Seriously. Your content fascinates all levels of expertise and knowledge - I was not expecting quite such a mix. That is a real talent. I’d love to see kids actually enjoying classes, and more importantly, they would too.
While I heard of Analog computers in the past, I was so amazed by an analog Integrater. As long as we need to solve a specific problem, Analog works well. Though in digital the same machine can be used to solve various problems with the simple change in variables or expressions.
I wonder if an analog machine could be made with similar capability for a set of problems? Obviously there are limits to what analog machine could do but I'm sure some of the variable things could be sorted out. What purpose this would have other than a thought problem... I have no idea.
At their peak, analog computers were programmable. They had started using electric versions of the old gears (as seen in this video). These fundamental components could then be assembled and reassembled into different analog computing circuits. This was done by physically moving patch cables to wire them in different assemblies---programming! So, assuming that modern technology enables doing this without moving a bunch of wires by hand, programmable analog computers should be quite feasible. (I'm pretty confident they exist, but I haven't followed this subject, myself.)
@@chrismiddleton398 reprogrammable circuits! I knew i had seen something similar to this principle, but I couldn't point my finger to what As far as i know the idea is a circuit that can rewire itself to implement a specific computation, as long as the machine is constant then the result is close to the reality
I remember learning about these analog computers back in highschool. It's always fascinated me that such devices have the same functions as our digital computers today. Amazing
Analogue machines in the old style were objects of great beauty, and in many cases the result of superb craftsmanship. They were however very expensive, and had to be lovingly maintained, since they were prone to malfunction due to the build up of dirt. Inevitably too they gave inaccurate results. A secondary effect was cumulative error as a result of rounding errors in a lengthy calculation. On a personal level I used a slide rule for many years in my early career as a design engineer, but was blown away by my first electronic calculator - a Texas Instruments machine which by modern standards would be considered pathetic. That was in the 1970's. I shall look forward to viewing a further piece on a possible comeback for analogue computers, but am sceptical that they can ever overcome the problems inherent in this kind of machine.
One problem using any kind of Mechanical Gears in any Machine, Is Gear Lash. Also the the Gear Lash gets Larger the longer it is used. That’s the biggest problem with Mechanical Components is Wear and Tear, and the inherent inaccuracies it produces. Even in the Guns that are Controlled by Analog Computers, it still takes Gears to Rotate, set the Elevation, or any kind of movement. Over Time they Wear. So then what used to be reasonably Accurate becomes Inaccurate.
@@joeybobbie1 I've never personally heard the term ' Gear Lash ', but have heard of ' backlash '. As I recall it refers to either lost motion along the gear train, OR torque multiplication if the train is driven from the output end, which can strip the whole train in a severe case. Parenthetically an old friend of mine, an inveterate collector of good things, had purchased several gear train components on the US military surplus market. These objects were things of both beauty and interest, crying out to be re-used in some newly conceived mechanism.
@@t.me_s_petizioni_2220 In English, "wear" is sometimes used as a synonym for "use". So "wear and tear" means "use and degradation", to describe an object breaking down over time.
I'm absolutely fascinated by these old mechanical computers. There was no software back then to design them, the device was designed within someone's imagination. Truly incredible.
They were designed on paper using methods and instruments that were practiced for years. Imagination alone was never enough. We have always used tools to supplement the deficiencies of our bodies and minds. Software is merely the latest tool.
I'm sure the ancient alien astronauts visiting us had workstations on their UFOs that could run Autocad.
Your comment doesn't make much sense. Even right now when you are designing something you need to visualize the outcome before writing the code for the task. So this has been happening in the past and in the present and will continue to happen in the future. You can't design something without imagining it in your head first.
A mechanical computer is simply a scale model simulator of the process under consideration. Its components are assumed to represent all of the forces involved in the real world process, but some aspects might have been overlooked. As the narrator said, an analogue computer is specific to the problem being solved, hence there is no concept of software.
Ok
The ball and disk integrator actually blew my mind. I cannot believe such a thing has existed and I only ever heard about it now. What a beautiful machine.
For those interested, May I suggest books 📚 like “1800 mechanical movements, devices and appliances” which give an amazing insight into the variety of complex mechanical movements possible in that realm.
@@tomorrow6 Yes that Ball and Disk was really neat, I thought I knew about all the fundamentals in this area. That 1800 movements book is really cool too.
Any other sources of roots concepts like that, feel free to post'em : )
also hoping I don't miss it when the new part 2 video comes out
I WANT ONE!
Same thought I had. I had never heard of this. it's such a brilliant idea
Honestly, I don't think it would be too hard to come up with this idea. It follows pretty naturally from the fundamental theorem of calculus, which basically says that you have to translate the height of the input into the slope of the output.
I was more impressed by the pulley thing. Now that is neat!
As a total software person I find these mechanical devices so fascinating and clever. Those people coming up with them were geniuses.
Yea no doubt. The more I learn the more it becomes apparent our whole human world is a massive construction standing on the shoulders of countless geniuses. It’s just mind boggling how clever the differential machine is alone.
This is more like mechanical clock
Talk about “hard-coded” 😉
it also helps to understand that because mathematics dogma was driven towards physical representations of equations before the advent of digital computing, that it would be a lot more intuitive for people at the time to make mechanisms that perform equations !
same here man, I cannot even understand how would they come up with those elegant solutions, but they did and it moved humanity forward!
My mom used to work on an analog computer in the 1940's. She worked in a comptometer office when they got this "new machine." She never called it a computer but as She described it my mouth nearly dropped. I was in college learning programming at the time and we had recently gone over the history of computing. She said it had a bunch of wires and plugs and dials and flashing lights. Her boss couldn't figure it out so he gave her the manual to figure it out.
A lady in my writers group used to program plug boards (for card sorting applications, IIRC). Hasn't been that long ago!
My great aunt Vera loaded programs into computers.
With _punch cards._
@@SirUncleDolan When I started in DP, I did, too. And if you dropped a deck, you better hope someone keyed sequence numbers in 73-80. In the mainframe world, The Librarian was the first source manager that made it big and provided a decent backup and maintenance mechanism and made card decks pretty much obsolete.
I would love to interview your mom for my PhD thesis on gender technology and labor!
@@coexistordontexistLMAO
When I was learning to be an engineer back in the early '70s, analog computers were on the way out the door. Large-scale integration was beginning, and Moore's Law was a new concept that my professor's predicted was going to revolutionize computers.
Fifty years later I am retired after a career in digital computing, and now I find that analog is making a comeback. I am looking forward to part 2; I know enough about analog computer that I can anticipate some of the application for which they will be useful. I suspect the improvements in electronics, and perhaps even 3D printing of components will produce new and sophisticated analog machines.
Makes me wish I was 20 again, so I could have a second career in analog!
Please keep making these videos - you are doing valuable work. 👍
Absolutely.
Although I personally suspect that 3D printing will take a back seat to modern CNC machining -- "computer numerical control" -- use a digital computer to control a lathe or mill (or yes, a 3D printer) to build the parts for an analog computer.
3D printing plastic parts is great, but I'd rather have a precision machined piece out of aluminum or brass.
And just as an example (that I googled; I am not an expert in either field), a typical 3D printer has a resolution of 140 microns, while precision machining can get below 5 microns
You rule!
Thank you for your work, sir.
Nice that you share this with us! What a time to have been in digital computing, all those changes. I hope you see still more evolution happening. Must be exciting, for me it is anyway, though I'm only 40 years old:)
Can you give a young engineer like me a bit of perspective on what's coming up? I'm at least a year away from graduation and can really use some help!
Chris from "Clickspring" is building an Antikytherean mechanism and has been building it using period tools and techniques to the best of the experts knowledge...the ww1/2 analog firing computers are still incredibly advanced and smartly built tbh. They are just insanely accurate for as little input as they take and what they can interpret and output for solutions
He's also writing a paper about it as he's figured some stuff about it no knew previously if I remember rightly
That's a beautiful channel.
Ha I was going to mention this. :3
Interesting
Very, very interesting my brother
I exclaimed at my TV when you showed the rotary ball integrator. What a beautiful system!
my small brain cannot understand how this mathematically works
@@marc-antoineb.2125 same here
@@marc-antoineb.2125 same, but the pulley and the ball thing looks cool
I got chills when they turned it into a Fourier transformer by oscillating the platform. Ingenious, and so simple!
@@wsshambaugh also with the pulley system to add all the functions
As a US Navy reactor operator of 60s-era nuke submarines, I am recalling that subs had a large number of analog computers, from bow to stern, so to speak. In my own training for my specialty, we were told of magnetic amplifiers (mag-amps) used in German gun directors, that still worked perfectly after being recovered from sunken ships. Part of my work was checking and correcting as needed, the micrometer settings of certain variable circuit components in a particular analog computer that was absolutely vital to the operation of a nuclear submarine engineering plant. In the meantime, we did manual calculations with a slide rule and graphs, to determine when the reactor should go critical. (all of these submarines were turned into razor blades decades ago, so no useful classified information is in this remark)
Razor blades?
the classic submariner's lament...when a sub is decommissioned and turned into so much scrap metal, the end result of which is the sub is remanufactured into other steel products...we say, she was turned into razor blades (a trivial end to a once formidable machine. @@tfuenke
That's truly fascinating.Thank you for ur service
Thank you for your service, swabee! Charlie Lord
USN
You can only get stories like that on UA-cam. Thanks mate.
Your clarity and efforts are always appreciated. - When I see a new Veritasium video, I'm glued to my screen.
Awesome to see an analog/digital artist like yourself appreciate the building blocks of what enables you to master your craft like you have :)
Love your music!
What a coincidence! Just this week I started listening to a few of your songs again :D But you're right, he does have a very good way of explaining and visualizing complex topics.
Oh my god! I'm a huge fan of your work.
@OVERWERK when are you gonna put the nth ° back up on spotify?
This was astoundingly relevant to--almost a summary of--my History of Science: The Digital Age course, for which I have a final for tomorrow. This video is practically a 'further reading' section. Thank you for this.
Good luck
You will ace it
Good luck! Although it depends what part of the world you live in, a final on the week of Christmas sure is rough. Hope you at least get New Years off!
Hmm yes indubitably
You are as lucky as Lord Kelvin.
This presentation, on this channel, may be my all-time favorite. As a software engineer, I am blown away and humbled by the innovations of people like Lord Kelvin. I absolutely loved the organization and flow of this presentation.
Agreed! Their entire production is the very highest in quality. Just reviewing the references on this 1 video alone says a lot. It really is one of the best science channels on UA-cam.
When I think about this other youtuber...a 20-something year old turd who's worth $20M from making imbecilic, "how dumb can we get in 20 minutes on this video?" while he's driving a Bently down Sunset Blvd in We-Ho, w/ his model girlfriend, who wouldn't give him the time of day except for his $20M, his Bently, his mansion & his fame. No wonder this country can't have nice things.
@@MrDAMS1963 Which country are you talking about? If you're in the USA you already have some of the nicest things in history. If you want to make history, change the world. If you want to make money, appeal to the average masses.
Praise Bob
I used and helped develop analog computers in 1960's and early 70's. Gun aiming analog mechanical computers used gears and wheels . It considered ship vector and speed and distance to target. The disk/ball mechanism for integration mentioned were also part of it. Electronic ones used tube then transistor operational amplifiers with resistor input and feedback for aritmatic plus capacitor feedback for calculus anf special diode networks in input for trig. And so on. Moon landing simulator we made combined analog (to "think" ) and digitals with control for in/ out. Civil engineers and auto companies used analog pc's to optimize suspensions and roadside slope grading. Plotters as large as beds drew plans and curves far more accurate than primitive digitals. I used m9 in USAF.
The firing solution computer on the USS North Carolina is impressive. It also considered temperature, barometric pressure, delta in altitude (for targets on cliffs), rotation of the earth, and roll of the ship.
And now a digital computer, very small can do all this calculation very quickly, less costly and much easier to maintain.
I did two tours on DDGs that were converted DLGs and we had a 5" gun that used this old analog system, but everything else ran on digital computers. The accuracy of the 5" gun wasn't great. I did a tour on an FFG with a 3" gun and it could put a round through a window of about 1sq. m at quite a distance. It ran on a digital computer.
I was a DS, Data Systems Tech, and worked with these digital computers and different systems involved in CIC. Out system changed names a couple times and eventually it got eaten up into an integrated weapons system when AEGIS came about.
I REALLY enjoyed working on this equipment (1980 - 2000) and I've followed computer technology ever since.
The need for ever more powerful computers does not require that a transistor keeps shrinking and this is a fallacy that different people who follow computer technology have spread (what we do when we can no longer shrink a transistor). The fact is when a transistor hits that point it will use such a small amount of electricity that ICs can become much larger as long as clock speeds are low enough. But this brings up the problem of failure rate for die being produced from a wafer, where the larger the die is the higher in percentage is for unusable die from the wafer, since every wafer has imperfections. So we have now moved to chiplets, so a CPU can be made of multiple chiplets and this solves so many problems, along with being able to stack one die on top of another, and eventually probably being able to produce multi layer die that I see most of this talk about the need to move away from digital along with moving away from silicon wafers as sensationalism and nothing more.
A bigger problem for digital computers has been and will be how fast memory is. CPUs have to do a LOT of work to try to predict what data/instructions will be needed and fetch it from memory and load it into cache on the CPU which works much faster. If memory could be sped up to where it could run even 1/2 the speed of the CPU cores that amount of performance boost would be VERY impressive and it would simplify the CPU because you wouldn't need so much branch prediction and prefetching, flushing an instruction pipeline with a branch prediction is wrong, etc........
@@tango_uniform Also pitch, wind direction and speed, for a gun if I remember correctly. Even in launching SAR missiles, you need pitch and roll to get the missile going in the correct trajectory
I hope you have watched the video narrated by Spock titled “The Last Question”. If youre an AC pioneer, you’ll appreciate it.
You guys are the golden age of engineering. Hats off.
In 2000s, I helped to disassemble a big modular analog computer from 70s. It was pretty universal in its day, but mostly used to model the transients during a big electric motor startup. A big 20x20 switchboard, few electric drums for variable coefficients, everything an engineer could have wished for.
As the students, our task was to grab the equations it was modelling and port it into MATLAB. A paltry Pentium-2 PC was quite faster and much more precise than that giant machine. After the model was ported, we together with the faculty's staff disassembled the computer and moved its modules into the humidity-controlled storage. They wanted to preserve them for history. Wonder if it's still in there...
It's so crazy how adding or multiplying sine waves, something that's as simple as punching values into a calculator today, used to require some unbelievable engineering. I mean, just the notion of such an advanced mechanical computer makes my head hurt. The things we do today would be seen as magic many years ago. Great video!
This comment sparked a big boy thought in my brain. Thank you!
You need to punch in a whole lot of numbers in that calculator to do a fourier transformation. The fact that it was analog is ideal for fourier transformations instead of doing a lot of calculations the next calculation comes from turning the wheel slightly. No wonder these where still used in the 1960'ties.
Any sufficiently advanced technology is indistinguishable from magic.
(Science fiction writer Arthur C. Clarke's 3rd law)
Obviously there must be an external input, but you don't necessarily need to do everything manually, like in a calculator.
If you already know the formulas that you will follow, you can create a web panel / etc to facilitate it to the user.
Isn't this undermining the fact that a calculator is an even greater feat of engineering?
The analog computers used by navies during the First and Second World Wars are amazing technological feats. The inputs are the continuous, relative motions of opposing ships and the outputs are a synchronous stream of firing solutions.
The Norden bomb site. The third most expensive weapons program of WW2. An analog computer with 50 variables.
@@byrnemeister2008 It became a important piece of equipment to the Allies due to it’s increased accuracy with pinpointing the exact location for detonation.
@@byrnemeister2008 that hole was filled in immediately the war ended...
@@byrnemeister2008 different tech entirely
The US Navy was still using pure analog computers for gun fire control into the 70s and possibly early 80s. In the early 70s I worked on the fire control radar for the Tartar missile system and at that time our system was a mixture of analog and digital.
This is my favorite of all the videos you've ever done.
This is the best crossover.. SRV and Computer Science
@@HolyPro7 I know right? Wasn't expecting to see him here haha
So much goes over my head, b/c mathematics proved that my brain was deficient in something. I maxed out intellectually/academically at _pre_ Algebra. It was heartbreaking and I'll never be completely be rid of the shame and guilt, b/c I excelled at writing and English. Bc/ of my math and analytical weakness, my IQ is only 80, despite the fact that I managed to finish 2 college degrees. So, degrees don't really mean anything (unless they have ivy league backing).
600th like. 👍
Mine too! Fascinating subject, great editing, and plenty of real-world applications. A wonderful piece.
I was a field engineer for Electronics Associates Inc., the largest manufacturer of analog computers in the 60's and 70's. I traveled all over, but the last year was at NASA Ames. Among other things, the navigation 8-ball used in the Apollo program was developed on our computers. They were state of the art at time time. We later combined them with digital computers. The complex computations were handled with ease on the analog portion and the raw number crunching was done on the digital computer. All together we had around 15 large scale analog computers on site at NASA Ames filling entire rooms and involved in all aspects of their various missions from spaceflight to life science studies. Analog computers speak to engineers in their language--mathematics. Digital computers require interpretation between languages.
4:15 It's pretty mind boggling that a telegraph cable was laid across the Atlantic ocean in the 1850's. I'd love to learn the details of that endeavor someday.
In brief - yes, it worked. But ... it was shoddily made, and leaked water. Thus, the signals were very weak. Two solutions were proposed. One was to use a mirror to amplify the weak signals by reflecting a dot on a wall. The second, advanced by an arch-rival, was to use high voltage. The arch-rival won the debate .. and burnt out the cable in a few days.
This is in a short story is told by Neil Stephenson - "Mother Earth Mother Board" from 1996
Before artificial satelites intercontinental telephone calls were possible thanks to cables lying deep at the bottom of the sea.
Bet you can find at least half a dozen videos on this topic. Now is the time to pray 🙏🤞as my luck is usually terrible when it comes to winning bets (though logically speaking I should win).
@@adblockturnedoff4515 lo. You're right.
@@adblockturnedoff4515 Except Neil Stephenson is a great writer. He even starts out the story with this convoluted Victorian introduction 'wherein it is explored...'
So Lord Kelvin was the first guy to think of a system to compute the FFT! How come I never heard about this in engineering school. Before DFTs and FFTs, there existed AFT (Analog Fourier Transform). Mind blown...🤯. And what an elegant construction. These concepts should be used to teach mathematics and engineering in STEM. The genesis of the thinking behind any mathematical and engineering breakthrough goes way beyond equations and has real world analogies that are much easier to understand. Brilliant video. Thanks for everything you do, Veritasium. 🎉🙏🏽
Precisely this, had they taught me the history behind the math in high school I would have been more interested. As opposed to asking the teachers what sine and cosine is actually used for and getting a non answer.
@@m1ndk1ller My teacher used to say just study whatever is in your book dont ask useless questions, these people just don't know the importance of all this and just made us into calculators.
@@anuragtekam9376 I need to understand the math not just how to do it.
This should be the best (and maybe the only) way to teach math - with real world examples
For those of us, who were never explained the reasonings of math, or having a dictionary/thesaurus, to find out what the differing parts of math are/were for, it is all but impossible to see something that does not have any tangible roots for yo draw an image within our minds, … I have never understood the practicality of algebra, any further than Einstein’s time x mileage, =‘s mph, … drawing a picture inside my mind of what something should look like, (such as using the “carpenter’s rule of thumb”), … is the one sure way, that I have used in creating doorway’s that are genuine in their appearance, & setting up a baseball/soccer field in proper angular accordance with the rules, … flat lining a parking lot, with a slope of 2 tenths of an inch from the highest point of the area, to the sewer drains, have ensured that water would not collect, and flood patrons vehicles in abundant rainstorms, … running a leveled string line, ensures grade at marked consideration points, for the backfill, grading, of stone, & legitimacy of pavement, whether poured, or heat adhered such as asphalt, … making an unbalanced wall look flat is also to be had, provided shims are properly placed, & utilized throughout construction, … painting said walls & ceilings can also be sight affected by how much paint is applied with a roller, & in what direction, it is rolled out, with textures, notwithstanding, …
My grandad was a bomber pilot in the RAF in WW2, his crew was occasionally given new devices to test, one of them was a machine that contained a moving map that was small enough to strap to their thigh. This map would rotate around rollers and update to show the terrain he was flying over regardless of visibility to the ground. The whole crew thought it was incredible and provided great feedback to the engineers that developed it. But as was often the case it was taken away after testing and they never got it back for the rest of the war. Always been curious what that machine was, how it worked and why it was taken away if it seemed to be working.
Sounds fascinating, commenting to see if anyone that knows replies.
Analog computer
Many incredible inventions and discoveries made during the war were never actually put into use with front line forces because it was feared they would do more harm than good if they fell into the hands of the Axis powers. This was especially true of inventions like the one you describe. A key part of the aerial defence of the UK was the use of decoys on the ground to send German bombers using visual navigation off-course - at various points during the Battle of Britain entire fake towns and other features were created for this purpose. If the device you describe were to found by the Germans in a crashed plane then it could be reverse engineered and the accuracy of their bombers would improve ten fold, even in weather conditions where our own fighters could not see the enemy.
Worse still, what if these same computers were adapted into guidance systems for the German V1 and later V2 unmanned missile and rocket attacks? The navigation abilities of V1 were notoriously bad, which is why it was largely fired at targets so big that they were hard to miss, i.e. London. They weren't precision weapons, but what if this technology was available to the Germans? That development may have altered the outcome of the war.
These are just hypothetical reasons, I'm sure someone knows the real reasons why the device your grandfather tested was never seen again. However these types of very real fears were the case for many inventions being kept on a shelf marked Top Secret instead of being used.
It was taken away and given to the Nazis to help them build up the Nazi War Machine! Then, our inventions that the Nazis was using made them look like a genius, when it was in fact American genius. There are spies in our Military Industrial Complex who sit in the highest military and civilian positions waiting to transfer American military and civilian secret to our enemies for a handsome price!
I fail to see how that would work other than using air speed which would be pretty inaccurate considering air speed isn't the same as ground speed.
I dropped out of school, regrettably, but whenever I find your channel while scrolling, I always seem to pay more attention than I did in school. You've taught me more than most of the people who were paid to teach me, and for that I appreciate you V, keep up the good work!
No worries, school does not have anything to do with wisdom, inteliigence etc. and the knowledge can be fill in
Our traditional school system wastes at least 5 years of your time. Self educate - that is what one does at varsity in any case.
I've done a teardown video of a B52 bomber astro compass analog computer and it's glorious how these thing can compute sinusoidals and do integration etc.
Gonna have to watch that
I rly love your videos.
Oh shid you guys are the ones melting wrenches with the giant transformer. Good on you brethren
Why destroy the old mechanism? None are being made any more today. Is there a security reason for it’s destruction? Or am I mistaken and you meant that you took it apart to learn about it’s function?
@@wynfrithnichtwo8423 Yeah, you've missed a term... "Teardown" involves (specifically) taking carefully apart so that it can be rebuilt... including any and all repairs or replacements... so that when you are finished with a "full operation" you have two majority stages... "Teardown" and "Rebuild".
If you'd like a specific reference in literature, Hayne's or Chilton's or "Climber" (for motorcycles) offer "Complete Teardown and Rebuild" Guides to automobiles, motorcycles, and ATV's... and there are probably other publications... SO you can order or download based on whatever vehicle you'd like by Manufacturer, Year, Model, and any additional nomenclature... AND read up on just about anything you'd ever want to know about it...
Some of us "restorers" have to take things apart in a "teardown" operation, to study them. We usually have to take pictures prolifically of the project as we do so, and then study those pictures and the components to decipher how to get them back together to make them work "like new" again... and other times to simply fabricate a new one... occasionally (when we're lucky) better than the original.
Nobody is willingly tossing one (essentially) into a chipper. I promise. We have more respect for the past and the heroes who lived it, than that. We're just as fascinated as you are! ;o)
I do want to point out on the rather incredibly improvement of shells/kill ratios of Allied AA guns: that wasn't solely down to better maneuvering of the guns. Planes are fast and small, so hitting them in three dimensional space is really difficult. The Americans and the British jointly developed the proximity fuse, basically a way for the shell to detect when it was near an enemy plane and detonate, and this at least removed the third dimension from their aim, making them much more accurate.
Oh from one of the Curious Droid channel video yeah it basically the fuse that make allied guna more accurate and with the additional improvement with equipment to calculate the trajectory it makes more accurate guns
Just don't let it detect the ground!
Proximity fuses were far more important than the rare analogue control of a aa gun. As you still had to aim the gun at the plane yourself for the analogue computer to attempt to correct for speed, distance and wind. Our own analogue computer in a trained operator (called a brain) did this just as well with some intermittent flack rounds for the adjustments to be made. Proximity rounds that actually went off weather they hit or just went near was a game changer (though these were only in use on the large calibre aa's not the light machine gun type).
Right, we need a comparison of before/after the proximity fuse was implemented, which I understand to be very close to the end of the Pacific theater. I checked: first use January 1943.
Germans invading:
Allies with new proximity shells: laughs in galaga.
How people actually came up with these things out of thin air is absolutely astonishing to me. Pure brilliance
In absolute awe of these old computers. The Rotary Ball Integrator is beautiful. Back in college, it was all about learning the symbology of how to solve integral and differential equations, and although we were told we needed to understand conceptually what we were doing, we didn't spend any time on it.
Seeing these machines add all those waves makes that visual representation all the more fascinating and gives you a sense "aha! so that's what it means to add waves and take the integral"
I don't think I really appreciated what integration was doing until I took a chemistry lab in college with an outdated FTIR that could only print to paper. We had to cut different peaks and weigh them to compare against each other.
@Grace Jackson I never get this about educational systems, they all start backwards and expect people to be good at grasping things out of a vacuum. I tought myself the most complex stuff possible without any help from teachers or experts. When I analysed how I did it it was always "backwards" to how school tought you: 1. I saw a problem, 2. I broke it down to the basic most important aspects and (...) then I calculated it. I was always very happy with my solutions (they worked perfectly).
(...) = 1. as an amateur having absolutely no clue on deeper scientific stuff, 2. looking up the formulas necessary for the calculation (or in case of language learning - learning the grammar), 3. learning on my own how to apply the formulas, 4. adding formulas together to solve the problem.
Why could I never do this stuff in school or university? Because I only had a fuckton of puzzle pieces and absolutely no picture in mind of what they are supposed to ressemble. If you have a problem - like the tidal waves - you very clearly have data on what to expect and compare to your solution.
I wish school would start out teaching the actual real world application FIRST (yes, it is extremely complicated) and break it down in smaller and smaller parts that can be tought individually. You can always see that you are in "chapter 3" of the "tidal wave problem" and where this fits in the bigger picture and what is left to do. At first you think "Sh*t I'll never be able to learn this clusterf**k?!", but having this picture of what is expected is giving such a great motivational reward when you realize you solved this HUGE problem on your own.
How does school do it? They teach you all the tiny pieces with NO explenation of what they are for. Then at the end you get to solve a problem that is laughably easier than tides calculation and you still don't know how to solve it, because you never had to apply the puzzle pieces to real world problems. School... every time... I hate it so much.
As an electrical engineering student in the early 80's, I worked at a steel plant where I helped fix an analog (tube) computer that controlled a mill which cold-rolled aluminum. Just when we figured out what was wrong with it, they decided to use digital instead.
What was wrong with it? Did you get told?
I think that is partly your fault, in that you could have diagnosed it much quicker by declaring "it's fucked", lol.
The problem with analog computers
is that there will always be some slop in the connections between parts; the problem with digital computers is that always there will be cosmic particles to malfunction it. :-)
The problem with the universe is
Reminds me of that time in a Super Mario 64 speed run where Mario jumped extra high and the only explanation we have is cosmic particles lol
@@iainbrady3629 lmao fr
that escalated quick and make me laugh lol
@@iainbrady3629 How do I Google it? 🤔🕵️♂️
it would be interesting to see if you could create a computer partially analog and partially digital to create a computer free of all issues
What a great talk.
I remember hearing a story about analog computers for use in naval gunnery; they were used to keep the guns level at the target in rolling seas. When they tried to replace them with digital computers they found the digital computers were too slow to compensate for the movement of the ship, and had to go back to analog controls. It took a couple of decades of digital computer development before the digital computers were fast enough to replace the analog ones.
I’m glad you brought that up. This video states the problems with analog but did not mention the huge advantage and that is speed. An analog computer can solve VERY quickly, which is why they were so effective for use on gun sights against dive bombers.
A mechanical/analog integrater? That’s badass-level of genius!
Now makes me wonder how a Differentiater would look like...🤔
Ikr!!
If he born today... Elon musk would have become 2 most richest person....
Or at least, Nobal would have started giving Noval prized for maths and cs.... Just to give him....
@@darkblaze1594 A differentiator would be relatively easy to design compared to an integrator. Without having any prior knowledge of such designs, I can imagine one possible way it would work. If you have a machine which plots waves on a chart, you attach a weight to the swinging arm which plots the line, in order to increase its inertia. Then you measure the force on that arm, which is proportional to the acceleration of the needle, which is of course its derivative or second derivative, depending on what you were measuring.
@@iankrasnow5383 A speedometer is operating as a differentiator, showing the time variation of the distance (or the rotation of the cable).
As a person into computer science myself, I love looking at the old ways of doing things. Seeing the pulleys and the ball integrators was incredible, and I can't wait for the next video.
I think it's also the new maybe cuz we wanna upload minds and brains are analog mostly not digital
@@Hshjshshjsj72727 There are actually continuous data structures in computer science research, so an analog system is not exactly necessary to store analog data.
@@shamsow that's really cool thanks for letting me know :) I'm very interested in uploading the mind and immortality
Analog devices like the Fourier analyzer are so astounding to me for how creative the inventor had to be. One of the coolest is a "planimeter", which measures the area of an arbitrary shape by tracing around its perimeter.
That's possible?!!?!!!!
Thus you could, in theory, calculate the area of an island by tracing the shore line, correct?
@@LucasPlay171 Possible but not the easiest solution-it's much quicker and cheaper to cut the shape out and weigh it and then compare it with the weight of a unit square. Years ago one of my chemistry teachers explained to me that that's how they were taught to take the area beneath the graphs their instruments produced-as chemists they had high resolution, sensitive balances to hand.
@@StormWarningMom Look up the "coastline paradox".
Well, is a Planimeter really that awesome? I can accomplish the same thing with a string and a ruler
I had no idea about the history of tide computations. The presentation of the Fourier application is the best video that I ever saw. Even if I hadn't taken Calculus 30 years ago, there would have been a lot of understanding imparted from the stunning visual aids. The video also provides an appreciation of the genius (1% inspiration and 99% perspiration) of earlier generations.
balls
Claude Shannon is by FAR my favourite mathematician, and, in my opinion, the single most underrated. Very happy to see him get a mention here! His work enabled the ENTIRETY of modern computing, communication, and information theory. Plus, he has a lot of other really cool work, such as the sampling theorem (bane of audiophiles everywhere). And better yet, he did a significant portion of his important work as a master's student!
Boole? Turing? And many many others that we don't even know like engineers and scientists are god like on logics.. They should be credite.
And, he went to Michigan!! Go Blue! BBA-Accounting '96
And Shannon's Law is still used to analyse the capacity of an electronic communications channel C = W log2(1 + S/N ), where C is the channel capacity in bits per second, W is the bandwidth in hertz, and S/N is the signal-to-noise ratio
There's a movie biography of Shannon, "Bit Player". Check it out.
Hey brother, all these information theory I have those topics in my UG course... dumb faculties don't care about explaining how all equations are actally used in realworld... but I'm intrested in exploring on them to get a strong understanding.... where can I start exploring to understand everyting that Shannon presented?? can you point me to some direction ?
My physics teacher used to tell us that digital computers would never catch on because of their limited accuracy, and those analog computers were going to be big. That was in the UK in the 1960s. Since then, I have been laughing about that and thinking “How wrong can you be!!”. But, well, maybe he knew something… let’s wait and see.
@jshowa o Yeah, if a finite amount of frequencies is present. Even a simple step function has infinite amount of frequencies present. The Nyquist theorem is more about the resolution limit of discrete approximations
I remember my teachers all tell me how important it is to understand basic math because we wouldn't be walking around with a calculator in our pocket...
Growing up with computers, it's crazy how much it's changed, but also how much of it is still the same.
maybe he was talking about quantum computers...
currently optical "computers" seem to be the next thing for deep learning. Not because of accuracy, but because of power efficiency.
False like the y2k bug ...digital computers will always be key
As a computer engineer, I was always told that every one has always dreamed to build analog computer because of the enormous performance potential over digital computers. However, it's not that engineers are dumb, that's not that simple.
There is the precision and noise problem that you pointed out, digital component can easily erase noise and restore a digital signal which is very interresting to transmit data over the distance. But in that's not the most important point. We are curently already creating quantum compter which require extreme precision, but there isn't any decisive project for analog computers yet.
In fact in the case of computing the true game changer is timing! Most analog components have time based behaviours, they use analog values of physical constants like voltage, intensity, etc. that doesn't change instantly because they tightly binded to physics laws. When you have a few component moving over a few minutes like old computers presented in the video, it's no big deal, but the nightmare start when you try assembling million, billion or even more of them and syncronizing them at the micro-second or nano-second. And th is not a trivial problem.
On the other side, the main advantage of digital gates is that they are syncronized on a clock. There is a regular clock in your computers (today usually around 4 GHz) and all the memory cells that store bits of data in your computer change there state at the same time when this clock tic. The clock period is designed to be long enough to be sure that every electronic component in the whole computer chip had the time it needed to update its electric state and stabilize its value. Some components would probably take less time than that, but in the end we are garanteed that all electronic signals are stable and reliable. Just with that hypothesis all simulation tools and computer designer can make an enormous simplification by just ignoring all time related physical behaviour and instead focus on a much much much simpler finit state automata. If you look at a processor simulation tool used by computer architects you would not see anything related to voltage, intensity, or physic equations, because all of that is neglected and instead we can focus on much simpler binary signals.
In fact I lied a bit, in practice at the level of miniaturization we reached, we still had to deal with noise, interferance, etc. but that's mostly solved globaly by building robust components and adding shielding around chips when needed, an not by solving the exact physical equaltion between every single electronic components.
It's not that no body tried to do analogus computer but, in history, since the transistor was invented, it has always been simpler and more efficient to reduce complexity with digital computing and compensating by an increased clock speed, adding more parallel transitors computation or a more optimized chip architecture, that trying to work out the complexe equations of billions of analogous electronic components.
It's a little bit like an intersection with traffic lights. It's theoritically possible that everyone pass the intersection at almost full speed by precisely computing everyone's speed. But since cars became widely used, we all accepted that its safer to sync everybody using traffic lights in a binary way.
What about optical computers? Some startups and research groups are investigating light as a signal and calculation medium, using wave interference to perform calculations. While most of these are digital, I think there's potential to make analog optical computers without as many of the downsides you mentioned. I wonder if that's what part 2 of this video will be about, but I could be wrong.
@@areadenial2343 I'd like to hear this dudes opinion on this, educate us good sirs
@Kasonnara
Do u think that there is problems today that could only be truly solved by some powerful analog computer?
If I had a billion laying around I would simply start a non-profit organization to preserve and fund more of these analog computers. You know, just in case of... electronic doomsday, massive EMP's, GRB's or any of such.
So round abouts are analog traffic lights?
This video made me remember the clocktowers of Neal Stephensons Anathem.
Still a favourite book.
The rotary ball integrator, oh God, what a fascinating piece of machinery! Can someone guide me as to where I can get my hands on one?
Not only would it be an amazing prop to have on my desk but it would serve well as a teaching aid to Calculus students so they can develop an appreciation for the physical "analog" of the integration process.
@@mattmurphy7030 please do sir
@@mattmurphy7030 Yes, please do! That would be fantastic!
@@mattmurphy7030 I have one and was thinking about building one
@@mattmurphy7030 yes please do
@@mattmurphy7030 The trick with a 3d printed version is making sure the balls and rollers don't slip and slide past each other.
Kelvin's tide machines are my single most favourite things. I remember seeing them for the first time in the London Science Museum over 30 years ago.
Small nitpick. The pics you show when mentioning Colossus are actually the Bombe machines used to crack Enigma, not the machine used to crack the Lorentz codes.
Everyone forgets Konrad Zuse and his early computers. These were earlier and more advanced than the ENIAC. But since he was German, and developed them during WW2, history has not been kind.
This comment really needs to be higher.
There was more than just war as a reason why Zuse's computers didn't go thru the roof. However some companies used it back then and made big profits because of that and are still in business today.
Das ist gut! Forgets? Who the hell is taught anything anymore?
If I'm not mistaken, technically, Zuse's computer was not fully Turing-complete. Don't get me wrong, it was a miracle of engineering nonetheless, but I think it's rightful that ENIAC gets the credit for its achievement.
Let's vote this to the top.
The talent this guy has is just incredible, he can teach us some incredible stuff and at the same time making us enjoy the process.
Totally, he has come so far from the early days! The new animated bits are a nice touch too.
Except about how electricity in the wires works... 😏
Scammers
@just some guy tired of life Bro why would you expect someone to expect details on how an anti aircraft gun worked in literally 5 minutes? This video is NOT about teaching you how an anti aircraft gun works, there are other videos and books on the topic. Sine and Cosine waves are literally taught in school, you wouldn't expect him to teach us how to do carry division in a 20 minute video on analog and digital computers lol. He gave examples of analog and digital computers, which you can research yourself if you find them interesting. I don't think you understand the point of the video: Not to explain everything perfectly, but generate the interest in YOU to learn them yourself.
@just some guy tired of life Well yes you're completely right, but I'm not expecting to learn all from a single video it's just that now I know stuff that didn't know it existed, in order to have deep knowledge about it I must investigate by myself....
Absolutely great video! The first person who added sine waves with Scotch Yoke was not William Thomson (Lord Kelvin) but Francis Bashforth (1819-1912). He built a Fourier synthesizer in 1845 for purely math purposes (equation solver). When Kelvin's tide predictors started to become famous he tried to make his priority known but it didn't work very well. His paper "A description of a machine for finding numerical roots of equations" was reprinted in 1892.
If my old teachers/professors had led with this fascinating usage of Fourier Series I may have been far more enthralled for the process.
I'm totally fascinated by that analog pully frequency sumarization device and that mechanical fourier transformation. Such a beautifully simple solution.. and man, that was just 126 years ago... Mind=Blown.
I had a lab exercise to simulate a control system stability with an analog computer. The analog computer was built using vacuum tube op-amps. Their signal range was +/- 100 V. A bit later, where I worked, my boss had also earlier used the same analog computer and we started discussing if we could build our own, but with recently introduced semiconductor op--amps. Well, that discussion resulted in a task for me to design and build one. What I ended putting together was a box with 10 op-amps, a bunch of stackable resistors and capacitors and a few special units, and a panel with banana sockets for the stackable components. All along with amplifier symbols on the panel. The signal range was reduced to just +/- 10 V. My boss was soon happily simulating not only his control systems, but also extrusion and cooling processes for high voltage cable insulations. That was just before I got transferred to another division of the company, where new challenges were waiting me in a new R&D lab. So I myself never had the opportunity of really learning to use "my" analog computer in anything but simple servo loop simulations (optimizations).
I am a meteorology student and my university is very old fashioned. I had to do a lab course for meteorological instruments, and one of them involved using an analog computer to measure the humidity. The device contained human hairs and the computer was able to measure how much the hairs stretch and draw a graph of the humidity. The tutor asked me beforehand what I expected to see, and I said an exponential curve, and the thing really did draw a perfect exponential curve. I remember both my partner and I were kind of fascinated by it even though nobody would ever actually use such an archaic device anymore.
Thinking about analog computers reminds me of another professor who claimed that every single physical thing that happens is a measurement, which is why the wave function in quantum mechanics collapses when it's observed. Thinking about it, building devices to mimic these measurements that God and the universe is constantly making makes a lot of sense.
And now imagine that in your brain such images of the world are created by a variety of interactions between neurons. I think human brain is some kind of an analog computer, too. It's really fascinating! ❤
hair hygrometers are archaic? Out of the mouths of babes. Humidity transmitters using horse tail hair are still in use today and are generally more stable than the electronic sensors. The horse hair does not stretch. It elongates as it absorbs moisture. The hair bundle is maintained under a gentle stress, just enough to keep the hairs from becoming loose.
The only one I know of that can rival a horse hair hygrometer is the chilled mirror technique. In this device colimated white light is directed at a 45 degree angle to a front surface mirror cooled by a Peltier junction. The light is reflected off the mirror to a photoelectric detector. The control circuitry controls the Peltier device to maintain a slight decrease in light which indicates a thin film of dew is on the mirror. The temperature of the mirror is the dew point. The (analog) electronics can convert the dew point into relative humidity when needed.
I have a horse hair dew point transmitter in my display case that is 90s vintage. It's output is to control the current in a current loop to between 4 and 20ma. That's a standard analog measurement range. It used to be 10-50ma but to reduce power consumption in large control systems, the 4-20 ma was standardize. The compliance voltage is usually 45 volts so many consumers of the signal may be connected in series. Voltage operated instruments used dropping resistors across their terminals to develop the voltage. Most indicators and analog control elements use the current signal directly.
The entire reactor control and protection system of the Sequoyah nuclear plant near Chattanooga, consisted of a large room full of racks of analog computing elements. Square root extractors to linearize the differential pressure signal across an orifice plate flow measurement. Integrators and differentiators. PID controllers. Alarms (devices which closed contacts when the current exceeded a setpoint, voters (selected the highest or lowest of several inputs) and so on. Construction was started in 1970 and it went commercial in about 1978. It was only 3 years ago that Westinghouse replaced the analog control system with a multiply redundant custom-designed computer system.
Based on flat non moving earth. Same as astroLabe. FYI
When I joined the Navy in 1971, they still operated analog computers in some of their fire control systems. An entrance exam showed some simple gear output questions. I worked on a Norden system that was a true hybrid. It had a few analog computers that operated on gear ratios, electronic pots, accurate voltage references, relays, shaft encoders, etc. to a digital computer that had vacuum tubes, early 1960's transistors, late 1950's developed magnetic core memory, magnetic drum memory with about 2 dozen read/write heads in it. Not an integrated circuit chip in the whole system. This fit inside an attack aircraft that operated off of aircraft carriers plus ordinance and 2 crew. This unlike the aforementioned bombsight, actually got the job done. Yet these break down and needs constant maintenance and like the airframe it is attached to gets old and outdated.
My grandfather built analog computers into battleships in WWII. He described one as being roughly 1000 cubic meters in size with the mechanical components weighing hundreds of tons. He also described them as being accurate to within 60 decimal places and I still don't have a pocket calculator that can do that.
Cyperpunk in the theatre of the REAL.
@@nunyabiznez6381 The one he described must have been on land. It couldn't have been additional *hundreds* of tonnes placed inside a WW2 battleship.
The Mark 37 system fire control system in the Iowa class battleships, for example, weighed 1 400 kilogrammes.
@@nunyabiznez6381 Yes, and these computers were in the training flicks we saw in avionics school.
Thank you for sharing your story and insight sir
Loved the animations. This was an amazing video
lol you here too
Ted-Ed level stuff
Man. The quality and exactness of your visuals have literally exponentially increased over time. New videos are simply amazing.... It always makes me wonder whether I should start a career in research
Brilliant. As a retired computing science teacher I found this enthralling. Wish it was around about 10 years ago.
Keep this stuff coming
This is a phenomenal explainer!
Maybe a scientist
What’s up checkmark?
Glad to see you watching another great creator. Love both your videos, for slightly different reasons lol
Well you're a good explainer too, you're just usually 3-4 drinks in before you film it.
How to drink bro I'm thirsty
I appreciate the compromise you’ve come to around clickbait. You use a clickbaity title, but not deceptively so. Rarely if ever do I feel like you’ve over promised. You’re really setting a gold standard in how to make great content on the internet today. Thank you for all your hard, and thoughtful work!
There was a video where he explained that he switches thumbnail and title and measures which is the most effective (highest click rate)
@@Halolaloo Yeah I have often noticed his videos changing the clickbaits. I will have to see this video you are talking about, thought I missed it
That's the best way to use clickbait, when you actually deliver on it.
In order to get popularity in the algorithm (and compete), clickbait is a necessary evil, sadly. I don't like it either, but I'm perfectly fine with channels I want to see succeed, using it to stay relevant and gain viewership, even if they (slightly) over-promise in the title. The more people that watch Veritasium, the better, at least in my opinion.
I mean, is it really clickbait if it delivers? Clickbait has a connotation of deception, imo.
Spent a huge chunk of my sophomore years in 2001 on learning 741 OP-amp, professors told us it was used for making computers and calculators during the 60s. But later the entire ecosystem of analog computers were completely replaced by boolean logic circuits. But some type of computations, especially numerical integration/differentiation can be done really really fast and quite accurately on a very simple 741 OP-amp configurations (even you can make one now, just order the components over ebay and won't cost more than 10$ with batteries) but for some reason people don't use that technology anymore. Would be really excited to see how they are coming back. I'm hyped for the next installments.
as an Electrical and Electronic Engineering student, we still learn quite a lot about OP-amp.
@@Fx_Explains haha lol I'm a CS major and I took good amount of EEE courses, those were fun.
The ball and disc system is used in the transmissions of many snowblowers but the ball is another smaller disc. This gives a neutral position, a variable forward speed and a variable reverse speed. Pretty amazing tech.
I had never really thought about the digit in digital, so I had always considered digital to be synonymous with electrical, and analogue with mechanical. Your mechanical digital computer taught me something new today.
Be amazed then - it simply comes from "digit" as in "finger". That which you use to count up to ten, for you have ten of them. Which made "digit" just another word for "number between one and ten, that we use to assemble really large numbers made of multiple digits". So "digital" is just another word for "numeric", as in "based on numbers".
I always thought it had to do with the Greek prefix “di” meaning two or double (as in dichotomy or carbon dioxide) because 1/0 or true/false is a pair
@@AttilaAsztalos so “getting digital” is another term for “getting handsy”?…
now i understand where “compute her” comes from…
🤔😏
Analogue means 1 to 1. Every button or input corresponds with a specific output.
I don't understand the mechanical digital computer. If we didn't assign 1 and 0 to the outputs, would it still be digital?
@@socialist-strong they just have to be 2 different states
When I started in Engineering school, there were generic purpose electric analog computers. I never got to use them, because they were phased out before I got to that level, but I think they were used by students of differential equations to prove their models. Instead of coding, they worked by patching components with wires, such that they could combine the effects of different analog elements, to obtain an output curve to the input they provided. Something to an electronic version of the tidal wave prediction machine, but configurable to different sequences depending on the patching.
I'd love to hear more about that
A version of Kelvin's ball-and-disc integrator was also used in the fire control of large battleship guns during WWII and after. There were tube-type electronic computers, but they could not survive the concussion of a salvo of four 16" guns firing simultaneously. The one that I saw in class used a wheel rather than a ball for the output.
I saw a ‘50’s, US Navy film on the analog gun plotting rooms. I think they could’ve survived direct hits themselves.
I wondered why no mention was made of analog electronics in this video (or at least it was glossed over relatively quickly). A big step in the chain of technology was from electromechanical devices to devices that were purely electronic, but not yet digital.
US subs during WW II used analog computers to compute torpedo courses.
Who created these animations? Very nice work.
Reminds me of an internship I had while I was in engineering school: I worked for the Geophysical Institute in Alaska, with a group that analyzed synthetic aperture RADAR data. Back then, gigabyte file sizes were still pretty hard to work with and the data took hours to process, something that most SAR devices do in real time now (I think). They said that in the early days of SAR, they used some kind of optical analog computer that used lenses and mirrors to do all of the calculations. Fascinating.
What is a synthetic aperture..like as opposed to an organic or naturally occuring aperture? Or wjat kind of radar technically has some kind kf aperture I'm curjous and not familiar? I'm only thinking of like a radar satellite dish or antenna I guess.
So in other words you're a liar liar pants on fire. Got it.
@@youtubeuser206 who's a liar David? If so why?
@@leif1075 the “synthetic” means that it’s calculated rather than natural. Maybe a bad analogy would be: if you have a camera with a tiny lens but you moved it along a really accurate path while taking a thousand images, if you used computers to stitch them all together and overlay them, you could achieve an image that was larger than your mini camera could take.
Radar images are way more accurate with a very large dish or antenna so they can “resolve” small targets. Well, big antennas and dishes like that are expensive and hard to put on airplanes and even harder to put into space. So, some smart people figured out that if you took a smaller antenna, and made send out radar pulses almost continuously, and then did some tricky math, you ended up being able to get really good resolution by “synthesizing” the “aperture” of the radar.
@@toastrecon thanks what's the aperture though. The size of the antenna then or region it transmits across?
I was a Fire Control Technician (gunnery computers) in the US Navy in 1959. The computers were all analog and had rolling ball integrators as shown in this video. The adders and subtractors were miniature differential gears identical to the differential which drives the rear wheels of your car.
At that time we never thought of solving the Fire Control problem digitally, using transistors. 🇺🇸
7:11 Tower's proposed solution is absolutely ingenious, it's mindblowing. Incredibly cool :D
Wait….. ¿You put a personalised AD in your video to promote a new channel? ¿¿¡¡HOW DID YOU DID THAT?? you genius!! ¡I never saw that before! that was very original and I specifically love how you did because you started the ad with “so it looks like you are about to watch a Veritasium video” using the 5 second wait to skip the AD to attract the interest of the viewer to listen to the ad and not skip the add immediately ¡I loved you guys a great! :D
I would love to see a veritasium / clickspring crossover. The connection between the theory in physics to the the real world in machining, has a lot of merit for each.
Photonic computers is an interesting type of analog computer that's just taking off. By making a microchip with what can be described as tunable partially reflecting mirrors, we can make a system that's essentially a matrix multiplication. Because the chip only changes to be re-programmed, it can crunch through multiplications as fast as you can put them in. Constant time, or even faster depending on how you look at it.
Even normal computers can crunch through multiplication/instructions as fast as you can put them in. problem is, they get hot LOL.
photonic seems good tho for a certain types of calculations tho
This is very interesting. Thanks for this info my bro.
@@honkhonk8009 I mean you can't, the clock speed isn't only limited by the cooling system.
@@benjaminmuller9348 The point is information entropy produces heat
@@benjaminmuller9348 Voltage and current is a problem since you need big power sources to run fast, moreover there is a point when it just too much voltage through the semi conductor and at that point you get a shortcircuit, so there is a physical limit that we are already hitting since our electronics are a few atoms wide now days.
Driving me crazy waiting for part 2 of this. This video has ignited my imagination like CRAZY. It is amazing to live in these times with such amazing learning resources.
same, we need it lol
Me too I have been checking youtoob every day
For reals!
He's got me by the balls, milking every drop of anticipation out of me. 😣
Checking every now and then!
You have enemies? Good. That means you've stood up for something, sometime in your life.
I come from at least 5 generations of clockmakers. My dad was the last generation to actually be a clockmaker and I am a software engineer. Perhaps my grandkids will be clockmakers again :-)
That was amazingly cool. Doing a discrete Fourier transform by hand is hard enough, but designing a machine to do it with nothing but gears and pullies just blows my mind. We have it so easy now with digital computers and software. We can program anything we want as sloppily as we want and a compiler will find a way to get a CPU to do the work. With analog you can't succeed without a complete understanding of every aspect of what you're building.
When I started my career, 1980. Analog controls were phasing out and digital just starting. The digital computer we used was a custom design, 2, 16 bit shift registers into a single bit comparator, with carry. We had 11 instructions, but only 4 were used mainly. I asked why we moving towards digital when obliviously it was so rudimentary? The answer, because we could change control equations with software, whereas changes to analog systems, always involved changes to hardware. We had a group that modeled differential equations with this big patch board with op-amps, lags, leads and gains. Analog controls are differential equation computers.
It's funny that the company Analog Devices now manufactures digital components.
let's spare analog computer for some very special calculations
The obvious answer is to use a mix of parts, so digital can do what it does best, and analog can do what it does best, and the two can talk to each other. Integration is an ass to do digitally. All your physics calculations are integrals. Want to know when two moving objects will collide? That's an integral. Want to know how something will accelerate or bounce? That's an integral. Want to procedurally deform terrain without needing a million triangles? Integrals have you covered. Just add a hill to the equation. And digital SUCKS ASS at it, needing specialized graphics cards and huge memory to do raytracing. In analog? It's a few amps and a coil of wire. You don't get perfectly crisp results, but it's a PHYSICS AND LIGHTING ENGINE, you didn't WANT perfectly crisp results.
@@williambarnes5023 -- Can you keep a tiny ball from slipping when it's spinning fast enough to do a couple million integrations per second? Eventually we might have thousands of nano balls or equivalent piezoelectric structures, but hyperthreading will be here for a while yet.
@@HolgerJakobs Yeah, back in the day, my senior group project was designing a Bit Slice Computer. I wrote the OS. We used AMD.
Math cs major here … my head hurts. Wish there was more detail and examples. Will watch again.
6:10
"Then he had a stroke"
*sweating profusely*
"of inspiration"
Fascinating to see the genius behind older technology, and to see what we as a species have been through in order to get where we are today!
All that to get to a generation of Tik-Tokkers. :(
Back around 1970 there were surplus bomb sights available by mail-order, and I got a few parts from one - a 24 volt motor and a couple of really nice worm gears with screw actuators with feedback to track air pressure governed by a metal bellows. It was like a big mechanical servo to adjust for altitude, and that was just a small part of it. Don't know if it was a Norden.
Ten years earlier, I was about 10 years old prowling around a local business and found an aerial bomb in their dump. I remember thinking it was like a kiddie bomb, but heavy steel, maybe 6-7 inches long with 4 fins inside a 2-3 inch cylindrical cowl - the cowl being the same diameter as the rest. So I brought it home. When my mother saw it, the blood drained from her face. She knew a bomb when she saw one and knew what they could do, having spent the war in London during the blitz. It turned out it was being used as a harmless paper weight and somebody got tired of it and tossed it out. I don't recall where it went. I think maybe she also threw it out because it made her uneasy.
Something to note is that it's not really Moore's law that's ending. The real issue stems from the death of Dennard scaling which is a core part of Moore's observations and a big player in the financial decisions when producing transistors. Something that's been really interesting to see is the research and work into Memristors as a form of analog computing or even being combined with traditional CMOS logic, however, there are many other interesting "Post Moore" computing techniques so I can't wait to see what's discussed in part 2.
moores law has already been beaten for a time being, but now heading back to moores law due to quantum computing
@@rolly6020 But quantum computing only has 3 values. That's still a significant upgrade from 2, but it's still a far cry from the hundreds, thousands, etc., that you can theoretically get from an analog computer. The main drawback to analog systems is the lack of accuracy due to loss, but one solution I thought of is already being used for quantum systems: Superconducting. Why go for 3 when you can go for way more with basically the same restrictions (i.e. having to use superconducting wiring)?
@@rolly6020 Maybe I should have explained better. Moore's law is still far from ending. Like I was saying the end of Moore's law has grossly been misattributed to issues in the computing field whereas the real issue is the end of Dennard scaling. We can still shrink the transistor and double the number of transistors on chip and will be able to do that for quite a bit longer, the real issue stems from errors when considering tunneling and also issues with leakage. These issues make the doubling of transistors much less efficient due to the increased amount of computation now needed for error correction due to the previous issues I referenced. This directly describes Dennard's scaling, we can no longer shrink a transistor while maintaining the same power density as before. Poor Moore get's all the bad rep when it's not really his observations which are at fault lol.
@@XxCODSMACKxX Yes, in any particular area where physics begins to run into barriers like single molecule atomic sized computing switches, there must be found whole other paradigms to end run the problem. And sometimes those don't exist, I expect, or at the very least will require a whole new aspect of knowledge of reality to advance further, like quantum mechanics did for us recently.
I'm no computer nerd but we could use spintronics or use a different coding system like the theoretical sloot digital coding system; making better use of the technology we have now. Still, I find analog computers very fascinating.
I know people that cannot set the clock in their car...
When I began as an undergrad 50 years ago, after experiencing numerical analysis with mechanical calculators, then on an Elliot 903, we moved on to analogue computers. As a Physicist I wondered why anybody would want to programme a digital computer to solve differential equations when it took a couple of minutes to patch the equation into the analogue box, and results came out instantly as the variables were changed - rather than punching a new paper tape, queueing to feed it in, then after dealing with error messages trying again and getting the result.
I spent two 'gap' years working on British Government work (the Official Secrets Act prevents me speaking further), and when I returned to university the analogue devices had been scrapped.
I have a permanent eBay watch for an EAI TR20 - the device on which I lost my analogue virginity.
How does one just randomly work for the government like that?
So cool.
@@benwhitehair5291 back in the days, degrees actually meant something. The higher educated were really elites and forefront of knowledge, and there aren't as many of them. Thus, they have more opportunities to work with governmental agencies that needs them.
I think, like the video points out, that when exactness and reproducibility are required, digital computers have an advantage. Also the limitations of creating accurate and precise analog components at the time is probably why digital took over almost everywhere. Part 2 will probably explain why analog might be making a comeback these days.
@@gottagoMS123 degree inflation is such a weird thing. On the one hand its amazing that the general population has easier access to the forefront of human knowledge. At the same time though, it has dumbed down a lot of it to the point I know many with "degrees" that probably dont deserve them...
I have both Fourier analysis and Laplace transform exams this semester, and didn't had any clue what I was calculating, coincidentally this vid gave me a very good visual example! Veritasium never disappoints ❤️
Laplace is everywhere, even in Control Theory in Engineering (a unit I'm seriously struggling to understand!).
What a failure of teaching!
Actually my pure math class was where I first encountered it and he was a terribke teacher... when people got confused hed say "its simple!!!!" Was diff EQ prof.
Then all the bazillions of EE and esp. Signal processing classes I later took it got to being like do you really need to repeat this for the 600th time during the first 2 weeks of the semester? Became trivial
It also is critical to quantum theory, the wave equation. Dirac equation is the truer one vs shroedinger. The one who conceived of the delta dirac function that makes so much of fourier analysis so useful i.e. FT of impulse response = frequency response... where convolution in time is multiplication in frequency basically the fundamental theorem of signal processing in LTI systems
And Z transform is the digital version of Laplace. Never really worked with Laplace in real life but Z transforms are very useful. Something like imagine the s=jw axis being rotated on itself into the unit circle of the complex Z domain where z = exp(jw)
Modern digital oscilloscopes and spectrum analyzers does Fourier analysis in real-time. My own oscilloscope (Rigol DS1054Z) makes 1G (1 000 000 000) samples per one second and its make Fourier from this on real time, and everything is done by very small CPU with ARM core - approximately 1cmx1cm (half inch x half inch) size, and uses couple wats of energy. This is huge incredibile. I know, there are exists much faster and more precise oscilloscopes, but they are very expensive.
Apart from small op-amp based reconfigurable machines, real analog computers are superfast and consume very little power. They have been used a lot by military, mainly by radars. I've been designing SAW processors with TeraFlops performance back in 80s. But the amount of second order effects, the need to intimately understand lots of various physics involved, the time required to and the plain hardship of "debugging" ie polishing of the repeatable design makes it totally unsuitable for mass production. There are not very many people in the world capable of doing the work.
From what I know the general idea between digitial and analog is: Digital is just less efficient then physicaly implementing whatever you want to simulate directly, but a lot more general. Trowing all your effort into optimising 1 digital solution is better then optimising an infinate amount of analog solutions. (The engineering that goes into lithography is just insane.)
Its not that, you need abstraction, and to build off your previous work, to actually have that be effective.
Like everyone would be writing their own code in litteral assembly, but people need to be more abstract when making more and more complex programs, and thats why you have stuff like Python being used so often now.
@@someonespotatohmm9513 like you can probably have a decent enough analogue system and then work off abstracting it, and make some version of HDL for it, but i feel like its still gonna cancer to work with.
@@someonespotatohmm9513 There's also the sampling and resolution problem. For audio signals, most radio signals, and below, that's not a huge problem anymore, but once you're dealing with higher-band radio signals, and ESPECIALLY visual signals, it gets trickier and trickier.
there's no chance it would be worth it now... you where in the era of 8 bit micros with like 16-64 KB of memory evolving to 16 bit with about 512KB to 1MB typical ram
This channel is the gold standard for educational UA-cam
I wonder what would have happened if the ship with the Antikythera device hadn't sunk... would it have inspired more technological advancement? Or would it have been melted down and forgotten, and only by it sinking was it preserved?
it was likely just one of several such devices, it is only interesting because people had forgotten and reinvented the technology.
more useful than the Rosetta Stone back then less useful now.
I think it would in fact belong to the group of "that which could've furthened human knowledge, had it not been destroyed"- e.g. Alexandria's library, or Alan Turing's life
It was definitely not a one-off, it is way too complex for that. Presumably there were at least dozens of preceding devices built over years, maybe centuries, as the creators learned and refined their craft. They were clearly rare and expensive - only toys for the rich or the ruling class. So the seeds of more technological advancement were available, but it obviously needed more than just those seeds to sprout into anything like an industrial revolution - the ancient world was just so different from western societies of the past few centuries; for one thing they had pervasive slavery so there was not much demand for machines to do work.
Ancient Greece was the most dramatic Greek tragedy. Split into science and mysticism factions, the mystics eventually won. Makes me wonder if we're headed for the same fate today in this country. Ignorant mob rule destroys the educated intelligentsia that drives progress.
We have documents of these devices being in the homes of rich Romans. It’s not a unique thing.
13:24
:"source?"
:" I dreamt about it "
I'm shocked you didn't mention Chris over at Clickspring, as far as I'm aware, he's one of the only people to have actually made a replication of the Antikythera mechanism using similar methods to what the original creator may have used. It is an absolutely fantastic series and I cannot recommend it enough
Did he end up finishing that? I haven't seen anything from him in ages!
Yea I was thinking the same.
@@150cameron In the few videos he as put out (mostly on the second chanel), he has commented a few times saying that more is coming. He's already poured thousands of hours into it both making and studying/writing about it, and he seems to be on the next phase which is going to take many more hours of work. oh and then going through all that footage and making videos for us. Absolute legend if you ask me.
@@150cameron He made some significant discoveries during his investigation of the machine which lead to him writing scientific papers instead of machining gorgeous parts. I think he figured out it is based on lunar movement, not solar, or something like that, basically upending what everyone thought the machine was for and how it worked. He's been releasing detailed videos highlighting certain previously made components for the Antikythera machine and assuring us there is more to follow.
@King Pistachion It was the first thing i thought when i saw the beginning of this video. Clickspring's Antikythera mechanism is just amazing. Loved the whole series and his research on ancient tools technology. I really hope it will be mentioned at least on this channel too.
this is a much better take on analog computing than the previous video
I'm impressed by the ingenuity of Kelvin's projects
Thank you!
He would say if he were still alive.
@@kelvinelrick807 Lmaoooo brilliant comment
Worked with a US Navy Gun Fire Control Computer for years in the mid 70's, before digital was common. They worked very well, but as explained, didn't always give the same exact answer twice in a row. Close, but not exact. That's why regular tests and calibrations were required. Since a projectile, unlike a guided missile, cannot change course in flight, having to consider ship pitch and roll, ship speed and direction, muzzle velocity, target altitude, range, direction and speed, time of flight and be able to predict where the projectile had to be to meet the target at a future position took a lot of timing motors, their associated gears and shafts, ballistics cams and their associated followers, synchros, resolvers, summing networks, indicator dials and hand crank inputs. My particular computer was about 6 ft tall, 2 ft deep and 10 ft wide. That could probably be replaced today by a computer the size of a typical tablet. Old systems like those have long been replaced by digital systems.. but the old engineering solutions were innovative and impressive.
actually all replaced by a single chip about the size of a pinhead. The apple watch probably has 1,000 times more computing power than your original mechanical computer.
Replacing innovation seems like it may be a shortcut to disaster in some ways.
Is there any resources out there which show the damage caused by big naval shells when they land and explode. Lots of videos of the guns firing. Not much of the shells landing.
@@n-da-bunka2650 Yes.. the chip can do the calculations, quickly and accurately, but to have a complete system, you need a way to input info from various sensors, a way to input information manually, a way to see the results and a way to get the results to the equipment that will actually fire the projectile.. that's why I suggested a "tablet" as the minimum size device to accomplish those tasks.. Also, even though military digital equipment is "hardened", the old analog computers were naturally immune to EMP events.
To be fair, the amount of general purpose digital computing power needed to replace a dedicated function electromechanical device is orders of magnitude more computing power than actually done with those old dedicated devices.
The design and build of electromechanical computers like the fire control computer from ww2 and immediate postwar stuff like the Salems are almost beautiful in how it all goes together and moves and does what it does how the do it. It really is something when you notice that an actual USN video meant to inform sailors on the very very basics of how they function is the length of a tv documentary, but containing zero filler, just to describe the basic movements of the cams and whatnot, is telling as to how mechanically complex they are.....
All of that working together to do something which back in 1990 couldn't be done well enough with digital stuff to justify their replacement, is a testament to how good they can be, and are like a symphony of movement that makes the most glorious of triple expansion engines envious of them.
That said, apparently Isambard Kingdom Burnel's propeller used on his 1845 SS Great Britain, which was the first screw prop ship to be a proper regular bluewater ship transiting the Atlantic is only a few percent more efficient in how well it moved water as compared to a moderrn screw prop, and that was built more with a combo of Brunel's brilliance and determination to do whatever he set his mind to, and some trial and error...... so take all of this as you will.
(Can you imagine what Brunel could design and build with the tools we have today? Remember this is the guy who designed, built, launched (after some difficulty) and operated a ship with a displacement of around 33k tons, 19k tons gross tonnage, not surpassed in gross tonnage until 1901 and in terms of volume, 6 times larger than anything else ever, back in 1853 with the SS Great Eastern)
Until the 1980’s or so, automobile automatic transmissions were actually analog computers. They ran on oil and transmission fluid and were quite complex. The shifting logic was performed by the transmission fluid. This is why fixing them back then often necessitated a visit to a transmission specialist who only worked on gearboxes. Once it became electronic (and then digital) it all changed.
However, there are automobiles where you can manually switch gears (manual transmission), which are a lot less complex, require a lot less maintanace and are drastically more ecological. Unfortunately, the vast majority of car drivers are lazy asses and learning to use manual drive (which is the standard in many countries) is a bit too difficult for them.
With manual transmissions, the driver controls the shifting of gears - both timing when to shift and also how to shift. No need for any kind of computer. With an automatic transmission, some control logic is needed. This used to be analog and later became electronic, ie digital. I love driving manual but fewer manufacturers are offering manual gearboxes these days. Electric motors produce large torque at low revs and have pretty much no need for gears so with these cars there may be no gearbox at all, and hence no associated computer. Enjoy shifting manually while you can. NB It gets worse. As cars become autonomous, you not only won’t shift gears, you won’t drive at all! No fun!
🧢
@@GP-qi1ve You say that manual transmissions are "more ecological", but overall they're not. Automatic transmissions do a better job of consistently keeping the engine in the most efficient power band, resulting in increased fuel economy and lowered emissions/CO2. While it's true that a conscientious, knowledgeable driver with a manual transmission can outperform an automatic transmission in that respect, that doesn't describe the great majority of drivers nor how most people with manual transmissions choose to drive. And I say that as someone who can drive a stick and enjoys it. Furthermore, hybrid cars are even more ecological, and manual transmissions are not even feasible on a hybrid drivetrain.
@@GP-qi1ve I learned my lesson in to NOT buck the system when my '81 manual lost an output bearing and a used tranny was 2x the cost of a used auto trans (& took 6 mos. to find 150 miles away) and just a new bearing/ output shaft was 2X the cost of the whole used tranny (auto trans. was 1/4 cost than for new parts). Supply & demand works for the lazy asses (I got lazy fast).
Wow these guys were so smart. Trailblazers
My favourite cross-over between the analogue and digital worlds is the old cellular technology CDMA, it blew my mind when we learned about it at uni.
Basically it allows all mobile phones to communicate at once, without interfering with eachother because it uses the interference itself to encode the signal. Each device is assigned a unique digital code, the device then XORs the data to be transmitted with the code and those bits are transmitted through the air, which interferes with other devices communicating on the same channel. To decode the data for a specific transmitting device, the receiving base station multiplies the transmitter's unique code with the raw signal, and out pops the transmitted data. Constructive interference produces a signal with levels above and below 1 and 0, but when multiplied with the transmitter's code, any values over 1 are interpreted as a 1, and any values below 0 are interpreted as a 0.
Okay that is genuinely impressive and some damn cool and clever technology. It also explains to me why CDMA tends to be used for longer distances or in noisier environments than most other cellular data technologies.
Thank you for sharing! :D
Is this technique based on some kind of orthogonality relation?
Sorta reminds me of how WiFi works.
@@leonsteffens7015 Yeah, basically each device has a code that is pairwise orthogonal with every other device's code so the inner product produces the component for that device only
The Antikythera mechanism and the so called ‘Hero’s engine’ make me think of a quote that read:
“If ancient civilisations had any idea of how much potential their technologies held, we would already be exploring the neighbouring stars” (Arthur C. Clarke)
Why obnoxious creator of this channel thinks that his audience have not heard of things and he the only one introduce us to it ???
@@MrUssy101 Woah, easy there, easy. You've been out cold for a couple of days now. Why don't you just relax a second, get your bearings?
@@MrUssy101 because the Entire point of his channel is to introduce people to things they might not have heard of...
It just wasn’t meant to be. Too many variables
@@MrUssy101 your English is terrible. Perhaps you should go find some videos about that instead.
I can't believe I'd never heard of Kelvin's integrator before. So elegant! I almost thought it wasn't real, that they'd never built one - and then you showed a picture. Just amazing! I'd imagine the main problem would be getting the bounds right so that you never hit the limits and ended up clipping the values.
Yeah, I was thinking about that. The mechanism behind the integrator is basically a CVT or continuously variable transmission. You could perhaps account for that by swapping out the ball for one with a smaller radius. This way, having it move farther from the center would yield a much greater angular velocity, and thus upper and lower limit.
Okay, I really like the way this guy does sponsored content. Instead of slapping it in the middle where it interrupts content, he puts it at the end! That's a very respectable thing to do!
And for my opinions on the rest, it's really interesting seeing how mechanical computers work! I've always had a fascination with stuff like clockwork, and what is a clock but a basic mechanical computer? Thank you for this video!
What a fascinating vid! I worked at IBM in the 70s/80s, and there was always some analog computing going on still, inside the lab, with all sorts of strange tools. Oscilloscopes hooked to sensors measured all sorts of things I did not understand at the time, but I puzzled through the operation of our cathode ray scopes and figured out how to troubleshoot the operation of the first (or second) Robotic Arm, used then to pump out disks and diskettes by the boatload. What a wild time!
As someone who's OBSESSED with both computers and mathematics, I can't believe I never knew like, 80% of this before. My area of study for computational devices in college was always like, digital-style computers from the Lovelace era and onward, so those tide-predicting devices were something never brought up! How fascinating!
What do you do for work?
@@claye1205 I'm a manager nowadays, but I also do technical consultant work for a specific software
"Brought up"? So, your learning is based on the whims of others. Maybe you should actually learm how to learn. Don't just sit there waiting for service.
@@mikemondano3624 This seems like one of those "i didn't even know of it's existence in which to research further to understand it's function". Don't fault him/her for not knowing something, isn't that why we are here?
@@mikemondano3624 What a smug, condescending attitude. Bet you're very smart and have lots of friends.
One of the craziest things I ever did in my electrical engineering classes was building an FM radio receiver.
The professor was describing how FM radio worked and what we would need to build to process it. I raised my hand and asked
"Wait, so you need to take the derivative of the FM signal to get the original sounds, right?"
"Yes, that's right."
"So this circuit... does calculus on the input?"
"Uh, you could say it that way."
A person can differentiate in minutes. A digital computer can differentiate in milliseconds. Analog electronics can differentiate at the speed of light.
How is analog
Faster?
@@billballinger5622 The output of an analog computer happens at the same time as it receives the inputs. There's no "calculation" time really. A simple example is a rope on a pulley that lifts a weight. When you pull down on the rope, the weight goes up at the same rate you pull. Near-Instantaneous input/output.
@@billballinger5622 Processors process using semiconductors and logical gates, so they have processing time. Analog signals simply are transformed from input to output due to interactions between matter and energy
@@eaglekepr It doesn’t technically happen instantaneously. It happens at the rate of sound through the object that is moving.
@@eaglekepr "There's no "calculation" time really."
Analog circuits don't respond instantly either. They have capacitance (resists change in voltage) and inductance (resists change in current). Which is faster, analog or digital, is not a question that can be answered in generality. It depends on the exact details of the implementation, the specific operation you're trying to run (certain things are a better fit for one computational model or another), and how much inaccuracy you're willing to tolerate.
This is why I love mechanical analog machines. With the simple movement of hundreds of gears, pinions, discs, balljoints, and multiple other hard components, a machine is made to do one task incredibly accurately. Sure, in an advanced scale, these machines take up huge room while an SSD fits on a thumbnail, but this is all visual. With some understanding of mechanisms, you can see exactly how the machine functions, and with some understanding of mathematics, understand what the machine's job is.
One of my hobbies is making music with synthesizers. It's a golden age for both types: analog and digital. Often they are both, having digital control of analog sound generation and filtering. The unpredictability of the analog side is considered to be "musical".
in the 1970's, I worked with analog equipment whose main component was the op-amp. It was amazing with it's calculating ability and I thought at the time it would revolutionize electronics. However, digital came out and simplified understanding the circuits for the common man.
This video gave me chills, each story was awe inspiring, shows much of the current technology we are taking and using for granted are the result of handful of extremely dedicated, hardworking, usefully creative people and an unknown Force which is making it happen.
And as always the transitions in the video were smooth.
Thanks for a great vid! I always think of it as : Analog tells you how much. Digital tells you how many.
Just discovered you, and absolutely loving what I’ve seen so far. The discussions they inspire in the comments are pretty entertaining too. Please please please consider writing books for schools, and/or making videos specifically for school curriculums. You’d make classes infinitely more engaging, and possibly inspire some future careers.
Seriously. Your content fascinates all levels of expertise and knowledge - I was not expecting quite such a mix. That is a real talent. I’d love to see kids actually enjoying classes, and more importantly, they would too.
Why not bring the videos to the classes!? ^_^
While I heard of Analog computers in the past, I was so amazed by an analog Integrater.
As long as we need to solve a specific problem, Analog works well. Though in digital the same machine can be used to solve various problems with the simple change in variables or expressions.
I wonder if an analog machine could be made with similar capability for a set of problems? Obviously there are limits to what analog machine could do but I'm sure some of the variable things could be sorted out. What purpose this would have other than a thought problem... I have no idea.
At their peak, analog computers were programmable. They had started using electric versions of the old gears (as seen in this video). These fundamental components could then be assembled and reassembled into different analog computing circuits. This was done by physically moving patch cables to wire them in different assemblies---programming!
So, assuming that modern technology enables doing this without moving a bunch of wires by hand, programmable analog computers should be quite feasible. (I'm pretty confident they exist, but I haven't followed this subject, myself.)
@@chrismiddleton398 reprogrammable circuits! I knew i had seen something similar to this principle, but I couldn't point my finger to what
As far as i know the idea is a circuit that can rewire itself to implement a specific computation, as long as the machine is constant then the result is close to the reality
I remember learning about these analog computers back in highschool. It's always fascinated me that such devices have the same functions as our digital computers today. Amazing
This subject is wrinkling my brain. The intelligence and creativity of some of these folks is incredible.
Analogue machines in the old style were objects of great beauty, and in many cases the result of superb craftsmanship. They were however very expensive, and had to be lovingly maintained, since they were prone to malfunction due to the build up of dirt. Inevitably too they gave inaccurate results. A secondary effect was cumulative error as a result of rounding errors in a lengthy calculation.
On a personal level I used a slide rule for many years in my early career as a design engineer, but was blown away by my first electronic calculator - a Texas Instruments machine which by modern standards would be considered pathetic. That was in the 1970's.
I shall look forward to viewing a further piece on a possible comeback for analogue computers, but am sceptical that they can ever overcome the problems inherent in this kind of machine.
Wow, awesome story!
One problem using any kind of Mechanical Gears in any Machine, Is Gear Lash. Also the the Gear Lash gets Larger the longer it is used. That’s the biggest problem with Mechanical Components is Wear and Tear, and the inherent inaccuracies it produces. Even in the Guns that are Controlled by Analog Computers, it still takes Gears to Rotate, set the Elevation, or any kind of movement. Over Time they Wear. So then what used to be reasonably Accurate becomes Inaccurate.
@@joeybobbie1 I've never personally heard the term ' Gear Lash ', but have heard of ' backlash '. As I recall it refers to either lost motion along the gear train, OR torque multiplication if the train is driven from the output end, which can strip the whole train in a severe case.
Parenthetically an old friend of mine, an inveterate collector of good things, had purchased several gear train components on the US military surplus market. These objects were things of both beauty and interest, crying out to be re-used in some newly conceived mechanism.
@@joeybobbie1 What do you mean with WEAR? thanks.
@@t.me_s_petizioni_2220 In English, "wear" is sometimes used as a synonym for "use". So "wear and tear" means "use and degradation", to describe an object breaking down over time.