I was in the U.S. Navy in the time frame than these computers were being designed, and was aboard the first of the Navy's ships to be equipped with solid state computers, the Naval Tactical Data System (NTDS). Like the four different Apollo computers you describe, the several components of the earliest NTDS all used different types of logic, different word structures and and even different logic voltages. As usual at that time the choice was up to the manufacturer with only the intercommunications between devices dictated by the Navy team in charge of the system. There were some strangely named gates used and both positive and negative logic employed. As a technician working with the whole system as installed, it was sometimes a real challenge to follow a chain of commands that were not acting correctly on the display consoles. As with the early PC's reboots were a common necessity. We had two main computers for redundancy, though there were a few single points of failure in the system. Punched paper tape could pile up on the computer room floor trying to troubleshoot. The craziest comment I heard from one of the industrial designers who came aboard to see how this, essentially prototype, system was working, was when he was shown the reverse writing on plexiglass plotting boards that would have to be done manually when the computers were not working. "No, No! he said. Our system is supposed to replace all that!" Thanks for the comprehensive overview.
5:22 -"If they'd let a hacker anywhere near the thing"- *If they take their historical preservation responsibility seriously.* Code is not some throw away element of the past. It is highly refined prose that hundreds of people spent thousands of hours refining. It deserves just as much attention as the hardware if not more. I suck at it, but still... I can appreciate it.
Hear, hear! :) Without the code, the computer is meaningless - it is just a hunk of metal and silicon. Its actual historical function can only be understood by reference to the code which made it perform that function. If that code is locked up inside the hardware, it is "critically endangered" - a mishap could destroy it forever. Every copy of it held externally to the hardware is a vital historical "insurance policy". Arguably, the code is even more important than the hardware: should the hardware be lost, the code would still provide deep insight into the inner workings of the launch vehicle and, indeed, the collective thought processes of the design team and it could still be run under emulation or even on a replica of the hardware if schematics are available or were to come to light in future. Were the code to be lost, any hypothetical replica hardware would still be of little value.
I'm a former NASA contractor and lived NASA until my retirement in 2005. I've been waiting decades for someone to explain the entire Apollo Spacecraft Flight Computers in a manner where everyone watching would understand how they worked. With this excellent presentation, everyone can see where we were back then with our imagination and desire to reach the moon in less than 10 years as a National Goal as set by President Kennedy. Thank you!!!
I discovered and downloaded NASA Technical Note NASA TN D-5869. “Description and performance of the Saturn launch vehicle’s navigation, guidance and control system”. The calculations they did in that digital computer are amazing in number and function!
@@calmvolatility2787, To dissect the joke. The code was called "Corona". To prevent spread of misinformation about of the current pandemic, youtube algorithms try to suppress every mentioning of the sickness in any context. In effect every mention of "corona" would get you "demonetized".
One of my favorite facts about the AGS is that they implemented their read-only memory by omitting the Y addressing wire through the cores holding 0. This made it so that during writeback, these cores were guaranteed to not flip. However, it did mean that a loss of power could leave you with your hard-wired cores reading the wrong thing. To correct this, the first thing the software does when it boots up is "prime" the hardwired cores by writing to every single location.
@@fabiosemino2214 They're much harder to come by, from what I can tell. I currently only know the location of a single one in a private collection outside of museums. And so far we haven't been able to locate any schematics at all for it, which would make things quite a bit harder than we had with the AGC.
@@mikestewart8928 Drop a line to Curious Marc on UA-cam. They have a full set of schematics for the AGC and multiple dumps of the LM code from different versions.
@@fabiosemino2214 yea, but I only hope that when he restores / rebuilds one of those he swaps out all the old paper and electrolytic capacitors which they dind't do with the Teletyper powersupply restore, instead they tried to reform them 🙄That's just asking for trouble 😨
I think that I just earned my geek degree. I actually understood everything that Scott talked about. It helped that I also got to view the core memory last month at the US Space & Rocket Center.
Hi! New subscriber here. It is absolutely necessary to mention the impressive hardware and revolutionary software that supported the Apollo program from the ground back in the Real Time Computer Complex! The RTOS (extended IBM OS/360) was a marvel. The whole thing did things that we take for granted today in modern operating systems but where unheard of back then.
Unfortunately, NASM's LVDC was never loaded with code; I was in contact with them regarding dumping it many years ago and they were open to the idea, but when we did further research to determine what may be on it, we found it was only used for testing.
+3 massive IBM 360 mainframe computers on the ground (rtcc) doing the heavy work. Which are almost always completely ignored (especially in silly comparisations of "computing power of a Apollo mission). A modern smartphone is stilll much better. But things like "a digital clock is better" are just bullshit. Not to mention 3 rather good biological computers on board. And a LOT more on the ground and pre-mission. Also some people find it absolutely incomprehensible that you can do serious math with pen and paper or a slideruler... Sir Isaac does not approve.
@@shadow7037932 It's a function of computing power and time. You can do incredibly complicated calculations on paper. It just takes long. And people in ye olde days were used to that. I never used a slide ruler but calculation aids in various forms are known for millenia. The main work like the trajectories were calculated, checked, tested, simulated and so on months in advance. Often by a (now) kinda famous group of woman which were called "computers". Based on Newtons laws and rules, calculated (published) ironically in 1686. Centuries before the first x86 CPU.
@@shadow7037932 The main human computer didn't have beard, gray or otherwise. She was Katherine Johnson. An African American woman. and oooh boy yeah the original mathematician team was all male.... She was just so good at her job that even in the segregated and male dominated field of the time, she was the one flight crews trusted most. To the point that flight crew astronauts asked specifically, that Johnson be the one to be responsible for calculating and verifying the orbits and maneuvers.
@@aritakalo8011 I think that is a bit of a Gene Kranz thing (...who some think was the only flight director). She wasn't the only woman working on that. Shes just the one most featured in books and documentary. www.history.com/news/human-computers-women-at-nasa
I could imagine building a computer from scratch today would be so hard to do! Gotta give so much respect for the work they put in to create these, which we take for granted every day.
It's actually surprisingly simple because these days you can assume to have access to logic gates that just work "out of the box" - and then all you need to worry about is just the pure binary logic. And designing a CPU using just logic gates is, although not trivial, not incredibly difficult either, it's something many students and hobbyists alike are able to and often do just for kicks, or as personal projects. I've designed a 16-bit CPU from bare-bones, starting with just a NAND-logic gates (which itself is just made out of either 2 or 4 transistors, depending on the semiconductor technology used), and it's actually quite surprising how little it takes to have a fully functional CPU that can do all the common mathematical operators and access RAM & ROM, etc. Back when these guys were doing it though? An absolute nightmare to achieve the same.
Oh, yeah. This takes me back to my Mk. 152 Univac Fire Control Computer in the Navy. 16 bit Simplex, and 32 bit Duplex words. Core memory. Huge power supplies. Chips? What are those fancy new thingy's? Machine code programming. Grace Hopper and her Micro-secant.
What exactly did it do? Did you ever use it in serious combat? Did you like it? What was the worst thing about it? Tell us more before you die and the ancient computers with you :(
@@G4m3G3ni3 The fire control computer interfaced the radar, launchers, and tactical data system together and also gave the missiles rudimentary (compared to today,) guidance orders. The beauty of core memory is they do not require power to retain information. You could shut the power off at any time, come back in six months power up, hit go and the program would restart at the next instruction. But, it is slow, power intensive, and expensive. And huge space wise. The Data converter unit was it's own refrigerator size enclosure, for 16 data channels. Data passed through dozens of 90 conductor armored cables, relay switch boards, and electromechanical switches. The computer on my desk is thousands of time more powerful, compact, and reliable. It also runs on a fraction of the power, and waste heat is still it's biggest enemy. I joke that the biggest 'upgrade' the machine got was when they glued a "SPERRY" tag on the thing when Sperry bought Univac. But the programming was entirely in straight machine code in Octal format, and directly accessible from the front panel. The Terrier missile system used 32K of core ram, half of full capacity, but double what the Tarter system used. There was also a Teletype machine to access sumo sub-programs, and input instruction for training and testing purposes. Maintenance of that was nightmarish! Hope that satisfies your curiosity.
@@kevinbendall9119 I hope you got the paperwork declassifying all that given how the US Navy famously prolongs the classified status of some technology.
13:16 well, we still can't agree on that. While the word size is either 8, 16, 32 or 64 the instruction sizes still don't align. x86_64 have 8 bit up to ridiculously 120 bit for instructions. While integers might be 8 bit, 16 bit ,32 bit or 64 bit. We have two different types of Endianness - one used on ethernet connections, the other one used for basically everything else. We have several different architectures, which use 64 bit and 32 bit integers while using either 32, 48 or 64 bit memory addresses. Our filesystems are sometimes 64 bit, while most USB sticks still cant accept more than 4 GB files, since the filesystem is 32 bit. So yeah.
I don't think the mentioned problems are real/relevant in the modern computing, everything is using 64 bit, even phones. As for pendrives - it's just the stupidity of defaulting to FAT instead of formatting everything in NTFS by default (licensing reasons... ?). That is, unless you're targeting some semi-custom platform like a weirdo router that uses some CPU architecture forgotten by time.
A diverse ecosystem is valuable. That’s why Linux supports something like two dozen different major processor architectures-more than any other OS in history. And it also supports a wide variety of filesystems, not just NTFS.
There was a major schism at the time between 36 and 32 bits; 36-bit octal machines such as the PDP-10 were popular in research circles because a single word gave you ten decimal digits of precision. Xerox PARC, for example, wanted a PDP-10 so badly that when Xerox management denied them permission to buy one, they built their own clone of it out of components and called it the Xerox MAXC. The 32-bit machines, however, were cheaper and IBM-sponsored, so they were more popular with management types and therefore more successful in the market. And that's why everything today is a 32 or 64-bit hexadecimal machine, while 18 and 36-bit octal machines are forgotten. The worst enemy of a superior solution is a cheaper inferior that is just good enough...
LVDC - Launch Vehicle Digital Computer for the Saturn V made at IBM Owego, New York. where Dad later worked on Space Shuttle Flight Computer Boards. During Apollo and Skylab Dad communicated to the LVDC via the DDAS Station in the the different Firing Rooms of the Launch Control Center. On one occasion Dad and a security guard had to go to the top of the Launch Umbilical Tower to change a "card", one bit in the DDAS Bus, that then went across the swing arm to connect to the Instrument Unit in the Third Stage. Keep up the good job Scott.
I'm afraid this video has gone completely over my head. Even with more than one viewing I only understand bits of pieces of what you're saying - but I still really enjoyed it! It's very relaxing to listen to someone who really knows what they're talking about as they take you through a subject you _almost_ understand. It really focuses the mind on 1960's computers instead of present day problems. So thank you for another great video, Scott Manley! The research you do on these videos is incredible.
Well done Scott. I remember reading a very in depth technical article about the guy who resurrected the AGC code directly from the AGC computer at the Smithsonian. That and documentation he found and one of the engineers who had worked on the thing. The culmination is essentially a digital archive of the actual code and documentation for that code. He then went on to build an "emulator" using a raspberry pi I think, and fabricated an exact replica of the DSKY interface and display unit. The whole thing works and he was able to simulate trajectory corrections, LEM landings and launches, Insertion burns, etc. Pretty amazing feat of "digital archeology." Thank Scott for your awesome content!
I am not a "computer nerd" so I have a really hard time understanding a lot of what Scott is saying in these videos about the Apollo Missions' Computer Systems. However, that does not stop me from attempting to learn and understand as much as I can about these systems. I am just fascinated by them and I really appreciate Scott bringing to us these amazing, yet very technical, stories and how these computers were made, how they worked and how they were used to get Humans to the Moon and safely back to Earth. The Core ROPE Memory is the most fascinating of them all to me because the way it was made required a lot of time, dexterity, nimbleness and skill.
Actually by the time of the manned missions making core rope was mostly automated. The machine Scott showed in this video moved the core rope to the correct place, the lady (they were all ladies) just had to push the wire through, then the machine moved to the next place etc.
They were at an auction to get more parts, but couldn't because of course some rich f*ck bought them because of the 50 yo anniversary. Now they will rot in some garbage home of poor garbage taste. I hate these f*cks with a passion. This is why channels like CuriousMarc should be supported, so that they have the money to outbid these rotten, useless pigs. One shinning beacon of this method is Steve1989, who reviews military rations from every era and age, along with military kits and so on. Some of the things he shows, are not even recorded. It took me months to dig out some information on some of the things he showed as physical items. Those rations (insert here, anything of historical value) would've rotten in some dumbass' collection, never to be seen again, were it not for him.
aserta in fairness Marc got access to the AGC from a rich benefactor and was able to keep the code after the restoration. Not all collectors want to lock everything up. But I get your point none the less.
@@dmacpher It wasn't a rich benefactor. It was a guy who bought a pile of Apollo scrap for cheap. He became rich decades after, by selling the perfectly working computer.
As a boy of 10, in July of 1976 my father took our family to Disney for the big 200 year anniversary of our country. After the awesome fireworks show @ Disney World, my father took us to the KSC for a tour of that magnificent place where the moon shots originated. I remember the Guidance ring distinctly as we were encouraged to "touch and feel" history. The sights and smells of the control room, (yes everyone smoked back then and the room smelled like a bar and the paint was all dingy from the smoke as well) I'll never forget going into the assembly building and being awestruck by the sheer size of it!!! (at that time they were transitioning over to the shuttle program and they hadn't gotten to the refit of that building yet which allowed the public a peek inside) Growing up on the gulf side of Fl I wasn't able to witness the Apollo launches, (too young but mom did take us out to the end of our drive so we could watch Apollo 15 (night launch) head off for the heavens), but I did drive over with friends to watch the shuttle launch around 1983, ish... Sorry for the ramble Mr Manly, but I do enjoy your space videos very much!!! :-)
As a fellow computer programming nerd, I found this video so fascinating. I hope you do more in the future on how rockets and other space vehicles utilize programming to complete their missions.
9:11 There was one hardware difference between the CM AGC and the LM AGC: in the Command Module, there were two separate DSKYs. The second one was located in the space behind the main control panel, where the navigator stood while taking sightings through the telescope (and also I think where the crew had to go through to get to the docking hatch).
Thank you Scott for this great video. I am a big fan of the documentation "moon machines" where they also have an episode covering the computers. But not nearly as detailed as you did! Awesome! I wish people nowadays would appreciate much more that nothing on their smartphone would exist without those geniouses who built the first computers!
Alessandro Bianchi by an enormous factor ! The orbit calculations were done on the ground and radioed up. KSP does those as you play and does hundreds of geometric calculations per pixel each second, on top of it.
Computing power isn't everything. It's how you apply it. Here's a simple example. I've a top of the line laptop for work. It has to have as good a computing/computation power i can get, to give me the correct architectural solutions IN a portable format as fast as possible. If i need to know if a beam isn't ok, then i need it fast. THAT is great, it's awesome, and it made my life much easier. But this laptop, for the life of it, cannot interface with the CNC machine that makes the custom fit beams we use in restoring old buildings. It needs another "computer" to run the stepper motors, read the scales, so on. And that's just a modern concept, back then they had to deal with a heck of a lot more things, like vacuum, cooling, heating, fire hazards, robustness. IMO, the computers that were put to work were the best they could be at the job they had to do, after all, you can't run a rocket with a modern laptop, anymore than you can use one of those old computers to run a computational program...but you could make a CNC out of one, and it would be the best CNC money could buy, because precision was adamant.
Except that the Saturn control computer and the ACCs actualy controls the vehicles. Almost anything developed since has more computing capacity and more memory but lacks the I O hardware needed to do anything.
as always your presentations are fascinating. While I knew some of this (having been an avid follower of US space missions as a child) you have once again given us a comprehensive review of all the pertinent elements. appreciate it. makes our shelter in place easier to deal with.
Nicely, Done, Sir. 16-bit, 15-bit, 18-bit. I grew up on a PDP-1 with core memory and 16-bit words with 1 parity bit. Since it was DEC all op coeds were in octal. Great times, include the original Space War code writtien 4k memory.
The AGC NOR gate chips use DTL, not TTL as Scott stated. Furthermore they used the DTL aspect to wire OR signals in some places, so you cannot build a direct copy using TTL chips since they can't do that.
Every single time I discart the "build myself an AGC project" with Arduino or even better an old school 8bit 80's computer or something like that I see a video like this and... holyyy c**** ..... I NEED IT!! ;) Thanks Scott!!
Notice how the Lunar Landing Module maneuvers in outer space were more akin to the movement of the repair pods in "2001:A Space Odyssey" than the space ships in "Star Wars". That underscores Stanley Kubrick's attention to detail.
The OP is talking about how spaceships maneouvre in vacuum versus how aircraft fly in an atmosphere. The “Star Wars” fighters banked and turned like aircraft, not spaceships.
@@romhackstashbox1275 or you are conspiratard, and probably a flat earther too. Get lost. Go read the bible or something and leave the rocket science to the adults.
I know this is a late comment, but perhaps you could give a link/shout out to Curious Mark’s UA-cam channel: his crew painfully documented the acquisition, disassembly, operational checks, code verification and many other activities to basically start from scratch to resurrect the AGC. It is an incredible work of love and time they put into that project, culminating with the simulated Apollo 11 landing on the moon, complete with 1202 error codes. Riveting videos, one short screen I saw on this presentation of yours. Their work combined with yours is a great source of “how it was done” in early manned space flight. Thanks for all of your work, Scott!
Brilliant video Scott...even for us whom have to look up some of the terms...operand?...have watched it twice...keep 'em coming mate..REALLY helps with lockdown in south east London lalala🙃
Last week I watched my Blu-ray disc, Apollo 11. It had been a while since last viewing. That is one heck of a good documentary. The large format film used was a very wise choice back them.
Thank you Scott this was a very interesting video. What really amazes me is how a technology that was wasn't even barely barely understood was used to send humans to the moon and back safely and then of course with Apollo 13 you know to be able to operate on the fly. And to what really amazes me is the fact that our cell phones have more technology and more computer operations and then hit the computers that actually sent Apollo to the Moon.
I was always fascinated by that classic footage of "the ring" being jettisoned during an Apollo mission, so I learned about it! When I finally saw an actual Instrument Unit on display at Kennedy I took about 10,000 photos and really felt like things had come full...circle? LOL One vote for the LVDC here! :)
The lovely interstage separation (skirt sep) shots. Ahhhhhh. As Scott said, that was the skirt surrounding the S-IIs 5 J2 engines. The IU (instrument unit) is where the LVDC and the analog computer were. That sat atop the S-IVB, and stayed with it until it crashed into the moon, or was sent on a wild ride around the sun, as in the case of Apollo 12. The IU had to stay attached, to still control the trajectory and transmit data. In the case of Apollo 13, after the explosion, the IU was causing interference in communication with the LM, because it was essentially hijacking frequencies from the weaker Omnidirectional antennas that were used. They had to push the S-IVB out of the way a bit. Sorry, this turned into a long thing lol
Fun trivia: magnetic-core memory is why the process of (or the file created by) reading out the full contents of a computer's memory is still sometimes called a "core dump", even when the computer uses semiconductor RAM with nary a core in sight.
Excellent explanation, thank you. The Saturn V "analog computer" was the optimal approach for a time when fast A/D and D/A converters were yet to be invented (I spent the good part of 20 yrs of my design career with analog computers). I like the final hyperbole: "when the computer designer couldn't agree on the number of bit in a word". How true. I guess we should say thanks to Mr. Shannon for the eight bit in a byte. Edit: I didn't know Armstrong gimbal-locked the half-LEM before docking. I remember, though, some astronaut asking for a fourth axis, and never understood why MIT didn't allow for it. Gimbal lock is a totally man-made system failure, IMO.
A fourth axis would have added weight. It was a compromise, save weight but have this scenario you need to avoid. Gemini capsules had four axes, but then they didn't care about weight so much.
An interesting fact about core memory is that it couldn't be read without being erased. Reading is done by trying to set the magnetic polarity of a core, and if it then flips from the other polarity, that generates a pulse. Edit: here's a video about it: ua-cam.com/video/p7SkE5pERtA/v-deo.html
"Everything rises and falls on leadership." -John Maxwell If the organization get its people right, the right people in the right roles using the right skills and tools will handle all the technical details.
So much careful design went into it. The two absolutely critical engines that, if they didn’t fire, the men would be left marooned in space or on the Moon-the Service Module and the LM Ascent stage-were hypergolic. That means their ignition only depended on simple on/off valves working properly, not on more complicated throttles, pumps etc. The initial trajectory to the Moon was “free return”, which meant that if nothing further was done, they would simply come back to Earth again. This is what happened with Apollo 13, where the main issue was figuring out how to last the time it took for the trip back.
A little known Apollo trivia tidbit: Raytheon's Phalanx 20mm Close In Weapons used ubiquitously on nearly every US warship, the Block 0 (about 1980) and Block 1 (1988) versions all used a derivative computer that was nearly identical to the AGC right down to the magnetic cores and "rope" memory. The Navy needed something absolutely rock solid reliable and simple enough to withstand nuclear EMP yet be compact enough to stuff into a large can on top of a Vulcan rotary gun. The irony was from Apollo the Phalanx was born...
From a technical standpoint, the integration of the chips in the LVDC was way more of a advancement than the way the AGC was built. Modern computers are way more directly related to the LVDC than to the AGC.
At 5:07 there is a comparison in which the AGC performed an operation in 168 microseconds with 28 bit precision whereas IBM's LVDC required 246 microseconds with only 25 bit precision. Was IBM's position that the extra performance of the AGC was unnecessary to accomplish the mission? The analogy being that a road trip can be successfully completed in a Trabant as the capabilities of a better car wouldn't affect the completion of the road trip. At 5:15 an explanation of this program would interest me if no one else. Surely this one page is not the entire program? Rope memory? Was this similar in any way to the iron toroids in the IBM LVDC memory?
@@HuntingTarg Once they got the computer-radio interface running uploading an approximate state from Houston should be doable. But star realignment was already something they did multiple times on each trip.
@@absalomdraconis I was gonna suggest he bring help. But my comment was already becoming rather lengthy. If nothing else I'd like him to be there to document the work. Furthermore, it would be in the museums interest to facilitate such an operation. They would be curating the original code.
Byte and word lengths were a mess. It felt like I was having to adjust to a different system for every architecture. Things really settled down in the early 80s when personal computing was becoming common if not yet ubiquitous. 8-bit bytes and 16-bit words were all the rage. As welcome as that was, I was more relieved when I stopped having to worry about EBCDIC (or Baudot) popping up when we finally settled on ASCII as our de facto character set of choice for most purposes.
lol. Because I still mess with old microcomputers I still run into things every so often... Like, you wouldn't think so from a modern point of view, but early 80's keyboards can be surprisingly annoying thanks to the missing buttons. Like, on my desk here is an Atari 800XL. Has no equivalent to the 'alt' key on a modern keyboard (though it does have a dedicated key for 'inverted' characters, which on older revisions of the hardware was an 'atari' key.) Control is in a weird location, there's no function keys, arrow keys are bound as secondary functions to something else... Oh, and of course the standard software for entering programs assumes 'overwrite' is permanently active (and there's no insertion mode the way a modern system would tend to default to. To insert characters requires pressing the insert key repeatedly then typing new characters in the space opened up) For good measure, this system's text is encoded in something called 'atascii' (atari character set) Which is similar to ASCII but nonetheless quite incompatible with it anyway. while we're at it, due to the nature of the machine, you're often working with it at a low level. And... Well, not only is the official character set ATASCII which is it's own thing. But the internal numbering of the standard font doesn't even align with the ATASCII code (the OS/BASIC ROM internally translates), and the keyboard scan codes are a 3rd arrangement unrelated to the other two. Because, hey, who needs consistency right?
@@KuraIthys I used to do some programming on the original Atari 800 (in 6502 assembly, of course) and I agree. Trying to translate programs from other computers was always oh-so-much fun. Although, having the complete OS source code was very helpful in circumventing a lot of silliness. It's just what life was like back then. I'd be bouncing between various UNIVACs, DEC Vaxes and PDPs, HP minis, a plethora of 6502, 8080, Z80, 8086 and 68000 PCs all using different monitors and OSs and all in their particular machine/assembly languages. Not to mention the dozen or so "common" higher level languages for various purposes. Without a doubt, one of the funnest times of my life. If you can be on the bleeding edge of a paradigm shift, ride it as long as you're able.
Great video explaining how this worked. If they had a cellphone cradle my phone would have been a great backup system. But the phone would have lost signal like driving into a tunnel. Ahh the good ol days, those guys were brilliant using what was available then. Wow
5:19 The code on the lvdc is probably still technically classified as top secret (And the same algorithms used are probably running on other hardware right now). So you could understand how the ability to guide a device from one point on earth to a distant point on a different, or more importantly the same planet could be considered classified. And publication of the binaries or source code would be in breach of the current weapons export controls. But the reality is that smart people could fully reimplement the same functionality probably slightly less efficient, using more computing power. But the maths involved in all aspects required to implement it is far more accessible now (online) than it would have been half a century ago. I was going to say that it is not rocket science, but it is :)
I remember a TV interview with a programmer responsible for the Apollo moonlanders. He was asked how they managed to get their programs 'bug free'. He answered that they never did! He estimated that there were probably thousands of programming errors remaining in the software which was actually used by the spacecraft.... Yeah, reminds me of the days I spent designing machine code programs for the Motorola 6502 in a Commodore Pet in order to create a system supplying methadon to junkies in the Netherlands....
Speaking solely for myself...dayum! 👍 I knew about the LVDC from a friend who's an IBM bigot, didn't know that the LVDC source was lost to history. I've spent some time reading the online copies of the AGC source; from the standpoint of a former code geek that's just schweet. Well documented, a pleasure to read. Thanks for the look back in time... 👍
Those electroluminescent displays were also not trivial pieces of engineering. All this was actually not long ago, I am in my late forties and I have had core memory in my grubby hands. It used to fill 19 inch racks here on earth. And flashing your dongle in public would buy you a ticket to a stay in club fed.
I still have a layer of CDC Cyber core memory! When the University of New South Wales decommissioned their Cyber 72-26 it's parts were available as souvenirs. As I had been one of the first operators, I got a few bits. Somewhere there may be the logic gate I snagged as well? The 3 foot diameter 808 drive platters made amusing tables for a while, but were too unwieldy to remain in use for long.
Here I am studying flip-flops and counters as if they’re just things we dig out of the ground and stick in computers. I had 8 weeks of analog before getting into digital but still, learning about all this stuff in the physical component level is amazing me every day.
Awesome video. I love hearing about all the crazy competing design concepts used in computers back then. Today everything is basically Intel or ARM. Back then there were different ideas and concepts and a willingness to explore. Fascinating!!!
Actually, there are quite a few architectures in common use. Your wi-fi router is probably running a MIPS processor. SiFive is gathering momentum. And if you look at the world’s fastest supercomputers, you’ll see a few POWER machines near the top. What do these processors have in common? They all run Linux.
@@lawrencedoliveiro9104 Lets not forget RISC-V, the great hope for an open hardware future! Much more common, if humble, is the processor inside all those Arduinos!
Yes Mike, some were fascinating. But some were just frustrating. Have a look at the architecture of the CDC Cyber range. For no particularly sensible reason that I ever heard about, Seymour Cray cursed them with a very limited address size, although the data word length was 60-bits - arranged as octal. They were a dreadfully limited thing to try to use as a more general-purpose computer, or even with really big, complex programs. Then Cray went off to create his dream machine... declaring memory size to be the really big advantage of the Cray-1! I remember reading the article on the notice board as we awaited the arrival of a CDC Cyber 72, and wondering just how this would turn out!
Good god, I just went down the rabbit hole of core rope memory because it's the first time I ever heard that term... Man, what a glorious piece of ingenuity but I gotta say it's probably the most hacky thing I've ever seen. Can't believe something like that got us to the moon.
@@seannot-telling9806 That voltage is still in the range of Computers today....l am 77 years old now....But l still build a Computer now and then....Cheers my friend...!
Wow, I didn't know that they had a backup strapdown IMU. That's very smart, no gimbal lock without gimbals. I was under the impression that strapdown units only really became a thing decades later.
An interesting fact about the LVDC, Scott, is while the blueprints for its' hardware are available the software themselves have been lost (Who wrote the programmes is not known any longer either) however they could be extracted electronically from the LVDCs in the IUs of the two surviving Saturn 5s. However there available online several NASA technical reports which detail the algorithms and equations that formed the basis of the LVDC programmes.
One comment, the LVDC was inherited from the Gemini guidance computer. The divisions used to overheat it so the "developers" needed to throw in a couple of No Action commands there to let it cool - the only way they figured to overcome the issue. MIT, of course, knew better and did what IBM was unable to. Knowing IBM, their tendency of being abandoned in favor of better options remained a policy to this day! ;-)
I was in the U.S. Navy in the time frame than these computers were being designed, and was aboard the first of the Navy's ships to be equipped with solid state computers, the Naval Tactical Data System (NTDS). Like the four different Apollo computers you describe, the several components of the earliest NTDS all used different types of logic, different word structures and and even different logic voltages. As usual at that time the choice was up to the manufacturer with only the intercommunications between devices dictated by the Navy team in charge of the system. There were some strangely named gates used and both positive and negative logic employed. As a technician working with the whole system as installed, it was sometimes a real challenge to follow a chain of commands that were not acting correctly on the display consoles. As with the early PC's reboots were a common necessity. We had two main computers for redundancy, though there were a few single points of failure in the system. Punched paper tape could pile up on the computer room floor trying to troubleshoot. The craziest comment I heard from one of the industrial designers who came aboard to see how this, essentially prototype, system was working, was when he was shown the reverse writing on plexiglass plotting boards that would have to be done manually when the computers were not working. "No, No! he said. Our system is supposed to replace all that!"
Thanks for the comprehensive overview.
Computers and reboots... together forever.
But with me and naps, how can I judge?
M mks mop mind man mmmmmmm mom m know mml NJ mmm blink mom mm MP nmmmn much m ZzzszzZ lml mom LMK blink
5:22
-"If they'd let a hacker anywhere near the thing"-
*If they take their historical preservation responsibility seriously.*
Code is not some throw away element of the past. It is highly refined prose that hundreds of people spent thousands of hours refining. It deserves just as much attention as the hardware if not more. I suck at it, but still... I can appreciate it.
Add a few more zeros to the hours spent to compute and refine the computation. The people involved in the math race were ran ragged by the end.
just recreate it in fpga and boom.
Ctrl+A, Ctrl+C, Ctrl+V
If the code is in there and not examined, it may as well not be in there.
Hear, hear! :) Without the code, the computer is meaningless - it is just a hunk of metal and silicon. Its actual historical function can only be understood by reference to the code which made it perform that function. If that code is locked up inside the hardware, it is "critically endangered" - a mishap could destroy it forever. Every copy of it held externally to the hardware is a vital historical "insurance policy".
Arguably, the code is even more important than the hardware: should the hardware be lost, the code would still provide deep insight into the inner workings of the launch vehicle and, indeed, the collective thought processes of the design team and it could still be run under emulation or even on a replica of the hardware if schematics are available or were to come to light in future. Were the code to be lost, any hypothetical replica hardware would still be of little value.
I'm a former NASA contractor and lived NASA until my retirement in 2005.
I've been waiting decades for someone to explain the entire Apollo Spacecraft Flight Computers in a manner where everyone watching would understand how they worked. With this excellent presentation, everyone can see where we were back then with our imagination and desire to reach the moon in less than 10 years as a National Goal as set by President Kennedy. Thank you!!!
I discovered and downloaded NASA Technical Note NASA TN D-5869. “Description and performance of the Saturn launch vehicle’s navigation, guidance and control system”. The calculations they did in that digital computer are amazing in number and function!
"Notably, the first code flown on an Apollo spacecraft was called..."
DEMONETIZED.
roflmao
"Speaking of programming, about those algorithms..."
After a ggl search I can not find any references to this, do you have any citations?
Calm Volatility
Whoosh!
@@calmvolatility2787, To dissect the joke. The code was called "Corona". To prevent spread of misinformation about of the current pandemic, youtube algorithms try to suppress every mentioning of the sickness in any context. In effect every mention of "corona" would get you "demonetized".
"rope memory had to be hardwired at the facory". this actually gives a new old shine to the word "hardwired"
Wait till you read what ‘patch’ originally meant :)
Hardwired to selfdestruct......
Scott, you rock star, you rocket star. You rock and never gimbal lock.
Cheers man.
Interestingly there is an asteroid called Scottmanley
Spec four gimbals next time, oh just use MEMs.
One of my favorite facts about the AGS is that they implemented their read-only memory by omitting the Y addressing wire through the cores holding 0. This made it so that during writeback, these cores were guaranteed to not flip. However, it did mean that a loss of power could leave you with your hard-wired cores reading the wrong thing. To correct this, the first thing the software does when it boots up is "prime" the hardwired cores by writing to every single location.
I always hope a certain frenchman can get a look into one the AGS...
@@fabiosemino2214 They're much harder to come by, from what I can tell. I currently only know the location of a single one in a private collection outside of museums. And so far we haven't been able to locate any schematics at all for it, which would make things quite a bit harder than we had with the AGC.
@@mikestewart8928 Drop a line to Curious Marc on UA-cam. They have a full set of schematics for the AGC and multiple dumps of the LM code from different versions.
@@fabiosemino2214 yea, but I only hope that when he restores / rebuilds one of those he swaps out all the old paper and electrolytic capacitors which they dind't do with the Teletyper powersupply restore, instead they tried to reform them 🙄That's just asking for trouble 😨
@@allangibson8494 Ahem. Mike is the one in the team who gave us all the schematics and dumped the LM code...
Those were the 13 longest minutes of my life ... Scott took every second and literally crammed it with information, this is good stuff!!
Now days you only need one "computer" to do all the task. Its called MechJeb
does it work in latest build? i thought mechjeb got retired?
Spunkmire yeah it works, there was an update not too long ago
I think that I just earned my geek degree. I actually understood everything that Scott talked about. It helped that I also got to view the core memory last month at the US Space & Rocket Center.
Covered pretty much every concept from college level computer hardware classes.
I just sent the link for this page to my brother in law. He worked as a programmer on this project.
Hi! New subscriber here. It is absolutely necessary to mention the impressive hardware and revolutionary software that supported the Apollo program from the ground back in the Real Time Computer Complex! The RTOS (extended IBM OS/360) was a marvel. The whole thing did things that we take for granted today in modern operating systems but where unheard of back then.
Instead of calling it the Back Up Guidance System they should have called it the Back Up Guidance and Orientation Unit Technology, or BUGOUT.
Unfortunately, NASM's LVDC was never loaded with code; I was in contact with them regarding dumping it many years ago and they were open to the idea, but when we did further research to determine what may be on it, we found it was only used for testing.
yeah more nasa BS.
@@romhackstashbox1275 You sound bitter. That can only mean one thing. You think it's a CONSPIRACCCCCCCCCCYYYYYYYYYYYY!!!! Lol!
@@romhackstashbox1275 More probably IBM bs - they haz secretssses - trixy Hobbitses they isss -
Rom Hack Stash Box don’t fall off the edge when the quarantine is over, lol
+3 massive IBM 360 mainframe computers on the ground (rtcc) doing the heavy work. Which are almost always completely ignored (especially in silly comparisations of "computing power of a Apollo mission).
A modern smartphone is stilll much better. But things like "a digital clock is better" are just bullshit.
Not to mention 3 rather good biological computers on board. And a LOT more on the ground and pre-mission.
Also some people find it absolutely incomprehensible that you can do serious math with pen and paper or a slideruler... Sir Isaac does not approve.
What's more impressive is I'm certain some of those old NASA graybeards designing these things could do rather complex calculation in their heads.
@@shadow7037932 It's a function of computing power and time. You can do incredibly complicated calculations on paper. It just takes long. And people in ye olde days were used to that.
I never used a slide ruler but calculation aids in various forms are known for millenia.
The main work like the trajectories were calculated, checked, tested, simulated and so on months in advance.
Often by a (now) kinda famous group of woman which were called "computers".
Based on Newtons laws and rules, calculated (published) ironically in 1686. Centuries before the first x86 CPU.
@@shadow7037932 The main human computer didn't have beard, gray or otherwise. She was Katherine Johnson. An African American woman.
and oooh boy yeah the original mathematician team was all male.... She was just so good at her job that even in the segregated and male dominated field of the time, she was the one flight crews trusted most. To the point that flight crew astronauts asked specifically, that Johnson be the one to be responsible for calculating and verifying the orbits and maneuvers.
@@aritakalo8011 I think that is a bit of a Gene Kranz thing (...who some think was the only flight director). She wasn't the only woman working on that. Shes just the one most featured in books and documentary. www.history.com/news/human-computers-women-at-nasa
@@5Andysalive Other endeavors before NASA also had dedicated human computers on staff.
I could imagine building a computer from scratch today would be so hard to do! Gotta give so much respect for the work they put in to create these, which we take for granted every day.
It's actually surprisingly simple because these days you can assume to have access to logic gates that just work "out of the box" - and then all you need to worry about is just the pure binary logic. And designing a CPU using just logic gates is, although not trivial, not incredibly difficult either, it's something many students and hobbyists alike are able to and often do just for kicks, or as personal projects. I've designed a 16-bit CPU from bare-bones, starting with just a NAND-logic gates (which itself is just made out of either 2 or 4 transistors, depending on the semiconductor technology used), and it's actually quite surprising how little it takes to have a fully functional CPU that can do all the common mathematical operators and access RAM & ROM, etc.
Back when these guys were doing it though? An absolute nightmare to achieve the same.
Would be great if you could do an explanation of the actual navigation process as opposed to a hardware review. Thanks
Oh, yeah. This takes me back to my Mk. 152 Univac Fire Control Computer in the Navy. 16 bit Simplex, and 32 bit Duplex words. Core memory. Huge power supplies. Chips? What are those fancy new thingy's? Machine code programming. Grace Hopper and her Micro-secant.
What ship where you on?
What exactly did it do? Did you ever use it in serious combat? Did you like it? What was the worst thing about it? Tell us more before you die and the ancient computers with you :(
@@G4m3G3ni3 The fire control computer interfaced the radar, launchers, and tactical data system together and also gave the missiles rudimentary (compared to today,) guidance orders. The beauty of core memory is they do not require power to retain information. You could shut the power off at any time, come back in six months power up, hit go and the program would restart at the next instruction. But, it is slow, power intensive, and expensive. And huge space wise. The Data converter unit was it's own refrigerator size enclosure, for 16 data channels. Data passed through dozens of 90 conductor armored cables, relay switch boards, and electromechanical switches. The computer on my desk is thousands of time more powerful, compact, and reliable. It also runs on a fraction of the power, and waste heat is still it's biggest enemy. I joke that the biggest 'upgrade' the machine got was when they glued a "SPERRY" tag on the thing when Sperry bought Univac. But the programming was entirely in straight machine code in Octal format, and directly accessible from the front panel. The Terrier missile system used 32K of core ram, half of full capacity, but double what the Tarter system used. There was also a Teletype machine to access sumo sub-programs, and input instruction for training and testing purposes. Maintenance of that was nightmarish!
Hope that satisfies your curiosity.
@@kevinbendall9119 I hope you got the paperwork declassifying all that given how the US Navy famously prolongs the classified status of some technology.
@@kevinbendall9119
"core memory is they do not require power to retain information"
Yup. ferrite. little toroidal magnets.
Just incredible detail. Congratulations, 1m subs well deserved. Sincere thanks for a fair few Corona hours well spent.
I've been a programmer for over 30 years now and at 3:36 Scott just starts with some pillow talk, then just carries on talking dirty to me!
13:16 well, we still can't agree on that.
While the word size is either 8, 16, 32 or 64 the instruction sizes still don't align.
x86_64 have 8 bit up to ridiculously 120 bit for instructions. While integers might be 8 bit, 16 bit ,32 bit or 64 bit.
We have two different types of Endianness - one used on ethernet connections, the other one used for basically everything else.
We have several different architectures, which use 64 bit and 32 bit integers while using either 32, 48 or 64 bit memory addresses. Our filesystems are sometimes 64 bit, while most USB sticks still cant accept more than 4 GB files, since the filesystem is 32 bit.
So yeah.
I don't think the mentioned problems are real/relevant in the modern computing, everything is using 64 bit, even phones. As for pendrives - it's just the stupidity of defaulting to FAT instead of formatting everything in NTFS by default (licensing reasons... ?).
That is, unless you're targeting some semi-custom platform like a weirdo router that uses some CPU architecture forgotten by time.
A diverse ecosystem is valuable. That’s why Linux supports something like two dozen different major processor architectures-more than any other OS in history.
And it also supports a wide variety of filesystems, not just NTFS.
There was a major schism at the time between 36 and 32 bits; 36-bit octal machines such as the PDP-10 were popular in research circles because a single word gave you ten decimal digits of precision. Xerox PARC, for example, wanted a PDP-10 so badly that when Xerox management denied them permission to buy one, they built their own clone of it out of components and called it the Xerox MAXC. The 32-bit machines, however, were cheaper and IBM-sponsored, so they were more popular with management types and therefore more successful in the market. And that's why everything today is a 32 or 64-bit hexadecimal machine, while 18 and 36-bit octal machines are forgotten. The worst enemy of a superior solution is a cheaper inferior that is just good enough...
@@randomname3566 fat32 is kinda the lowest common denominator. Apple and MS insist on closed proprietary hyjinx.
There was rather more than just 32 or 36. There were also 24-bit, 48-bit, and 60-bit machines, just to mention a few.
+1 for showing the transcript of Vance Brand on the CAPCOM link. As a family friend - always love seeing him come up in retrospectives.
LVDC - Launch Vehicle Digital Computer for the Saturn V made at IBM Owego, New York. where Dad later worked on Space Shuttle Flight Computer Boards. During Apollo and Skylab Dad communicated to the LVDC via the DDAS Station in the the different Firing Rooms of the Launch Control Center. On one occasion Dad and a security guard had to go to the top of the Launch Umbilical Tower to change a "card", one bit in the DDAS Bus, that then went across the swing arm to connect to the Instrument Unit in the Third Stage. Keep up the good job Scott.
I'm afraid this video has gone completely over my head. Even with more than one viewing I only understand bits of pieces of what you're saying - but I still really enjoyed it! It's very relaxing to listen to someone who really knows what they're talking about as they take you through a subject you _almost_ understand. It really focuses the mind on 1960's computers instead of present day problems.
So thank you for another great video, Scott Manley! The research you do on these videos is incredible.
Well done Scott. I remember reading a very in depth technical article about the guy who resurrected the AGC code directly from the AGC computer at the Smithsonian. That and documentation he found and one of the engineers who had worked on the thing. The culmination is essentially a digital archive of the actual code and documentation for that code. He then went on to build an "emulator" using a raspberry pi I think, and fabricated an exact replica of the DSKY interface and display unit. The whole thing works and he was able to simulate trajectory corrections, LEM landings and launches, Insertion burns, etc. Pretty amazing feat of "digital archeology." Thank Scott for your awesome content!
I am not a "computer nerd" so I have a really hard time understanding a lot of what Scott is saying in these videos about the Apollo Missions' Computer Systems. However, that does not stop me from attempting to learn and understand as much as I can about these systems. I am just fascinated by them and I really appreciate Scott bringing to us these amazing, yet very technical, stories and how these computers were made, how they worked and how they were used to get Humans to the Moon and safely back to Earth. The Core ROPE Memory is the most fascinating of them all to me because the way it was made required a lot of time, dexterity, nimbleness and skill.
Actually by the time of the manned missions making core rope was mostly automated. The machine Scott showed in this video moved the core rope to the correct place, the lady (they were all ladies) just had to push the wire through, then the machine moved to the next place etc.
Let’s get CuriousMarc access to LVDC!!
Fran Blanch gave EEVblog some LVDC chips to test.
They were at an auction to get more parts, but couldn't because of course some rich f*ck bought them because of the 50 yo anniversary. Now they will rot in some garbage home of poor garbage taste. I hate these f*cks with a passion. This is why channels like CuriousMarc should be supported, so that they have the money to outbid these rotten, useless pigs.
One shinning beacon of this method is Steve1989, who reviews military rations from every era and age, along with military kits and so on. Some of the things he shows, are not even recorded. It took me months to dig out some information on some of the things he showed as physical items.
Those rations (insert here, anything of historical value) would've rotten in some dumbass' collection, never to be seen again, were it not for him.
aserta in fairness Marc got access to the AGC from a rich benefactor and was able to keep the code after the restoration. Not all collectors want to lock everything up. But I get your point none the less.
@@dmacpher It wasn't a rich benefactor. It was a guy who bought a pile of Apollo scrap for cheap. He became rich decades after, by selling the perfectly working computer.
The only complete LVDCs with functional code inside are stuck inside Saturn V on display at museums.
As a boy of 10, in July of 1976 my father took our family to Disney for the big 200 year anniversary of our country. After the awesome fireworks show @ Disney World, my father took us to the KSC for a tour of that magnificent place where the moon shots originated. I remember the Guidance ring distinctly as we were encouraged to "touch and feel" history. The sights and smells of the control room, (yes everyone smoked back then and the room smelled like a bar and the paint was all dingy from the smoke as well) I'll never forget going into the assembly building and being awestruck by the sheer size of it!!! (at that time they were transitioning over to the shuttle program and they hadn't gotten to the refit of that building yet which allowed the public a peek inside) Growing up on the gulf side of Fl I wasn't able to witness the Apollo launches, (too young but mom did take us out to the end of our drive so we could watch Apollo 15 (night launch) head off for the heavens), but I did drive over with friends to watch the shuttle launch around 1983, ish... Sorry for the ramble Mr Manly, but I do enjoy your space videos very much!!! :-)
Fantastic episode again!
I know about these computers, but you always manage to provide some new insights and information.
Thanks Scott!
As a fellow computer programming nerd, I found this video so fascinating. I hope you do more in the future on how rockets and other space vehicles utilize programming to complete their missions.
9:11 There was one hardware difference between the CM AGC and the LM AGC: in the Command Module, there were two separate DSKYs. The second one was located in the space behind the main control panel, where the navigator stood while taking sightings through the telescope (and also I think where the crew had to go through to get to the docking hatch).
Thank you Scott for this great video. I am a big fan of the documentation "moon machines" where they also have an episode covering the computers. But not nearly as detailed as you did! Awesome! I wish people nowadays would appreciate much more that nothing on their smartphone would exist without those geniouses who built the first computers!
Basically the apollo program took less computing power than a recreation in kerbal space program
lets put it this way, ive heard claims that the entire Apollo computing system could have been run on an original gameboy, im inclined to believe it
Alessandro Bianchi by an enormous factor ! The orbit calculations were done on the ground and radioed up. KSP does those as you play and does hundreds of geometric calculations per pixel each second, on top of it.
Alessandro Bianchi god damn...... Brain blow
Computing power isn't everything. It's how you apply it.
Here's a simple example. I've a top of the line laptop for work. It has to have as good a computing/computation power i can get, to give me the correct architectural solutions IN a portable format as fast as possible. If i need to know if a beam isn't ok, then i need it fast. THAT is great, it's awesome, and it made my life much easier.
But this laptop, for the life of it, cannot interface with the CNC machine that makes the custom fit beams we use in restoring old buildings. It needs another "computer" to run the stepper motors, read the scales, so on.
And that's just a modern concept, back then they had to deal with a heck of a lot more things, like vacuum, cooling, heating, fire hazards, robustness.
IMO, the computers that were put to work were the best they could be at the job they had to do, after all, you can't run a rocket with a modern laptop, anymore than you can use one of those old computers to run a computational program...but you could make a CNC out of one, and it would be the best CNC money could buy, because precision was adamant.
Except that the Saturn control computer and the ACCs actualy controls the vehicles. Almost anything developed since has more computing capacity and more memory but lacks the I O hardware needed to do anything.
Thanks for documenting, and explaining all this
as always your presentations are fascinating. While I knew some of this (having been an avid follower of US space missions as a child) you have once again given us a comprehensive review of all the pertinent elements. appreciate it. makes our shelter in place easier to deal with.
Nicely, Done, Sir. 16-bit, 15-bit, 18-bit. I grew up on a PDP-1 with core memory and 16-bit words with 1 parity bit. Since it was DEC all op coeds were in octal. Great times, include the original Space War code writtien 4k memory.
The AGC NOR gate chips use DTL, not TTL as Scott stated. Furthermore they used the DTL aspect to wire OR signals in some places, so you cannot build a direct copy using TTL chips since they can't do that.
Another great effort Scott! Again, I learned a lot. Greetings from Arizona.
As somebody with an appreciation for history and electronics- thank you for this video!
Every single time I discart the "build myself an AGC project" with Arduino or even better an old school 8bit 80's computer or something like that I see a video like this and... holyyy c**** ..... I NEED IT!! ;) Thanks Scott!!
Notice how the Lunar Landing Module maneuvers in outer space were more akin to the movement of the repair pods in "2001:A Space Odyssey" than the space ships in "Star Wars". That underscores Stanley Kubrick's attention to detail.
Hm, nope. It looks nothing like that.
The OP is talking about how spaceships maneouvre in vacuum versus how aircraft fly in an atmosphere. The “Star Wars” fighters banked and turned like aircraft, not spaceships.
or nasa copyed kubrick since no one has ever been to space. or wait kubrick worked with NASA.
@@romhackstashbox1275 or you are conspiratard, and probably a flat earther too. Get lost. Go read the bible or something and leave the rocket science to the adults.
Who filmed it though!!!!????
I know this is a late comment, but perhaps you could give a link/shout out to Curious Mark’s UA-cam channel: his crew painfully documented the acquisition, disassembly, operational checks, code verification and many other activities to basically start from scratch to resurrect the AGC. It is an incredible work of love and time they put into that project, culminating with the simulated Apollo 11 landing on the moon, complete with 1202 error codes. Riveting videos, one short screen I saw on this presentation of yours.
Their work combined with yours is a great source of “how it was done” in early manned space flight.
Thanks for all of your work, Scott!
Brilliant video Scott...even for us whom have to look up some of the terms...operand?...have watched it twice...keep 'em coming mate..REALLY helps with lockdown in south east London lalala🙃
The AGS was like the can of spam in the back of the pantry, you think you're never going to need it until the world ends THEN botulism.
Last week I watched my Blu-ray disc, Apollo 11. It had been a while since last viewing. That is one heck of a good documentary. The large format film used was a very wise choice back them.
It's a good job Apollo happened in the 1960s and early 70s. A decade later and it might all have been shot on VHS for posterity.
Excellent presentation. Thank You.
Good stuff as always, thank you for sharing these gems!
Thank you Scott this was a very interesting video. What really amazes me is how a technology that was wasn't even barely barely understood was used to send humans to the moon and back safely and then of course with Apollo 13 you know to be able to operate on the fly. And to what really amazes me is the fact that our cell phones have more technology and more computer operations and then hit the computers that actually sent Apollo to the Moon.
I was always fascinated by that classic footage of "the ring" being jettisoned during an Apollo mission, so I learned about it!
When I finally saw an actual Instrument Unit on display at Kennedy I took about 10,000 photos and really felt like things had come full...circle? LOL
One vote for the LVDC here! :)
That ring is the interstage and isn't related to the Instrument unit.
The lovely interstage separation (skirt sep) shots. Ahhhhhh. As Scott said, that was the skirt surrounding the S-IIs 5 J2 engines. The IU (instrument unit) is where the LVDC and the analog computer were. That sat atop the S-IVB, and stayed with it until it crashed into the moon, or was sent on a wild ride around the sun, as in the case of Apollo 12. The IU had to stay attached, to still control the trajectory and transmit data.
In the case of Apollo 13, after the explosion, the IU was causing interference in communication with the LM, because it was essentially hijacking frequencies from the weaker Omnidirectional antennas that were used. They had to push the S-IVB out of the way a bit.
Sorry, this turned into a long thing lol
Fun trivia: magnetic-core memory is why the process of (or the file created by) reading out the full contents of a computer's memory is still sometimes called a "core dump", even when the computer uses semiconductor RAM with nary a core in sight.
I'm swimming in fun...
Excellent explanation, thank you.
The Saturn V "analog computer" was the optimal approach for a time when fast A/D and D/A converters were yet to be invented (I spent the good part of 20 yrs of my design career with analog computers).
I like the final hyperbole: "when the computer designer couldn't agree on the number of bit in a word". How true. I guess we should say thanks to Mr. Shannon for the eight bit in a byte.
Edit: I didn't know Armstrong gimbal-locked the half-LEM before docking. I remember, though, some astronaut asking for a fourth axis, and never understood why MIT didn't allow for it. Gimbal lock is a totally man-made system failure, IMO.
A fourth axis would have added weight. It was a compromise, save weight but have this scenario you need to avoid. Gemini capsules had four axes, but then they didn't care about weight so much.
I can't guarantee it, but I belive it was Michael Collins who asked for the 4th gimbal for Christmas.
I like 18 bit solution. The more bits the longer the word and the better freedom of expression!
An interesting fact about core memory is that it couldn't be read without being erased. Reading is done by trying to set the magnetic polarity of a core, and if it then flips from the other polarity, that generates a pulse.
Edit: here's a video about it: ua-cam.com/video/p7SkE5pERtA/v-deo.html
Same thing with modern DRAM, reads are destructive.
The more I see from the Apollo missions, the more amazing it is to see they actually made it to the moon and back.
"Everything rises and falls on leadership."
-John Maxwell
If the organization get its people right, the right people in the right roles using the right skills and tools will handle all the technical details.
So much careful design went into it. The two absolutely critical engines that, if they didn’t fire, the men would be left marooned in space or on the Moon-the Service Module and the LM Ascent stage-were hypergolic. That means their ignition only depended on simple on/off valves working properly, not on more complicated throttles, pumps etc. The initial trajectory to the Moon was “free return”, which meant that if nothing further was done, they would simply come back to Earth again. This is what happened with Apollo 13, where the main issue was figuring out how to last the time it took for the trip back.
A little known Apollo trivia tidbit: Raytheon's Phalanx 20mm Close In Weapons used ubiquitously on nearly every US warship, the Block 0 (about 1980) and Block 1 (1988) versions all used a derivative computer that was nearly identical to the AGC right down to the magnetic cores and "rope" memory. The Navy needed something absolutely rock solid reliable and simple enough to withstand nuclear EMP yet be compact enough to stuff into a large can on top of a Vulcan rotary gun. The irony was from Apollo the Phalanx was born...
2:00 , the LVDC ceramic chips were NOT soldered but retained by springy contacts
Watch FranLab for a teardown of an LVDC board.
From a technical standpoint, the integration of the chips in the LVDC was way more of a advancement than the way the AGC was built.
Modern computers are way more directly related to the LVDC than to the AGC.
At 5:07 there is a comparison in which the AGC performed an operation in 168 microseconds with 28 bit precision whereas IBM's LVDC required 246 microseconds with only 25 bit precision. Was IBM's position that the extra performance of the AGC was unnecessary to accomplish the mission? The analogy being that a road trip can be successfully completed in a Trabant as the capabilities of a better car wouldn't affect the completion of the road trip.
At 5:15 an explanation of this program would interest me if no one else. Surely this one page is not the entire program?
Rope memory? Was this similar in any way to the iron toroids in the IBM LVDC memory?
Rope memory also used ferrite cores.
8:37 - So what would they have done if a failure occurred which both crashed the AGC _and_ wiped out its last-good-state information?
Probably they'd have died.
During lunar landing they'd have used the AGS.
The rest of the time they had hours to realign the platform using the stars.
@@scottmanley couldn't they get updated data from ground control (outside of landing/ascent)? Or would the multiple time delays make it corrupted?
@@HuntingTarg Once they got the computer-radio interface running uploading an approximate state from Houston should be doable. But star realignment was already something they did multiple times on each trip.
Start writing to your elected representatives.
Let's get Scott into that museum (so he can extract the code from that guidance computer).
He's not that old
@@eFeXuy Just edited my comment. He would be there to do a job.
@@pentagramprime1585 : Is Scott trained in computer design? He might be a mismatch for the initial extraction job.
@@absalomdraconis I was gonna suggest he bring help. But my comment was already becoming rather lengthy.
If nothing else I'd like him to be there to document the work. Furthermore, it would be in the museums interest to facilitate such an operation. They would be curating the original code.
and then we put it on 4 arduinos and fly to mars.
Nice one.
Make a video about the support computers on the ground next.
Byte and word lengths were a mess. It felt like I was having to adjust to a different system for every architecture. Things really settled down in the early 80s when personal computing was becoming common if not yet ubiquitous. 8-bit bytes and 16-bit words were all the rage. As welcome as that was, I was more relieved when I stopped having to worry about EBCDIC (or Baudot) popping up when we finally settled on ASCII as our de facto character set of choice for most purposes.
lol. Because I still mess with old microcomputers I still run into things every so often...
Like, you wouldn't think so from a modern point of view, but early 80's keyboards can be surprisingly annoying thanks to the missing buttons.
Like, on my desk here is an Atari 800XL.
Has no equivalent to the 'alt' key on a modern keyboard (though it does have a dedicated key for 'inverted' characters, which on older revisions of the hardware was an 'atari' key.)
Control is in a weird location, there's no function keys, arrow keys are bound as secondary functions to something else...
Oh, and of course the standard software for entering programs assumes 'overwrite' is permanently active (and there's no insertion mode the way a modern system would tend to default to. To insert characters requires pressing the insert key repeatedly then typing new characters in the space opened up)
For good measure, this system's text is encoded in something called 'atascii' (atari character set)
Which is similar to ASCII but nonetheless quite incompatible with it anyway.
while we're at it, due to the nature of the machine, you're often working with it at a low level.
And...
Well, not only is the official character set ATASCII which is it's own thing.
But the internal numbering of the standard font doesn't even align with the ATASCII code (the OS/BASIC ROM internally translates), and the keyboard scan codes are a 3rd arrangement unrelated to the other two.
Because, hey, who needs consistency right?
@@KuraIthys A highly modern computing project I follow needed updated patches for EBCDIC compatibility, and that was April 2020.
@@KuraIthys I used to do some programming on the original Atari 800 (in 6502 assembly, of course) and I agree. Trying to translate programs from other computers was always oh-so-much fun. Although, having the complete OS source code was very helpful in circumventing a lot of silliness.
It's just what life was like back then. I'd be bouncing between various UNIVACs, DEC Vaxes and PDPs, HP minis, a plethora of 6502, 8080, Z80, 8086 and 68000 PCs all using different monitors and OSs and all in their particular machine/assembly languages. Not to mention the dozen or so "common" higher level languages for various purposes. Without a doubt, one of the funnest times of my life. If you can be on the bleeding edge of a paradigm shift, ride it as long as you're able.
The analog computers don’t get as much attention and credit. Thanks for pointing them out. A deeper dive into them would be interesting.
Great show 👍
Great video explaining how this worked. If they had a cellphone cradle my phone would have been a great backup system. But the phone would have lost signal like driving into a tunnel. Ahh the good ol days, those guys were brilliant using what was available then. Wow
5:19 The code on the lvdc is probably still technically classified as top secret (And the same algorithms used are probably running on other hardware right now). So you could understand how the ability to guide a device from one point on earth to a distant point on a different, or more importantly the same planet could be considered classified. And publication of the binaries or source code would be in breach of the current weapons export controls.
But the reality is that smart people could fully reimplement the same functionality probably slightly less efficient, using more computing power. But the maths involved in all aspects required to implement it is far more accessible now (online) than it would have been half a century ago. I was going to say that it is not rocket science, but it is :)
I remember a TV interview with a programmer responsible for the Apollo moonlanders. He was asked how they managed to get their programs 'bug free'. He answered that they never did! He estimated that there were probably thousands of programming errors remaining in the software which was actually used by the spacecraft.... Yeah, reminds me of the days I spent designing machine code programs for the Motorola 6502 in a Commodore Pet in order to create a system supplying methadon to junkies in the Netherlands....
Thanks for another great video 🙏
This information is fantastic. So are the comments of the people who worked with computers like these, of that day. I'm amazed
Speaking solely for myself...dayum! 👍
I knew about the LVDC from a friend who's an IBM bigot, didn't know that the LVDC source was lost to history. I've spent some time reading the online copies of the AGC source; from the standpoint of a former code geek that's just schweet. Well documented, a pleasure to read.
Thanks for the look back in time... 👍
Does this include Kerbal Space Pegram Running on my parents Dell computer?
I love the message at the end!
Those electroluminescent displays were also not trivial pieces of engineering.
All this was actually not long ago, I am in my late forties and I have had core memory in my grubby hands.
It used to fill 19 inch racks here on earth.
And flashing your dongle in public would buy you a ticket to a stay in club fed.
Let me guess, the place was somewhere out on the prairie and the thing was almost completely underground right?
I still have a layer of CDC Cyber core memory! When the University of New South Wales decommissioned their Cyber 72-26 it's parts were available as souvenirs. As I had been one of the first operators, I got a few bits. Somewhere there may be the logic gate I snagged as well? The 3 foot diameter 808 drive platters made amusing tables for a while, but were too unwieldy to remain in use for long.
Fascinating how some of those early computers worked...
Here I am studying flip-flops and counters as if they’re just things we dig out of the ground and stick in computers. I had 8 weeks of analog before getting into digital but still, learning about all this stuff in the physical component level is amazing me every day.
So your textbooks no longer include transistor level diagrams of TTL and CMOS NAND gates?
12:09 to 13:00 that is incredible video!
Awesome video. I love hearing about all the crazy competing design concepts used in computers back then. Today everything is basically Intel or ARM. Back then there were different ideas and concepts and a willingness to explore. Fascinating!!!
Actually, there are quite a few architectures in common use. Your wi-fi router is probably running a MIPS processor. SiFive is gathering momentum. And if you look at the world’s fastest supercomputers, you’ll see a few POWER machines near the top.
What do these processors have in common? They all run Linux.
@@lawrencedoliveiro9104 Lets not forget RISC-V, the great hope for an open hardware future!
Much more common, if humble, is the processor inside all those Arduinos!
Yes Mike, some were fascinating. But some were just frustrating. Have a look at the architecture of the CDC Cyber range. For no particularly sensible reason that I ever heard about, Seymour Cray cursed them with a very limited address size, although the data word length was 60-bits - arranged as octal. They were a dreadfully limited thing to try to use as a more general-purpose computer, or even with really big, complex programs.
Then Cray went off to create his dream machine... declaring memory size to be the really big advantage of the Cray-1! I remember reading the article on the notice board as we awaited the arrival of a CDC Cyber 72, and wondering just how this would turn out!
@@John.0z Sorry, yes, I meant RISC-V. SiFive is just one of many companies embracing that architecture.
Super interesting, thanks Scott!
What a great video. Thanks so much
I only understand every other word (on average) in these explanations, but damn if I won't watch every thing you produce, Scott. Cheers!
There were also a bunch of IBM, Univac, and CDC computers supporting operations on the ground.
Good god, I just went down the rabbit hole of core rope memory because it's the first time I ever heard that term...
Man, what a glorious piece of ingenuity but I gotta say it's probably the most hacky thing I've ever seen. Can't believe something like that got us to the moon.
Scott what type of voltages did the computers run on?
DC 13v 12v 6v 5v 3v 2.1 v
@@steveshoemaker6347 Thanks for the info. That is a bit of a range of lower voltage.
@@seannot-telling9806 That voltage is still in the range of Computers today....l am 77 years old now....But l still build a Computer now and then....Cheers my friend...!
Always very interesting Scott. Thank you. Future computers will take us to the stars (well, to the planets orbiting those stars).
10:00 I love how those brilliant engineers that made these hardwares still had a dose of superstition
yep, we've all got a mental glitch of some sort,
Wow, I didn't know that they had a backup strapdown IMU. That's very smart, no gimbal lock without gimbals. I was under the impression that strapdown units only really became a thing decades later.
I barely understood any of this but it's still fascinating stuff and I'm glad you shared it
it never ceases to amaze me how far nothing but 0's and 1's have gotten us...
Gotta love binary, man...xd
My grandpa worked on the Saturn V instrument unit back when he worked for IBM
I've got some very good pictures of the LVDC and IU from Huntsville. Interesting machine.
Amazing that it actually worked
love the footage of the docking manuevers near the moon
@Scott Manley Great video!
I noticed that your signature sign-off sounded a little grim; am I mistaken? Hope you're doing well.
More videos like this!!!
Are those computers the same that Destin showed on @SmarterEveryday?
One of them is.
An interesting fact about the LVDC, Scott, is while the blueprints for its' hardware are available the software themselves have been lost (Who wrote the programmes is not known any longer either) however they could be extracted electronically from the LVDCs in the IUs of the two surviving Saturn 5s. However there available online several NASA technical reports which detail the algorithms and equations that formed the basis of the LVDC programmes.
I wonder if the LVDC code is quietly out of reach because it's still running on some ICBM booster in a silo somewhere...
Hmm, could be. If it works, why reinvent the wheel. Just the hardware has reduced in size.
10:22 Is that really 1,024MHz!? Or is that coma a decimal point (1.024MHz)?
At least one important space system still runs at 1024000Hz.
One MHz, obviously.
I think the gates in the AGC ICs were RTL (Resistor-Transistor Logic), not TTL (Transistor-Transistor Logic)
Do you have a reference for this, because I did look it up in NASA docs and now I can't find it....
One comment, the LVDC was inherited from the Gemini guidance computer. The divisions used to overheat it so the "developers" needed to throw in a couple of No Action commands there to let it cool - the only way they figured to overcome the issue.
MIT, of course, knew better and did what IBM was unable to. Knowing IBM, their tendency of being abandoned in favor of better options remained a policy to this day! ;-)