Hi, thanks for watching. I incorrectly said the landing was on 29 July, it was on the 20th. I just read it wrong on my script! Also this video is getting a few moon landing deniers unfortunately. I’ll try remove the more crazy comments but please ignore (unless you really want to get into an online argument!)
Although it's understandable enough to keep saying 1969... the truth is that the Inertial Guidance System (high-tech gyroscope) the Computer and its Software was the first contract issued for the Apollo mission, so its concepts and design are rooted in the key technologies of nearly a decade prior, and partly successful due to the foresight of what may be available when critical commitments to implementation are due - e.g. use of the first integrated circuits. The software was developed over a long time, and only just made the deadlines set for each critical step in the pathway.
hopefully, a decompiled version will be available! like c. historical we wrote in assembly, until we realize we can write higher level code like c to create assembly. then we wrote another layer like python and java to go a step further. java compiles to java bytecode, that then compile to an assembly with java virtual machine. there's no real reason to write in assembly, as most modern languages today compile to assembly. so we all still write in assembly. but with custom functions you can say.
Thank you for this video. Truly amazing software accomplishment. Amazing that NASA had this one-of-a-kind non-programmer computer-to-human interface. Also, you a beautiful and well spoken. Thank you again. Best Regards.
fwiw: Comanche is the European name for a very famous tribe of Native Americans inhabiting the Great Plains region of what is now the United States. Pronounced kuh-man-chee.
Yes, it was called magnetic core memory. It was really an amazing technology of the day bc it was non-volatile so it served double duty as RAM as well as flash.
@@TheWallReportsMagnetic core memory was a step up from using cathode ray tubes to write, store and read data. Thank heavens I never had to use cathode ray tubes for storage and I have a very time imagining what it was like to use mercury delay lines for memory.
I think you guys are getting two different technologies mixed up here. Magnetic-core memory was volatile and used for RAM, and while it was indeed hand-assembled it was done so in a manner that was the same for every cell. Core-rope memory, on the other hand, is the non-voltile/ROM technology that was hand woven by the so-call "little old ladies" at MIT. The former used the cores to store one bit in the ferrite material itself. The latter used the cores to read which wires passed through and which went around (with up to 64 wires per core in the case of Apollo). Two slightly similar, yet completely different, technologies.
There is a 1-to-1 correspondence between the assembly language and machine code. The programming logic is the same for both. Assembly provides human-friendly symbols that the assembler translates 1-to-1 into machine code instructions. Assemblers also provide human-friendly labels for memory addresses like jump/goto targets and data locations. Advanced assemblers also provide "macros" that substitute sequences of assembly instructions with a single command, similar to how macros in office software like word processors and spreadsheets work. Once macro code is substituted and memory address symbols are resolved, again, it's 1-to-1 translation to machine code. Early microcomputers like the Altair and minicomputers like the PDP-11 had front panels with displays a little similar to the AGC DSKY. You could enter the instructions in binary and read results in binary from display lights. The DSKY was more user-friendly (no binary) in this regard as it provided command symbols on the keypad and decimal values for the display and keypad.
One other thing to keep in mind about assembly language is that it's not a single language. Each processor architecture's assembly language is unique; e.g., the AGC's assembly looks completely different from 6502 assembly, which looks completely different from i386 assembly, which looks completely different from ARM assembly, which... you get the idea. This was because assembly instructions map 1:1 to machine instructions, which of course are completely different for different architectures.
@@markrosenthal9108it's not necessarily 1:1. Assemblers support macros that can turn one assembly statement into 2 or more machine code instructions. MIPS CPUs don't have an instruction for loading a 32 bit constant, but there is a "pseudo instruction" li, which is turned into lui + or. The main difference is you can use all the instructions available, while high level languages only use a subset and they won't let you directly use things like CPU flags. An issue that I have faced is C not letting you detect carry or math overflows without wasting time on unnecessary calculations
@@markrosenthal9108 It's not quite 1:1, because even without a macro assembler, there are tricks you can do with machine code, that's difficult or meaningless with assembler, like designing code that executes differently if you jump into the middle of a multi-byte instruction in an instruction set that supports a variable-length instruction set (like x86 or Z80 or 6502 or 68K).
I just smirk when people say older people don’t understand tech. As a 14 year old, I learned about microcomputers on an Intel 8085 development kit. You would write a program on paper using the 8085 assembly instructions, and then look up the machine code for those instructions in the manual and enter them using a keypad. 8 years later at university I was using my personal 386 PC to run Mathcad etc. It is amazing how rapidly things are developing. Apparently one cellphone represents a cost beyond the total world GDP of the 60’s where it possible to construct it using valve tech from that era. Great clip about this famous computer. A testament to pushing the limits of the tech you have available and the character of the people faced with solving problems
My lecturer taught me the concept of memory using a transistor based flip-flop to store a single bit. Fast forward in the 90s I learned to program on a Zilog Z80 using a keypad to produce something on a seven-segment display. Despite of how fast technology is evolving, it's comforting to know that Arduino sustems and bread boards are still widely available nowadays and youngsters are eager to learn from those.
@@thesoundsmith The KIM-1 was super fun to use. I recently learned there were a bunch of expansions for it. I'd only used the base board with its hex keypad and LED display. Maybe a homemade cassette interface (from BYTE, iirc, but it might have been for the SYM).
Yip! Same for me in 6502, back in 1979: Pure assembly language driving hand-wire-wrapped custom hardware for a microtonal music synthesizer. I entered it into Science Fair, although I probably would have done the project either way.
I started playing with computers around '69/70 and started programming for space systems around '80. When I began, cards were for development and paper tape was for finished code (it was more compact but really really annoying to edit). Fast forward ten years to around 1980 and terminals faster than a TTY were starting to become common, which made all the programmers happier -- even at 300 baud. That said, I was still doing a fair amount of programming with cards well into the 80s. Many programs were developed using FORTRAN to work out the algorithms and logic (C wasn't yet mainstream, nor vetted for space missions where I worked) and chase out a lot of the bugs, but in the end we were still translating that by hand into "optimized" and commented assembly (i.e. just compiling and doing an assembly dump as a shortcut wasn't an option). It was a treat when you got to use an AL you already knew from a previous computer; still, you ended up learning a lot of them.
Programming certainly has changed since then! When I started working I did some assembler, but that work died out and haven't touched it since. The lowest language I use is C, and now using frameworks is important.
Do you still program in Fortran? Will Fortran ever be deprecated? I'll bet it will still be used in the 2100's and beyond. The more time passes, it's less likely to ever be replaced.
@@TerryMundy Honestly, I never loved programming in FORTRAN, but what it does, it does well -- which explains its recent resurgence. Do I still program in it? Nope. I haven't had to for a while. Nowadays I tend to use C-family langs, Python, or a microcontroller assembly.
I have worked on Assembly language programs and it really finds out your weaknesses and forces you to learn. I am not a professional developer BTW, but I do have a passion for embedded systems. I know why they used assembly so much back in the day. It was their only real option. Thank goodness for really good compilers and large memories nowadays. WE ARE SPOILED.
👍🏾True. The thing with Assembly Language and same can be said for machine language is the coder has to know & understand the actual CPU & hardware bc you're working & manipulating registers & memory directly more less depending on whether virtual memory mode is active.
I still remember my first days meeting with assembly. It was in 3rd elementary... No, I was not some Sheldon Cooper level genius in some "gifted kids' school". It was early 90s, and my school decided to buy modern computers (so 286s instead of Comodore64) and start computer science courses for the kids. It was really forward thinking at the time, sure, but there was one tiny problem. None of the teachers knew ANYTHING about computers, so they assigned the science teacher to learn about computers from books and teach kids what she learned. Basically she was ahead of the classes she taught by one or two lessons. And what exactly you think they thought would be the best way to introduce kids into computer science? Yes, exactly what you thought: Assembly language. During the first few lessons we learned what peripherials are, then right after that we were introduced to registers, stacks and heaps and data manipulation verbs. In like two months, all the kids were taken out of that course by the parents.
@@Jenny_Digital Same here. I wrote a few simple arcade games in 6502 back in the 80's on my Atari 800. 6502 was so simple that it made it very difficult to program. At college in 1987 I took an assembly language class which involved programming a mainframe computer. I don't remember the name of the CPU but the assembly was much more advanced making it fairly easy.
@@jp5000able I had an Acorn Electron with only a black and white telly and a tape recorder, but the built-in assembler meant I got my start in 1983 when I was only six. I had no printer so I typed my listings using an imperial widecarriage typewriter my dad had rescued from the tip whilst working for the council. Back then my computer came with a very comprehensive user guide, getting started book and demo tape. Now you get a sheet or two of paper telling you how to plug the thing in and not to open it.
Curious Marc is a treasure trove channel. Going to auctions, nabbing old gear, doing hardware debugging, rebuilding display tech... Then casually piloting a landing. They are amazing.
@Fred-yq3fs Yes, and their series of videos about the Apollo systems is fascinating. At one point they even had access to one of the rope memory "memories" (I guess that's the term?) and reverse engineered it to determine what code was woven into it.
Absolutely true! I learned so much from their hands on restoration of an AGC and a download of its core. They then redid A11 landing with the AGC interpreting inputs and computing outputs.
I went to the Smithsonian a couple of decades ago now and was absolutely stunned by the technology and computers used to go to the moon. It looked absolutely archaic in design. I literally saw soldering points on the greenboards for wires. I was told at the time they had triple redundancies because of the anticipated shaking that might loosen the wires. My respect for those astronauts only grew tenfold when I saw that capsule and those components. Now, I have heard that we have to relearn how to go to the moon, because most of those brilliant engineers are now dead and things were kept on paper. Wow. You would have thought that we would have kept and or recorded these kinds of things, even if for only posterities sake. Great video, thanks for taking the time to explain it.
We have tons of records from every aspect of Apollo. Drawings, technical reports, etc. What’s gone is the ephemera: not design details, but background of “why did we design it this way”. Relearning how to go to the moon is more a matter of experience: the engineers working on Artemis have tons of experience with Earth orbit, less so with the environment of the moon and its unique challenges.
@@Hobbes746 I think it comes down to two things. Firstly I heard that the Apollo engines required individual hand tuning, which the engineers knew how to do at the time, but they're all dead and noone knows what exactly they were doing. Second, there's a cascading chain of technical debt. A lot of minor components and things are surely out of production or the contractor companies gone. So that minor component needs replacing with something else. But that then affects multiple other components, which also need to be changed to be compatible, which in turn affects other things and so on, until it'd be simpler to start with a clean slate. To be honest, it's now more likely that Starship will be ready for a lunar mission long before any attempt to rebuild Saturn V "as is" could ever be ready. Plus Starship will be reusable.
@@calebfuller4713 Yes, those all play a role too. Hand tuning the engines can be learned again, but that takes a few years of hands-on experience, building an engine and then testing it. You also have to consider the change in manufacturing methods: you’d have to redo all the drawings in CAD so you can use modern machines (CNC etc.), you have to take into account that some of the metal alloys used back then have been replaced with new ones with different properties. The list gets long.
@@Hobbes746 Absolutely true! It's part of what I meant by a cascading chain of obsolescence. And then you need to test how the new methods work versus the 1960s methods. Will it blow up or work better than ever? It's not even just the rockets. Its stuff like the the control computers. Are you going to dig up a vintage 1960s computer? Or it all needs to be adapted to a modern computer? Which is a whole new project in itself... One that is again maybe harder than just rewriting new code in a modern language from scratch.
In an age when there are bad actors who seek to revise history or erase it (not for greater accuracy, but for barbaric political reasons), hard copies are good to have, to preserve the sanity of the planet. When the risks of technocratic neo-feudalism or nuclear war are becoming increasingly evident, what we call "records" could become future artifacts that could help people sort out what went wrong.
Margaret Hamilton was the lead software engineer of the AGC. She coined the term Software Engineering and pioneered fault tolerance in software systems. Her ideas behind fault tolerance played a crucial role in gracefully handling of AGC overload right before landing
Thank you for making this video. For the record AGC memory didnt use bytes but rather 4096 words of 12 bits each. Very restrictive, and requiring extreme coding efficiency that modern developers don't generally have to worry about.
Amazing! I remeber an interview/debriefing with all three astronauts after their return and Buzz admitted to not releasing the radar task which he should have done to follow procedure. The result was the two errors. Buzz said that at the time he wasn't going to let go of ride home or something to that effect. Who can blame him! What they did is beyond brave! I think you can find this interview on the 3 DVD set NASA released (years ago now). I can also remember watching this in real time back in '69 - it was sooo tense. The fuel almost gone, the computer errors, a last second decision to manually override the landing to get a better spot. Whew it was spectacular. Thanks for doing the video - Good job! - Bill
As long as you make comments. Especially in assembler. Joe Armstrong (the inventor of Erlang) made a joke about the commenting behaviour of one of his colleagues: It was one big file of assembly code and in the middle of the file there was just one sentence: " ;And now for the tricky bit!" 😆
As long as what needs to be known can be known, putting in a little humor or a note to future programmers is like putting a note in a bottle and casting it into the sea. Who knows who will eventually read that note, and how this little bit of humanity will affect their appreciation of the code. I think the comments also help document a point in history with slang or literary or movie references. If done tastefully and with a sense of balance, these comments give us insight not only in the code, but of the programmer.
I have always injected some humour into the code and continue to do so - mostly for my own amusement - sometimes as a monument/reminder of some insight that would otherwise get lost in time. I enjoy seeing it in other people's code too, it does have some utility beyond putting a smile on a tired developer's face though... What I have found is that third parties looking at the code latch onto those bits and pieces, they give an otherwise featureless jumble of code some landmarks - which folks can use to navigate the code (and share in the insights of the original developer).
Fascinating, thank you! Yes, I'd held the incorrect belief that the AGC was roughly equivalent to a calculator. I'm a technical person, but not a "computer" person, and I found this video very helpful.
So cool you gave the Wizard of Oz not one but two mentions in your video. One is the "Off to See the Wizard" bit of the code and also that Margaret Hamilton is not only the MIT software engineer but another Margaret Hamilton was the Wicked Witch of the West.
One of the most interesting UA-cam episodes I've ever viewed. I started using assembly language in 1970, and have a very high degree of respect for early coders. Thanks for your research and presentation! I enjoyed it immensely!!
As did I. A little bit of programming in hex based machine code and then onto Assemby. We though it was amazing when we got 8k of RAM. How things have changed.
Actor Jack Black's mum worked on the Abort-Guidance System (AGS) in the Apollo Lunar Module. In what has to be one of the best quotes ever... "In a memorial tribute, her son Neil notes that she was troubleshooting problems with schematics on the day she went into labor, called her boss to let him know she had fixed the problem and then delivered Jack". The calculations and measurements were also done in metric and displayed in imperial units as was simpler.
@@Alkatross you do! All imperial measures in the USA are based on metric standards. The US inch has always been 25.4mm. However in the UK its varied over time - even at one point 1 inch being 27mm. also US military - 1 click = 1km.
@@mrrolandlawrence i realize it's all the same stuff, but it's just the general knowledge of the measurement that I am missing as a native freedom unitarian.
@@Alkatross yeh thats one thing i'll never understand..... no one uses imperial tones in the USA. its always LBS. even when its in the millions. one curse that is worldwide though -TVs -measured in inches everywhere and car tyres for some reason.
Once in a while I stumble across something like this that really makes up for all the other mediocre junk taking up space on UA-cam that makes me say to myself, "I'm really glad I watched that!" Dee, good job, this is pure gold👌 The content was a very fascinating little slice about this awesome historic endeavor! It invokes more appreciation for all those super intelligent people it took to make the program happen.
Great Video Thanks! Don Eyles on of the programmers responsible for those jokes wrote an interesting book called “Sunburst and Luminary an Apollo Memoir” about his time working on Apollo.
It's a really good read! Don Eyles was the main author for the Lunar landing module software and helped pioneer some coding techniques still in use like memory fetch interleaving - esp by game developers to get the most out of the consoles, as well as interrupt coding - used in OS's. The latter turned out to be key in avoiding the information flooding mentioned in the video (indicated by the 1202 alert) from stopping the mission.
I remember machine code and hand compiling 6502 programs into hex! I also remember watching the landing live, aged 12. BTW it was July 20th 1969 not the 29th. As a computer operator back in 1983ish punch cards were still in regular use where I worked. Having recently retired after 40+ years in IT, based on talking to recent graduates I get the impression that basically Assembler is no longer taught in college. This is a mistake; short of some kind of electrical engineering degree there is, in my opinion, no better way to understand what the hardware's doing.
Fairly good description of my start. Watched moon landing, programmed on punch tape between 1972 and 1979, Fortran and punched cards 1980, Computer Science degree included hand compiled 6502, bit slice CPUs, and graphics processors along with Pascal, ADA, APL, Cobol. Finally went to C and Smalltalk.
@@ProfPtarmigan Pretty close, I started as an operator in 79, learnt Cobol & ICL Plan Assenbler on the midnight shift from reference manuals. Switched to programming in 83, spen 6 years with a software house (Cobol, RPG2 & Wang Assembler), Moved into the airline industry in 89, spent 9 years doing TPF & MVS assembler (Both in the UK & US) switched to Java in 98. Moved into DEVOPS in 2018 & retired in 2023. Somewhere in there I found time to snag a 1st class hons in, effectively, computer science from the Open University.
I started learning programming when I was 19 in 83. 8086/8088 assembly code was the common language. I still use 8086 assembly as BASIC is often to limited. I got to watch a Saturn 5 launch in 71. That is when I got the programming bug. Thanks for this information.
My first programming was less than a decade after the moon-landing and we had a landline from college (UK, so age 16 to 18) to a local university mini-computer. The link terminal had a huge telex-keyboard and a paper-roll instead of a cathode-raye tube screen, and then you had to wait for most of the lesson (or frequently until the next one) to get the inevitable error message about a typo in line ten, or maybe the results of the program if you had been extra careful. A DSKY like the Apollo had was massively ahead of it's time. NASA has large public archives concerning the tasks and planning for the tens or hundreds of thousands of people involved in the project, and the solutions to the new problems that had never previously needed solving.
Catching up was indeed a problem, with no budgets for that sort of thing as computer use was percieved to be just something for accountants . . . The more enthusiastic students had Commodore or TRS80 machines at home, at about the same time as the telex-and-paper terminals in college!
Assembly language also is not just one language. It's a family of languages that are specific to the processor the engineer is writing for. For example the assembly written for an Intel x86 chip would differ from say an AMD x64 chipset. Back in the day each computer had its own version of assembly, so the code written for this flight computer can really only run on this flight computer.
I learned 6502 assembly language progamming on the Apple ][ computer, a decade after the 1969 Apollo 11 moon launch. The page layout and formatting of the Apollo Guidance Computer assembly code is strikingly familiar, and similar to the assembly language syntax and commenting standards I used and encountered in programming Apple ][, Commodore 64, and IBM PC computers in the 1980's and beyond. It's inspirational to see how low-level programming culture and techniques evolved from the earliest examples of embedded systems programming on machines that predated modern microprocessors.
"Comanche" is pronounced as /ko-man'-chee/. "ko" rhymes with "go" but can be softened to "kuh". The second syllable is emphasized; "man" is exactly like "man". And "chee" rhymes with "me". The Comanche tribe of Native Americans here in the USA are headquartered in the state of Oklahoma.
@@msromike123, the typical American pronunciation is /uh-loo'-muh-num/. Emphasis on the second syllable. The last two syllables can be /min-um/ instead. You can blame our first prolific dictionary editor for all that. Old Noah Webster.
Excellent video. It's heartening to see young software developers pay homage to the giants whose shoulders all developers stand on. I was 7 years old when Apollo 11 landed on the moon. Later in high school, I read about the AGC and I was inspired to learn how to program. First course was FORTRAN with punch cards. I worked as a software engineer for 38 years and never forgot the lessons learned from those pioneers. PS I love your South African accent. Keep up the good work!
I remember playing with discarded punch cards as a child back in the early 1980s, I guess. My parents were students at the time & one of their student jobs was loading punch cards into the university's computers (and taking them out again later). They brought home a bunch, mostly to be used as note pad replacements. They also caused my child brain to imagine all kinds of funky things for all those strange hole patterns. Thank you very much for this fascinating & entertaining dive into our past; definitely one of humanity's most uplifting & most impressive achievements ever.
Great summary of the AGC. As others have posted Curious Marc and team have brought back an AGC and rebuilt the code for multiple missions. Current video they are debugging ground radio commands to remotely program the AGC. Given the state of the art back then it was a fantastic accomplishment. Less well know is the LVDC (Launch Vehicle Digital Computer) that guided the Saturn 5 and astronauts to orbit designed by IBM.
Has LVDC been published like the Apollo 11 source? I'd love to see that code. Maybe Curious Marc could restore that hardware for an encore to the AGC project
@@shackamaxon512 Much of the extant LVDC hardware is now a tangled mass of wires because, to keep it light, the LVDC body was built of beryllium. These have degraded to uselessness over time. A shame too; the published specs make it clear the LVDC was also a ground-breaking machine.
Ah, the good old days when programming was creative and fun, when code wasn't shipped until it was just about perfect. Also, the verb/noun paradigm is so well-suited to human command logic. Great video. Enlightening. Entertaining. Well done,
I mean, the noun-verb paradigm was the true genius of the system. It was such an elegant solution to the size and weight limitations necessary in the AGC human interface component. Also, an amazing design in light of the need for the astronauts to be able to accurately interpret the output from, and make input to the AGC in times of extreme stress. Mind boggling!
You'd better be sure that most of the work on the Apollo software was not fun but very tedious. Those jokes tell a story of exasperation. On the flip side, coding can be very much fun today, and modern software can be far more reliable than what even those most brilliant programmers of the 60s were able to do with their limited tooling. If your experience tells you otherwise then perhaps you should look a bit more around for better projects, other programming languages, etc..
Ah, the good old days when requirements were so simple and constrained. Even the smallest app today does so many more things than just "Fly rocket to moon". And then a separate program "Land it".
Margret Hamilton's daughter often went to her mother's work and was allowed to play in a mock-up capsule that contained an AGC. At one point an alarm went off and Margret asked what her daughter had done and it turned out that she had started program 00 which basically meant that the Saturn was ready for launch. Margret warned her superiors but they brushed it off saying that astronauts would never be stupid enough to do that. Until James Lovell made exactly that mistake during the Apollo 8 mission when it was already close to the moon. Mission Control was not amused. With much pain and effort they managed to restore the AGC remotely!!!
00 is the AGC Idle ("Go to pooh") command. 01-03 are the command for starting up a Saturn. Also, Lovell got 40 and 41 mixed up during Apollo 13, which they were able to unscramble quickly. You're thinking of another incident (maybe on Apollo 8?) that most of us haven't been exposed to?
@@fredflintstone9609 Yes, as I have already said, it was on the Apollo 8 mission. Lovell himself confirmed it. If I can find the link, I will add it here later.
The 1201 and 1202 Alarms were causing AGC reboots during the Apollo 11 Landing. This extended the range of the de orbit and landing engine burn. The low level assembly language allowed the AGC to keep up with required tasks, even with frequent reboots. Reboots were near instantaneous. I always wanted to find out how long it took Mission Control to isolate the problem to the enabled Rendezvous Radar. Note: Apollo 14 Landing used a lot of fuel also. They had only a minute of fuel remaining at Touchdown. Apollo 14 had an issue with a landing abort switch with an intermittent short that had to be isolated with an in orbit re-write of the Luminary Code. Loved watching your video! 👏👏👏
MIT handled the 1201 and 1202 problem isolation and provided the fix just before 11 lifted off the lunar surface; see also George Silver. On 14, Don Eyles wrote some code that lived in R/W memory; rewriting Luminary was not possible as it was in fixed (think ROM) memory. See Sunburst and Luminary by Don Eyles.
Fascinating. The code itself is one of the best records of the mission. I also want to point out the amazing inertia motors\sensors connected to the computer. This combination of brilliant engineering enabled the craft to remain perfectly oriented through all missions. Our phones have a similar device installed, and the Occulus Rift has a pair. A DJI Drone hovers perfectly still in all wind conditions (below 25mph) because of its precise, near weightless inertia motors. For a computer, in 1969, to observe that data and respond to it accurately is mind blowing.
There is a book about the Apollo guidance computer called "The Apollo Guidance Computer Architecture and Operation" which is pretty good reading. It had a preemptive and cooperative multiprogramming, an interpreter to implement more complex instructions.
Correct me if I'm wrong but my understanding of the AGC is that it was NOT just a calculator some are led to believe. To believe that would be absurd. But what was meant by that statement is that it had the computation power inline with that of a calculator but far more capable bc of its preemptive & cooperative multi-tasking abilities.
@@TheWallReports I guess the problem is also that different people think of different things when thinking of a calculator. When I think of a calculator I think of a non-programmable device which has keys for a fixed set of mathematical operations and a display capable of showing a single number; however there were other electronics also going under the name of calculator that could be programmed and had graphical displays, and which IMHO would qualify as pocket computers.
@@TheWallReports It looks like a calculator because it use a VERB/NOUN interface but it could do a lot of things. The verb was a two digit code that defined the action for 00-99. The lower codes were for early stages on mission like prelaunch while later codes were used for later parts of the mission like navigation, descent and landing back on earth. This interface was something the developers came up with while expected something better in the future but nobody could think on anything better. The CPU had 16 bit registers. With the interpreter they implemented opcodes on double width registers doing trig functions and vector/matrix operations needs for flight control. The CPU had access via IO to many sensors and engine controls. The AGC were quite capable but the ground facilities had much more powerful computers to augment the AGC because there were weight constraints on how big a on board AGC could be.
As a Space Nerd who spends a horrific amount of time observing the very small details and progress SpaceX is making with Falcon and now Starship, and also physics and engineering back ground, you did a very good job of explaining the mission correctly. So many get parts of this wrong. And you made the inner workings of this very interesting. This is a great video. Have you ever seen the interview Smarter Every Day did with one of the engineers that made the Sartun and integrated all the systems the code you reviewed ran on and controlled? Its awesome.
We were using punch cards up through 1991 on our ship. We had to learn Hollerith code and even had 5-channel paper tape and magnetic tape. We had guys coming down from various departments, sometimes with several BOXES full of the cards. Every once in a while, they would trip on the ladder (stairwell) coming down to our shop to give us the job! Those cards were stacked sequentially and had NO writing on them. They had to go back to their shop and re-run the entire job again to get the output. :)
So they missed the trick of running a pen down the side of a stack of cards at a slight angle so that even if the cards get shuffled it's quick and easy to get them back into order.
@@Heater-v1.0.0 Our cards were punched and also had the text printed at the top of the card. We were taught to tab over to the comment part of the punch card starting at I believe column 72 and type a sequential number. No one ever did it, since it was so slow to move over there. Dropping your punched card deck was everyone's nightmare.
@@dermick Yeah, we didn't bother numbering our cards when punching in Algol code in 1976. Seems a lot of people didn't cotton on to the trick of running a marker pen down the side of the deck. Anyway, luckily by 1977 we had terminals to a timesharing OS , no more card punching. Then very soon came micro-processors and personal computers. Happy days...
Fun fact: When the space shuttles were built, technicians in Palmdale, California used IBM punch cards as shims to establish gaps between the tiles during bonding operations. As late as 2008 we still found pieces of punch cards between tiles that flew many missions. The punch cards were very consistent in terms of thickness and techs could establish gaps with them to meet strict gap requirements very easily.
I was in the US Air Force, and we used punch cards well into the 80s. It was in the 90s, if I recall correctly, that we switched over to a system of remote terminals. Even then, it was the old system at heart, but you could enter data directly rather than via punch cards. This allowed for real-time updates to maintenance data rather than overnight batch processing.
Yep... I still remember the thrill of the first IBM 3270 terminal in the shop with CICS to enter our code on. The senior programmers hogged it up most of the time...
When I was in the Marine Corps, I was almost a mainframe operator. Still even lists that MOS on my dd214 as a secondary MOS. Luckily, due to some priority changes, I went to school to be a Small Computer Systems Specialist (an IT guy basically), but then ended up actually doing UNIX sysadmin work. I was around a lot of mainframe folks, though, both Marines and civilians.
My wife took a computer programming course at Brown University in about 1973. It involved figuring out what you wanted to do, punching holes in punch cards, then hand simulating each card to remove as many errors as you could, and then taking a box of cards to the computer center, to be fed into the card reader. Generally, the first try it would get a few dozen cards into the run and stop due to an error. It generally took a few hours before they could do your run, so that meant go back and figure out why you got the error that you did, and try to fix it. Then you would take the cards to the computer center and start the cycle again, hoping to get further into the run before it stopped again.
Checking online, it cuts deeper - that "burn baby burn" line was first used during the race riots in 1965 by a Radio DJ named Magnificent Montague in America... Super video, one of my absolute favourite software topics is the AGC. Many thanks for taking your time here, yet getting it all done in under 20 minutes. Well impressed! I am interested in what kind of instruction set they had available to them in terms of a Von Neumann machine, to be writing assembly on a portable computer in the late 60s. And also that the tech had to be shielded from extreme physical forces - ploughing thru Earth's atmosphere. And also, as others have said here in the comments, the actual point-to-point wire-weaving of the memory modules for the AGC, by real seamstresses with actual needles and a ball of wire, is the other most-phenomenal of facts about this software-hardware machine! Thanks for the upload, subbed 👍
I'm 67, at school they stopped the whole school for the Lunar landing wheeled out a television and we watched the landing LIVE. Later on in my Career I spent 40+ years in IT. A specialist in Aerospace/Defence, several languages, no machine code, megabytes of code. Yes I put in jokes. They can be useful. If an error says "Tony your shorts are on fire" in correction you go straight to the place in the code. "Error 404" could be anywhere. NOW THE IMPORTANT PART. For Apollo they taught me that they programmed in triplicate. For a key calculation three programmers would code the algorithm. The calculation on the onboard computer would run all three in parralel. If all three did not agree the result, it would display the two majority result. If All three were different it was "Shorts on Fire". From my experience they could not run the whole of the system in triplicate (we can now) this would be sub sections of code.
I'd just like to thank you for a very clear and interesting explanation. I remember watching the moon landing as a 11 year old kid, then in my last year of middle school learning Fortran programming using the write the code -> convert to octal -> punch cards with an unbent paperclip -> bind your program with a rubber band -> wait for the printout to return from the university the next week method. In the early 80s I was programming simple games in Z80 assembly language for the simple home computers of the day. Persistent memory was cassette tapes!
Note: Assembly code is platform specific. X86 assembly is very different than ARM64 or 6502 assembler. They have completely different syntax, opcodes, registers, etc. Each CPU type's assembly code is essentially it's own language and there is generally no direct parallel between different platforms.
Oldish assembly coder here. Whilst assembly can be different on each platform, the biggest difference is big endian and little endian. Where basically the code is written backward. (Kinda). And yes, opcodes, registers, memory etc. all different but that can even be different from the same architecture to the next in the same architecture.
I agree they are different but.... I know Microchip assembly vey well, and if I have a quick look at a reference manual, I can easily read and understand any other flavor of Assembly.
@@TheHeff76endian is rather insignificant, big endian is easier for a programmer to read but slower for the computer (little endian let's you start doing maths as soon as it's read the lowest significant byte) Load/store Vs register/memory architecture is a more important distinction. The way the 6502 updates flags after every instruction while 8080/z80 only does it on a few, is a larger difference. Even the indexing modes on 6502 are vastly superior. Trying to switch between 6502 and z80 (which are both little endian) can be annoying, the 6502 is the best CPU ever.
@@phill6859 I've never done anything with the 6502, but I heard it had very few registers. Also, every instruction updating the flags sounds more like a nightmare to me; what if you want to test more than one flag? Also it means that you'll more often need to store the flag's value somewhere else for later reference.
@@__christopher__depending on which instruction is running only some of the flag bits are changed not all of them. Normally only the logic like and, or, not, rotate OR math like add, sub, compare effected any of the flags then use conditional jumps to branch on status. Even though it might refresh the flags on every instruction it doesn't usually change any of them.
That's Epic; thanks for clearing up the situation between the AGC and the Lunar Module Pilot: Buzz "Computer, tell me the altitude", Computer "Hold that thought Buzz, I'm a little Busy at the moment! (1201 & 1202 alarms)"
Apollo's AGC was the first computer to use ICs (Integrated circuit / Chips) There was a lot of discussion about this but the team decided that they were the only way the project could be realized
Whilst they used some super basic logic gate ICs in the AGC, the real leap goes to the CADC in the Grumman F14 Tomcat which was designed in ~1968-70 and used the first set of ICs that would in combination be considered a microprocessor, the computer being the only way to make the swing wing design flyable
This was so interesting and I'm not a developer. You can take almost any aspect of these first steps in space exploration and have enough material for like 50 feature films. This video does a great job illustrating how ground-breaking this tech was and it's nice to see articulated kudos to all of the brilliant minds that were the OGs of the quest for space. But calling assembly "challenging" is like calling the Burj "tall" 😆Great vid!
I started on 6502, 8088, and then later Vax Macro I still love assembly language programming, there's something unique knowing you are literally dealing with the instructions themselves Now I use python which is nice but it doesn't quite feel the same
Based on my experience using assembly language in the 1970s, you needed some sense of humor or else you would lose your mind. Most of the humor in my code and those of my colleagues was more like graveyard humor and rather dark. The comments and subroutine names you quoted are much more amusing than the ones we had.
Great video! Loved it, used to enjoy programming my first computer (C64) in machine code. Very impressed with the source code that made sure we REALLY could land on the moon and get back safely!
People might be amazed to know that there is STILL a LOT of IBM BAL assembly code running - especially in large, critical applications where companies deem it just too expensive to rewrite, test, and certify the new code. The banking industry is notorious for this - lots of ancient code on HOGAN systems and still running heirarchical databases like IMS...
Hi...thanks for the nice video about Apollo software (and hardware). I have written a lot of assembly code as a computer engineer, so I can appreciate what it took to do this work. This required someone at the top a great deal of planning to structure this given assembly code by nature does very little...requiring LOTS of source code. It looks like they broke the code down into many small parts, which makes sense.
I don’t recall how they handled decimals, but it is worth noting that if you understand the range of numbers you’ll be dealing with and range of accuracy you need, there is absolutely no need for floating point. Floating point is way more flexible since you have both significant digits and the ability to place the point within those digits, but if you don’t need to move the decimal, integers are fine, and they would have a pretty solid idea of the range of numbers possible for any reading or calculation, the accuracy of the inputs, and usefulness of the outputs. Genuinely amazing code.
This video is brilliant and engaging. I loved watching it even though I’m not a coder.. One tiny thing though- The command module was shown without the service module attached. Command module is always attached to the service module throughout the mission except for just a few minutes before landing. This exposes the semi spherical heat shield on the bottom of the command module.
First I Iearned a psudo assembly code the teacher wrote.. then Fortran with both large and then small punch cards, then Basic on the IBM terminal, then APL on the terminal. We also learned a smattering of Cobol and Pascal. Later on taught myself a bunch of flavors of Basic including Mallard Basic on the Amstrad, AmigaBasic and a compiled Basic for the 68000... as well as C and the 68000 family of assembly. Also on the Amiga I learned E, Pascal, and a bunch of others I can't remember anymore. Then came the MS-DOS era with 6800 asm, MS Basic, Java & Javascript, HTML, Perl, ASP, PHP... now for the past 5 years I've been a dev using C++ with the Qt framework. Now learning C# using VS and the .NET framework.
Assembly is still very much relevant today. In fact, every single time you compile software, you are compiling it into ASM (Assembly) and then the assembler takes over from there.
This is true as if you need efficiency and speed from a CPU and it's peripherals, you switch to assembly, write a sub and call it from whatever higher level language you're writing in.
I loved the video and appreciate the background information provided. I should point out that Margaret Hamilton was certainly not standing next to a single listing of the source code. Someone computed the entire github repo to contain 1,751 pages which would not create a stack that size. The most likely explanation is that it is a culmination of test data run against the source or possibly several printouts stack for effect.
i can see why there are moon conspiracies. these engineers performed MAGIC. forget the primitive hardware, the sheer elegance of practical code to control a system of that complexity seems like scifi mumbo jumbo.
@@airaction6423 Science hasn’t been prioritized since then. Even back then, it took the Cold War to make governments and their citizens see science as a high priority.
@@airaction6423The reasoing is in the wall The US went there for geopolitical reasons. Win the cold war. It went there after apolo 11 too. Multiple times. Then the space race was won...and the cold war itself was won. The US government virtually defunded NASA. It had served its purpouse Thats why. And its sad.
Seeing how it is still difficult today to put a craft on the moon (there have been recent failures), let alone a human, it makes NASA’s success in putting 12 people on the moon with 1960s technology even more awe inspiring. When done right, a government agency can do fantastic things. Private enterprise can too, but only if there is a profit motive. Public good vs private profits.
7:45 As someone who has written assembly language programs for 64K machines, I can say that the stack of printouts that Margaret is standing near in this photograph must be for the *entirety* of software for the Apollo program, not just the lunar and/or command module AGCs. A printout of source code for an assembly program (even one heavily-commented) that fits in 73K (3,840 RAM + 69,000 ROM) might require the number of pages of one of the smaller binders.
In those main files in the GitHub repo you can see that Luminary099 had 1743 pages and Comanche055 has 1751 pages, which is like a box of copy paper. That stack looks more like 8k pages or so. I think you're on to something.
She has said in interviews that when the reporter and photographer showed up, they came up with the idea of making the "stack of code." So, they went around the room and literally just grabbed whatever binders were there. So, that's probably not all source code. I suspect a lot of the thicker binders are just data dumps and other diagnostic listings. After all, computer only had about 36k of assembled code. Even with copious comments, the source code for that is not going to be a huge pile of paper
Yes. Each book is the source code for one AGC. But “Luminary 99” means the 99th version of Luminary, so there were far more versions than is visible in that stack. For that photo, they just grabbed a bunch of printouts that were lying around.
There's no way that stack is just the code for the Command Module and The Lunar Lander. I just downloaded the repo from Github, and the whole repo is 177 files @ 3.34mb. Given printing on 8 1/2 x 11 paper that's roughly 342 pages... less than a 500 sheet ream of ream of paper. I'm assuming those printouts where the typical green and white wide tractor feed... so it would take less than 342 pages. The project I'm dev on is almost 6,000 files at 2.18 GB... which is about 46K pages or 92 reams of paper and @ 2" /ream that's 182 inches or 15.3 feet. The photo is pure made up BS.
Great video! This relates to my favorite computer hack on Apollo 14 to bypass the abort button. It’s too much to type in a comment but might be something for you to cover in a future video
The book “Sunburst and Luminary: An Apollo Memoir” by Don Eyles is also an amazing look at the creation of the AGC and specifically the software from the start.
Good video. BUT big ERROR at 0:52. NOT 29 th.... 20 th July was the Landing on the moon. Such a well known fact. So easy to check. Google it. I was watching it at that time. Please correct or check your script. You are precise normally. Now I can we trust your facts on the rest if you missed this. Still i will watch your videos because you do excellent presentation.
I’m reading the book Sunburst and Luminary : An Apollo Memoir by Don Eyles. A really great read on the AGC’s lunar module software. Coding With Dee, you got the explanation for the 1201/1202 alarms spot on, according to Don’s book.
As an ex assembly programmer (on PDP 11 about a decade after this) I can add that it was good practice to comment every line of code. They were after all cryptic instructions involving registers and memory locations. Actually the PDP 11 had a great assembler. All registers could be incremented and used as memory pointers. Once you added Macros you were close to C. Actually C was essentially sugared PDP assembly code.
As a fellow PDP-11 programmer I approve this 😁. Just a small addition to the nice comments: I once saw this comment (IMHO also citing Shakespeare): 'Our faults dear Brutus are not in our stars, but in ourselves' - error handling function of the RT-11 operating system. I used RT-11 sometimes, but mainly worked with RSX-11M. All were 'real programmers', not just quiche eaters 🤣.
Likewise... this brings back equal memories of nightmares (when it didn't work) and huge euphoria (when it finally did work). We were interfacing A/D's to PDP-11 for data acquisition. Set up "Ping Pong" buffers so the A/D could be writing to Ping while the PDP-11 could read Pong; write to disk... then flip the pointers. Hopefully the "bathtub" never overflowed while filling and draining at the same time. It was those other pesky interrupts from disk writes that could ruin things... now, what was the sustained disk write speed for 8k vs 16k vs 32k block-writes???? As I get close to retirement age (actually past it, but still love being an engineer), I'm thinking about getting into Arduino's for Halloween projects.
@@EngRMP The Arduino is good. I'd recommend getting some of the clones, particularly the nano version. I would also recommend using 2.54mm pin header socks and connectors so the parts can be removed and replaced easily. while I generally gravitate towards the ESP32 , ESP8266, and the raspberry pi pico. However those devices use 3.3 volt logic vs the Arduino's 5 volt logic. It's easier connecting sensors and output devices to the Arduino since most are going to be 5 volt logic and you won't require additional hardware. Get yourself a package about a hundred 2n222 transistors or some 2n7000 fets. It's possible to drive higher power mosfets directly, but if I remember correctly 70 ma is about the most you want to source from the Atmel 328 and stay within safety parameters. There's no real advantage to using real Arduino boards over the clones. I hope you'll enjoy building projects with Arduino as much as I have. The only limitations are going to be your imagination and your wallet. Oh...I usually buy in quantities of 10 so if something goes wrong I have plenty of spares available without any waiting. Besides this stuff is cheap. Take care.
I never actually wrote any useful PDP-11 assembly code, but my 2nd-semester programming course in college (Spring 1972) involved writing a PDP-11 simulator in the first half of the course and then a PDP-11 assembler in the second half, so I became fairly familiar with the PDP-11 instruction set. They made some nice design choices. The auto-increment and auto-decrement addressing modes were useful in lots of situations, and having the program counter be one of the 8 registers meant that pc-relative addressing for branches and subroutine calls was available to the programmer (or compiler writer) without it having to be a special distinct addressing mode.
My dad was a technical writer working on the Apollo project for North American Aviation (later North American Rockwell). The Lunar Module (sometimes called the Lunar Excursion Module or "LEM") was built and tested near his office in Downey, California. He would have enjoyed this video.
FYI, "Burn Baby Burn" was a term associated with the 1965 Watts Riots. It was often quoted by college students of the time. The MIT programmers would have been familiar with it's source and meaning.
Don Eyles writhe the decent and landing part of the program. He had a great sense of humour and I’m certain that he was responsible for most of all of the crazy comments in the source code.
Great video, thanks for doing this. I'm a big fan of the AGC. It really was an amazing accomplishment and in many ways it drove the advances in cpu design, computer architectures, and software development that set the stage for the computer revolution that followed. If you are interested in this kind of computer history, there is a book titled Apollo Guidance Computer that goes through its hardware and software in lots of detail, yet remains approachable even if you don't write machine code just for the fun of it. That being said, your video provided come insights that weren't in the book. By the way, I think most calculator comparisons are based on memory or processing power and are rarely accurate.
I went to college in the late 80s. I was the first CompSci class to NOT use punch cards for computer programming classes. My internship was at a company that still used a punch card machine; for FIVE cards per day! So, this technology lingered on and on!
Same. I learned on PDP-11s running RSTS/E in college on the 80s. We’d use VT52 terminals or even the teletype machines in the lab. But on my work experience I had to deal with punched cards and paper tape.
@@TesterAnimal1 Yes, well, this is typical for every Generation. In Universities we mostly learn the cutting edge science and tooling (but also sometimes got optical EEPROMS for programming an SPS 🙂). When you come to industry then you realize, that most of them are at least 10 years behind.
I took a programming course (FORTRAN) at a college in the summer before my senior year of High School. We used punch cards. By the time I started at the same college as a freshman, in 1978, punch cards were obsolete.
Excellent video. For anyone interested in learning more, I recommend the excellent memoir “Sunburst and Luminary” by one of the software engineers, Don Eyles. Also, the BBC podcast “13 minutes to the moon” does an amazing job of putting you right into the drama of those 13 minutes from undocking from the command module to landing on the lunar surface.
Back then very few people knew about computer programming. Very little memory available to store it on. Crazy but true. Amazed that they got there and back.
But but but Stanley Kubrick and the studio! And what about the Van Allen Belt? Huh, what about that pretty lady? And everyone now knows that only computers can do complex math! Got'cha there...
Stanley Kubrick obviously filmed the moon landings, the space agency had no intention of going to the moon. What people don’t realize is that he’s such a stickler for detail that they ended up going the moon to film in-situ.
What is amazing was how given the level of Computer Technology at the time of the Apollo missions, how they were able build all the necessary functionality into a computer that was able to fit into the Apollo Command Module and the Apollo Lunar Module. With both of these modules the Apollo Guidance Computers had a limited size and weight that could be fitted into these modules. There was not the luxury of having a large size and heavy weight. So it is likely the Apollo Guidance Computer was the smallest and lightest computers of its time.
Yes. The AGC was the first computer *in the world* to use integrated circuits. Those were chips containing only two logic gates each, but they were an advance over the use of individual transistors as was common back then.
Really nice summary. I programmed in assembly in the late 70s and key punch cards were used initially followed by an editor. Used cards, paper tape, mag tape and Mylar tape. Our machine used core memory.
12:59 You cut that quote short. It looks like it was split over a page boundary, if I understand the "Page 393" marker correctly, and the last bit is on the line after that. So, it should end as, "...such abominable words as no Christian ear can endure to hear."
Yeah that kind of bothered me more than it should have. Also not sure why the programmer gave the page number, because that's going to vary in every edition anyway. Instead you go by act, scene and line numbers.
@@prosimian21 It's a digital transcription of the source code that was originally in printed form, so I'm pretty sure the "Page 393" is just marking where page 393 of the code starts, and is nothing to do with the Shakespeare quote.
Great video! I always think of the AGC as an HP-12C on steroids-you input numbers, store them in registers, and call commands to have the "calculator" compute and give you results. The part where it gives the astronaut instructions to correct the trajectory works like that. The astronaut observes the positions of 40+ stars and inputs them into the computer, entering the star number and its position. Then, the AGC calculates the required correction, providing the result in terms of thrust duration and timing. This meant the astronauts had to manually ignite the thruster and control how long it stayed on (which is where the beautiful Omega Speedmaster came into play). The AGC didn’t control the thrusters directly or track positions automatically. I’d love to see what the AGC code would look like in C or another modern programming language.
Uh, no. The AGC _did_ control the thrusters directly and the astro only had to sight/mark on two stars (of 40 possible the AGC knew about). The astros had access to a manual mode where the AGC controlled the thrusters based on the inputs from the controls inside the modules, but this was only used once on 13 because the LM AGC had no software designed for a docked burn.
@@fredflintstone9609 I see. Is it the "Auto maneuver" in this flight log? 022:51:55 Aldrin: Apollo 11... 022:51:58 McCandless: Go ahead, 11. 022:52:02 Aldrin: [Faint] Roger. Standing by for your updates. Over. 022:52:10 McCandless: Okay, 11. This is Houston. At time approximately 22:30 in the Flight Plan, in your post-sleep checklist, and in all other post-sleep checklists, we'd like you to delete the statement that says Auto RCS Jet Select, 16, to On, and what we're doing here is picking this up in the procedure for exiting PTC that's in your CSM checklist. And in the CSM checklist on page Foxtrot 9-8: if you want to turn to that, we'd like to change the order of the steps in that. Over. 022:53:13 Aldrin: [Faint]Stand by. [Long pause.] 022:53:37 Aldrin: [Faint]Okay. Page F9-8. Go ahead. 022:53:40 McCandless: Okay. Right now it reads, "To exit G&N PTC". Then you've got a pen-and-ink change that says, "Auto RCS Select, 12, Main A and B". And you come down to the printed Step 1. We'd like to take and move the "Auto RCS Select, 12, Main A and B" down to be the second step, so the procedure would read, Step 1, "Manual Attitude 3, Accel Command"; Step 2 "Auto RCS Select, 12, Main A B." Step 3 would be, "Verify DAP Load", and so on. Over. 022:54:16 Aldrin: [Faint] Roger. I copy. Is that "Auto RCS Select, 12, Main A, Main B" to be the... [Fades out.]
Assembly is not “a” language, nor is it confined to history books. Every processor architecture has its own unique assembly language designed for the machine code used by that processor. Assembly languages are still widely used today for situations where you need very compact, fast, and/or efficient code for portions (or all) of a program. Also you cut off the Henry VI quote, it had the remainder but whoever put it in that comment left a page number line in the middle of it for some reason. (And, pointless nitpick time,😂 comments in that language are preceded by hashtags, not followed by them. 😉)
I'll program in assembly or direct in machine (or even COBOL) any day over "programming" using Spring, TypeScript and Kotlin. Those are the worst "programming" languages / frameworks ever made.
It's a great computer, incredible for the time. Good fun to program, I made a couple of games for it a few years ago and in terms of raw speed is on par with 8-bit computers from the 80s. I had to put in artificial delays so the games wouldn't run too fast.
It's crazy that we don't even have the ability to replicate that and go back right now. Some astronauts have said our current technology can't do it but we're so close again to making it happen. Think back when so many millions of people had to listen to the radio but now the incredible advancements in video when we go back to the moon is going to be insane! I can't wait.
It would be very unwise to reproduce this 60 year old technology. The goal was to put a man on the moon within 10 years so it didn't have to be that efficient and money wasn't an issue then. They had been there 6 times and there was no need to go back. SpaceX shows how it should be done now. Artemis is somewhere in the middle by using upgraded Space Shuttle hardware.
@@REEREEREE33 you're a crap listener then, because no current space engineer ever said that. What we don't have is the entire extremely expensive supply chain that was built for that project alone, at an enormous cost.
@@vitalyl1327 I doubt cost is the real issue. It's the non-existent ROI. What would anyone currently achieve by leaving orbit? No monetary gain prospects in any way,shape or form. The americans just wanted to beat the Russians to the moon, so they were willing to spend just about every cent they could dig out their pockets to make it happen and make it happen FAST they did. Saying that we {can't} do it, is a pretty fucking stupid thing to say.
I have an affection for that era, as my second computer experience (after the Dartmouth IBM mainframe, at age 12-13) was with a DEC pdp-8e, a moderate upgrade on the tech developed for this AGC. It was far easier with old Teletype terminals, and a high speed optical paper tape reader, than dealing with punch cards, or the limits NASA's AGC had of 2k RAM, and 36k of ROM as its storage limit. MIT's work on AGC no doubt had crossover to DEC engineering. When we think of standard programming languages in the cp/m uPC era and since, that was NOT the APC. The pdp-8e could run BASIC, FOCAL, COBOL, FORTRAN, or a two pass ASM assembler and compiler. NASA's AGC ran AGC4, its own unique hardware specific code. MIT's engineers amazingly outdid one major NASA contract spec for AGC. They came out with a 70 lb computer, well under the 100 pound design goal limit. That in itself was amazing at that time. It's been a LONG time since I wrote ASM code. One reason Steve Gibson's hard drive utilities are so efficient, is that's one of his long term specialties, as other code bloats just on GUI interfaces, that may hog more than a million times the video RAM alone, on which we ran entire space missions in 1969.
Hi, thanks for watching. I incorrectly said the landing was on 29 July, it was on the 20th. I just read it wrong on my script! Also this video is getting a few moon landing deniers unfortunately. I’ll try remove the more crazy comments but please ignore (unless you really want to get into an online argument!)
Although it's understandable enough to keep saying 1969... the truth is that the Inertial Guidance System (high-tech gyroscope) the Computer and its Software was the first contract issued for the Apollo mission, so its concepts and design are rooted in the key technologies of nearly a decade prior, and partly successful due to the foresight of what may be available when critical commitments to implementation are due - e.g. use of the first integrated circuits. The software was developed over a long time, and only just made the deadlines set for each critical step in the pathway.
hopefully, a decompiled version will be available! like c. historical we wrote in assembly, until we realize we can write higher level code like c to create assembly. then we wrote another layer like python and java to go a step further. java compiles to java bytecode, that then compile to an assembly with java virtual machine.
there's no real reason to write in assembly, as most modern languages today compile to assembly. so we all still write in assembly. but with custom functions you can say.
Yeah...I noticed the date glitch: I clearly remembered watching the landing 2 days before my 10th birthday! Great video...really brought it to life!
Thank you for this video. Truly amazing software accomplishment. Amazing that NASA had this one-of-a-kind non-programmer computer-to-human interface.
Also, you a beautiful and well spoken. Thank you again. Best Regards.
fwiw: Comanche is the European name for a very famous tribe of Native Americans inhabiting the Great Plains region of what is now the United States. Pronounced kuh-man-chee.
Even more mind blowing is that the AGC's memory was HAND WOVEN.
Yes, it was called magnetic core memory. It was really an amazing technology of the day bc it was non-volatile so it served double duty as RAM as well as flash.
@@TheWallReportsMagnetic core memory was a step up from using cathode ray tubes to write, store and read data. Thank heavens I never had to use cathode ray tubes for storage and I have a very time imagining what it was like to use mercury delay lines for memory.
Fun fact: the “core dump” term comes from this era, when ram was implemented using magnetic cores.
I think you guys are getting two different technologies mixed up here. Magnetic-core memory was volatile and used for RAM, and while it was indeed hand-assembled it was done so in a manner that was the same for every cell. Core-rope memory, on the other hand, is the non-voltile/ROM technology that was hand woven by the so-call "little old ladies" at MIT. The former used the cores to store one bit in the ferrite material itself. The latter used the cores to read which wires passed through and which went around (with up to 64 wires per core in the case of Apollo). Two slightly similar, yet completely different, technologies.
@@Myndale with core-rope you can talk about 'hard coding' 😂
Kudos for making the distinction between Assembly code and Machine code. Some today even do not know that.
There is a 1-to-1 correspondence between the assembly language and machine code. The programming logic is the same for both. Assembly provides human-friendly symbols that the assembler translates 1-to-1 into machine code instructions. Assemblers also provide human-friendly labels for memory addresses like jump/goto targets and data locations. Advanced assemblers also provide "macros" that substitute sequences of assembly instructions with a single command, similar to how macros in office software like word processors and spreadsheets work. Once macro code is substituted and memory address symbols are resolved, again, it's 1-to-1 translation to machine code.
Early microcomputers like the Altair and minicomputers like the PDP-11 had front panels with displays a little similar to the AGC DSKY. You could enter the instructions in binary and read results in binary from display lights. The DSKY was more user-friendly (no binary) in this regard as it provided command symbols on the keypad and decimal values for the display and keypad.
@@markrosenthal9108 Nobody argued, that an Assembler is like a Compiler.
One other thing to keep in mind about assembly language is that it's not a single language. Each processor architecture's assembly language is unique; e.g., the AGC's assembly looks completely different from 6502 assembly, which looks completely different from i386 assembly, which looks completely different from ARM assembly, which... you get the idea. This was because assembly instructions map 1:1 to machine instructions, which of course are completely different for different architectures.
@@markrosenthal9108it's not necessarily 1:1. Assemblers support macros that can turn one assembly statement into 2 or more machine code instructions. MIPS CPUs don't have an instruction for loading a 32 bit constant, but there is a "pseudo instruction" li, which is turned into lui + or.
The main difference is you can use all the instructions available, while high level languages only use a subset and they won't let you directly use things like CPU flags. An issue that I have faced is C not letting you detect carry or math overflows without wasting time on unnecessary calculations
@@markrosenthal9108 It's not quite 1:1, because even without a macro assembler, there are tricks you can do with machine code, that's difficult or meaningless with assembler, like designing code that executes differently if you jump into the middle of a multi-byte instruction in an instruction set that supports a variable-length instruction set (like x86 or Z80 or 6502 or 68K).
I just smirk when people say older people don’t understand tech. As a 14 year old, I learned about microcomputers on an Intel 8085 development kit. You would write a program on paper using the 8085 assembly instructions, and then look up the machine code for those instructions in the manual and enter them using a keypad. 8 years later at university I was using my personal 386 PC to run Mathcad etc. It is amazing how rapidly things are developing. Apparently one cellphone represents a cost beyond the total world GDP of the 60’s where it possible to construct it using valve tech from that era. Great clip about this famous computer. A testament to pushing the limits of the tech you have available and the character of the people faced with solving problems
My lecturer taught me the concept of memory using a transistor based flip-flop to store a single bit. Fast forward in the 90s I learned to program on a Zilog Z80 using a keypad to produce something on a seven-segment display. Despite of how fast technology is evolving, it's comforting to know that Arduino sustems and bread boards are still widely available nowadays and youngsters are eager to learn from those.
I'm 82 now. My first computer was a KI
@@thesoundsmith The KIM-1 was super fun to use. I recently learned there were a bunch of expansions for it. I'd only used the base board with its hex keypad and LED display. Maybe a homemade cassette interface (from BYTE, iirc, but it might have been for the SYM).
Yip! Same for me in 6502, back in 1979: Pure assembly language driving hand-wire-wrapped custom hardware for a microtonal music synthesizer. I entered it into Science Fair, although I probably would have done the project either way.
i'm 15 and i wish i got that 😔
I started playing with computers around '69/70 and started programming for space systems around '80. When I began, cards were for development and paper tape was for finished code (it was more compact but really really annoying to edit). Fast forward ten years to around 1980 and terminals faster than a TTY were starting to become common, which made all the programmers happier -- even at 300 baud. That said, I was still doing a fair amount of programming with cards well into the 80s. Many programs were developed using FORTRAN to work out the algorithms and logic (C wasn't yet mainstream, nor vetted for space missions where I worked) and chase out a lot of the bugs, but in the end we were still translating that by hand into "optimized" and commented assembly (i.e. just compiling and doing an assembly dump as a shortcut wasn't an option). It was a treat when you got to use an AL you already knew from a previous computer; still, you ended up learning a lot of them.
Programming certainly has changed since then! When I started working I did some assembler, but that work died out and haven't touched it since. The lowest language I use is C, and now using frameworks is important.
Dorothy Vaughan was a first Afro-American woman who taught herself Fortran because NASA invested into IBM computers.
Do you still program in Fortran? Will Fortran ever be deprecated? I'll bet it will still be used in the 2100's and beyond. The more time passes, it's less likely to ever be replaced.
@@TerryMundy Honestly, I never loved programming in FORTRAN, but what it does, it does well -- which explains its recent resurgence. Do I still program in it? Nope. I haven't had to for a while. Nowadays I tend to use C-family langs, Python, or a microcontroller assembly.
I have worked on Assembly language programs and it really finds out your weaknesses and forces you to learn. I am not a professional developer BTW, but I do have a passion for embedded systems.
I know why they used assembly so much back in the day. It was their only real option.
Thank goodness for really good compilers and large memories nowadays. WE ARE SPOILED.
👍🏾True. The thing with Assembly Language and same can be said for machine language is the coder has to know & understand the actual CPU & hardware bc you're working & manipulating registers & memory directly more less depending on whether virtual memory mode is active.
@@TheWallReports in the days when I was writing in 6502, there was no memory protection, no multiply or divide, and no hardware breakpoints.
I still remember my first days meeting with assembly. It was in 3rd elementary... No, I was not some Sheldon Cooper level genius in some "gifted kids' school". It was early 90s, and my school decided to buy modern computers (so 286s instead of Comodore64) and start computer science courses for the kids. It was really forward thinking at the time, sure, but there was one tiny problem. None of the teachers knew ANYTHING about computers, so they assigned the science teacher to learn about computers from books and teach kids what she learned. Basically she was ahead of the classes she taught by one or two lessons.
And what exactly you think they thought would be the best way to introduce kids into computer science? Yes, exactly what you thought: Assembly language. During the first few lessons we learned what peripherials are, then right after that we were introduced to registers, stacks and heaps and data manipulation verbs. In like two months, all the kids were taken out of that course by the parents.
@@Jenny_Digital Same here. I wrote a few simple arcade games in 6502 back in the 80's on my Atari 800. 6502 was so simple that it made it very difficult to program. At college in 1987 I took an assembly language class which involved programming a mainframe computer. I don't remember the name of the CPU but the assembly was much more advanced making it fairly easy.
@@jp5000able I had an Acorn Electron with only a black and white telly and a tape recorder, but the built-in assembler meant I got my start in 1983 when I was only six. I had no printer so I typed my listings using an imperial widecarriage typewriter my dad had rescued from the tip whilst working for the council.
Back then my computer came with a very comprehensive user guide, getting started book and demo tape. Now you get a sheet or two of paper telling you how to plug the thing in and not to open it.
Curious Marc is a treasure trove channel. Going to auctions, nabbing old gear, doing hardware debugging, rebuilding display tech... Then casually piloting a landing. They are amazing.
Are they? 😜
It's really fun to see all the old gear they get working. One of my favorite channels.
@Fred-yq3fs Yes, and their series of videos about the Apollo systems is fascinating. At one point they even had access to one of the rope memory "memories" (I guess that's the term?) and reverse engineered it to determine what code was woven into it.
@@emjizoneI’d consider them the authority as they worked with people who built the AGC’s
Absolutely true! I learned so much from their hands on restoration of an AGC and a download of its core. They then redid A11 landing with the AGC interpreting inputs and computing outputs.
I went to the Smithsonian a couple of decades ago now and was absolutely stunned by the technology and computers used to go to the moon. It looked absolutely archaic in design. I literally saw soldering points on the greenboards for wires. I was told at the time they had triple redundancies because of the anticipated shaking that might loosen the wires. My respect for those astronauts only grew tenfold when I saw that capsule and those components. Now, I have heard that we have to relearn how to go to the moon, because most of those brilliant engineers are now dead and things were kept on paper. Wow. You would have thought that we would have kept and or recorded these kinds of things, even if for only posterities sake. Great video, thanks for taking the time to explain it.
We have tons of records from every aspect of Apollo. Drawings, technical reports, etc. What’s gone is the ephemera: not design details, but background of “why did we design it this way”.
Relearning how to go to the moon is more a matter of experience: the engineers working on Artemis have tons of experience with Earth orbit, less so with the environment of the moon and its unique challenges.
@@Hobbes746 I think it comes down to two things.
Firstly I heard that the Apollo engines required individual hand tuning, which the engineers knew how to do at the time, but they're all dead and noone knows what exactly they were doing.
Second, there's a cascading chain of technical debt. A lot of minor components and things are surely out of production or the contractor companies gone. So that minor component needs replacing with something else. But that then affects multiple other components, which also need to be changed to be compatible, which in turn affects other things and so on, until it'd be simpler to start with a clean slate.
To be honest, it's now more likely that Starship will be ready for a lunar mission long before any attempt to rebuild Saturn V "as is" could ever be ready. Plus Starship will be reusable.
@@calebfuller4713 Yes, those all play a role too. Hand tuning the engines can be learned again, but that takes a few years of hands-on experience, building an engine and then testing it.
You also have to consider the change in manufacturing methods: you’d have to redo all the drawings in CAD so you can use modern machines (CNC etc.), you have to take into account that some of the metal alloys used back then have been replaced with new ones with different properties. The list gets long.
@@Hobbes746 Absolutely true! It's part of what I meant by a cascading chain of obsolescence. And then you need to test how the new methods work versus the 1960s methods. Will it blow up or work better than ever?
It's not even just the rockets. Its stuff like the the control computers. Are you going to dig up a vintage 1960s computer? Or it all needs to be adapted to a modern computer? Which is a whole new project in itself... One that is again maybe harder than just rewriting new code in a modern language from scratch.
In an age when there are bad actors who seek to revise history or erase it (not for greater accuracy, but for barbaric political reasons), hard copies are good to have, to preserve the sanity of the planet. When the risks of technocratic neo-feudalism or nuclear war are becoming increasingly evident, what we call "records" could become future artifacts that could help people sort out what went wrong.
Margaret Hamilton was the lead software engineer of the AGC. She coined the term
Software Engineering and pioneered fault tolerance in software systems. Her ideas behind fault tolerance played a crucial role in gracefully handling of AGC overload right before landing
Thank you for making this video. For the record AGC memory didnt use bytes but rather 4096 words of 12 bits each. Very restrictive, and requiring extreme coding efficiency that modern developers don't generally have to worry about.
Curious Marc is bringing these machines back to life
that how i found him . from one of the videos on agc restoration and since then watch every video he publishing .
@@b43xoit Corrected, thanks!
Amazing!
I remeber an interview/debriefing with all three astronauts after their return and Buzz admitted to not releasing the radar task which he should have done to follow procedure. The result was the two errors. Buzz said that at the time he wasn't going to let go of ride home or something to that effect. Who can blame him! What they did is beyond brave!
I think you can find this interview on the 3 DVD set NASA released (years ago now).
I can also remember watching this in real time back in '69 - it was sooo tense. The fuel almost gone, the computer errors, a last second decision to manually override the landing to get a better spot. Whew it was spectacular.
Thanks for doing the video - Good job!
- Bill
I'm all for expressing a sense of humour in code comments.
I have seen a comment in the code, which was loading and starting some firmware- " demons be gone!"
I am not.
@@msromike123 so... don't...?
As long as you make comments. Especially in assembler. Joe Armstrong (the inventor of Erlang) made a joke about the commenting behaviour of one of his colleagues:
It was one big file of assembly code and in the middle of the file there was just one sentence: " ;And now for the tricky bit!" 😆
As long as what needs to be known can be known, putting in a little humor or a note to future programmers is like putting a note in a bottle and casting it into the sea. Who knows who will eventually read that note, and how this little bit of humanity will affect their appreciation of the code. I think the comments also help document a point in history with slang or literary or movie references. If done tastefully and with a sense of balance, these comments give us insight not only in the code, but of the programmer.
I have always injected some humour into the code and continue to do so - mostly for my own amusement - sometimes as a monument/reminder of some insight that would otherwise get lost in time. I enjoy seeing it in other people's code too, it does have some utility beyond putting a smile on a tired developer's face though...
What I have found is that third parties looking at the code latch onto those bits and pieces, they give an otherwise featureless jumble of code some landmarks - which folks can use to navigate the code (and share in the insights of the original developer).
Fascinating, thank you! Yes, I'd held the incorrect belief that the AGC was roughly equivalent to a calculator. I'm a technical person, but not a "computer" person, and I found this video very helpful.
So cool you gave the Wizard of Oz not one but two mentions in your video. One is the "Off to See the Wizard" bit of the code and also that Margaret Hamilton is not only the MIT software engineer but another Margaret Hamilton was the Wicked Witch of the West.
One of the most interesting UA-cam episodes I've ever viewed. I started using assembly language in 1970, and have a very high degree of respect for early coders. Thanks for your research and presentation! I enjoyed it immensely!!
As did I. A little bit of programming in hex based machine code and then onto Assemby. We though it was amazing when we got 8k of RAM. How things have changed.
Actor Jack Black's mum worked on the Abort-Guidance System (AGS) in the Apollo Lunar Module. In what has to be one of the best quotes ever... "In a memorial tribute, her son Neil notes that she was troubleshooting problems with schematics on the day she went into labor, called her boss to let him know she had fixed the problem and then delivered Jack".
The calculations and measurements were also done in metric and displayed in imperial units as was simpler.
I wish we just used metric in the us
@@Alkatross you do! All imperial measures in the USA are based on metric standards. The US inch has always been 25.4mm. However in the UK its varied over time - even at one point 1 inch being 27mm.
also US military - 1 click = 1km.
@@mrrolandlawrence i realize it's all the same stuff, but it's just the general knowledge of the measurement that I am missing as a native freedom unitarian.
so she fixed one problem but created a new one
@@Alkatross yeh thats one thing i'll never understand..... no one uses imperial tones in the USA. its always LBS. even when its in the millions.
one curse that is worldwide though -TVs -measured in inches everywhere and car tyres for some reason.
Landing was on the 20th. Thanks for highlighting the software/hardware of the effort!
Once in a while I stumble across something like this that really makes up for all the other mediocre junk taking up space on UA-cam that makes me say to myself, "I'm really glad I watched that!"
Dee, good job, this is pure gold👌
The content was a very fascinating little slice about this awesome historic endeavor! It invokes more appreciation for all those super intelligent people it took to make the program happen.
Great Video Thanks! Don Eyles on of the programmers responsible for those jokes wrote an interesting book called “Sunburst and Luminary an Apollo Memoir” about his time working on Apollo.
It's a really good read! Don Eyles was the main author for the Lunar landing module software and helped pioneer some coding techniques still in use like memory fetch interleaving - esp by game developers to get the most out of the consoles, as well as interrupt coding - used in OS's. The latter turned out to be key in avoiding the information flooding mentioned in the video (indicated by the 1202 alert) from stopping the mission.
Thank you for explaining this. I was a teenager in 1969 and remember well the issue of the lunar module computer being overloaded but never knew why.
I remember machine code and hand compiling 6502 programs into hex! I also remember watching the landing live, aged 12. BTW it was July 20th 1969 not the 29th. As a computer operator back in 1983ish punch cards were still in regular use where I worked. Having recently retired after 40+ years in IT, based on talking to recent graduates I get the impression that basically Assembler is no longer taught in college. This is a mistake; short of some kind of electrical engineering degree there is, in my opinion, no better way to understand what the hardware's doing.
Fairly good description of my start. Watched moon landing, programmed on punch tape between 1972 and 1979, Fortran and punched cards 1980, Computer Science degree included hand compiled 6502, bit slice CPUs, and graphics processors along with Pascal, ADA, APL, Cobol. Finally went to C and Smalltalk.
@@ProfPtarmigan Pretty close, I started as an operator in 79, learnt Cobol & ICL Plan Assenbler on the midnight shift from reference manuals. Switched to programming in 83, spen 6 years with a software house (Cobol, RPG2 & Wang Assembler), Moved into the airline industry in 89, spent 9 years doing TPF & MVS assembler (Both in the UK & US) switched to Java in 98. Moved into DEVOPS in 2018 & retired in 2023. Somewhere in there I found time to snag a 1st class hons in, effectively, computer science from the Open University.
I started learning programming when I was 19 in 83. 8086/8088 assembly code was the common language. I still use 8086 assembly as BASIC is often to limited.
I got to watch a Saturn 5 launch in 71. That is when I got the programming bug.
Thanks for this information.
My first programming was less than a decade after the moon-landing and we had a landline from college (UK, so age 16 to 18) to a local university mini-computer. The link terminal had a huge telex-keyboard and a paper-roll instead of a cathode-raye tube screen, and then you had to wait for most of the lesson (or frequently until the next one) to get the inevitable error message about a typo in line ten, or maybe the results of the program if you had been extra careful. A DSKY like the Apollo had was massively ahead of it's time. NASA has large public archives concerning the tasks and planning for the tens or hundreds of thousands of people involved in the project, and the solutions to the new problems that had never previously needed solving.
Microcomputers showed up in 1972, but it takes a while for education to catch up.
Catching up was indeed a problem, with no budgets for that sort of thing as computer use was percieved to be just something for accountants . . . The more enthusiastic students had Commodore or TRS80 machines at home, at about the same time as the telex-and-paper terminals in college!
Yep, me too on the landline setup.
We had a modem connected to the phone and then to the line printer keyboard for Fortan assignments for college!
Assembly language also is not just one language. It's a family of languages that are specific to the processor the engineer is writing for. For example the assembly written for an Intel x86 chip would differ from say an AMD x64 chipset. Back in the day each computer had its own version of assembly, so the code written for this flight computer can really only run on this flight computer.
I learned 6502 assembly language progamming on the Apple ][ computer, a decade after the 1969 Apollo 11 moon launch. The page layout and formatting of the Apollo Guidance Computer assembly code is strikingly familiar, and similar to the assembly language syntax and commenting standards I used and encountered in programming Apple ][, Commodore 64, and IBM PC computers in the 1980's and beyond. It's inspirational to see how low-level programming culture and techniques evolved from the earliest examples of embedded systems programming on machines that predated modern microprocessors.
"Comanche" is pronounced as /ko-man'-chee/. "ko" rhymes with "go" but can be softened to "kuh". The second syllable is emphasized; "man" is exactly like "man". And "chee" rhymes with "me". The Comanche tribe of Native Americans here in the USA are headquartered in the state of Oklahoma.
LOL. How do you say aluminum?
@@msromike123, the typical American pronunciation is /uh-loo'-muh-num/. Emphasis on the second syllable. The last two syllables can be /min-um/ instead.
You can blame our first prolific dictionary editor for all that. Old Noah Webster.
@@msromike123Just like it is spelled. At least you spelled it correctly. 😁
@@msromike123 Why, the same way I say titanuminium, platinuminium, uranuminium, and plutonuminium, or course. 🤪
Excellent video. It's heartening to see young software developers pay homage to the giants whose shoulders all developers stand on. I was 7 years old when Apollo 11 landed on the moon. Later in high school, I read about the AGC and I was inspired to learn how to program. First course was FORTRAN with punch cards. I worked as a software engineer for 38 years and never forgot the lessons learned from those pioneers. PS I love your South African accent. Keep up the good work!
I remember playing with discarded punch cards as a child back in the early 1980s, I guess. My parents were students at the time & one of their student jobs was loading punch cards into the university's computers (and taking them out again later). They brought home a bunch, mostly to be used as note pad replacements. They also caused my child brain to imagine all kinds of funky things for all those strange hole patterns.
Thank you very much for this fascinating & entertaining dive into our past; definitely one of humanity's most uplifting & most impressive achievements ever.
Great summary of the AGC. As others have posted Curious Marc and team have brought back an AGC and rebuilt the code for multiple missions. Current video they are debugging ground radio commands to remotely program the AGC. Given the state of the art back then it was a fantastic accomplishment.
Less well know is the LVDC (Launch Vehicle Digital Computer) that guided the Saturn 5 and astronauts to orbit designed by IBM.
Has LVDC been published like the Apollo 11 source? I'd love to see that code. Maybe Curious Marc could restore that hardware for an encore to the AGC project
@@shackamaxon512 Much of the extant LVDC hardware is now a tangled mass of wires because, to keep it light, the LVDC body was built of beryllium. These have degraded to uselessness over time. A shame too; the published specs make it clear the LVDC was also a ground-breaking machine.
Ah, the good old days when programming was creative and fun, when code wasn't shipped until it was just about perfect. Also, the verb/noun paradigm is so well-suited to human command logic. Great video. Enlightening. Entertaining. Well done,
It famously wasn't perfect.
@@phill6859 it never will be
I mean, the noun-verb paradigm was the true genius of the system. It was such an elegant solution to the size and weight limitations necessary in the AGC human interface component. Also, an amazing design in light of the need for the astronauts to be able to accurately interpret the output from, and make input to the AGC in times of extreme stress. Mind boggling!
You'd better be sure that most of the work on the Apollo software was not fun but very tedious. Those jokes tell a story of exasperation.
On the flip side, coding can be very much fun today, and modern software can be far more reliable than what even those most brilliant programmers of the 60s were able to do with their limited tooling. If your experience tells you otherwise then perhaps you should look a bit more around for better projects, other programming languages, etc..
Ah, the good old days when requirements were so simple and constrained. Even the smallest app today does so many more things than just "Fly rocket to moon". And then a separate program "Land it".
A serialised restoration of an actual Apollo AGC can be viewed here on UA-cam. Definitely worth a look.
Currently, CuriousMarc and his team are working on the Apollo radio communications equipment (S-band).
Margret Hamilton's daughter often went to her mother's work and was allowed to play in a mock-up capsule that contained an AGC. At one point an alarm went off and Margret asked what her daughter had done and it turned out that she had started program 00 which basically meant that the Saturn was ready for launch. Margret warned her superiors but they brushed it off saying that astronauts would never be stupid enough to do that. Until James Lovell made exactly that mistake during the Apollo 8 mission when it was already close to the moon. Mission Control was not amused. With much pain and effort they managed to restore the AGC remotely!!!
00 is the AGC Idle ("Go to pooh") command. 01-03 are the command for starting up a Saturn. Also, Lovell got 40 and 41 mixed up during Apollo 13, which they were able to unscramble quickly. You're thinking of another incident (maybe on Apollo 8?) that most of us haven't been exposed to?
@@fredflintstone9609 Yes, as I have already said, it was on the Apollo 8 mission. Lovell himself confirmed it. If I can find the link, I will add it here later.
ua-cam.com/video/Wa5x0T-pee0/v-deo.htmlsi=yKVtUI_-NUDnxnql?t=2559
The 1201 and 1202 Alarms were causing AGC reboots during the Apollo 11 Landing. This extended the range of the de orbit and landing engine burn. The low level assembly language allowed the AGC to keep up with required tasks, even with frequent reboots. Reboots were near instantaneous. I always wanted to find out how long it took Mission Control to isolate the problem to the enabled Rendezvous Radar. Note: Apollo 14 Landing used a lot of fuel also. They had only a minute of fuel remaining at Touchdown. Apollo 14 had an issue with a landing abort switch with an intermittent short that had to be isolated with an in orbit re-write of the Luminary Code. Loved watching your video! 👏👏👏
MIT handled the 1201 and 1202 problem isolation and provided the fix just before 11 lifted off the lunar surface; see also George Silver. On 14, Don Eyles wrote some code that lived in R/W memory; rewriting Luminary was not possible as it was in fixed (think ROM) memory. See Sunburst and Luminary by Don Eyles.
Fascinating. The code itself is one of the best records of the mission. I also want to point out the amazing inertia motors\sensors connected to the computer. This combination of brilliant engineering enabled the craft to remain perfectly oriented through all missions. Our phones have a similar device installed, and the Occulus Rift has a pair. A DJI Drone hovers perfectly still in all wind conditions (below 25mph) because of its precise, near weightless inertia motors. For a computer, in 1969, to observe that data and respond to it accurately is mind blowing.
There is a book about the Apollo guidance computer called "The Apollo Guidance Computer Architecture and Operation" which is pretty good reading. It had a preemptive and cooperative multiprogramming, an interpreter to implement more complex instructions.
Correct me if I'm wrong but my understanding of the AGC is that it was NOT just a calculator some are led to believe. To believe that would be absurd. But what was meant by that statement is that it had the computation power inline with that of a calculator but far more capable bc of its preemptive & cooperative multi-tasking abilities.
@@TheWallReports I guess the problem is also that different people think of different things when thinking of a calculator. When I think of a calculator I think of a non-programmable device which has keys for a fixed set of mathematical operations and a display capable of showing a single number; however there were other electronics also going under the name of calculator that could be programmed and had graphical displays, and which IMHO would qualify as pocket computers.
@@TheWallReports It looks like a calculator because it use a VERB/NOUN interface but it could do a lot of things. The verb was a two digit code that defined the action for 00-99. The lower codes were for early stages on mission like prelaunch while later codes were used for later parts of the mission like navigation, descent and landing back on earth. This interface was something the developers came up with while expected something better in the future but nobody could think on anything better.
The CPU had 16 bit registers. With the interpreter they implemented opcodes on double width registers doing trig functions and vector/matrix operations needs for flight control. The CPU had access via IO to many sensors and engine controls.
The AGC were quite capable but the ground facilities had much more powerful computers to augment the AGC because there were weight constraints on how big a on board AGC could be.
shes not into this, the video is for clicks and talk.
As a Space Nerd who spends a horrific amount of time observing the very small details and progress SpaceX is making with Falcon and now Starship, and also physics and engineering back ground, you did a very good job of explaining the mission correctly. So many get parts of this wrong. And you made the inner workings of this very interesting. This is a great video. Have you ever seen the interview Smarter Every Day did with one of the engineers that made the Sartun and integrated all the systems the code you reviewed ran on and controlled? Its awesome.
We were using punch cards up through 1991 on our ship. We had to learn Hollerith code and even had 5-channel paper tape and magnetic tape. We had guys coming down from various departments, sometimes with several BOXES full of the cards. Every once in a while, they would trip on the ladder (stairwell) coming down to our shop to give us the job! Those cards were stacked sequentially and had NO writing on them. They had to go back to their shop and re-run the entire job again to get the output. :)
So they missed the trick of running a pen down the side of a stack of cards at a slight angle so that even if the cards get shuffled it's quick and easy to get them back into order.
@@Heater-v1.0.0 Our cards were punched and also had the text printed at the top of the card. We were taught to tab over to the comment part of the punch card starting at I believe column 72 and type a sequential number. No one ever did it, since it was so slow to move over there. Dropping your punched card deck was everyone's nightmare.
@@dermick Yeah, we didn't bother numbering our cards when punching in Algol code in 1976. Seems a lot of people didn't cotton on to the trick of running a marker pen down the side of the deck. Anyway, luckily by 1977 we had terminals to a timesharing OS , no more card punching. Then very soon came micro-processors and personal computers. Happy days...
Fun fact: When the space shuttles were built, technicians in Palmdale, California used IBM punch cards as shims to establish gaps between the tiles during bonding operations. As late as 2008 we still found pieces of punch cards between tiles that flew many missions. The punch cards were very consistent in terms of thickness and techs could establish gaps with them to meet strict gap requirements very easily.
I was in the US Air Force, and we used punch cards well into the 80s. It was in the 90s, if I recall correctly, that we switched over to a system of remote terminals. Even then, it was the old system at heart, but you could enter data directly rather than via punch cards. This allowed for real-time updates to maintenance data rather than overnight batch processing.
As a Cold War veteran myself I was the Communications Squadron and I remember it very well.👍🏾
Yep... I still remember the thrill of the first IBM 3270 terminal in the shop with CICS to enter our code on. The senior programmers hogged it up most of the time...
When I was in the Marine Corps, I was almost a mainframe operator. Still even lists that MOS on my dd214 as a secondary MOS. Luckily, due to some priority changes, I went to school to be a Small Computer Systems Specialist (an IT guy basically), but then ended up actually doing UNIX sysadmin work. I was around a lot of mainframe folks, though, both Marines and civilians.
My wife took a computer programming course at Brown University in about 1973. It involved figuring out what you wanted to do, punching holes in punch cards, then hand simulating each card to remove as many errors as you could, and then taking a box of cards to the computer center, to be fed into the card reader. Generally, the first try it would get a few dozen cards into the run and stop due to an error. It generally took a few hours before they could do your run, so that meant go back and figure out why you got the error that you did, and try to fix it. Then you would take the cards to the computer center and start the cycle again, hoping to get further into the run before it stopped again.
Checking online, it cuts deeper - that "burn baby burn" line was first used during the race riots in 1965 by a Radio DJ named Magnificent Montague in America...
Super video, one of my absolute favourite software topics is the AGC. Many thanks for taking your time here, yet getting it all done in under 20 minutes. Well impressed!
I am interested in what kind of instruction set they had available to them in terms of a Von Neumann machine, to be writing assembly on a portable computer in the late 60s. And also that the tech had to be shielded from extreme physical forces - ploughing thru Earth's atmosphere. And also, as others have said here in the comments, the actual point-to-point wire-weaving of the memory modules for the AGC, by real seamstresses with actual needles and a ball of wire, is the other most-phenomenal of facts about this software-hardware machine!
Thanks for the upload, subbed 👍
Great posting. Felt the need to create a new playlist calling it 'Close to heart'. So packed with good info.
I'm 67, at school they stopped the whole school for the Lunar landing wheeled out a television and we watched the landing LIVE. Later on in my Career I spent 40+ years in IT. A specialist in Aerospace/Defence, several languages, no machine code, megabytes of code. Yes I put in jokes. They can be useful. If an error says "Tony your shorts are on fire" in correction you go straight to the place in the code. "Error 404" could be anywhere. NOW THE IMPORTANT PART. For Apollo they taught me that they programmed in triplicate. For a key calculation three programmers would code the algorithm. The calculation on the onboard computer would run all three in parralel. If all three did not agree the result, it would display the two majority result. If All three were different it was "Shorts on Fire". From my experience they could not run the whole of the system in triplicate (we can now) this would be sub sections of code.
I'd just like to thank you for a very clear and interesting explanation. I remember watching the moon landing as a 11 year old kid, then in my last year of middle school learning Fortran programming using the write the code -> convert to octal -> punch cards with an unbent paperclip -> bind your program with a rubber band -> wait for the printout to return from the university the next week method. In the early 80s I was programming simple games in Z80 assembly language for the simple home computers of the day. Persistent memory was cassette tapes!
Note: Assembly code is platform specific. X86 assembly is very different than ARM64 or 6502 assembler. They have completely different syntax, opcodes, registers, etc. Each CPU type's assembly code is essentially it's own language and there is generally no direct parallel between different platforms.
Oldish assembly coder here. Whilst assembly can be different on each platform, the biggest difference is big endian and little endian. Where basically the code is written backward. (Kinda). And yes, opcodes, registers, memory etc. all different but that can even be different from the same architecture to the next in the same architecture.
I agree they are different but.... I know Microchip assembly vey well, and if I have a quick look at a reference manual, I can easily read and understand any other flavor of Assembly.
@@TheHeff76endian is rather insignificant, big endian is easier for a programmer to read but slower for the computer (little endian let's you start doing maths as soon as it's read the lowest significant byte)
Load/store Vs register/memory architecture is a more important distinction.
The way the 6502 updates flags after every instruction while 8080/z80 only does it on a few, is a larger difference. Even the indexing modes on 6502 are vastly superior. Trying to switch between 6502 and z80 (which are both little endian) can be annoying, the 6502 is the best CPU ever.
@@phill6859 I've never done anything with the 6502, but I heard it had very few registers. Also, every instruction updating the flags sounds more like a nightmare to me; what if you want to test more than one flag? Also it means that you'll more often need to store the flag's value somewhere else for later reference.
@@__christopher__depending on which instruction is running only some of the flag bits are changed not all of them. Normally only the logic like and, or, not, rotate OR math like add, sub, compare effected any of the flags then use conditional jumps to branch on status. Even though it might refresh the flags on every instruction it doesn't usually change any of them.
That's Epic; thanks for clearing up the situation between the AGC and the Lunar Module Pilot:
Buzz "Computer, tell me the altitude", Computer "Hold that thought Buzz, I'm a little Busy at the moment! (1201 & 1202 alarms)"
No different computer... The AGS was the backup for an aborted landing attempt to bring them back up to lunar orbit if the PGNCS called pings
Apollo's AGC was the first computer to use ICs (Integrated circuit / Chips) There was a lot of discussion about this but the team decided that they were the only way the project could be realized
Whilst they used some super basic logic gate ICs in the AGC, the real leap goes to the CADC in the Grumman F14 Tomcat which was designed in ~1968-70 and used the first set of ICs that would in combination be considered a microprocessor, the computer being the only way to make the swing wing design flyable
This was so interesting and I'm not a developer. You can take almost any aspect of these first steps in space exploration and have enough material for like 50 feature films. This video does a great job illustrating how ground-breaking this tech was and it's nice to see articulated kudos to all of the brilliant minds that were the OGs of the quest for space. But calling assembly "challenging" is like calling the Burj "tall" 😆Great vid!
I started programing Macro11 Digital PDP11 I MISS IT!
I started on 6502, 8088, and then later Vax Macro
I still love assembly language programming, there's something unique knowing you are literally dealing with the instructions themselves
Now I use python which is nice but it doesn't quite feel the same
Nice, this is really gold lady!😊 I was that kid of 5 yo. That was pulled out his bed by his parents to watch the landing... still remember it all
Based on my experience using assembly language in the 1970s, you needed some sense of humor or else you would lose your mind. Most of the humor in my code and those of my colleagues was more like graveyard humor and rather dark. The comments and subroutine names you quoted are much more amusing than the ones we had.
Great video! Loved it, used to enjoy programming my first computer (C64) in machine code. Very impressed with the source code that made sure we REALLY could land on the moon and get back safely!
People might be amazed to know that there is STILL a LOT of IBM BAL assembly code running - especially in large, critical applications where companies deem it just too expensive to rewrite, test, and certify the new code. The banking industry is notorious for this - lots of ancient code on HOGAN systems and still running heirarchical databases like IMS...
Hi...thanks for the nice video about Apollo software (and hardware). I have written a lot of assembly code as a computer engineer, so I can appreciate what it took to do this work. This required someone at the top a great deal of planning to structure this given assembly code by nature does very little...requiring LOTS of source code. It looks like they broke the code down into many small parts, which makes sense.
And to think we got to the moon with a computer program written in assembler that supported only 30 to 40 operations, and no floating point math.
It had a funny sort of fixed point system for dealing with non-integers.
I don’t recall how they handled decimals, but it is worth noting that if you understand the range of numbers you’ll be dealing with and range of accuracy you need, there is absolutely no need for floating point.
Floating point is way more flexible since you have both significant digits and the ability to place the point within those digits, but if you don’t need to move the decimal, integers are fine, and they would have a pretty solid idea of the range of numbers possible for any reading or calculation, the accuracy of the inputs, and usefulness of the outputs.
Genuinely amazing code.
This video is brilliant and engaging. I loved watching it even though I’m not a coder..
One tiny thing though-
The command module was shown without the service module attached. Command module is always attached to the service module throughout the mission except for just a few minutes before landing. This exposes the semi spherical heat shield on the bottom of the command module.
One should note that the CM pictured has flown.
The first language I learnt was BASIC. The second was 6502 Assembler. I also programmed on punched cards at school.
First I Iearned a psudo assembly code the teacher wrote.. then Fortran with both large and then small punch cards, then Basic on the IBM terminal, then APL on the terminal. We also learned a smattering of Cobol and Pascal. Later on taught myself a bunch of flavors of Basic including Mallard Basic on the Amstrad, AmigaBasic and a compiled Basic for the 68000... as well as C and the 68000 family of assembly. Also on the Amiga I learned E, Pascal, and a bunch of others I can't remember anymore. Then came the MS-DOS era with 6800 asm, MS Basic, Java & Javascript, HTML, Perl, ASP, PHP... now for the past 5 years I've been a dev using C++ with the Qt framework. Now learning C# using VS and the .NET framework.
@@douglascaskey7302 Very nice. I learned C on the Amiga too.
Great video! I love learning about early computing.
They had so little compared to today, but what they had, they made sing.
Assembly is still very much relevant today. In fact, every single time you compile software, you are compiling it into ASM (Assembly) and then the assembler takes over from there.
This is true as if you need efficiency and speed from a CPU and it's peripherals, you switch to assembly, write a sub and call it from whatever higher level language you're writing in.
Unless your compiler goes direct to machine code. Which most do anymore.
@fredflintstone9609 Most compilers still go through ASM since the assemblers are usually built with further optimization sometimes.
I was seven years old and my parents let me stay up to watch it. I still remember how exciting it was. One small step for man....
The source code looks like assembly. 0:21 Nice.
Well it is
I loved the video and appreciate the background information provided. I should point out that Margaret Hamilton was certainly not standing next to a single listing of the source code. Someone computed the entire github repo to contain 1,751 pages which would not create a stack that size. The most likely explanation is that it is a culmination of test data run against the source or possibly several printouts stack for effect.
i can see why there are moon conspiracies. these engineers performed MAGIC. forget the primitive hardware, the sheer elegance of practical code to control a system of that complexity seems like scifi mumbo jumbo.
On the contrary. There are conspiracies because we have not gone there since then with infinitely more powerful, safe and robust equipment
@@airaction6423look up the Artemis program
@@airaction6423 Science hasn’t been prioritized since then. Even back then, it took the Cold War to make governments and their citizens see science as a high priority.
@@airaction6423The reasoing is in the wall
The US went there for geopolitical reasons. Win the cold war.
It went there after apolo 11 too. Multiple times.
Then the space race was won...and the cold war itself was won.
The US government virtually defunded NASA. It had served its purpouse
Thats why. And its sad.
@@miguelpadeiro762 I understand your arguments but it still makes little or no sense
Seeing how it is still difficult today to put a craft on the moon (there have been recent failures), let alone a human, it makes NASA’s success in putting 12 people on the moon with 1960s technology even more awe inspiring. When done right, a government agency can do fantastic things. Private enterprise can too, but only if there is a profit motive. Public good vs private profits.
7:45 As someone who has written assembly language programs for 64K machines, I can say that the stack of printouts that Margaret is standing near in this photograph must be for the *entirety* of software for the Apollo program, not just the lunar and/or command module AGCs. A printout of source code for an assembly program (even one heavily-commented) that fits in 73K (3,840 RAM + 69,000 ROM) might require the number of pages of one of the smaller binders.
In those main files in the GitHub repo you can see that Luminary099 had 1743 pages and Comanche055 has 1751 pages, which is like a box of copy paper. That stack looks more like 8k pages or so. I think you're on to something.
It is all of the code for Apollo.
She has said in interviews that when the reporter and photographer showed up, they came up with the idea of making the "stack of code." So, they went around the room and literally just grabbed whatever binders were there. So, that's probably not all source code. I suspect a lot of the thicker binders are just data dumps and other diagnostic listings. After all, computer only had about 36k of assembled code. Even with copious comments, the source code for that is not going to be a huge pile of paper
Yes. Each book is the source code for one AGC. But “Luminary 99” means the 99th version of Luminary, so there were far more versions than is visible in that stack. For that photo, they just grabbed a bunch of printouts that were lying around.
There's no way that stack is just the code for the Command Module and The Lunar Lander. I just downloaded the repo from Github, and the whole repo is 177 files @ 3.34mb. Given printing on 8 1/2 x 11 paper that's roughly 342 pages... less than a 500 sheet ream of ream of paper. I'm assuming those printouts where the typical green and white wide tractor feed... so it would take less than 342 pages. The project I'm dev on is almost 6,000 files at 2.18 GB... which is about 46K pages or 92 reams of paper and @ 2" /ream that's 182 inches or 15.3 feet.
The photo is pure made up BS.
I felt like I was on this mission just based on your description of the code...Excellent!!! 🚀
They do not make such quality any longer -- same as with many, many products.
I heard all that I need. She was DIRECTOR of software engineering team. Thanks!
Awesome presentation! Learned a few new things. Thank you so much
Nice deep dive into the code and the functions it was meant to control. The humour of the coders was a nice touch. Thank you for the video.
Great video! This relates to my favorite computer hack on Apollo 14 to bypass the abort button. It’s too much to type in a comment but might be something for you to cover in a future video
The book “Sunburst and Luminary: An Apollo Memoir” by Don Eyles is also an amazing look at the creation of the AGC and specifically the software from the start.
Good video. BUT big ERROR at 0:52. NOT 29 th.... 20 th July was the Landing on the moon.
Such a well known fact.
So easy to check. Google it.
I was watching it at that time.
Please correct or check your script. You are precise normally.
Now I can we trust your facts on the rest if you missed this. Still i will watch your videos because you do excellent presentation.
I’m reading the book Sunburst and Luminary : An Apollo Memoir by Don Eyles. A really great read on the AGC’s lunar module software. Coding With Dee, you got the explanation for the 1201/1202 alarms spot on, according to Don’s book.
As an ex assembly programmer (on PDP 11 about a decade after this) I can add that it was good practice to comment every line of code. They were after all cryptic instructions involving registers and memory locations.
Actually the PDP 11 had a great assembler. All registers could be incremented and used as memory pointers. Once you added Macros you were close to C. Actually C was essentially sugared PDP assembly code.
As a fellow PDP-11 programmer I approve this 😁. Just a small addition to the nice comments: I once saw this comment (IMHO also citing Shakespeare): 'Our faults dear Brutus are not in our stars, but in ourselves' - error handling function of the RT-11 operating system. I used RT-11 sometimes, but mainly worked with RSX-11M.
All were 'real programmers', not just quiche eaters 🤣.
The register symmetry of the PDP 11 was really nice. The Motorola chip in the early Macs borrowed a lot from that architecture.
Likewise... this brings back equal memories of nightmares (when it didn't work) and huge euphoria (when it finally did work). We were interfacing A/D's to PDP-11 for data acquisition. Set up "Ping Pong" buffers so the A/D could be writing to Ping while the PDP-11 could read Pong; write to disk... then flip the pointers. Hopefully the "bathtub" never overflowed while filling and draining at the same time. It was those other pesky interrupts from disk writes that could ruin things... now, what was the sustained disk write speed for 8k vs 16k vs 32k block-writes???? As I get close to retirement age (actually past it, but still love being an engineer), I'm thinking about getting into Arduino's for Halloween projects.
@@EngRMP The Arduino is good. I'd recommend getting some of the clones, particularly the nano version. I would also recommend using 2.54mm pin header socks and connectors so the parts can be removed and replaced easily. while I generally gravitate towards the ESP32 , ESP8266, and the raspberry pi pico. However those devices use 3.3 volt logic vs the Arduino's 5 volt logic. It's easier connecting sensors and output devices to the Arduino since most are going to be 5 volt logic and you won't require additional hardware. Get yourself a package about a hundred 2n222 transistors or some 2n7000 fets. It's possible to drive higher power mosfets directly, but if I remember correctly 70 ma is about the most you want to source from the Atmel 328 and stay within safety parameters. There's no real advantage to using real Arduino boards over the clones. I hope you'll enjoy building projects with Arduino as much as I have. The only limitations are going to be your imagination and your wallet. Oh...I usually buy in quantities of 10 so if something goes wrong I have plenty of spares available without any waiting. Besides this stuff is cheap. Take care.
I never actually wrote any useful PDP-11 assembly code, but my 2nd-semester programming course in college (Spring 1972) involved writing a PDP-11 simulator in the first half of the course and then a PDP-11 assembler in the second half, so I became fairly familiar with the PDP-11 instruction set. They made some nice design choices. The auto-increment and auto-decrement addressing modes were useful in lots of situations, and having the program counter be one of the 8 registers meant that pc-relative addressing for branches and subroutine calls was available to the programmer (or compiler writer) without it having to be a special distinct addressing mode.
My dad was a technical writer working on the Apollo project for North American Aviation (later North American Rockwell). The Lunar Module (sometimes called the Lunar Excursion Module or "LEM") was built and tested near his office in Downey, California. He would have enjoyed this video.
FYI, "Burn Baby Burn" was a term associated with the 1965 Watts Riots. It was often quoted by college students of the time. The MIT programmers would have been familiar with it's source and meaning.
Don Eyles writhe the decent and landing part of the program. He had a great sense of humour and I’m certain that he was responsible for most of all of the crazy comments in the source code.
Great video, thanks for doing this. I'm a big fan of the AGC. It really was an amazing accomplishment and in many ways it drove the advances in cpu design, computer architectures, and software development that set the stage for the computer revolution that followed. If you are interested in this kind of computer history, there is a book titled Apollo Guidance Computer that goes through its hardware and software in lots of detail, yet remains approachable even if you don't write machine code just for the fun of it. That being said, your video provided come insights that weren't in the book.
By the way, I think most calculator comparisons are based on memory or processing power and are rarely accurate.
I went to college in the late 80s. I was the first CompSci class to NOT use punch cards for computer programming classes. My internship was at a company that still used a punch card machine; for FIVE cards per day! So, this technology lingered on and on!
Same. I learned on PDP-11s running RSTS/E in college on the 80s. We’d use VT52 terminals or even the teletype machines in the lab.
But on my work experience I had to deal with punched cards and paper tape.
@@TesterAnimal1 Yes, well, this is typical for every Generation. In Universities we mostly learn the cutting edge science and tooling (but also sometimes got optical EEPROMS for programming an SPS 🙂). When you come to industry then you realize, that most of them are at least 10 years behind.
I took a programming course (FORTRAN) at a college in the summer before my senior year of High School. We used punch cards.
By the time I started at the same college as a freshman, in 1978, punch cards were obsolete.
Excellent video. For anyone interested in learning more, I recommend the excellent memoir “Sunburst and Luminary” by one of the software engineers, Don Eyles.
Also, the BBC podcast “13 minutes to the moon” does an amazing job of putting you right into the drama of those 13 minutes from undocking from the command module to landing on the lunar surface.
the note off to see the wizard is followed by the instruction to call burb baby burn , so you see
Back then very few people knew about computer programming. Very little memory available to store it on. Crazy but true. Amazed that they got there and back.
But but but Stanley Kubrick and the studio! And what about the Van Allen Belt? Huh, what about that pretty lady? And everyone now knows that only computers can do complex math! Got'cha there...
Stanley Kubrick obviously filmed the moon landings, the space agency had no intention of going to the moon. What people don’t realize is that he’s such a stickler for detail that they ended up going the moon to film in-situ.
What is amazing was how given the level of Computer Technology at the time of the Apollo missions, how they were able build all the necessary functionality into a computer that was able to fit into the Apollo Command Module and the Apollo Lunar Module. With both of these modules the Apollo Guidance Computers had a limited size and weight that could be fitted into these modules. There was not the luxury of having a large size and heavy weight. So it is likely the Apollo Guidance Computer was the smallest and lightest computers of its time.
Yes. The AGC was the first computer *in the world* to use integrated circuits. Those were chips containing only two logic gates each, but they were an advance over the use of individual transistors as was common back then.
Boeing should have learned a thing or two from those NASA programmers before outsourcing their software development from overseas…
It is unfair to judge their entire history with the recent events
They weren't NASA programmers; they were MIT or IBM programmers.
Really nice summary. I programmed in assembly in the late 70s and key punch cards were used initially followed by an editor. Used cards, paper tape, mag tape and Mylar tape. Our machine used core memory.
You can do a lot with 3 terabytes, but you can also do a lot with 3 bytes.
What is a lot you can do with 3 bytes?
Track 16 million packages for starters.
@@automateTec That's 3 bytes times 18 million, 54 million bytes.
54 MB.
@@callykitten5095 not in barcode form :)
Thanks for this video! As an embedded systems engineer I found this super interesting.
12:59 You cut that quote short. It looks like it was split over a page boundary, if I understand the "Page 393" marker correctly, and the last bit is on the line after that. So, it should end as, "...such abominable words as no Christian ear can endure to hear."
Yeah that kind of bothered me more than it should have. Also not sure why the programmer gave the page number, because that's going to vary in every edition anyway. Instead you go by act, scene and line numbers.
@@prosimian21 It's a digital transcription of the source code that was originally in printed form, so I'm pretty sure the "Page 393" is just marking where page 393 of the code starts, and is nothing to do with the Shakespeare quote.
Great video!
I always think of the AGC as an HP-12C on steroids-you input numbers, store them in registers, and call commands to have the "calculator" compute and give you results. The part where it gives the astronaut instructions to correct the trajectory works like that. The astronaut observes the positions of 40+ stars and inputs them into the computer, entering the star number and its position. Then, the AGC calculates the required correction, providing the result in terms of thrust duration and timing. This meant the astronauts had to manually ignite the thruster and control how long it stayed on (which is where the beautiful Omega Speedmaster came into play). The AGC didn’t control the thrusters directly or track positions automatically.
I’d love to see what the AGC code would look like in C or another modern programming language.
Uh, no. The AGC _did_ control the thrusters directly and the astro only had to sight/mark on two stars (of 40 possible the AGC knew about). The astros had access to a manual mode where the AGC controlled the thrusters based on the inputs from the controls inside the modules, but this was only used once on 13 because the LM AGC had no software designed for a docked burn.
@@fredflintstone9609 I see. Is it the "Auto maneuver" in this flight log?
022:51:55 Aldrin: Apollo 11...
022:51:58 McCandless: Go ahead, 11.
022:52:02 Aldrin: [Faint] Roger. Standing by for your updates. Over.
022:52:10 McCandless: Okay, 11. This is Houston. At time approximately 22:30 in the Flight Plan, in your post-sleep checklist, and in all other post-sleep checklists, we'd like you to delete the statement that says Auto RCS Jet Select, 16, to On, and what we're doing here is picking this up in the procedure for exiting PTC that's in your CSM checklist. And in the CSM checklist on page Foxtrot 9-8: if you want to turn to that, we'd like to change the order of the steps in that. Over.
022:53:13 Aldrin: [Faint]Stand by. [Long pause.]
022:53:37 Aldrin: [Faint]Okay. Page F9-8. Go ahead.
022:53:40 McCandless: Okay. Right now it reads, "To exit G&N PTC". Then you've got a pen-and-ink change that says, "Auto RCS Select, 12, Main A and B". And you come down to the printed Step 1. We'd like to take and move the "Auto RCS Select, 12, Main A and B" down to be the second step, so the procedure would read, Step 1, "Manual Attitude 3, Accel Command"; Step 2 "Auto RCS Select, 12, Main A B." Step 3 would be, "Verify DAP Load", and so on. Over.
022:54:16 Aldrin: [Faint] Roger. I copy. Is that "Auto RCS Select, 12, Main A, Main B" to be the... [Fades out.]
Assembly is not “a” language, nor is it confined to history books. Every processor architecture has its own unique assembly language designed for the machine code used by that processor. Assembly languages are still widely used today for situations where you need very compact, fast, and/or efficient code for portions (or all) of a program.
Also you cut off the Henry VI quote, it had the remainder but whoever put it in that comment left a page number line in the middle of it for some reason. (And, pointless nitpick time,😂 comments in that language are preceded by hashtags, not followed by them. 😉)
Your explanation of the Error codes was brilliant. Thanks. The reset solution had me shutting down my iPhone and LOL. No traffic up there..
I'll program in assembly or direct in machine (or even COBOL) any day over "programming" using Spring, TypeScript and Kotlin. Those are the worst "programming" languages / frameworks ever made.
It's a great computer, incredible for the time. Good fun to program, I made a couple of games for it a few years ago and in terms of raw speed is on par with 8-bit computers from the 80s. I had to put in artificial delays so the games wouldn't run too fast.
Great video
Programming such computers was a huge challenge, the software had to be very short and efficient, squeezing every line of code to the maximum.
It's crazy that we don't even have the ability to replicate that and go back right now. Some astronauts have said our current technology can't do it but we're so close again to making it happen. Think back when so many millions of people had to listen to the radio but now the incredible advancements in video when we go back to the moon is going to be insane! I can't wait.
We absolutely can replicate it, at a fraction of a cost they paid back then. It is just still far too expensive.
It would be very unwise to reproduce this 60 year old technology. The goal was to put a man on the moon within 10 years so it didn't have to be that efficient and money wasn't an issue then. They had been there 6 times and there was no need to go back. SpaceX shows how it should be done now. Artemis is somewhere in the middle by using upgraded Space Shuttle hardware.
@@vitalyl1327 I'll listen to the expert scientists and astronauts who say we don't have the technology to go back right now.
@@REEREEREE33 you're a crap listener then, because no current space engineer ever said that. What we don't have is the entire extremely expensive supply chain that was built for that project alone, at an enormous cost.
@@vitalyl1327 I doubt cost is the real issue. It's the non-existent ROI. What would anyone currently achieve by leaving orbit? No monetary gain prospects in any way,shape or form. The americans just wanted to beat the Russians to the moon, so they were willing to spend just about every cent they could dig out their pockets to make it happen and make it happen FAST they did.
Saying that we {can't} do it, is a pretty fucking stupid thing to say.
I have an affection for that era, as my second computer experience (after the Dartmouth IBM mainframe, at age 12-13) was with a DEC pdp-8e, a moderate upgrade on the tech developed for this AGC. It was far easier with old Teletype terminals, and a high speed optical paper tape reader, than dealing with punch cards, or the limits NASA's AGC had of 2k RAM, and 36k of ROM as its storage limit. MIT's work on AGC no doubt had crossover to DEC engineering.
When we think of standard programming languages in the cp/m uPC era and since, that was NOT the APC. The pdp-8e could run BASIC, FOCAL, COBOL, FORTRAN, or a two pass ASM assembler and compiler. NASA's AGC ran AGC4, its own unique hardware specific code.
MIT's engineers amazingly outdid one major NASA contract spec for AGC. They came out with a 70 lb computer, well under the 100 pound design goal limit. That in itself was amazing at that time.
It's been a LONG time since I wrote ASM code. One reason Steve Gibson's hard drive utilities are so efficient, is that's one of his long term specialties, as other code bloats just on GUI interfaces, that may hog more than a million times the video RAM alone, on which we ran entire space missions in 1969.