This channel has come so far. I remember when the content was just altering game config files to bring game graphics below their minimum settings and spec, and these documentaries are just as valuable imo.
@@MyScorpion42 Problem is more and more games not allowing editing of the game files, single player games mostly excluded, but even then they continue to make it more difficult, that and most of the methods are kinda same-same anyway since a lot of games are built on existing engines and all you need to do is find the right files. There's only so many different ways of explaining what the same options do and once people know what they do they can just edit it themselves.
I bought the first 6502 product, the KIM-1, directly from MOS, and built up a system around it. Later, I bought a Commodore PET and did a lot on that. Then I got a job at Atari designing games for their machine. I still have my KIM-1, my PET, and my Atari 800. I knew the 6502 instruction set, in machine code, by heart.
My first 6502 was an Ohio Scientific single board computer kit. The graphics weren't great, but good enough that I could write fun games in BASIC. My next computers were Commodores and they had much better graphics systems. Rhe graphic sprites and sound system on the C64 were great.
I had the VIM-1 which was a improved KIM-1 they said if you like KIM you will love VIM I had to program it with hexadecimal numbers (just like the KIM)
I met Chuck Peddle briefly at an event in the fall of 2019 before he passed away. Even at 82, he was still excited about new projects and eagerly shared his experience with us undergrads. A truly gifted and talented engineer. I really wish I could have spoken with him longer
One exception to accuracy. I don’t think any individual was sued. I only saw one lawsuit and none was served to me personally. All our names were mentioned in the lawsuit. I did have the feeling that if things got tough I could be on my own. The suit was filed in Philadelphia in the fall of 1975. It should be on file. I only read the first page. “Motorola took seven man years to do the layout. MOS Technology took one man year,” therefore we stole it. As a matter of record I returned to Motorola in 1977 and worked there till 2010. There is a story there.
@@engineerbot That was page one in the lawsuit. There were many more pages I never read. MOS technology ran out of money. As I remember it the first lawyer bill was $400k. They could not pay a second one like that so they had to settle. Someone else will have to post the terms of the settlement. The possession of damaging material by one of the former Motorola employees, even tho not used or even seen by others, would have made further litigation extremely expensive. Motorola could pay it. MOS Technology could not. I was not privy to these discussions.
@@harryb1251 That's the basic model of operations for large businesses. Make it impossible for your smaller competitor to actually fight it in court even if you're in the wrong because you have an entire office floor for your lawyers on retainer and the small business just doesn't have the liquidity to fight you.
Ah the memories! Taught myself 6502 assembly language as a teenager in the early 80's and wrote my own games cos I couldn't afford to buy them. Never did manage to stop my Pacman ghosts getting stuck in corners.....
Should have prototyped the logic in something like BASIC and then converted it. I couldn't even afford an assembler so I wrote my own BASIC program that converted z80 Assembler instructions (ZX81) into machine instructions that were POKEd directly into memory. I did intend to go a step further and create a BASIC > assembler > machine instructions compiler but I discovered drink and girls and that was the end of that! :)
For me again it was the 6502, writing demo's on the C64 in the late 80's. Not even using an assembler, just punching hex into memory using a monitor.... lol I managed to learn every hex code for every instruction and could just sit their typing away, hex after hex after hex for hours on end. Was 14 or 15 at the time.
@@JoannaHammond. Me too. Punching hex into a keypad - I couldn't afford the full keyboard for my 6502 computer until much later. Codes such as A9 (LDA) are burned into my brain.
I probably still have the 6502 manuals somewhere. One thing I recall that was not mentioned was that the 6502 was designed to be made on a memory production line rather than requiring a more complex chip production line that other micros required, which allowed them to be made more cheaply and leverage exiting production capacity.
I laughed so hard when the Steve Jobs negotiation part came out. That part of the video really capture his voice perfectly. Jobs biography book described how angry he can be when negotiating with someone he dislike. Jack was certainly the type of person that jobs dislike.
@@LowSpecGamer Tramiel (BTW drop the i, it sounds like Tra-MEL per the man himself) was likewise well known for being fiery AF, that negotiation would have been legendary. He was a notorious micromanager and the friction between him and the chairman of the board Irving Gould was not great even though his performance had made Commodore immensely profitable. It's honestly weird looking back at the 70s and 80s in computing and realising just how much larger than life all these people were in the industry, so many hard nosed characters dictating the way the industry would evolve.
Jack was the kind of person most people would dislike, I bet. There were many jerks in that generation of tech CEOs (counting Jobs), but he took the biscuit.
@@ButterfatFarms - *_I don't think he cared what he was selling in the end. It eventually turned out to be consumers electronics, calculators, watches, computers and a few other odd niches._* When I was in my teens -- the mid-1960s -- my father gave me a small "portable" B&W TV. The brand was "Commodore"! After a year or two, the tuner developed a malfunction. This was quite common with the "electro-mechanical" tuners of the day. There were even _stores_ that sold nothing other than tuner parts (for the TV repair trade) -- I remember visiting one in the Bronx, near Yankee Stadium, to buy parts for my mother's Zenith TV (also B&W, as were most televisions of that, to get parts for the _Commodore_ TV, I had to visit the sole supplier, _Commodore,_ which was located at that time deep in the heart of Brooklyn. So, I got onto the subway and began my _long_ travel. When I arrived, I found a place that was like a big warehouse/garage type operation. Definitely _not_ an ounce of "retail" to be found. I bought my parts and was on my way. A few years later, the "pocket calculator" was born. Nifty, but expensive -- something like $400 in 1965 dollars (or whatever year it was in the mid-60s, I can[t remember). But then an inexpensive brand was born, and my father got one for me. Yup, "Commodore" (I have to wonder if the fact that they were located in Brooklyn, and he worked at the Brooklyn Navy Yard, had anything to do with his purchases). Ah, memories... PS: I had the 2nd or 3rd TRS-80 sold in my region. Went in for warranty service nearly 20 times, five main boards, and still ran like a one-legged blind man on a pogo stick. PPS: The one shown in the photo in the video is the Model 3, which came out a few years after the TRS-80 (retro-named the "Model 1").
Excellent retrospective of the exact moment in time that I became a consumer-electronics enthusiast. Jack Tramiel's appearance was a particularly strong flashback moment. A man who's shrewd business acumen bordered on evil, but who understood consumers and made possible products that revolutionized home technology (all the while being myopic about the future of the industry). The man nearly took down Texas Instruments (most assuredly killed their home consumer division), and then bought Atari to price-undercut Apple, IBM and his own former Commodore. The man had huge Cajonas!! It was indeed more than the birth of an industry, it was the birth of our electronic zeitgeist. Thank you, that was 30 minutes well spent.
Motorola has been around since 1928 originally started as Galvin Manufacturing Corporation. In 1930 they sold their first Motorola branded product a Car Radio named Motorola a combination of the terms Motor and 'ola' from the then at the time popular Victrola radio name. And well the rest is history.
@@cybercat1531 after receiving a $500 investment to start Motorola the founders allowed the investors to drive around the block one more time in the car in which the new Motorola radio had been mounted for demonstration! The car caught fire and burned to the ground but the investors had already coughed up the check!
And they wouldn't have bothered except they found the Z80 cartridge for the 64 wouldn't work on the 128, and that incompatibility was considered a bug that they had to fix
Another great video. Without Chuck Peddle there would have been no Commodore 64, no Atari 2600 and that's right, no Apple II! The 6502 was a great, great invention.
It would be much longer until computers would hit the consumer market. Computers would of continued to be big business machines. Its nuts to think about. It was the perfect invention at the perfect point in history.
@@scottfranco1962 I doubt that. MOS were probably not able to put the 8500 (fast) transistors of the Z80 design on a single chip for a reasonable price. Maybe you are confusing MOS with Mostek. Mostek and Synertek were andvanced firms that built the first Z80 chips before startup Zilog had its first plant ready. They continued as second sources for years. The European second source for Z80 was SGS-Thomson (now ST). After a few years (late 1970s) there were also clones from NEC, Sharp, Toshiba, Rohm, GoldStar/LG, Hitachi, and others.
I personally don’t like this direction,what’s that all that sarcasm funny generic talk and all those drawings,it annoys me,as if he tries to make funny out of those incedible stories to force you to take it with a grain of salt,am sorry but if i want to have some fun i will watch a comedy but if i want to watch a history about the past whether it’s politics,economy or about video games,i don’t wanna any genericness in it because i consider that BS.
I never get tired of hearing "cal-coo-lader". I still have an Apple ][ and an Apple //e that works! 6502 was so loved by all kinds of lowspec machines - I remember my Vic20 and C64 fondly, as well. My computer class had an army of Commodore PETs. Very fun video - I've owned a number of PCs with this processor but I never knew its story!! Thank you.
It is awesome that 6502 accepts its 8bit nature and has a ton of instructions which work with 8bit immediates. No weird register pairs. Instead 16 bit values are addressed as memory, just like strings. Clean. I like that little Endian works with the way immediates are read and the way pointers in the zero page are read. Yeah, Intel had carry look ahead, but 8 but ADC saves a few transistors. I think that MOS invested a lot of transistors into cycle efficiency. Unlike RCA 1802 or Intel, a lot of stuff could be done. We have like 4 busses and a huge PLA. Only later the MOS fab produced better and better chips, and soon their clock frequency wasn’t the limit anymore. Rather in the C64 the complex system bus limited speed.
Programming an Apple II in assembly language was exactly where I started. Really loved all the addressing modes and powerful system calls. It wasn't until years later that I wrote my first lines of code in a high-level language...a thermodynamic model of an air-standard Brayton cycle.
Isn’t there this indexed zero page indirect addressing mode, which no other CPU has? It seems to be used for Demo effects only. It is like if on a 68k you could use the 3 lsb of a data register to select one off the 8 address registers to use in a reg-mem instruction. This is insane. I guess 6502 cannot move addresses between address registers like a 68k. I Wonder how you teach a C compiler to use IDs in Y instead of pointers? And use zeroPage+X for lists. Global optimization? Seems like C is the wrong language for 6502. Similarly, why is it so difficult to put pointers to the stack frame on the call stack? Ah, no addressing mode for this. 6502 cannot even peek SP+signed immediate8 .
I'm convinced the 6502 is one of mankind's greatest inventions, and this is the best content I've ever seen on its history. It's such a great story- a team of brilliant underdogs leaving the billion dollar Motorola and joining up with MOS to create a revolutionary processor, accessible to the masses. Also, I love seeing all the programmers leaving comments about what a joy the chip is to program against- I think its greatest strength is how succinct the instruction set is. RISC is beautiful to me. Absolutely brilliant content, sir.
Thank you this - I never looked too deeply into the origin of these chips. I remember as a teen learning to program super-simple assembly programs on my Commodore 64, which used the 6510, and then marveling about 10 years later when I was a junior technician, poking around the guts of a broken old DIN keyboard and realized just 10 years later the chip running the keyboard was a 6502! Going from being the central CPU to just powering the keyboard of an Intel computer is quite the fall from grace. 🙂
yes and no -- the whole point of these cores was they were cheap enough to be used anywhere. Even while IBM PC's were being powered bu the 8088, keyboards were already being powered by the simple-but-not-THAT-simple 8085. There were actually smaller alternatives to the 8-bit boys but nobody remembers them now. Basically calculator chips with minor modifications. Slow mask-programmed 4-bit behemoths found in a lot of instrumentation until they were replaced by application specific chips.
Bought at first a Spectrum Z80 at the time. It feeled so cheap like their tiny rubber keys. Went back and swapped it for a VIC20. Later a commodre64, got in love with the old 6502 and learned to how to program it in assembler. Great times. Doing stuff where the "computer" wasn't designed for. Stepped later over to the 8086.
My first computer was a 6502-based Superboard II from OSI. It ended up 100% overclocked at 2MHz, and I installed a heatsink and a fan to keep it cool enough to run reliably. I converted a B&W TV into a monitor, and built power supplies, and added an expansion board to make up 64KB of RAM in 128 half-Kilobyte 2114 memory chips - and they were not cheap! I resurrected a dead RX02 8" floppy disk drive from a PDP11 with an RS232, and squeezed 960KB of storage onto each IBM Floppy - they cost $28 each!
As a member of the team from Motorola to MOS Technology my compliments on the presentation. You know more than I do about the “big picture” but your telling of the things I do know about is accurate. Thank you
I am going from username alone but are your Harry Bawcom? I am super glad to hear you enjoyed the video. Hopefully you noticed the piece our artist did at 7:28 with the original Motorola team that should have you on it (and we would love to send you a print of it, if interested). Also, let me know if you are ever available for interviews. I would to know more about these events or what you did afterwards! I am available at alex@lowspecgamer.tv
My father was a mechanical engineer who got the first family computer in 1985, a kaypro 2. 2x 5 1/4 floppy drives and no hdd. You had to boot from disk and it could either run cp/m or msdos. Gaming with ascii graphics on a 5"(? Ish?) Monochrome screen. Ah the days of 'ladders'. You can still find it online as a self contained .jar file. When I was in school for electronics one of my instructors was on the Intel design team that broke the Ghz barrier. When asked how they did it he just said 'P. F. M.' Because of that instructor we also had access to a small corner of Intel's educational materials and I was able to download a transistor schematic for the 4004. I remember some of these wild west days from the early 80's from my early youth. Of course someone younger than 10 as I was at the time had no way of comprehending the gravity of what was happening around me, I just accepted adjusted and moved on. You're making me feel old here 😁, playing games on my ti85 in high school trig..... Great content this is worth chronicling as almost every living person today is affected by decisions made over 40 years ago in back rooms and garages used as office space.
@@hicknopunk like I said above, you can find ladders in a .jar file. Did you ever play a game that was a galactic economy Sim? Now THAT was a fun game. Wish I could remember the name of it and find a copy in a jar as well. Good old kaypro. I remember one time my dad took it on a business trip and a coworker of his that had the same machine set it up to allow me and my dad to instant message each other, in like 1985. Take that instant messenger.
@@jonathonschott the games I remember most were infocom games like Zork, Planetfall, Hitchhikers Guide, etc. I also had arcade games made up of all text characters like centipede, pac man, some racing game. I was a kid using it, so I mostly remember playing text adventures at night before my dad had to take the computer back to work on monday
My father was a chemical engineer and in 1986 brought home an IBM XT 286, 2MB ram, 20MB HDD. It was $6,000 that his company, Dupont, paid for. I remember playing flight sim and making pictures by printing out characters on the dot matrix printer. He would log into work remotely with paschal and was a cobal programmer.
20:44 the Apple was little more than a 6502, RAM, ROM and a keyboard. They even made the 6502 responsible for video output. Also, you glossed over the fact that Wozniak could *not* get his design to work. Chick Peddle naively showed them their mistakes. And thus, we are stuck with Apple today.
Today's PCs are just a lot of processor cores, RAM, ROM, and a keyboard. Even GPUs became just more processor cores. Wozniak was right all along: hardware's job is to run software.
@@ischmidt Try to us your smartphone to control a 3d printer, try to use your smartphone to write a real program, and you'll see the difference between a real system with expandable IO and the disposable tinkertoys without it. IO is so easy to overlook but so important.
@@tsm688Smartphones can do both of those things, and they can do them comfortably when you plug in a USB-C hub to add a keyboard, mouse, and display. The entire point of the Arduino and Raspberry Pi ecosystems is precisely that a lot of I/O oriented stuff doesn't need a PC.
@@ischmidt on a platform where programming languages are literally banned? The best and nearly only way to program for phone, is to not use phone to do it...
My very first computer build was the "Microtan 65 " kit from Tangerine Computers in 1980. You had to solder cbip sockets,and everything else, into the PCB and insert the chips into the sockets. It used the 6502 processor. It was pure magic when I powered it on and discovered that it worked.
That was my first computer, too. Bought as a kit in 1979. I am now building a replica of it using the 65c02 and modern memory chips (32kb in one chip is pure luxury, compared with the 2 chips needed for 1kb we had originally).
I started working in that business in 1978. I programmed a Commodore Vic20 (in Basic) to run an electronically controlled cow milker. Rockwell was a second source for the 6502. I programmed a Rockwell AIM65 to run a system to verify that VHS tapes were undamaged. Fotoshop was a company that had kiosks that accepted film for developing, and returned prints in a few days. They were planning to accept orders for rental tapes and fill them in a few days. It was fun to watch 8 specialized VHS testing decks all being controlled by one 6502. Perhaps a video could be done on Colecovision. It was the first reverse engineering of the Atari system, and it was successful for a while I was the software person working on getting the reverse engineered Atari hardware working, but then others did the rest of the project.
I will never understand how these videos haven't popped off yet, especially with the watch time probably being insanely high with people watching the whole thing. Gotta happen eventually
@@LowSpecGamer what could you possibly improve other then your pronunciation (no offense intended, but that's literally the only thing I can think of that could be better, I'm genuinely asking what needs to improve.) These videos are some of my favorite content on the platform, your style is just so great. To be honest, you found and nailed your own unique form of presentation.
@@MrGamelover23 If you actually mean his accent when criticising his pronouciation I'd not have it changed in any way shape or form. He is perfectly comprehensible and on top of that very distinguishable in tone and accent! It adds character! I'd rather have profound substance in content and production value. Since years I'm still here and excited over every video he releases. Other creator who are obsessed over perfect pronunciation and tone (aka like to here themselves talk) but seriously lacking in substance of their content (like e.g. coreteks) quickly lost my interest.
@@glockmanish hey don't get me wrong, this guy's videos are amazing, but I still need subtitles for some parts. This guy makes some of the best content for platform's ever had. I was just asking what could possibly be improved since he said he has a lot to improve?
@@LowSpecGamer hey, I apologize if my earlier comment was insulting or offensive or hurtful in any way shape or form. It was not my intention to be rude or hurtful and I hope that you weren't hurt.
Man was this a fun watch! Jack really is a stubborn business man that really loves calculators, and basically damage many business opportunities because of it.
I think he was just not a “visionary” but a good “businessman” that really kept an eye on what actually made his company money. Many companies over the years have gone bust because they lose sight of what their core business is and Jack needed definitive economic proof that computers were a worthy venture before committing.
@@LowSpecGamer In the end his style of doing business also was the downfall of Jack! Nobody who ever had to deal with him businesswise wanted to do it a second time, just look at the Amiga guys how desperate they were not getting into Tramiels clutches and were saved last minute by Commodore from that fate. Also just look how Epyx was screwed over by Tramiel on the Lynx and basically went bankrupt because of it! In the end Atari under Tramiel stood alone and no one wanted to deal with them anymore.
Your new Videos are GREAT. You went form someone who uploadet something semi interessting every couple of vids to in my playlist next to oversimplified, vsauce and in a Nutshell. You managed to turn your channel into something that is timeless, great job my man.
I'm glad to have watched this video today. I came here from the LowSpecLore playlist link on the "End of Low Spec?" video that I still had an open tab of for some reason. This kind of history lesson is great; I'll keep watching.
Great video! I was wondering how this amazing CPU came to be, especially as I was watching Ben Eater's series about making a breadboard-based computer around it. Some days I miss your old type of content(read: low spec game setup guides), but these videos are great! Totally amazing videos. Earns my like, FWIW!
Incredible deep and insightful documentation, with a great narrative. I am astonished! In 1977 I abandoned my plan to be an electronic engineer to become a computer engineer, at 10 it was easy to pivot ;)
If anything I feel like it’s not only not left us but is only getting much more common. Now that processors are this powerful in general even the low spec ones blow stuff from the not so distance past out of the water
I started with Radio Shack's Color Computer in 1984, ordered with 16Kilobytes of memory, but delivered with 32K! Thought I was in heaven! Can't wait to see your take on the 6809!
I used the 6502 in the early 80s, and loved programming it (in assembler). It had exactly the right instructions to do just what I wanted. Completely fuss free and clean.
objective oriented programming is a fucking lie... hopefully in the next few years we see a trend in indie game developers of coming back to Functional Programming (all functions inside a single file of code, code easier to read, etc)
@@FeelingShred it is good to have a data structure of some sort... but OOP mostly means "37 different warring designer's data structures in a mangled glob that barely functions". Just give people the data, in a simple flat structure, and they can do anything in any language
Commodore and its people were so central to early consumer computing yet so little history reflects this. Thank you for doing your part in helping to educate. How well did they know what consumers wanted? They were the first to one million units and have the world record for most units sold of a given model. RIP Chuck Peddle. Please do a video on how Commodore's Amiga was the first mass market computer with a preemptive multitasking OS. This is also crucially central to modern computing and yet also very much overlooked.
The Amiga was not a Commodore computer. It broke with all the fine traditions: no instantly ready BASIC in ROM, no PETSCII, no backwards compatibility. Not surprising because it was not made by Commodore but was bought in. They should have made an own 16-bit computer backwards compatible with the C64 instead - that could have been an IBM PC (clone) killer.
@@NuntiusLegis It wouldn't have mattered.. The PC didn't win because it was the best, or the most fully-featured. It won because it was the *default.* It didn't matter how good Commodore made the Amiga or a hypothetical C256 or the like, because only they were making it. Just like Apple was the only ones making Macintoshes. *Everyone else was making IBM Clones.* So programmers made more IBM-Compatible software. And in the end, all the fancy hardware in the world can't change the fact that *software* drives hardware sales.
@@watchm4kerWith the C64, Commodore had achieved the world market leadership for micro/home/personal computers, outselling IBM PC clones, Macs, and everything else by far. The C64 was the default. IBM PC clones won the race in the end, because they were continuously developed further in a backwards-compatible way. Had Commodore done the same in time, our current PCs would be descendants of the C64.
@@NuntiusLegis No. They wouldn't. Because companies couldn't make C64 compatibles without running afoul of Commodore. Nobody else could make a C64. Nobody else could make an Apple II. Nobody else could make an Atari 400. Or an Adam. Or an Amstrad. Or an Acorn. But they *could* make a computer that was just as good, if not better, than IBM, and let someone at home use the same software they used at work. Commodore, Apple, Atari, and Acorn were not competing against another company. They were competing against an entire industry, and one they could not compete *within.* (See what happened later with Commodore) If you want a counterargument, though? *The MSX.* That was a computer system designed specifically for licensed manufacture by multiple firms. Sony, Toshiba, Panasonic, and more could make MSX computers, all compatible, all interoperable. Had it been pushed harder in the US, and had Zilog been able to keep pace with Intel's hardware, it might have become the standard for home computers, worldwide. The only company that could stand up to the IBM PC was Apple, and that took a truly Herculean effort to achieve, carving out a niche as the graphics computer of choice for anyone working in art, design, or publishing. Even then, they tried to grow the market by attempting to license out hardware manufacture, but it was too late for anyone to really care. Amiga found a smaller niche as the first low-cost video graphics machine, and it held on amazingly well. Of course the company that really got the last laugh over them *all*... is Acorn.
20:51 Until this moment I was never sure that the first product of Apple Computer was actually called the "Apple 1" rather than just the "Apple" , meaning there was foresight by Jobs as to how progressively introduced models are named.
I can only find resources saying it picked up the one in the name some short time after the initial launch. Looks like in some official context it didn’t always have the one as most people would expect
Thanks for the history lesson. I can recall my first computer, it as the brand new addition to Commodore, the 128 with a new 80 collumn green screen monitor. Of course most of the time I ran it in the 64 mode Go 64! I actually made some money programing in Basic. I had been in an accident and was left disabled from the police department, whille the powers that be were debating my future, I picked up the wonderful manual that came with the 128 and learned Basic. Having been the chief of a small department, I knew all the local businssmen and when they learned I could write basic programs they were at my door requesting programs for their little C=64 machines they had at home to write billing programs, and help run their stores, some owned multiple businsses and needed someone who could program sprad sheets and such. So I made enough money to help keep my children in college while workmans comp decided my future. Now I am about at the end of my days, looking to doctors to extend my life a bit by batteling the cancer that has eaten my right kidney, I look to UA-cam to keep me positive as I face the future. Will I soon be with my wonderful wife of 51 years who lost the battle with cancer 2 years ago, and my two sisters who faced the same fate back in 2019. Or can I move on and watch as my grandsons and great grandson takle the world of the future. Time will tell, they tell me I should know before Christmas....
About the only maxed-out 128 application was some high flying business BBS software. Trying to sell CPM, in 1985!? They really did not read their market for that one.
@@LowSpecGamer Yes, Atari was still reeling from the crash of '84, and had already missed the boat on what would become the NES. They were DONE, and their parent company wanted them gone.
Another problem they faced was RISC's heavy reliant on memory bandwidth since there were simpler but a lot more instructions, they had to wait for someone to create faster memories...
Yes! There are ton of small technical stories, like the process they had to go through to create the masks, that I just could not get into for lack of time
I don’t get this. I thought that RISC has a lot of registers in order not to access memory for data. Large registers and barrel shift to pack data. RISC replaces the microcode ROM with cache RAM for code. RISC tries to avoid wait states, but always had allowed for them in case of a cache miss. MIPS was an experiment to see how we can reduce wait states as low as possible. Just consider DRAM and the address multiplexing. It is kinda double data rate. When MIPS was designed, a load or code fetch was still single cycle. When it came to market in 1987, fast page memory introduced variable latency.
@@ArneChristianRosenfeldt Large registers can only help eliminating load/store operations by few percents not completely, also "RISC replaces the microcode ROM with cache RAM for code." wouldn't that exacerbate the problem, since microcode ROM is faster(at that time) and directly wired to ALUs and can do multiple operations in parallel?
@@niks660097 I don’t get the few percent thing. I may be biased. I try to write assembler for AtariJaguar. Load Store does not block there and two instructions are loaded from RAM per cycle. So as long as I have reg-reg in between, von Neumann goes brrrrr. And a lot of code snippets I saw and my own ideas have a lot of reg-reg instructions. JRISC is cheap and only has one instruction format ( 6Operarstion:5src:5dst ). So there is a ROM which translates the Operation to the control lines. It only has 64 entries because there are only a few instructions and each is single cycle. So and that was a lie. JRISC has multicycle instructions also. RISCV has different instruction formats which slow down the decoding a bit. Still fits in a single cycle for a low enough clock rate. I don’t understand what 6502 does with the instruction opcode before the PLA. I thought it is so smart to emphasise decoding speed. So the opcode is latched and then using large transistors with lots of fan out blasted over the 128 transistors of the PLA for maximum parallel processing. For extra speed this is even a balanced signal almost like ECL. But why the same apply the same urgency for the following cycles of the instruction?
I hyphenated low-mid not meaning low through mid but in the sense that it would replace lower mid range cards like the 2060 which is not a low end card imo. An mx450 or 1660 would be a low end card. Mid range cards would be 60&70 series imo. In retrospect, i may have been a bit too hype saying low-mid. It will only replace lower end cards.
"the display was mostly broken chip" Use to do that all the time when we where making display units back when i worked in electronics. If its a display unit, its most likely broken.
I was a database applications developer until I retired. I rarely worked on any low-level programs (mainly for file conversions), but I quickly became interested in hardware. Unfortunately, I didn’t have the time or energy to learn about hardware - until I retired. I have been having a lot of fun learning about the history of computers and other electronic hardware. Videos like yours are perfect for guys like me!
I had a trainer board that used the 6502, and it had 4K ram! I stored the programs on a cassette tape recorder, and I had a black and green monitor connected, after first using an old RTTY machine!
6502 needs fast memory and originally could not accept another bus master on the 16 bit address bus and only had limited current output. So you better pair it with fast = slow RAM and add bridges for all bus pins. I wonder how the bridge can be fast enough to tell the SRAM that write enable is not meant for it? Address comes a bit before enable, so there is that. I still Wonder if there should be a shadow copy of SRAM in the DRAM extension to correct misguided writes? Kinda write through cache. Write only happen at half the clock rate, or do they? Push program counter is faster, but doesn’t leave SRAM. DRAM reads need to prolong phase 1 of the CPU.
I never saw your earlier stuff but I love this stuff. This is all before my time but it’s fascinating understanding the origin story of so much of what we take for granted. Good videos. Well done.
Oh my god this was one of the most entertaining videos I've seen! Props to the artist(s), the artwork really made it 100x times better. Your content keeps amazing me! :)
The number of personal computers and video game consoles that used the 6502 (or a chip using the same instruction set) shows how it was the right chip at the right time.
Awesome show, many thanks for for amazing effort! Interestingly Acorn, the makers of the 6502 based BBC Micro took some design cues from the 6502 and discussion on risc at universities to design the fore runners to Arm chips, something Bill Mensch (Western Design Center) one of the designers that worked on the 6502, mentioned he wasn't too happy about.
I knew parts of the ARM story but I had NO IDEA they directly took ideas from the 6502. That is brilliant. I am totally going to add that to my eventual ARM video. Thank you for telling me.
6502 sets flag on load. ARM does too? But is totally the wrong thing for RISC and load store. Even flags are not so great of an idea. Now JRISC got all those variants ADD, ADC, ADDtransparent to denote if flags are read or written. Power even has 8 flag registers. Or is this okay? After all CPUs have float regs and vector regs. Vec8 could write all 8 flag registers.
One significant point is that the 6502 was NOT bargin basement enough for Atari 2600. The 6507 was used instead, which is like a severely crippled 6502 with less address lines and one interrupt line.
Just found your channel. I watched your videos on the Z80 and 6502. Very entertaining and informative stuff. Keep up the great work. I look forward to viewing more of your content.
@@christopherlawley1842 I put 16k RAM on mine, basically by soldering an extra 8k on top of the first 8k of ram, worked fine; even with overclocking it x2 :)
Good video. Basically that period of time (1970s), was the Big Bang in terms microprocessors. Prior to that computing was huge mainframes that would take up a large room or an entire floor of office building. Then comes along the Microprocessor that shrinks much of mainframe functionality into a single chip... Steve Wozniak pitched the Apple 1 idea to his employer (HP), and the idea was so foreign to them that they could not see any future in microprocessor technology. To be fair, it was so earlier only a rare few understood it potential. aka, in the early 1900s who would want a horseless carriage (car) when you can just have a horse.
A couple of little details. If I recall, the 6501 or 6502 had one error after layout, a rotate that didn't work properly sticks in my head but that was so long ago. Also, I seem to recall the Atari 2600 used the 6507, a version of the 6502 with a shorter (13 bit) address bus in a 28 pin package (A lower, low spec processors). When larger ROMs became available, they had to resort to bank switching to address them. I used to program in assembly on the Atari 800 (which I still think was a wonderful computer). The one thing I didn't like on the 6502 was the short index registers (8 bit) but at least there were two of them compared to the single one on the 6800. Coming from the Intel world, I was never fond of the bus interface on the 6800 and 6502, I liked the 8080 and Z80 better with MRDn and MWRn instead of the phases. Just what I was used to. To this day, I still find it amazing that a 6502 clocked at less than 2MHz (half of color bust 3.58MHz) could produce such stunning and fluid games on the Atari 800 (granted the simple graphics hardware of the day helped) but there was some really fast code written for them. I had the chance to visit MOS Technology in the late 1970s and it was something going onto that hallowed ground, and meeting some of the people who were designing the next generation. Unfortunately, it is now the closest Super Fund site to my home (an hour away). Even though I work largely with ARM microcontrollers these days, I still miss the old days with these great old chips. I made a lot of embedded systems with the Intel 8085 and loved it. However, once I wrote code for the 8086 and 80196, it was hard to go back to these simpler instruction sets. Great video and pretty accurate and well researched. The drawings are awesome.
This channel has come so far... I just discovered it and looked at your 1st video and then viewed your newest video and then compared the two to confirm you have indeed came far
My first tinkering with micros was with the 8080. I did a bit of exploring the 6502 though. I'm now past 70, giving an indication of how long ago these were a thing.
You do an amazing job of delivering a relatively dry story on paper in a compelling way! The building of tension and focus on the individuals involved is great.
My wife bought me an Atari 400, and I spent HOURS programming 6502 machine language into Strings that BASIC could call. When I went off to BCIT ( BC Institute of Technology ) to try to learn enough to get a job, at graduation I came across an ad for someone who could deal with 65816 code. I found a manual and discovered that it was a beefed up 6502, I applied, got the job ( porting a home banking system to a console machine ), and spent ( following the usual detours ) 25 years working on home banking software. Those were the days.
I bought an MOS Technology 6502 as soon as they announced it. It was $25 and came with two excellent manuals, one hardware and one software. I had just built an 8008 machine. The 8008 was $45 and the 2102 rams were $5 each. That was a lot of money in 1975. My 6502 machine had a front panel with the normal arry of switches so I could fat-finger in a loader. The real advantage over the 6800 was not just price, but the indirect addressing modes in the 6502. It was much more advanced. I was shocked when Motorola came out with their 68000. Why would anyone come out with a register rich machine when the future belonged to compilers (which would never use all those registers) and multi-processing which required fast context switching. It was an interesting time.
Registers are called global variables by some. Similar to the zero page. So with a bad enough coding style you use them up quickly. C and Pascal have structure and a compiler can check spans without any calls and then check for non-overlapping lifetimes of variables ( first and last access ). This can be expanded to parameters and return values. Even in 1993 AtariJaguar came with scratchPad memory which forces you to load small, but mostly complete programs into 4k of SRAM ( like on PET ). And you load data into your 64 registers, which are not that global anymore. Sad, to use a compiler only for those snippets. You have to link them all using a custom tool chain. Ah, and no subroutines allowed. I mean prohibitive slow. All macros and loops.
@@ArneChristianRosenfeldt tons of registers make programs also faster, you do not have to performan constant on off stack operations for storing away data. There is nothing worse than having only a small number of registers, well yes there is something worse, having a small number of registers, and half of them dedicated to just one single task to skimp on internal wiring logic!
@@werpu12 6502 has the decimal logic wired to the Akkumulator, but otherwise it has a register file just like z80 . ARM, MIPS, JRISC, and SNES have special registers for multiply and accumulate. So basically, just as 6502 they have an accumulator. I always thought that maybe they keep carries at every fourth position like Z80 ALU? Also MAC already reads two source registers per cycle. You want two read ports on the file? I may want to add one bidirectional port to also allow two register barrel shift. And division is just slow, so indeed it uses hidden registers. No need to expose those in the ISA, if you have a scoreboard as JRISC does. MIPS does not. MIPS blocks if memory access takes longer than a cycle (plus addressing mode, 5 stage pipeline for memory vs 3 for reg-reg, so actually two cycles? MUL also fits in three cycles ( no addressing mode )). MIPS delegates multi cycle stuff to a coprocessor. Yeah, int, float, and vectors are not that different. For vector add, just block the carry. Same for exponent and mantissa. Modern, deep pipelines can deal with instructions of different execution time. So 64bit pointers, double floats, and MMX all in the same registers! So you don’t like the flag register? RISCV is for you! I think that the forth port to the file could be used to set or get a carry from to any register for ADC. Power has 6 flag registers. I just want them aligned to the file. Stack pointer is a special purpose register as is instruction pointer, but these are loved.
The funny thing is that the 6502 wasn’t even that low spec. It depended on faster memory due to having fewer register than its competitors (6800, Z80, 8080), but it could do twice as many memory cycles vs. the Z80 at the same frequency, so assuming your memory could keep up, it was way more cost effective despite being only marginally slower.
Pretty "low spec" with just a 8-bit stack pointer though. And memory couldn't keep up. Because the 6502 did not use the available memory speed very efficiently; only a fraction of a cycle was allowed for memory access, due to the transistor level design. A 6502 at 1MHz needed a 300ns memory, which is less than 1/3 of a clock cycle. In contrast, the Z80 allowed two full clock cycles for memory to respond. So using the same speed memory, the Z80 could be clocked around six times as fast as the 6502. This meant that ordinary 8-bit operations as well as normal jumps, calls, returns, etc. could be 2-3 times as quick on the Z80, using the exact same speed (and therefore cost) RAM and ROM. But the big difference was in 16-bit and 32-bit arithmetics (such as for floating point). The Z80 could do a 16-bit addition in 11 cycles, while the 6502 needed at least 20 (much longer!) cycles for the same thing. Being able to be clocked faster, less cycles for 16-bit add/sub, and several internal 16-bit registers meant that the Z80 could be about 12 times as fast as the 6502 in practical "number crunching" tasks. Again using the same speed, (cost) memories.
@@herrbonk3635 The practical speed difference is nowhere near what you indicate. z80 11 cycles for a 16 bit add is only for adding bc, de, or the stack pointer to hl (or adding hl to itself). Doing the same adds to ix or iy is 15 cycles. And you have to get the data into the right registers somehow to start with. With as few registers as both the z80 and 6502 have, that usually means the data comes from and goes to RAM. The 20 cycles for the 6502 is to add an arbitrary pair of zero page locations (there are 256 of them) to another arbitrary pair of zero page locations, and store the result in possibly yet a 3rd arbitrary pair of zero page locations. So your 20 cycles gets you the same functionality as a 16 bit machine with 128 registers and 3-address instructions. To do something as flexible on z80 requires e.g. "ld hl,($nnnn); ld bc ($nnnn); add hl,bc; ld ($nnnn),hl" which is 16+20+11+16 = 63 cycles and 11 bytes of code, vs 6502 20 cycles and 13 bytes of code. Sure, if you can keep all your data in registers on the z80 it can fly, and it does have more registers than the 6502, but it doesn't have *enough*, usually. 6502's 256 bytes of Zero Page is enough "registers" for anything you can imagine -- the same as a modern 64 bit machine with 32 registers e.g. RISC-V or ARMv8. I found that typically over a wide range of programs a z80 needs three times more clock cycles than a 6502. z80s generally ran at 4 MHz, while 6502s were either 1 MHz (a little slower than a z80) or 2 MHz (significantly faster than a z80).
@@BruceHoult No, it does not mean that it comes from memory, that was one of the main points iirc. Firstly, no one uses IX/IY for this purpose, other than perhaps a naive beginner in a school assignment. Again, even with zero page addressing only, the typical 1 MHz 6502/10 still needs 20 cycles = 20us for a 16 bit addition or subtraction. During those 20 clocks, the typical 4 MHz Z80 had 80 cycles to spend. Floating point routines for the Z80 could store up to three 32-bit mantissas in HL'HL, DE'DE and BC'BC. With exponents in A and A' and/or B,C. This was very useful for repeated shifts and adds, such as in mul and div. So these could therefore be performed with almost no memory data traffic at all. So very unlike the 6502 in this regard. The Z80 EXX-instruction was needed ONCE per 32-bit operation (via carry) and took the same time as *ONE* clock cycle on the 6502! The Z80 sure needed a different crystal than the 6502, but, again, it could cram more power out of the same memory ICs than the 6502 could, especially in 16/32-bit calculations. Memory speed was the main (economical) limiting factor in the 1980s. Even an old "4 MHz" rated Z80 chip could actually be run at 6, 8 or 10 MHz, when coupled to the faster memory ICs that became common in the latter part of the 1980s. The clock speed difference didn't even need to be large. The ABC 800 (based on a 3 MHz Z80) had a significantly faster BASIC than the pretty optimized BASIC for the BBC Micro (based on a 2 MHz 6502). The latter was in itself known to be quicker than say M$ BASIC for 6502. en.wikipedia.org/wiki/Rugg/Feldman_benchmarks
@@herrbonk3635 Thanks for the link! Unfortunately it seems you didn't read it, or look at the table of results: "They conclude that the 6502 is the highest performing of the CPUs, agreeing with comments Gates had made in his letter." Just as you apparently didn't read or didn't understand my comment. I will repeat and rephrase: there are micro-benchmarks (individual loops or functions with few variables) where the z80 is faster because it can keep everything in registers, but this doesn't translate to being faster on entire programs, and certainly not by the margins you claim.
Chuck was absolutely right about processors needing to be cheaper and microcomputers needing to integrate BASIC. An entire generation of software developers got their start on cheap home computers using BASIC. I doubt that I would even have had a 25 year long successful career as a developer if I didn't have a lifelong interest in programming sparked by having access to a Commodore 64 at home back in the 80s. I knew plenty of kids that grew up with consoles only and had absolutely no interest in learning how they worked or how to make them do things. I feel sorry for the current generation of kids who are growing up with smart phones and tablets, which like consoles, have their programming interfaces hidden and not easily accessible.
I know, I had to simplify to not add yet another tangent to the video. Since it is a variation of 6502 it still works within the narrative of the video
My first computer was an OSI computer with a 6502 chip. Learned to program in machine language, assembler and Basic. Things were so transparent and simple.
I always wondered why youtube stopped showing me your videos but did not expect you to completely go another direction. Miss the old lowspec videos, wish you would have kept some like the apex video atleast. Wishing you the best man I still remember when you guest appeared on the wan show
I remember finishing the Commodore 64 Programmer's Reference and thinking, "That's IT? That's what make these amazing games work? Just these simple instructions and their various addressing modes. Now I know a lot more about how the magic works, it still seems amazing.
"$250 for a processor? Das too much". It is weird to hear this, I know this isnt accounting for inflation but $250 today is like the norm for a decent cpu.
CPUs that cost $250 nowadays accomplish so much more than those that cost $20. It’s where they’re used that matters. The chips people were looking for here were for calculators and systems which did not need massive instruction sets as mentioned in the video.
Este tipo de videos no tienen nada que envidiarle a los documentales de los canales de ciencia de la TV, muy completos y me gusta el hecho de como los vas enlazando con los demas videos que has hecho hasta el momento, Excelente!!!
Just noticed something... for your future documentaries try to avoid using these animation screens with All White backgrounds... These things hurt in the eyes to look at when you're watching the video in a dark room (which most people these days are doing...) Dark Mode is the way...
@@LowSpecGamer WHITE MODE IS DEATH, it's worse than death in fact, it's painful 🤣🤣🤣🤣🤣 just kidding... I'm sure you guys will figure it out soon \o/ cheers
This channel has come so far. I remember when the content was just altering game config files to bring game graphics below their minimum settings and spec, and these documentaries are just as valuable imo.
You remember last year. lol just busting your balls man
Yes.
I miss it a bit. I am glad that he has moved on to creating content he enjoys again, but I wish somebody would pick up the mantle.
@@MyScorpion42 Problem is more and more games not allowing editing of the game files, single player games mostly excluded, but even then they continue to make it more difficult, that and most of the methods are kinda same-same anyway since a lot of games are built on existing engines and all you need to do is find the right files. There's only so many different ways of explaining what the same options do and once people know what they do they can just edit it themselves.
Wait I remember those videos. Are they deleted and gone for good?
I bought the first 6502 product, the KIM-1, directly from MOS, and built up a system around it. Later, I bought a Commodore PET and did a lot on that. Then I got a job at Atari designing games for their machine. I still have my KIM-1, my PET, and my Atari 800. I knew the 6502 instruction set, in machine code, by heart.
My first 6502 was an Ohio Scientific single board computer kit. The graphics weren't great, but good enough that I could write fun games in BASIC. My next computers were Commodores and they had much better graphics systems. Rhe graphic sprites and sound system on the C64 were great.
I had the VIM-1 which was a improved KIM-1 they said if you like KIM you will love VIM I had to program it with hexadecimal numbers (just like the KIM)
Admittedly, there were only 150+ opcodes to know.
you have a long life, wow! nice
@@disgruntledtoons you must be fun at parties lol
I met Chuck Peddle briefly at an event in the fall of 2019 before he passed away. Even at 82, he was still excited about new projects and eagerly shared his experience with us undergrads. A truly gifted and talented engineer. I really wish I could have spoken with him longer
I am jealous. I was very sad to hear of his passing back then. I am glad you got to meet him.
One exception to accuracy. I don’t think any individual was sued. I only saw one lawsuit and none was served to me personally. All our names were mentioned in the lawsuit. I did have the feeling that if things got tough I could be on my own. The suit was filed in Philadelphia in the fall of 1975. It should be on file. I only read the first page. “Motorola took seven man years to do the layout. MOS Technology took one man year,” therefore we stole it.
As a matter of record I returned to Motorola in 1977 and worked there till 2010. There is a story there.
Thanks for sharing.
So basically: You didn’t take as long to make the chip as us, therefore you stole from us.
How does one win a lawsuit on that?
@@engineerbot That was page one in the lawsuit. There were many more pages I never read. MOS technology ran out of money. As I remember it the first lawyer bill was $400k. They could not pay a second one like that so they had to settle. Someone else will have to post the terms of the settlement. The possession of damaging material by one of the former Motorola employees, even tho not used or even seen by others, would have made further litigation extremely expensive. Motorola could pay it. MOS Technology could not. I was not privy to these discussions.
@@harryb1251 that's interesting.
@@harryb1251 That's the basic model of operations for large businesses. Make it impossible for your smaller competitor to actually fight it in court even if you're in the wrong because you have an entire office floor for your lawyers on retainer and the small business just doesn't have the liquidity to fight you.
Ah the memories! Taught myself 6502 assembly language as a teenager in the early 80's and wrote my own games cos I couldn't afford to buy them. Never did manage to stop my Pacman ghosts getting stuck in corners.....
You were one up on me. I couldn't stop my pacman eating the frickin maze !!!! ;-)
Should have prototyped the logic in something like BASIC and then converted it. I couldn't even afford an assembler so I wrote my own BASIC program that converted z80 Assembler instructions (ZX81) into machine instructions that were POKEd directly into memory. I did intend to go a step further and create a BASIC > assembler > machine instructions compiler but I discovered drink and girls and that was the end of that! :)
For me again it was the 6502, writing demo's on the C64 in the late 80's. Not even using an assembler, just punching hex into memory using a monitor.... lol I managed to learn every hex code for every instruction and could just sit their typing away, hex after hex after hex for hours on end. Was 14 or 15 at the time.
Ahh those were the days. I was also writing 6502 assembly in the 80's and x86 by the end of the 80's.
@@JoannaHammond. Me too. Punching hex into a keypad - I couldn't afford the full keyboard for my 6502 computer until much later. Codes such as A9 (LDA) are burned into my brain.
I probably still have the 6502 manuals somewhere. One thing I recall that was not mentioned was that the 6502 was designed to be made on a memory production line rather than requiring a more complex chip production line that other micros required, which allowed them to be made more cheaply and leverage exiting production capacity.
I laughed so hard when the Steve Jobs negotiation part came out. That part of the video really capture his voice perfectly. Jobs biography book described how angry he can be when negotiating with someone he dislike. Jack was certainly the type of person that jobs dislike.
I was working from his characterization on that same biography so I am happy it came through
@@LowSpecGamer Tramiel (BTW drop the i, it sounds like Tra-MEL per the man himself) was likewise well known for being fiery AF, that negotiation would have been legendary. He was a notorious micromanager and the friction between him and the chairman of the board Irving Gould was not great even though his performance had made Commodore immensely profitable.
It's honestly weird looking back at the 70s and 80s in computing and realising just how much larger than life all these people were in the industry, so many hard nosed characters dictating the way the industry would evolve.
Im 100 like
Jack was the kind of person most people would dislike, I bet. There were many jerks in that generation of tech CEOs (counting Jobs), but he took the biscuit.
@@ButterfatFarms - *_I don't think he cared what he was selling in the end. It eventually turned out to be consumers electronics, calculators, watches, computers and a few other odd niches._*
When I was in my teens -- the mid-1960s -- my father gave me a small "portable" B&W TV.
The brand was "Commodore"!
After a year or two, the tuner developed a malfunction. This was quite common with the "electro-mechanical" tuners of the day. There were even _stores_ that sold nothing other than tuner parts (for the TV repair trade) -- I remember visiting one in the Bronx, near Yankee Stadium, to buy parts for my mother's Zenith TV (also B&W, as were most televisions of that, to get parts for the _Commodore_ TV, I had to visit the sole supplier, _Commodore,_ which was located at that time deep in the heart of Brooklyn.
So, I got onto the subway and began my _long_ travel.
When I arrived, I found a place that was like a big warehouse/garage type operation. Definitely _not_ an ounce of "retail" to be found. I bought my parts and was on my way.
A few years later, the "pocket calculator" was born. Nifty, but expensive -- something like $400 in 1965 dollars (or whatever year it was in the mid-60s, I can[t remember).
But then an inexpensive brand was born, and my father got one for me. Yup, "Commodore" (I have to wonder if the fact that they were located in Brooklyn, and he worked at the Brooklyn Navy Yard, had anything to do with his purchases).
Ah, memories...
PS: I had the 2nd or 3rd TRS-80 sold in my region. Went in for warranty service nearly 20 times, five main boards, and still ran like a one-legged blind man on a pogo stick.
PPS: The one shown in the photo in the video is the Model 3, which came out a few years after the TRS-80 (retro-named the "Model 1").
Excellent retrospective of the exact moment in time that I became a consumer-electronics enthusiast.
Jack Tramiel's appearance was a particularly strong flashback moment. A man who's shrewd business acumen bordered on evil, but who understood consumers and made possible products that revolutionized home technology (all the while being myopic about the future of the industry). The man nearly took down Texas Instruments (most assuredly killed their home consumer division), and then bought Atari to price-undercut Apple, IBM and his own former Commodore. The man had huge Cajonas!!
It was indeed more than the birth of an industry, it was the birth of our electronic zeitgeist.
Thank you, that was 30 minutes well spent.
He was a colorful character. I respect his shrewdness if anything
I don't know what I'm more impressed with, the video quality and storytelling, or that Motorola has been around since the 60's.
Motorola has been around since 1928 originally started as Galvin Manufacturing Corporation.
In 1930 they sold their first Motorola branded product a Car Radio named Motorola a combination of the terms Motor and 'ola' from the then at the time popular Victrola radio name. And well the rest is history.
@@cybercat1531 after receiving a $500 investment to start Motorola the founders allowed the investors to drive around the block one more time in the car in which the new Motorola radio had been mounted for demonstration! The car caught fire and burned to the ground but the investors had already coughed up the check!
The name is a clue, it's a characteristically old-timey American name lol.
You think that's crazy? Nintendo was founded in 1889
@@xSaintxSmithx You think that's crazy? Nokia was founded in 1865
And one of my childhood heroes, Bil Herd, brought both low spec processors together. 65xx and Z80 working together in one machine, the Commodore 128.
Too bad the Z80 half was useless by the time the Commodore 128 came out.
And they wouldn't have bothered except they found the Z80 cartridge for the 64 wouldn't work on the 128, and that incompatibility was considered a bug that they had to fix
Another great video. Without Chuck Peddle there would have been no Commodore 64, no Atari 2600 and that's right, no Apple II! The 6502 was a great, great invention.
It would be much longer until computers would hit the consumer market. Computers would of continued to be big business machines. Its nuts to think about. It was the perfect invention at the perfect point in history.
It wouldn't have stopped Woz. He'd have built something around the 8080 instead
Don't forget that the Nintendo Entertainment System and Super Nintendo used variants of the 6502.
MOS and Zilog were the real heroes which turned computers into mainstream devices.
And unmentioned here, MOS second sourced the Z80 for a time. My first computer was a MOS corp Z80 (hand built by the way).
@@scottfranco1962 I doubt that. MOS were probably not able to put the 8500 (fast) transistors of the Z80 design on a single chip for a reasonable price. Maybe you are confusing MOS with Mostek. Mostek and Synertek were andvanced firms that built the first Z80 chips before startup Zilog had its first plant ready. They continued as second sources for years. The European second source for Z80 was SGS-Thomson (now ST). After a few years (late 1970s) there were also clones from NEC, Sharp, Toshiba, Rohm, GoldStar/LG, Hitachi, and others.
@@herrbonk3635 Correct, it was Mostek.
Don't forget Acorn. Without them we're probably still marveling over flip-phones.
@@shaunnichols1743 Never heard of them before. looked it up. RISC and ARM are something i also never knew before. Thanks
If you're going for a new approach for your videos, let me tell you something. You're absolutely going in the right direction and I love it!
I personally don’t like this direction,what’s that all that sarcasm funny generic talk and all those drawings,it annoys me,as if he tries to make funny out of those incedible stories to force you to take it with a grain of salt,am sorry but if i want to have some fun i will watch a comedy but if i want to watch a history about the past whether it’s politics,economy or about video games,i don’t wanna any genericness in it because i consider that BS.
I never get tired of hearing "cal-coo-lader". I still have an Apple ][ and an Apple //e that works! 6502 was so loved by all kinds of lowspec machines - I remember my Vic20 and C64 fondly, as well. My computer class had an army of Commodore PETs. Very fun video - I've owned a number of PCs with this processor but I never knew its story!! Thank you.
10:38 that small note under the pricing is so good. The insane confidence with the transparency really shows they were proud of this.
What does it say? I can't see it.
What does it say? My screen is bad so I can't read it.
It says "If you get clocks for less than $5, buy the mos6501 and give your purchasing agent a bonus."
I really liked the 6502 instruction set. The zero-page indexing was awesome!
It is awesome that 6502 accepts its 8bit nature and has a ton of instructions which work with 8bit immediates. No weird register pairs. Instead 16 bit values are addressed as memory, just like strings. Clean.
I like that little Endian works with the way immediates are read and the way pointers in the zero page are read. Yeah, Intel had carry look ahead, but 8 but ADC saves a few transistors. I think that MOS invested a lot of transistors into cycle efficiency. Unlike RCA 1802 or Intel, a lot of stuff could be done. We have like 4 busses and a huge PLA.
Only later the MOS fab produced better and better chips, and soon their clock frequency wasn’t the limit anymore. Rather in the C64 the complex system bus limited speed.
Programming an Apple II in assembly language was exactly where I started. Really loved all the addressing modes and powerful system calls. It wasn't until years later that I wrote my first lines of code in a high-level language...a thermodynamic model of an air-standard Brayton cycle.
Isn’t there this indexed zero page indirect addressing mode, which no other CPU has? It seems to be used for Demo effects only. It is like if on a 68k you could use the 3 lsb of a data register to select one off the 8 address registers to use in a reg-mem instruction. This is insane. I guess 6502 cannot move addresses between address registers like a 68k. I Wonder how you teach a C compiler to use IDs in Y instead of pointers? And use zeroPage+X for lists. Global optimization? Seems like C is the wrong language for 6502. Similarly, why is it so difficult to put pointers to the stack frame on the call stack? Ah, no addressing mode for this. 6502 cannot even peek SP+signed immediate8 .
I'm convinced the 6502 is one of mankind's greatest inventions, and this is the best content I've ever seen on its history. It's such a great story- a team of brilliant underdogs leaving the billion dollar Motorola and joining up with MOS to create a revolutionary processor, accessible to the masses. Also, I love seeing all the programmers leaving comments about what a joy the chip is to program against- I think its greatest strength is how succinct the instruction set is. RISC is beautiful to me.
Absolutely brilliant content, sir.
Not sure, if there was no 6502 everbody and his kitchen sink would have used the z80 instead!
Thank you this - I never looked too deeply into the origin of these chips. I remember as a teen learning to program super-simple assembly programs on my Commodore 64, which used the 6510, and then marveling about 10 years later when I was a junior technician, poking around the guts of a broken old DIN keyboard and realized just 10 years later the chip running the keyboard was a 6502! Going from being the central CPU to just powering the keyboard of an Intel computer is quite the fall from grace. 🙂
yes and no -- the whole point of these cores was they were cheap enough to be used anywhere. Even while IBM PC's were being powered bu the 8088, keyboards were already being powered by the simple-but-not-THAT-simple 8085.
There were actually smaller alternatives to the 8-bit boys but nobody remembers them now. Basically calculator chips with minor modifications. Slow mask-programmed 4-bit behemoths found in a lot of instrumentation until they were replaced by application specific chips.
These videos may not pull as many views as the Low Spec ones but they're absolutely top notch content! Can't wait for the next ones!
Bought at first a Spectrum Z80 at the time. It feeled so cheap like their tiny rubber keys. Went back and swapped it for a VIC20. Later a commodre64, got in love with the old 6502 and learned to how to program it in assembler. Great times. Doing stuff where the "computer" wasn't designed for. Stepped later over to the 8086.
My first computer was a 6502-based Superboard II from OSI.
It ended up 100% overclocked at 2MHz, and I installed a heatsink and a fan to keep it cool enough to run reliably. I converted a B&W TV into a monitor, and built power supplies, and added an expansion board to make up 64KB of RAM in 128 half-Kilobyte 2114 memory chips - and they were not cheap!
I resurrected a dead RX02 8" floppy disk drive from a PDP11 with an RS232, and squeezed 960KB of storage onto each IBM Floppy - they cost $28 each!
As a member of the team from Motorola to MOS Technology my compliments on the presentation. You know more than I do about the “big picture” but your telling of the things I do know about is accurate. Thank you
I am going from username alone but are your Harry Bawcom? I am super glad to hear you enjoyed the video. Hopefully you noticed the piece our artist did at 7:28 with the original Motorola team that should have you on it (and we would love to send you a print of it, if interested).
Also, let me know if you are ever available for interviews. I would to know more about these events or what you did afterwards! I am available at alex@lowspecgamer.tv
My father was a mechanical engineer who got the first family computer in 1985, a kaypro 2. 2x 5 1/4 floppy drives and no hdd. You had to boot from disk and it could either run cp/m or msdos. Gaming with ascii graphics on a 5"(? Ish?) Monochrome screen. Ah the days of 'ladders'. You can still find it online as a self contained .jar file. When I was in school for electronics one of my instructors was on the Intel design team that broke the Ghz barrier. When asked how they did it he just said 'P. F. M.' Because of that instructor we also had access to a small corner of Intel's educational materials and I was able to download a transistor schematic for the 4004. I remember some of these wild west days from the early 80's from my early youth. Of course someone younger than 10 as I was at the time had no way of comprehending the gravity of what was happening around me, I just accepted adjusted and moved on. You're making me feel old here 😁, playing games on my ti85 in high school trig.....
Great content this is worth chronicling as almost every living person today is affected by decisions made over 40 years ago in back rooms and garages used as office space.
We had a lugable Kaypro 2. I loved playing text adventures on it.
@@hicknopunk like I said above, you can find ladders in a .jar file. Did you ever play a game that was a galactic economy Sim? Now THAT was a fun game. Wish I could remember the name of it and find a copy in a jar as well. Good old kaypro. I remember one time my dad took it on a business trip and a coworker of his that had the same machine set it up to allow me and my dad to instant message each other, in like 1985. Take that instant messenger.
@@jonathonschott the games I remember most were infocom games like Zork, Planetfall, Hitchhikers Guide, etc. I also had arcade games made up of all text characters like centipede, pac man, some racing game.
I was a kid using it, so I mostly remember playing text adventures at night before my dad had to take the computer back to work on monday
I wish processors were still in that margin, with 1 dollar per core. 8 core processor? 10 dollars
My father was a chemical engineer and in 1986 brought home an IBM XT 286, 2MB ram, 20MB HDD. It was $6,000 that his company, Dupont, paid for. I remember playing flight sim and making pictures by printing out characters on the dot matrix printer. He would log into work remotely with paschal and was a cobal programmer.
20:44 the Apple was little more than a 6502, RAM, ROM and a keyboard. They even made the 6502 responsible for video output. Also, you glossed over the fact that Wozniak could *not* get his design to work. Chick Peddle naively showed them their mistakes. And thus, we are stuck with Apple today.
Today's PCs are just a lot of processor cores, RAM, ROM, and a keyboard. Even GPUs became just more processor cores. Wozniak was right all along: hardware's job is to run software.
@@ischmidt Try to us your smartphone to control a 3d printer, try to use your smartphone to write a real program, and you'll see the difference between a real system with expandable IO and the disposable tinkertoys without it. IO is so easy to overlook but so important.
@@tsm688Smartphones can do both of those things, and they can do them comfortably when you plug in a USB-C hub to add a keyboard, mouse, and display. The entire point of the Arduino and Raspberry Pi ecosystems is precisely that a lot of I/O oriented stuff doesn't need a PC.
@@ischmidt on a platform where programming languages are literally banned? The best and nearly only way to program for phone, is to not use phone to do it...
I love the little lapel pins you've got on some of the people in these illustrations, it really helps make it easier to distinguish who each person is
Used so many machines with the 6502 it just was EVERYWHERE, and to some extent modern versions still use it.
I programmed the 6502/6510 in assembler. But prior to that there was the Cosmac 1802 and that found its way into avionics and other systems.
My very first computer build was the "Microtan 65 " kit from Tangerine Computers in 1980.
You had to solder cbip sockets,and everything else, into the PCB and insert the chips into the sockets.
It used the 6502 processor. It was pure magic when I powered it on and discovered that it worked.
That was my first computer, too. Bought as a kit in 1979.
I am now building a replica of it using the 65c02 and modern memory chips (32kb in one chip is pure luxury, compared with the 2 chips needed for 1kb we had originally).
I started working in that business in 1978. I programmed a Commodore Vic20 (in Basic) to run an electronically controlled cow milker. Rockwell was a second source for the 6502. I programmed a Rockwell AIM65 to run a system to verify that VHS tapes were undamaged. Fotoshop was a company that had kiosks that accepted film for developing, and returned prints in a few days. They were planning to accept orders for rental tapes and fill them in a few days. It was fun to watch 8 specialized VHS testing decks all being controlled by one 6502. Perhaps a video could be done on Colecovision. It was the first reverse engineering of the Atari system, and it was successful for a while I was the software person working on getting the reverse engineered Atari hardware working, but then others did the rest of the project.
I will never understand how these videos haven't popped off yet, especially with the watch time probably being insanely high with people watching the whole thing. Gotta happen eventually
I still got lots to improve. It takes time to build a new brand. You can help me by sharing it!
@@LowSpecGamer what could you possibly improve other then your pronunciation (no offense intended, but that's literally the only thing I can think of that could be better, I'm genuinely asking what needs to improve.) These videos are some of my favorite content on the platform, your style is just so great. To be honest, you found and nailed your own unique form of presentation.
@@MrGamelover23 If you actually mean his accent when criticising his pronouciation I'd not have it changed in any way shape or form. He is perfectly comprehensible and on top of that very distinguishable in tone and accent! It adds character!
I'd rather have profound substance in content and production value.
Since years I'm still here and excited over every video he releases. Other creator who are obsessed over perfect pronunciation and tone (aka like to here themselves talk) but seriously lacking in substance of their content (like e.g. coreteks) quickly lost my interest.
@@glockmanish hey don't get me wrong, this guy's videos are amazing, but I still need subtitles for some parts. This guy makes some of the best content for platform's ever had. I was just asking what could possibly be improved since he said he has a lot to improve?
@@LowSpecGamer hey, I apologize if my earlier comment was insulting or offensive or hurtful in any way shape or form. It was not my intention to be rude or hurtful and I hope that you weren't hurt.
Man was this a fun watch! Jack really is a stubborn business man that really loves calculators, and basically damage many business opportunities because of it.
I think he was just not a “visionary” but a good “businessman” that really kept an eye on what actually made his company money. Many companies over the years have gone bust because they lose sight of what their core business is and Jack needed definitive economic proof that computers were a worthy venture before committing.
@@LowSpecGamer and 3 different companies wanting one isn't proof?
"Business is war" Jack Tramiel
@Alfred Wedmore 😄
@@LowSpecGamer In the end his style of doing business also was the downfall of Jack! Nobody who ever had to deal with him businesswise wanted to do it a second time, just look at the Amiga guys how desperate they were not getting into Tramiels clutches and were saved last minute by Commodore from that fate. Also just look how Epyx was screwed over by Tramiel on the Lynx and basically went bankrupt because of it! In the end Atari under Tramiel stood alone and no one wanted to deal with them anymore.
I love a good video on microprocessor and digital design history and this one is exceptionally well presented. Seriously nice job.
Id love to see a video about the history of Unix and unix like operating systems in this format, keep this content coming bro its rly good
Your new Videos are GREAT. You went form someone who uploadet something semi interessting every couple of vids to in my playlist next to oversimplified, vsauce and in a Nutshell.
You managed to turn your channel into something that is timeless, great job my man.
I'm glad to have watched this video today. I came here from the LowSpecLore playlist link on the "End of Low Spec?" video that I still had an open tab of for some reason. This kind of history lesson is great; I'll keep watching.
Loved the cheeky use of the Mac font Chicago when you mentioned CES Chicago. It’s the little things. 👌
I was really, really hoping someone would catch that. I am glad it was you.
You put people first in your stories about electronics. That's marvellous!
Great video! I was wondering how this amazing CPU came to be, especially as I was watching Ben Eater's series about making a breadboard-based computer around it.
Some days I miss your old type of content(read: low spec game setup guides), but these videos are great! Totally amazing videos. Earns my like, FWIW!
Incredible deep and insightful documentation, with a great narrative. I am astonished!
In 1977 I abandoned my plan to be an electronic engineer to become a computer engineer, at 10 it was easy to pivot ;)
Low spec has never left us. Man, this each and every upload is so great, many thanks my friend.
If anything I feel like it’s not only not left us but is only getting much more common. Now that processors are this powerful in general even the low spec ones blow stuff from the not so distance past out of the water
I started with Radio Shack's Color Computer in 1984, ordered with 16Kilobytes of memory, but delivered with 32K! Thought I was in heaven! Can't wait to see your take on the 6809!
I used the 6502 in the early 80s, and loved programming it (in assembler). It had exactly the right instructions to do just what I wanted. Completely fuss free and clean.
objective oriented programming is a fucking lie...
hopefully in the next few years we see a trend in indie game developers of coming back to Functional Programming (all functions inside a single file of code, code easier to read, etc)
@@FeelingShred That's a bit random and absolutely nothing to do with my original comment.
@@mandolinic haha Look, another C++ advocate LOL 🤣🤣🤣
@@FeelingShred C++? Yeuk.
@@FeelingShred it is good to have a data structure of some sort... but OOP mostly means "37 different warring designer's data structures in a mangled glob that barely functions". Just give people the data, in a simple flat structure, and they can do anything in any language
you are a master historian I'm very glad and fortunate to be your humble subscriber. many thanks for you and to the team. thank you all.
Commodore and its people were so central to early consumer computing yet so little history reflects this. Thank you for doing your part in helping to educate. How well did they know what consumers wanted? They were the first to one million units and have the world record for most units sold of a given model. RIP Chuck Peddle. Please do a video on how Commodore's Amiga was the first mass market computer with a preemptive multitasking OS. This is also crucially central to modern computing and yet also very much overlooked.
Commodore truly does not get enough credit. It is a shame they did not survive through today like Apple, Microsoft or Intel
The Amiga was not a Commodore computer. It broke with all the fine traditions: no instantly ready BASIC in ROM, no PETSCII, no backwards compatibility. Not surprising because it was not made by Commodore but was bought in.
They should have made an own 16-bit computer backwards compatible with the C64 instead - that could have been an IBM PC (clone) killer.
@@NuntiusLegis It wouldn't have mattered.. The PC didn't win because it was the best, or the most fully-featured. It won because it was the *default.* It didn't matter how good Commodore made the Amiga or a hypothetical C256 or the like, because only they were making it. Just like Apple was the only ones making Macintoshes. *Everyone else was making IBM Clones.* So programmers made more IBM-Compatible software. And in the end, all the fancy hardware in the world can't change the fact that *software* drives hardware sales.
@@watchm4kerWith the C64, Commodore had achieved the world market leadership for micro/home/personal computers, outselling IBM PC clones, Macs, and everything else by far. The C64 was the default.
IBM PC clones won the race in the end, because they were continuously developed further in a backwards-compatible way. Had Commodore done the same in time, our current PCs would be descendants of the C64.
@@NuntiusLegis No. They wouldn't. Because companies couldn't make C64 compatibles without running afoul of Commodore. Nobody else could make a C64. Nobody else could make an Apple II. Nobody else could make an Atari 400. Or an Adam. Or an Amstrad. Or an Acorn. But they *could* make a computer that was just as good, if not better, than IBM, and let someone at home use the same software they used at work.
Commodore, Apple, Atari, and Acorn were not competing against another company. They were competing against an entire industry, and one they could not compete *within.* (See what happened later with Commodore)
If you want a counterargument, though? *The MSX.* That was a computer system designed specifically for licensed manufacture by multiple firms. Sony, Toshiba, Panasonic, and more could make MSX computers, all compatible, all interoperable. Had it been pushed harder in the US, and had Zilog been able to keep pace with Intel's hardware, it might have become the standard for home computers, worldwide.
The only company that could stand up to the IBM PC was Apple, and that took a truly Herculean effort to achieve, carving out a niche as the graphics computer of choice for anyone working in art, design, or publishing. Even then, they tried to grow the market by attempting to license out hardware manufacture, but it was too late for anyone to really care. Amiga found a smaller niche as the first low-cost video graphics machine, and it held on amazingly well.
Of course the company that really got the last laugh over them *all*... is Acorn.
20:51 Until this moment I was never sure that the first product of Apple Computer was actually called the "Apple 1" rather than just the "Apple" , meaning there was foresight by Jobs as to how progressively introduced models are named.
I can only find resources saying it picked up the one in the name some short time after the initial launch. Looks like in some official context it didn’t always have the one as most people would expect
the 6502 is fun to learn and a great retro chipset to get into as a budding computing hobbyist, next to the z80. Incredible art and incredible video!
Thanks for the history lesson. I can recall my first computer, it as the brand new addition to Commodore, the 128 with a new 80 collumn green screen monitor. Of course most of the time I ran it in the 64 mode Go 64! I actually made some money programing in Basic. I had been in an accident and was left disabled from the police department, whille the powers that be were debating my future, I picked up the wonderful manual that came with the 128 and learned Basic. Having been the chief of a small department, I knew all the local businssmen and when they learned I could write basic programs they were at my door requesting programs for their little C=64 machines they had at home to write billing programs, and help run their stores, some owned multiple businsses and needed someone who could program sprad sheets and such. So I made enough money to help keep my children in college while workmans comp decided my future. Now I am about at the end of my days, looking to doctors to extend my life a bit by batteling the cancer that has eaten my right kidney, I look to UA-cam to keep me positive as I face the future. Will I soon be with my wonderful wife of 51 years who lost the battle with cancer 2 years ago, and my two sisters who faced the same fate back in 2019. Or can I move on and watch as my grandsons and great grandson takle the world of the future. Time will tell, they tell me I should know before Christmas....
About the only maxed-out 128 application was some high flying business BBS software. Trying to sell CPM, in 1985!? They really did not read their market for that one.
Legends state that Jack Tramiel bought Atari from WB with him recieving the Atari corp, $50, and a ham sandwich from the deal.
Sounds about right
@@LowSpecGamer Yes, Atari was still reeling from the crash of '84, and had already missed the boat on what would become the NES. They were DONE, and their parent company wanted them gone.
Another problem they faced was RISC's heavy reliant on memory bandwidth since there were simpler but a lot more instructions, they had to wait for someone to create faster memories...
Yes! There are ton of small technical stories, like the process they had to go through to create the masks, that I just could not get into for lack of time
I don’t get this. I thought that RISC has a lot of registers in order not to access memory for data. Large registers and barrel shift to pack data.
RISC replaces the microcode ROM with cache RAM for code.
RISC tries to avoid wait states, but always had allowed for them in case of a cache miss. MIPS was an experiment to see how we can reduce wait states as low as possible. Just consider DRAM and the address multiplexing. It is kinda double data rate. When MIPS was designed, a load or code fetch was still single cycle. When it came to market in 1987, fast page memory introduced variable latency.
@@ArneChristianRosenfeldt Large registers can only help eliminating load/store operations by few percents not completely, also "RISC replaces the microcode ROM with cache RAM for code." wouldn't that exacerbate the problem, since microcode ROM is faster(at that time) and directly wired to ALUs and can do multiple operations in parallel?
@@niks660097 I don’t get the few percent thing. I may be biased. I try to write assembler for AtariJaguar. Load Store does not block there and two instructions are loaded from RAM per cycle. So as long as I have reg-reg in between, von Neumann goes brrrrr. And a lot of code snippets I saw and my own ideas have a lot of reg-reg instructions. JRISC is cheap and only has one instruction format ( 6Operarstion:5src:5dst ). So there is a ROM which translates the Operation to the control lines. It only has 64 entries because there are only a few instructions and each is single cycle.
So and that was a lie. JRISC has multicycle instructions also. RISCV has different instruction formats which slow down the decoding a bit. Still fits in a single cycle for a low enough clock rate.
I don’t understand what 6502 does with the instruction opcode before the PLA. I thought it is so smart to emphasise decoding speed. So the opcode is latched and then using large transistors with lots of fan out blasted over the 128 transistors of the PLA for maximum parallel processing. For extra speed this is even a balanced signal almost like ECL. But why the same apply the same urgency for the following cycles of the instruction?
I’m hyped for AMD’s Pheonix. It should replace low-mid level graphics cards with an APU that can play just about anything at 1080p 60.
1080 60 is low end. Mid level is at least double that. Even the 2060, a 3 year old budget card, runs games at 1080 100+
@@TheLongDon well top end pheonix should be around a 2060 apparently. I don’t think a 2060 could play anything you throw at it at 1080 60 tho.
I hyphenated low-mid not meaning low through mid but in the sense that it would replace lower mid range cards like the 2060 which is not a low end card imo. An mx450 or 1660 would be a low end card. Mid range cards would be 60&70 series imo. In retrospect, i may have been a bit too hype saying low-mid. It will only replace lower end cards.
@@TheLongDon What parallel universe are you from that 1080 60 is low end? I want to go to that universe without the chip shortage.
@@TheLongDon depends on game, E sports yes, AAA it's well within mid-level
Great rundown on the history of the 6502 - exactly what I was looking for!
"the display was mostly broken chip"
Use to do that all the time when we where making display units back when i worked in electronics. If its a display unit, its most likely broken.
Maybe this videos don't get a lot of views yet but with this quality it's just a matter of time. Love the new direction!
10:37 "Note: if you get clocks for less than $5, buy the mcs6501 and give your purchasing agent a bonus."
FRICKING GENIUS MARKETING CAMPAIGN
I was a database applications developer until I retired. I rarely worked on any low-level programs (mainly for file conversions), but I quickly became interested in hardware. Unfortunately, I didn’t have the time or energy to learn about hardware - until I retired. I have been having a lot of fun learning about the history of computers and other electronic hardware. Videos like yours are perfect for guys like me!
I had a trainer board that used the 6502, and it had 4K ram!
I stored the programs on a cassette tape recorder, and I had a black and green monitor connected, after first using an old RTTY machine!
6502 needs fast memory and originally could not accept another bus master on the 16 bit address bus and only had limited current output. So you better pair it with fast = slow RAM and add bridges for all bus pins.
I wonder how the bridge can be fast enough to tell the SRAM that write enable is not meant for it? Address comes a bit before enable, so there is that. I still Wonder if there should be a shadow copy of SRAM in the DRAM extension to correct misguided writes? Kinda write through cache. Write only happen at half the clock rate, or do they? Push program counter is faster, but doesn’t leave SRAM. DRAM reads need to prolong phase 1 of the CPU.
I never saw your earlier stuff but I love this stuff. This is all before my time but it’s fascinating understanding the origin story of so much of what we take for granted. Good videos. Well done.
Oh my god this was one of the most entertaining videos I've seen! Props to the artist(s), the artwork really made it 100x times better. Your content keeps amazing me! :)
The number of personal computers and video game consoles that used the 6502 (or a chip using the same instruction set) shows how it was the right chip at the right time.
This was such a good documentary, I like how everything was connected
Awesome show, many thanks for for amazing effort! Interestingly Acorn, the makers of the 6502 based BBC Micro took some design cues from the 6502 and discussion on risc at universities to design the fore runners to Arm chips, something Bill Mensch (Western Design Center) one of the designers that worked on the 6502, mentioned he wasn't too happy about.
I knew parts of the ARM story but I had NO IDEA they directly took ideas from the 6502. That is brilliant. I am totally going to add that to my eventual ARM video. Thank you for telling me.
I think I've seen a video with Sophie Wilson say that she was inspired by the 6502 when designing the ARM.
I GOT to make that video eventually
6502 sets flag on load. ARM does too? But is totally the wrong thing for RISC and load store. Even flags are not so great of an idea. Now JRISC got all those variants ADD, ADC, ADDtransparent to denote if flags are read or written. Power even has 8 flag registers. Or is this okay? After all CPUs have float regs and vector regs. Vec8 could write all 8 flag registers.
@@ArneChristianRosenfeldt All RISC becomes CISC once you discover enough needs
One significant point is that the 6502 was NOT bargin basement enough for Atari 2600. The 6507 was used instead, which is like a severely crippled 6502 with less address lines and one interrupt line.
Great video, insane quality, but I still miss those old tweaking classic Low Spec Gamer videos!
Will be another high quality experience
Just found your channel. I watched your videos on the Z80 and 6502. Very entertaining and informative stuff. Keep up the great work. I look forward to viewing more of your content.
What a great chip. I learned programming on a Super Board with 8K RAM. The ideal instruction set to start with.
Ooo, me too but 4k (to start with)
@@christopherlawley1842 I put 16k RAM on mine, basically by soldering an extra 8k on top of the first 8k of ram, worked fine; even with overclocking it x2 :)
Good video. Basically that period of time (1970s), was the Big Bang in terms microprocessors. Prior to that computing was huge mainframes that would take up a large room or an entire floor of office building. Then comes along the Microprocessor that shrinks much of mainframe functionality into a single chip... Steve Wozniak pitched the Apple 1 idea to his employer (HP), and the idea was so foreign to them that they could not see any future in microprocessor technology. To be fair, it was so earlier only a rare few understood it potential. aka, in the early 1900s who would want a horseless carriage (car) when you can just have a horse.
A couple of little details. If I recall, the 6501 or 6502 had one error after layout, a rotate that didn't work properly sticks in my head but that was so long ago. Also, I seem to recall the Atari 2600 used the 6507, a version of the 6502 with a shorter (13 bit) address bus in a 28 pin package (A lower, low spec processors). When larger ROMs became available, they had to resort to bank switching to address them. I used to program in assembly on the Atari 800 (which I still think was a wonderful computer). The one thing I didn't like on the 6502 was the short index registers (8 bit) but at least there were two of them compared to the single one on the 6800. Coming from the Intel world, I was never fond of the bus interface on the 6800 and 6502, I liked the 8080 and Z80 better with MRDn and MWRn instead of the phases. Just what I was used to. To this day, I still find it amazing that a 6502 clocked at less than 2MHz (half of color bust 3.58MHz) could produce such stunning and fluid games on the Atari 800 (granted the simple graphics hardware of the day helped) but there was some really fast code written for them. I had the chance to visit MOS Technology in the late 1970s and it was something going onto that hallowed ground, and meeting some of the people who were designing the next generation. Unfortunately, it is now the closest Super Fund site to my home (an hour away). Even though I work largely with ARM microcontrollers these days, I still miss the old days with these great old chips. I made a lot of embedded systems with the Intel 8085 and loved it. However, once I wrote code for the 8086 and 80196, it was hard to go back to these simpler instruction sets. Great video and pretty accurate and well researched. The drawings are awesome.
This channel has come so far... I just discovered it and looked at your 1st video and then viewed your newest video and then compared the two to confirm you have indeed came far
My first tinkering with micros was with the 8080. I did a bit of exploring the 6502 though. I'm now past 70, giving an indication of how long ago these were a thing.
You do an amazing job of delivering a relatively dry story on paper in a compelling way! The building of tension and focus on the individuals involved is great.
WOW that was a good watch. I don't know why, but I love how you've depicted calculators as sleazy computers.
My wife bought me an Atari 400, and I spent HOURS programming 6502 machine language into Strings that BASIC could call. When I went off to BCIT ( BC Institute of Technology ) to try to learn enough to get a job, at graduation I came across an ad for someone who could deal with 65816 code. I found a manual and discovered that it was a beefed up 6502, I applied, got the job ( porting a home banking system to a console machine ), and spent ( following the usual detours ) 25 years working on home banking software.
Those were the days.
It's incredible how this is the same story told from different points of view
loved programing in assembly language the 6502, very easy instruction set to learn.
I bought an MOS Technology 6502 as soon as they announced it. It was $25 and came with two excellent manuals, one hardware and one software. I had just built an 8008 machine. The 8008 was $45 and the 2102 rams were $5 each. That was a lot of money in 1975. My 6502 machine had a front panel with the normal arry of switches so I could fat-finger in a loader. The real advantage over the 6800 was not just price, but the indirect addressing modes in the 6502. It was much more advanced. I was shocked when Motorola came out with their 68000. Why would anyone come out with a register rich machine when the future belonged to compilers (which would never use all those registers) and multi-processing which required fast context switching. It was an interesting time.
Registers are called global variables by some. Similar to the zero page. So with a bad enough coding style you use them up quickly.
C and Pascal have structure and a compiler can check spans without any calls and then check for non-overlapping lifetimes of variables ( first and last access ). This can be expanded to parameters and return values.
Even in 1993 AtariJaguar came with scratchPad memory which forces you to load small, but mostly complete programs into 4k of SRAM ( like on PET ). And you load data into your 64 registers, which are not that global anymore. Sad, to use a compiler only for those snippets. You have to link them all using a custom tool chain. Ah, and no subroutines allowed. I mean prohibitive slow. All macros and loops.
@@ArneChristianRosenfeldt tons of registers make programs also faster, you do not have to performan constant on off stack operations for storing away data. There is nothing worse than having only a small number of registers, well yes there is something worse, having a small number of registers, and half of them dedicated to just one single task to skimp on internal wiring logic!
@@werpu12 6502 has the decimal logic wired to the Akkumulator, but otherwise it has a register file just like z80 . ARM, MIPS, JRISC, and SNES have special registers for multiply and accumulate. So basically, just as 6502 they have an accumulator. I always thought that maybe they keep carries at every fourth position like Z80 ALU? Also MAC already reads two source registers per cycle. You want two read ports on the file? I may want to add one bidirectional port to also allow two register barrel shift. And division is just slow, so indeed it uses hidden registers. No need to expose those in the ISA, if you have a scoreboard as JRISC does. MIPS does not. MIPS blocks if memory access takes longer than a cycle (plus addressing mode, 5 stage pipeline for memory vs 3 for reg-reg, so actually two cycles? MUL also fits in three cycles ( no addressing mode )). MIPS delegates multi cycle stuff to a coprocessor.
Yeah, int, float, and vectors are not that different. For vector add, just block the carry. Same for exponent and mantissa. Modern, deep pipelines can deal with instructions of different execution time. So 64bit pointers, double floats, and MMX all in the same registers!
So you don’t like the flag register? RISCV is for you! I think that the forth port to the file could be used to set or get a carry from to any register for ADC. Power has 6 flag registers. I just want them aligned to the file.
Stack pointer is a special purpose register as is instruction pointer, but these are loved.
These documentaries are so great! Subscribed!! Of course, my first computer was based on this CPU, it was the Commodore 64.
Wow the story telling was amazing! Looking forward to the next episode!
the new format is a great blend of storytelling, art and history. very unique channel and i love watching you! commenting for The Algorithm!
The funny thing is that the 6502 wasn’t even that low spec. It depended on faster memory due to having fewer register than its competitors (6800, Z80, 8080), but it could do twice as many memory cycles vs. the Z80 at the same frequency, so assuming your memory could keep up, it was way more cost effective despite being only marginally slower.
Pretty "low spec" with just a 8-bit stack pointer though. And memory couldn't keep up. Because the 6502 did not use the available memory speed very efficiently; only a fraction of a cycle was allowed for memory access, due to the transistor level design. A 6502 at 1MHz needed a 300ns memory, which is less than 1/3 of a clock cycle. In contrast, the Z80 allowed two full clock cycles for memory to respond. So using the same speed memory, the Z80 could be clocked around six times as fast as the 6502. This meant that ordinary 8-bit operations as well as normal jumps, calls, returns, etc. could be 2-3 times as quick on the Z80, using the exact same speed (and therefore cost) RAM and ROM.
But the big difference was in 16-bit and 32-bit arithmetics (such as for floating point). The Z80 could do a 16-bit addition in 11 cycles, while the 6502 needed at least 20 (much longer!) cycles for the same thing. Being able to be clocked faster, less cycles for 16-bit add/sub, and several internal 16-bit registers meant that the Z80 could be about 12 times as fast as the 6502 in practical "number crunching" tasks. Again using the same speed, (cost) memories.
@Jimbo Bimbo But Elite was first written for the BBC Micro - a 6502 machine
@@herrbonk3635 The practical speed difference is nowhere near what you indicate. z80 11 cycles for a 16 bit add is only for adding bc, de, or the stack pointer to hl (or adding hl to itself). Doing the same adds to ix or iy is 15 cycles. And you have to get the data into the right registers somehow to start with. With as few registers as both the z80 and 6502 have, that usually means the data comes from and goes to RAM. The 20 cycles for the 6502 is to add an arbitrary pair of zero page locations (there are 256 of them) to another arbitrary pair of zero page locations, and store the result in possibly yet a 3rd arbitrary pair of zero page locations. So your 20 cycles gets you the same functionality as a 16 bit machine with 128 registers and 3-address instructions. To do something as flexible on z80 requires e.g. "ld hl,($nnnn); ld bc ($nnnn); add hl,bc; ld ($nnnn),hl" which is 16+20+11+16 = 63 cycles and 11 bytes of code, vs 6502 20 cycles and 13 bytes of code. Sure, if you can keep all your data in registers on the z80 it can fly, and it does have more registers than the 6502, but it doesn't have *enough*, usually. 6502's 256 bytes of Zero Page is enough "registers" for anything you can imagine -- the same as a modern 64 bit machine with 32 registers e.g. RISC-V or ARMv8. I found that typically over a wide range of programs a z80 needs three times more clock cycles than a 6502. z80s generally ran at 4 MHz, while 6502s were either 1 MHz (a little slower than a z80) or 2 MHz (significantly faster than a z80).
@@BruceHoult No, it does not mean that it comes from memory, that was one of the main points iirc. Firstly, no one uses IX/IY for this purpose, other than perhaps a naive beginner in a school assignment.
Again, even with zero page addressing only, the typical 1 MHz 6502/10 still needs 20 cycles = 20us for a 16 bit addition or subtraction. During those 20 clocks, the typical 4 MHz Z80 had 80 cycles to spend.
Floating point routines for the Z80 could store up to three 32-bit mantissas in HL'HL, DE'DE and BC'BC. With exponents in A and A' and/or B,C. This was very useful for repeated shifts and adds, such as in mul and div. So these could therefore be performed with almost no memory data traffic at all. So very unlike the 6502 in this regard.
The Z80 EXX-instruction was needed ONCE per 32-bit operation (via carry) and took the same time as *ONE* clock cycle on the 6502!
The Z80 sure needed a different crystal than the 6502, but, again, it could cram more power out of the same memory ICs than the 6502 could, especially in 16/32-bit calculations. Memory speed was the main (economical) limiting factor in the 1980s. Even an old "4 MHz" rated Z80 chip could actually be run at 6, 8 or 10 MHz, when coupled to the faster memory ICs that became common in the latter part of the 1980s.
The clock speed difference didn't even need to be large. The ABC 800 (based on a 3 MHz Z80) had a significantly faster BASIC than the pretty optimized BASIC for the BBC Micro (based on a 2 MHz 6502). The latter was in itself known to be quicker than say M$ BASIC for 6502. en.wikipedia.org/wiki/Rugg/Feldman_benchmarks
@@herrbonk3635 Thanks for the link! Unfortunately it seems you didn't read it, or look at the table of results: "They conclude that the 6502 is the highest performing of the CPUs, agreeing with comments Gates had made in his letter." Just as you apparently didn't read or didn't understand my comment. I will repeat and rephrase: there are micro-benchmarks (individual loops or functions with few variables) where the z80 is faster because it can keep everything in registers, but this doesn't translate to being faster on entire programs, and certainly not by the margins you claim.
Can't wait for when all this new quality videos blows up, seriously good work
This didn't feel like half an hour, amazing content
Your videos were actually so Intereting that they made me sign up Curiosity/Nebule to watch more. Good work!
6502 and its older brother the 65816 are two of the most iconic 8/16 bit microprossesors out there
Ricoh in Japan also based its processors used in the NES and Super NES on the 6502.
Chuck was absolutely right about processors needing to be cheaper and microcomputers needing to integrate BASIC.
An entire generation of software developers got their start on cheap home computers using BASIC.
I doubt that I would even have had a 25 year long successful career as a developer if I didn't have a lifelong interest in programming sparked by having access to a Commodore 64 at home back in the 80s.
I knew plenty of kids that grew up with consoles only and had absolutely no interest in learning how they worked or how to make them do things.
I feel sorry for the current generation of kids who are growing up with smart phones and tablets, which like consoles, have their programming interfaces hidden and not easily accessible.
The Atari 2600 actually used the 6507, which is basically an even cheaper 6502 in a 28-pin package with a reduced memory address space.
I know, I had to simplify to not add yet another tangent to the video. Since it is a variation of 6502 it still works within the narrative of the video
I came back to this video yet again just for that "SPACE" yell. The animation of the Sputnik slowly going by is just hilarious 😂
The 6502 reigned supreme for more than a decade - it was a victim of its own success and traded market domination for a footnote in wikipedia
Being victim of your own success is not a new story. Have you heard of Eastman-Kodak?
My first computer was an OSI computer with a 6502 chip. Learned to program in machine language, assembler and Basic. Things were so transparent and simple.
Amazing documentary and the story was told really well. Thank you.
I always wondered why youtube stopped showing me your videos but did not expect you to completely go another direction.
Miss the old lowspec videos, wish you would have kept some like the apex video atleast.
Wishing you the best man I still remember when you guest appeared on the wan show
I see LowSpecGamer uploads
I click.
I remember finishing the Commodore 64 Programmer's Reference and thinking, "That's IT? That's what make these amazing games work? Just these simple instructions and their various addressing modes. Now I know a lot more about how the magic works, it still seems amazing.
"$250 for a processor? Das too much".
It is weird to hear this, I know this isnt accounting for inflation but $250 today is like the norm for a decent cpu.
It’s almost like we lost perspective of what used to be mainstream
a *personal computer's* CPU. not a CPU you could afford to put in a mass produced embedded system like a calculator or security camera or something
$250 in 1975 is almost $1400 in todays dollars, which puts the $25 6502 cpu at about $140.
CPUs that cost $250 nowadays accomplish so much more than those that cost $20. It’s where they’re used that matters. The chips people were looking for here were for calculators and systems which did not need massive instruction sets as mentioned in the video.
You can get an entire Raspberry π 4 motherboard for a fraction of that.
man! the quality of production is going up and up every upload!
keep it up 👍🏻❤️
Este tipo de videos no tienen nada que envidiarle a los documentales de los canales de ciencia de la TV, muy completos y me gusta el hecho de como los vas enlazando con los demas videos que has hecho hasta el momento, Excelente!!!
Our boy is making documentaries now, may your channel receive all the blessings for all the tech tips you gave us over the years 🙏🙏🙏🙏👍👍💪💪🙋♂
Just noticed something... for your future documentaries try to avoid using these animation screens with All White backgrounds... These things hurt in the eyes to look at when you're watching the video in a dark room (which most people these days are doing...) Dark Mode is the way...
Not sure how we can make black and white comic animations any other way?
@@LowSpecGamer WHITE MODE IS DEATH, it's worse than death in fact, it's painful 🤣🤣🤣🤣🤣 just kidding... I'm sure you guys will figure it out soon \o/ cheers
for a split second i thought you were making a processor i was like wtf?
Exactly what I thought too for a split second! lol!
Is it me or has the UA-cam recommendation algorithm improved enormously recently? Fascinating stuff!