Some notes from the editor's desk. I'll add more to this as time goes on. Please read or check before commenting: 1. Whether Apple should've invested more directly in WDC is a topic that I didn't address because it's too speculative (and would've stopped the pace cold). Theoretically Apple could have bankrolled an improved die before WDC migrated to Sanyo. They could've made an investment like they did with Acorn and VLSI. But there's too little on the record to discuss it without a lot of speculation. I'm guessing one reason they didn't around the ROM 03 timeframe is because of the direction they went with ARM. In the 87/88 timeframe 68020 products were out, and the 68030 was on the way. 2. In case it wasn't obvious from the text, Apple was obviously pushing the Mac as its machine of the future. The GS and the Apple II were victims of political machinations at Apple, especially later in its life. But that doesn't mean other problems didn't exist. I didn't want to get in the weeds about what a successor to the IIGS would've looked like, because very little would be on the record. A lot of Apple II people did a lot of great work during the 86-89 era, and did it under less than great managerial conditions. 3. On the subject of the SNES' usage of the 65816 core, I've since found decapped electron scans of both the 5A22 and 5A122 (the 1-CHIP SNES that combined the CPU along with PPU and other things). The original 5A22 has a CPU core that very closely matches the early die shots, so I don't think Ricoh modified the actual core at all (unlike the NES). I'm not a silicon designer; this is just based on my looking at it. The 5A122, however, has a newer design for the CPU core that more closely matches the die shots I've seen of the post-Sanyo redesign. Given that the 1-CHIP and the SA-1 were in design around the same time, I think it's a reasonable bet that the SA-1 uses the same core that's found in the 1-CHIP, and therefore doesn't have the REP/SEP flaws. There's other timing considerations for the SNES (its use of multiple clock speeds, fastROM / slowROM, the pseudosattic WorkRAM speed) that would've kept it from exceeding 4MHz back in the day without making the launch console considerably more expensive. But I think I'm comfortable saying that I now believe Ricoh didn't make any changes to the core to account for REP/SEP. The stable 3.56 MHz in its fastest mode likely comes from better yields at Ricoh. 4. Any IIGS faster than 4 MHz would've needed a cache (a la accelerators) due to the stringent memory requirements of the '816. Even putting aside that early chips (either due to REP/SEP or process issues) appeared to have problems closer to 4 MHz, they would've needed faster DRAM than what shipped with the GS. The GS probably had a healthy margin, but RAM was still expensive. A cache might've been the most cost effective solution, but this was still pretty new at the time. 5. The history of Zip and the accelerator companies is an interesting, but separate subject. 6. Left on the cutting room floor is Project Avatar. During the 90s Rebecca Heineman proposed a new GS-like system based on the 65816. It's similar in spirit to many of today's "modern retro computer" like the Foenix and Commander X16, except it would've been made in the 90s.
If one of 10 videos is one-tenth as good as this one, I am subscribing. Excellent job. I am going to see another couple and decide. Update: subscribed.
Very nice and well researched video! When I got my first IIGS a few years ago, and realized it was much slower than I was hoping for, I created the AppleSqueezer GS: a modern-day IIGS accelerator, which uses the 14MHz 65C816 + FPGA. Surprisingly (or perhaps not?) it's now a very popular accelerator / memory expansion / HDMI output used by many IIGS enthusiasts. I didn't know many of the details behind the early limitations of the 65816, and it was really interesting to learn more about this. Thank you!!
I know of the AppleSqueezer; alas, I don't own one. They're surprisingly hard to get. ;) I kid, because I love. You've done a fabulous job on that board. If you don't mind a technical question-I know you use a full-fat 14MHz '816; how did you solve the DRAM speed problem on your board? Does the FPGA act like a cache controller for what amounts to a very large cache for the system's existing memory? I assume the modern 256MB of RAM on the board is more than capable enough to satisfy the 25-30ns-ish time that the 14MHz chips seem to want.
I had a talk once with an ex-Apple employee who did work on the IIgs Rom. He said that they did experiments with 4MHz and 7MHz (and even higher clock speeds) with engineering samples. The system did work at those speeds but at more than 4MHZ, faster Dram was needed, which would have increased costs. The 65816 was still unstable at higher speeds and more importantly not available in volume in the foreseeing future. The early versions of the 65816 did have some severe bugs that became apparent at higher clock speeds. Applied Engineering had to work around those bugs with their GAL-logic to archive 7 and 8MHz. I suspect that early revisions of the FPI did have 4MHz and 7MHz options but those were removed in the final revisions. In the end they went with the absolute safe option of slightly under 3MHz and may have hoped to update the IIgs with a higher cpu-clock later on with a new revision. Sadly that never materialized.
You've done a great summary of all the points I brought up with an added bonus of relaying them from a IIGS dev. I'd really like to see a detailed breakdown of the FPI chip's structure; like you I figure it's reasonable that it had a capability to switch to faster speeds that was dummied out of the final chip.
@@userlandia I think you have done some very good research and i just wanted to reaffirm your conclusion. I wish i could get the contact info of the engineer i had that conversation with but somehow i have no history in my (rarely used) FB-account anymore.
@@userlandia On another note, i am quite baffled that the IIGS did only have one graphics page for SHR graphics. The Video-overlay card that uses the exact same chipset as the IIGS has two pages for the frame buffer and can do 640x400i with 4 colors. I am pretty sure that Apple management did get involved in this case and wanted to prevent the IIGS to have better graphics (if only on paper) than the Macintosh 🙂
Indeed, I appreciate you offering your insight. Perhaps that conversation was in pre-Messenger days, or on a public post for a profile that no longer exists... alas. Re: the SHR graphics, I can believe that there might've been market positioning (as the euphemisms like to say). The overlay card came out a few years later, though. I've never used one, so I can't speak to its technical capabilities or any hardware tweaks made for it, though. The VGC chip on the card is a -C variant (vs the -2 on my ROM 01 and the -4 I've seen on ROM 03). Would be curious as to the differences on these chips.
Before the internet. I heard about the IIgs was suppose to be faster. But, the big one was Apple Pushing Macintosh. This was already on fact with the use of color and adb. I can tell you once the internet came around. That is when rumors or facts got added, mixed, and spilled all over the place. Making a mess of simple facts. Some things I found out was true is not more so. Like how much Macintosh II was marked up. Way more than what i heard throughout the years. .
@@userlandia I have this system in the boxes some how and I got a feaw items signed by steve wazniac and I put it in there with it. I also have an Apple IIGS factory video while steve wazniac was still working it seems rare not online.
It's always a victory for the public when somebody actually goes and does the research behind computing 'urban legends.' You went and uncovered the business, engineering, and manufacturing processes behind why the IIGS was the way that it was. In this, you have made a contribution to the history of computing. With the popularity of retrocomputing, there are a lot of 'what-ifs' or 'might-have-beens,' and thanks to you, we're now further in understanding the IIGS. It makes you wonder 'what could have been' for the Apple II or Commodore 64 successor computer. Except that you've shown the the 'card table' 65816 design just wasn't going to meet the needs of those companies. If a 4 MHz chip had been available in 1985, it would have affected not only the Apple IIGS, but likely the Commodore 65. I guess since none of the proto-ARM or other blue-sky designs would have been known to / accepted by Apple, they would have had to get a faster-clocked 8-bit processor [6502-derivative], if possible, and mate it to a 16-bit graphics card. Like the Turbografix-16 or late Amstrads. Could an 8/16-bit Apple or Commodore computer have been successful, if released early enough? Other than that, the 68000-based computers became the real Apple and Commodore 'successors,' even if they weren't backwards compatible. Maybe a C64/Apple II on a card for the 68000s was the answer. It's sad that we don't see what the IIGS could really do. Even stock, but also with accelerators. There is a lot of software that pushes the C64, ZX, and to some degree the Apple II to its limits. Modern programs designed with the decades of experience over the 80s. But this doesn't exist much for the IIGS. A few places to go from here: Could Commodore have released a worthy 8-bit successor to the C64 while demand existed? Your inclusion of the Foenix was also fascinating. How much does a system like that need to go beyond 1980s limitations in order to have the performance that it does? Anyway, well-done, it's better to have one video looking at the facts, than 80 that speculate on the IIGS' limitations.
> How much does a system like that need to go beyond 1980s limitations in order to have the performance that it does? good point. I see how the IIgs board has an old side and a new side and how much effort is needed to maintain old quirks. So, given how the IBM compatible PC is piecemeal upgrades over the decades, I wonder how much is still holding the PC back? When people say "The Playstation 5 is just a windows PC" they're wrong because consoles come from a clean sheet design, but HOW wrong, exactly?
I worked on the //GS development team from the early days as a test engineer. The reality is the mega-// chip couldn't support higher speed processors. The cost jump for the higher end '816 was not worth the risk. So the Mega-// silicon was limited to 3.5MHz and then 2.8MHz.
Can you square this with the fact that the Mega II runs on the 1 MHz side of the bus? Every hardware reference I've seen says the CPU must slow down to 1 MHz any time a Mega II access cycle needs to happen. That's also why it has its own set of Slow RAM. So the Mega II side shouldn't be dependent on how fast the CPU runs because the CPU always slowed down to 1 MHz on cycles when it needs to access that chip. The Mega II operates on the phi0 clock cycle, not the phi2 of the CPU. Do you mean the FPI chip? Any thoughts you could provide (especially related to comments relayed by john_ace earlier in the comments) would be interesting.
@@userlandia I should have made that more clear. The mega// is limited to 1.024MHz bus transactions so running a CPU faster than about 3.5 MHZ has almost no benefit because all the system I/O is moderated by the Mega//.
It's interesting how much effort went into raw backwards compatibility. The Rosetta idea allows for much easier and cleaner hardware upgrades, it's a shame they did not go the 68000 + software rosetta way back then.
"Should" must be the most commonly used word in all of computing. Hardware, software, everything. They were using it back then and we're still using it now. It's so... freeing.
@NuntiusLegis The CMD SuperCPU was left on the cutting room floor, but I'm well aware of it since it was produced by Creative Micro Designs. They were located in East Longmeadow, Massachusetts, and 20-ish years ago I used to work down the street from their old offices. They were closed by then but I still found it to be a fun coincidence when I drove by on my commute. As a western Massachusetts native I always have to tip my cap to fellow members of the 413. That said, I didn't mention it because I've never actually used one. The SuperCPU uses the 14MHz Sanyo chips overclocked to 20MHz, and I don't know enough about its caching architecture (let alone SuperRAM) to speak to it on a technical level. And this thing was long enough that it was cut for time (like Project Avatar). The life of CMD is worthy of its own story, with a lot of sore points for Commodore enthusiasts and its ownership alike.
10,000 BC “This new wheel thing you designed, it’ll stay on that cart thing and not fall off killing anyone, right?” It should… But given the budget you gave me, the unrealistic development timeline and the lack of any testing you demanded, I can’t guarantee anything.
Grew up with the IIgs and it was an utterly phenomenal machine, kept upgrading it for ten years until I graduated high school. I never felt like the machine was that slow, though I always felt that the pokiest thing was the drive IO. Solved that issue with a SCSI hard drive later on.
When I was a teenager with an Amiga in the 1990s, I always noted that on video game boxes, the Atari ST, Apple IIGS, and Amiga screenshots were always very similar. It was only much later that I learned that, while the Apple IIGS could certainly generate beautiful images, its speed with moving sprites around and animations really hobbled it.
The IIGS lacked a blitter and depended entirely on CPU grunt. There's no double-buffering for the video, either, which also limits options for animation. It doesn''t mean you couldn't do it; just that you had to do a lot of clever programming (or lean on accelerators). The Wolfenstein 3D port that gets shown around a lot is a marvel of programming by Rebecca Heneiman and Eric Shepherd, but it really needs an accelerator to be playable. It was also heavily delayed, but that story's been well covered by Steven Weyrich. Had the IIGS gotten faster CPUs in its lifetime it probably would've been less of an issue, because IBM Compatible PCs largely solved the problem in the same ways (more raw CPU grunt and clever programming) until accelerated graphics became more common.
@@userlandia Right... brute force with a faster CPU probably wouldn't have made up for the entire difference made by Amiga's custom co-processors, but it would have helped. I should look for some footage of a IIGS using one of those souped up accelerator cards. I think all I've seen footage of is a stock IIGS.
Yeah, the Amiga's custom processors were really special. It gave them a five year head start on the rest of the industry. Consider the SNES again-it can do some really amazing stuff with a 65C816 as its CPU, but it's not a stock '816 and Nintendo's PPUs did so much of the heavy lifting. And, again, expansion processors too. But it wasn't in households untill 1990/1991, well after the Amiga. The IIGS was limited by needing to support the older Apple II video modes, but given the timeframe under which the VGC was developed I think they did all right.
Probably one of the best videos on the IIGS. Definitely helps to dismiss those other rumors/theories raised by other channels/creators that gained traction years ago.
you know what you have to do then make your own new version of the apple 2 gs just with faster cpu clock speeds on the 65816 cpu which can go up to 15 mhz and make it as a nice nostalgia computer for anyone to buy and enjoy or you could make a working pip boy if your to lazy to make a full on better apple 2 gs 🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣
There were definitely design errors in the original Apple II designs related to timing. The 6502's Phase0 pin was clock input. This was inverted in the 6502 and sent out on Phase1. Woz inverted that again for bus timing (primarily the read/not-write strobe), which was wrong. The correct signal to coordinate bus timing was Phase2, which was in phase with Phase0 but delayed enough for the chip to stabilize its bus. Or so memory tells me. How do I know this? I had the honor of designing the computer portion of the hardware and writing all the firmware for a well received handheld consumer product back in the 80's. To confirm my understanding of the 6502, I looked at the Apple II schematics, saw Phase1 in use, and beat my head against the wall for a while tracking down timing errors. Live and learn.
Yep. And also the 6502 clock doesn’t need to be symmetric - nor should be. The bus access part (PHI2 high) can be longer than the internal state setup part (PHI2 low). A modern 65C02 can do over 12MHz that way with fast sram. Back in the day it took a selected 6502, slight overvolting, and ECL RAM with level translation. And a 70W supply for the 32 2kB ECL SRAM chips 😅
While the 65816 did have yield problems and REP/SEP problems, the key reason why 8Mhz and later 14MHz parts were unused was not due to the CPU, but due the DRAM. In the 80's and 90's, DRAM cost was about half of the computer cost, so the Woz GS had only 128K fast dram and 128K slow dram in order to meet the $999 price. But that meant that it needed upgradable memory, which meant a memory expansion slot. The memory expansion card needed buffers to fan out the bus to the up-to-4MB dram chips. The FPI/CYA decoded the address bus during the 140ns of PH0 low, then drove the RAS/CAS lines through the buffers to the memory card. It is no accident that the dram response time from the memory card required 210ns of PH0 high, 50% longer than the PH0 low time. A faster 65816 would have required a faster FPI/CYA, faster DRAM, lost compatibility with existing GS memory expansion cards, and ultimately needed a lot of external cache circuitry like on the Transwarp and Zip GS.
Thanks for more DRAM timing context. When I was writing the section about the issues with faster chips needing faster, more expensive DRAM (or SRAM cache) I didn't want to get too far into the weeds for pacing issues, and I felt my on-screen footnote addressed it adequately, but the phi0 cycle time is important additional context that I should have put in said footnote. I did mention the FPI's cycle times earlier but didn't sidebar into those because I felt the tech page I posted was good enough for those who wanted to pause. "How Woz would have built his 8MHz IIGS in 1985 while affording a decent amount of memory is a rather inconvenient question." was my way of summarizing a bunch of technical minutiae like this.
@@userlandia It's amazing to me how few people understand the enormous effect that RAM price, speed, and technology (SRAM vs DRAM) had on all of the computers and game consoles in the 1980s and early 1990s. CPU manufacturers were boosting up the clock speed of their chips far more rapidly than RAM manufacturers were able to (affordably) increase the speed and density of memory. People who complain about Apple limiting the clock speed of the IIGS might wish to take a look at the cost/design compromises that had to be made for NEC to use a 7.16MHz 6502-variant in the PC Engine console in 1987.
The 8 bit external data bus of the 816 was a limiting factor, and I find it strange that they didn't use full width. It would have doubled the throughput...
@@userlandiathat would’ve also put the ][GS in the same pickle that the Macintosh was in. While it may have been technically possible to produce a machine with faster clock and RAM, it would t have made much business sense due to the costs and sourcing and that would’ve cratered the Apple ][ revenues. Always compromises.
@@johnbrandwood6568lol I think you could’ve reduced the video down to this one statement. Although I am absolutely fascinated with the detail in it. I owned a ][GS at this time and I remember having to save up multiple paychecks from my various jobs at hardware (Inacomp) and software (Babbages) shops for months to afford the memory expansion card and the chips that populated it. Yes, you had to buy them separately and hand install the chips on the board. In the end, the extra RAM wasn’t of much use since hardly any software took advantage of it. Except I believe AppleWorks was snappier with it than without since it didn’t have to hit the drives to page the app if you loaded the files into a RAM disk.
Don't forget ARM, m88K, Intel (Star Trek), their Aquarius project... and the Macintosh Application Environment for UNIX ran on SPARC and PA-RISC too. Any others out there?
@thejeremyholloway Those wouldn't have been in the five year period that @tcscomment was talking about (I assume 1988 through 1993). Those predated that era, but we'll let them join the party anyway. Oh, and the Hobbit too.
Speaking of Megahertz. I've manged to overclock a W65C816 to 39Mhz. And Plasmo, over on the 6502 Forum, has manged to cross the 40Mhz mark clocking one stably at 41Mhz.
Heyy theres me and my IIGS at 48:40 at VCFMW! Playing Puttin' On the Ritz, and the only way it could do that in that program was by accelerating the snot out of it and having a RAMDISK to read from because that file is about 12MB in size and the CPU just cant move the data from the SCSI drive fast enough.
The quality of content on this channel is insane! Thank you so much! As someone who was born after the Apple IIgs launched, reliving the experience through videos like these is incredible!
But things like 12:50 _"The IIGS, like many computers of its era, derived its base clock from a crystal oscillator"_ So unlike computers of today, or wtf? :D
CPU and system clock derivation was a lot cruder back then; most systems often relied on a single base clock for everything in the system (usually linked to video timing). There was a separate crystal for RTC in the IIGS but every other clock-based circuit relied on that single oscillator. Nowadays we have multiple clocks, there's various PLLs, a circuit's crystal might be embedded in a chip, et cetera... Obviously we still use oscillators but the architecture is very different. Odds are you know all that already and are just needling me for not taking a five minute detour on how clocks are generated yesterday vs. today, and I accept it for the spirit of needling that it is. ;)
Fantastic video! Great research and production quality. You've earned a sub from me for sure! I worked one summer and used the money to buy an Apple IIGS. It's still in my closet complete with a Transwarp GS, RAMFast SCSI and two hard drives (one of them even still works). I was thinking I was the king of the hill when I maxed out my OctaRAM board with 8MB of RAM! I did most of my programming homework using the ORCA compilers. Ahh the memories....
I love it when people come out with their stories about their tricked out GSes. You should bust it out of the closet and set it up in a place of honor! Or at least make sure the PRAM battery is removed. :)
@@userlandia I'd love to give it a place of honor in my back room but, alas, I don't have the space right now. It would be really fun to try and get it on the network too. I understand there are some products I've missed over the past few years that would be really fun to have. Thanks again for the video. I don't think I've ever seen a NEW video about the Apple IIGS.
The importance of EDA tools cannot be understated. It all started going down hill when the engineer proudly designed the masks by hand instead of relying on EDA. A rudimental STA would have detected the critical cycle and prevented this mess.
Another important piece of evidence for the end of the Apple II line is that Applesoft was the result of Apple using Microsoft's BASIC, licensed for 8 years and then renewed for another 8. In the early 1990s, Apple and MS were at odds, and there was no chance of another license extension. The looming 1993 end-of-license explains Apple's lack of hardware upgrades in the 90s.
To be frank, someone could just white-room reimplement that BASIC in a couple of months, and probably end up with a faster BASIC. MS was a bit of a scourge on BASIC performance. It was not great. But it did maintain compatibility across platforms, at least for the core statement set.
At 9:10 mark, the same reference you are quoting states Jobs didn't want the Apple II to encroach on the Macintosh. So even if he was not at Apple when the IIGS launched, his philosophy still permeated. If Apple had been really serious about the IIGS, they would have done more to advance it.
Good thing that's not the only source I'm quoting, and not the only perspective I discuss about the political problems around the IIGS, including Jobs' thoughts. It's obvious that Jobs was focused on the Macintosh, and I say as much literally two minutes later. But that didn't mean he or others sabotaged the CPU speed, as the legend goes. I'm not here to defend Jobs so much as to say "Hey, this is very complex and it can't be boiled down into a pithy quip about Jobs not liking the Apple II." There has to be room for nuance. Part of the point of this is that you can't take a single source (like the tell-all books) or even a first-hand source (like Bill Mensch) as gospel, especially if you can't see their primary sources and vet them. That quote I highlighted was in March 84, about the timeline of the IIx's cancellation. At the same time Apple was releasing the IIc and Jobs was a big driver of that project. By the time the IIGS proper got going in late 84, Jobs was embroiled in the Macintosh Office and would be out the door shortly after. Most of the decisions around the CPU speed would've taken place after he was stripped of power in spring 85. That's just based on the timeline of when the El Grando boards were created and when the custom chips went out. But even if Apple wanted to push the IIGS harder, they literally couldn't at the time for technical reasons, which I've exhaustively detailed. And by the time some semblance of faster speed was available, it didn't make business sense. Even if they, say, dumped a lot of money into WDC, would Mensch and company still have stuck to hand layout? His goals may not have aligned with Apple's goals. Obviously there's all sorts of what-ifs going on here, but architecturally the Apple II is a brilliant piece of work for its time that was going to be replaced at some point-either by the Macintosh or by some other machine. There's a whole story about business management with regards to computer platforms that isn't exclusive to Apple that's beyond the scope of this video, but suffice to say that the Apple II wasn't the only victim of bad management and customer neglect. The IBM PC compatible was only able to carry on like it did to the modern day because Intel did the hard work with the 386. But that's a long, extensive topic that can't be addressed in a youtube comment.
One of the best and most well balanced look at the development and short life of the IIGS. You articulated well what I have been arguing to a wall in a few IIGS and Macintosh groups.
He crashed a Bonanza? I've heard they call that the Doctor Killer because a good number of rich doctors will buy one and crash it because it can be tough to fly and they don't take the time to learn it well. That's the model plane that Buddy Holly, Ritchie Valens and Big Bopper were on.
Yup. Inexperienced pilot in a very powerful and sensitive aircraft is a recipe for disaster. There were a lot of factors to the crash-Woz was a freshly minted inexperienced pilot, had no IFR, lacked endorsements necessary for that type, definitely misjudged the amount of runway/airspeed he needed to take off, probably got his weights n balance wrong, etc... He's lucky he didn't face more severe legal penalties. It was probably a contributing factor to Sky Park closing 18 months later, but small airports like that were closing in similar fashion during that time period.
If you want to dive into a deeper rabbit hole allegedly related to the 65816, try researching its connections to the failed Atari/Synertek 6510 CPU project from the late 1970s. No, not the later MOS/CSG 6510 used in the C64. I’m talking about the Synertek 6510… a 6502 with 16-Bit extensions.
I actually did some research on the Synertek chip back when I was writing the Apple IIe computers of significant history, but I couldn't find a way to organically work it in, so it got cut. It's another absolutely fascinating case of "what if."
The problem with the 65816 is it is to closely coupled of the 6502. It should have been a true evolution with full 16 bit address bus and full true 20 or 24 bit address bus in a package similar to the 68K.
Wonderful video, and some of my assets used (and credited - thanks) within. I must disagree only with the Amiga 1200 analogy; it was far more of a leap over previous Amigas in its target-user slot than was the Mark Twain over the ROM 3 GS (32- vs. 16-bit system bus, 68020 vs 68000, new graphics chipset (if a fairly modest improvement -- they did offer "Productivity" modes, which were higher res, non-interlaced modes for "serious work")).
Hey, thank YOU for posting your photos under a permissive license. Credit isn't just good, it's the right and just thing to do. I'm a photographer too, I know how it goes. Also, I agree that the A1200 was a greater technical leap than the ROM 03, but it wasn't ENOUGH of a leap to stave off PCs, which were making bigger improvements at a faster rate. Especially considering that Commodore was starving R&D and it wasn't as good as it could have been. My point was that the the A1200 wasn't enough to stop the Amiga's decline, much like the Mark Twain, even with a speed boost, wouldn't have been enough to turn the IIGS' tide. Now, if the A1200 had an 030 and a full AAA chipset... Although I do think that if the Mark Twain (or the ROM 03) had a speed boost, it could have wound up like the A1200 in another way: it might've split the market for software. But I can't say for sure that's what would've happened. Thanks for commenting and for your work!
The amount of research and quality production in this video is exceptional. You for sue answered that nagging question that we all had for years as to if the gimped clock speed was on purpose for other than technical reasons. Thanks and keep up the good work.
This channel is amazing, would never have found you in podcasts had you not moved to video. Such fun nostalgia from old times. Now i wish there were weekly videos. I'll just have to not binge them at all at once.
Good to see that the algorithm sometimes does get it right. PC History Rabbit Holes don't get proper deep dives instead you just get stuck in wikipedia loops typically which leave you disappointed. So I am happy to see proper deep dives like these sometimes make it to my suggest links. Instant subscribed.
very interesting! just wanted to say i LOVE the additional (visual) footnotes in the video, like in every good book. :-D and i also love the long documentary format. this is not a random video of the week, it's for the archives.
Acorn, a British computer maker went to WDC to buy the 65816 CPU. They were not impressed at all, so much so that they got permission to make their own CPU, which they went to 32 bit right away with the Acorn RISC Machine CPU, better known as ARM. But another fun Acorn fact. They made machines which could have co-processors. Their main machine, the BBC Model B and later Master system could use 80286, 68000, Z80 and even got ARM CPU board, that is where the first example which was fabbed for them was tested, by accident it basically ran on the electricity that came from the oscillator and test equipment, they forgot to power it on, but it was so efficient it still ran on basically static electricity. In the book iWoz, Woz mentions the IIx briefly but shares no info what so ever, then disses the IIgs. It was fun seeing that Apple was trying to make a system like this, sadly it failed but still cool.
As the video and accompanying blog post point out, Acorn did actually use the 65816 in the Communicator, which was aimed at certain kinds of business applications. That choice seems to have been largely determined by constraints imposed by existing projects and the financial state of Acorn at the time. The part used appears to have come from GTE, judging from machines that are still around. The Communicator used the ULA from the Electron, and it is in various respects a follow-up of earlier efforts to augment the Electron with an expansion for use in communications applications. Adopting the 65816 presumably allowed the team responsible to evolve an existing approach, although they did bring in the original architect of the MOS to do a 65816 version. However, the division making the Communicator was pretty much on the margins at Acorn and was eventually eliminated in cost-saving measures, seeing founder Chris Curry's departure.
Another bit about the Communicator that I didn't mention in the text: it used 2 MHz '816s, which-given its development and release timeframe-lines up Dave Haynie's difficulties with sourcing 4 MHz chips and GTE's plentiful supply of 2 MHz chips in 1985.
@@NuntiusLegis John Romero interviewed Nasir Gebelli for a few hours www.youtube.com/@appletimewarppodcast1264 - and Nasir had said something like he had hardly touched the 16 bit instructions when he was coding Secret of Mana. Just took his NES skills and went onto SNES.
Based on this video and the research needed to create this video, I will never blame Steve for the Crippling of the Apple IIgs again. Thanks for setting us all straight on this.
Awesome deep-dive into the 2.8 MHz myth - thank you for finally clearing this up! And agreed that the IIgs is still a great computer, especially with all the new peripherals available for it such as the AppleSqueezer accelerator, Uthernet II card, LiteSound stereo card, A2FPGA Multicard, and a bunch of HD emulators.
Thank you instead for what I totally did not expect at the end, the ARM/Acorn excursion. Loved seeing that Communicator. Those Arcs and even that Master Compact.
Very nicely done. The take-away is that it suited Apple NOT to ensure the IIgs was more successful than the Mac. The 8-bit Guy has done a great thought experiment about what would have happened if Apple had pursued continuous improvement of the Apple II rather than the early failure that was the Mac. Apple nearly sank themselves with those terrible computers!
Not... really? The 65816 was a bit of a dead-end as a desktop processor. Even if higher speeds were possible in the quantities Apple worked in, the need for synchronous RAM would have crippled the GS's price at higher speeds. The 8-bit-guy claiming the lower cycle counts of the 6502 and 65816 were 'better' than the 68000 papers over this problem, since now it's trivial to get DRAM or SRAM at speeds and quantities unimaginable in the 1980s. But the 68000 could handle RAM at a range of speeds decoupled from the processor clock, something that we just take for granted these days.
Glad to be of service, but honestly my memory of that time is very limited because of the long gap. Unfortunately, I don't have archives which would help. A canonical authority for that time period (1989-1992) would be Dave Lyons who likely has either knowledge or archives.
What a fantastic video. I have a passing knowledge of the Acorn / BBC side of this (over in England the Apple II wasn't really a big thing) so I learned a lot. Well researched and professionally created, I don't understand how you aren't a 100k+ subscriber account.
Great video! I was a little uneasy about checking my own posts about the Apple II series, in case I'd helped perpetuate the megahertz myth. Thankfully I had not.
I loved the Apple II line, I even saved up and bought a Laser 128 (//c clone), which was a mistake at that time in my life. The IIgs was really neat. But while the II line was great for hackers, it wasn’t strong as a consumer device. I hated the pre-NeXT Macs, but they did have their place, they just weren’t targeted at me.
Great video, pretty convincing research and argumentation.. It delivers pretty conclusive answers to questions on the 65816 I as a former 6501 guru-wiz had because I just never did real work with that CPU. Regarding the 68000 User/Supervisor mode - that privilege separation was kinda pointless without an MMU to go along with it. Let me skip the details but the 68000 was basically broken for use with MMUs and required some expensive kludgery to bolt on a MMU. Plus there wasn't really a standard MMU. The 68020 / 68851 combo was in theory viable but expensive and the 688851 was pretty broken. Plus some vendors such as Sun were using their proprietary MMUs with the '020. Things started becoming sane only with the 68030 and its builtin MMU but for many but the most hardcore 68k vendors that was already the end of the 68k architecture. What I credit the 68k architecture with is that it was hands down way more elegant and programmable than x86. At the same time writing optimized code for it was more demanding than for the 6502 due to the large ways many things could be expressed in assembly code and the complexity of execution times. Something that got only even more complex with the 68020. Ironically RISCs were meant to make code generation easier for compilers - but with a slight help from tools they also made it easier for programmers. The orthogonality of the instruction set did also help - unlike the 68000 where things were just a slight bit different between address and data register. Re. David Haynie - some people never change. it's a long time since I met him but didn't change so much between then and in the video.
Thanks for your insight! Your point about user/supervisor separation not being helpful without an MMU is well taken, which is one reason why the Mac didn't use it (Andy Hertzfeld talked about this in some places). I guess the point I was making is that it's an example of the 68k architects thinking of future needs, and how its audience/goals differed from something like the 65816.
@@userlandia Absolutely! And I agree about the 68k architecture looking ahead. I think the function codes lines are one thing which shows they were thinking ahead. And the 68010 fixed the worst sins. And if you look at the instruction encoding (note, I have not fully researched this), there would have been space for a 64-bit variant.
Magnificent video, an immaculate documentary. It must have taken a dog's age to do all that research. Well done indeed. The Apple iiGS was a sort of dream machine for me. We had Apple IIes in school which I enjoyed. When the GS appeared, it seemed like a miracle. Alas I never did get my hands on one, then or now, but thanks to brilliant productions like this I can imagine a world where I have my very own IIGS. Thanks!
That was my first new Apple computer. I bought it with a sound card and needle printer with scanner snap-on. Did cost me an arm and a leg plus trading in my Apple IIe. But I loved it.
I made a conscious choice to forgo the Mac and get the Apple ][gs simply for color and ensoniq sound chip alone when I was a kid. The amount of money spent on this system was enormous for that time. When I got home, unpacked it all, setup and booted….and waited….waited…..waited…and the desktop finally came up, I realized two things. One, I had to purchase a hard drive system for this beast and could not rely on the 3.5 floppies to boot off of and run programs on this puppy. Second, and something I had no control over, was that the cup was a dud out the door!! Sure, it ran games like Conan (Apple ][ users know what I am talking about) at lightning speed because the game had no timing system built in and relied simply on the known fact of the ][‘s clock speed for game timing, but everything else including boot was monotonous. When 2 hardware developers came out with clock multiplier add in boards I was ecstatic. The applied engineering transwarp board was hot on my list and of course the hard drive and this board meant much more money!!! In the end,it was worth the money for both upgrades for my BBS.
Interesting... IIRC, Acorn were also considering the 65816 for the successor to the well known BBC Micro, but were frustrated by the delays, price and them being "a bit crap"... So they thought "What the hell, we'll make out own CPU"... This CPU was the Acorn Risk Machine - The ARM1... BTW, the original BBC Micro didn't have to slow its CPU down for ram access, Acorn sourced 4 Mhz ram for the 2Mhz 6502 based BBC Micro, so the video and CPU could access the ram on alternate clock cycles.
It would be interesting if backwards compatibility became such an important feature that they kept extending the architecture to 32 and 64 bits. Perhaps in an alternate universe there are Gen Alpha kids breaking out a 5.25 floppy to play Number Munchers in compatibility mode on a modern computer.
WDC announced but never made a 65832. Based on the preliminary data sheet it would've been even more of a PITA to program than the 65816, and generally too little too late.
@@ischmidt yeah the 65C832 was a ridiculous design with no benefit. It had no new instructions, just yet another processor status bit to double register sizes. So you could add/subtract 32-bit numbers, but still no multiply/divide. And everything squeezed through that 8-bit data bus, so those 32-bit operations would take extra cycles. The thing wouldn't have provided any practical benefit, which is why they found no customers for it.
@@tim1724 Yup. Mitsubishi's 7700 series microcontrollers were a more reasonable 65816 extension: 16-bit data bus, multiply and divide, and no emulation mode (but still M and X mode bits).
You'd have to be crazy to hate the Apple IIGS? Dude, you know that there were much, much better home computers before this, right? Amiga, for a start, in 1985. Apple didn't even have an operating system, it had a simple program loader, the simplest software a computer could have. The Amiga had a full, pre-emptive multitasking operating system like Unix. It took Apple another 16 years to come up with that, and it was a Unix clone from the old NeXT OS. And here you are trying to make a case for a IIGS in 1991. Having said that, I appreciate your history road here! And as an old Computer Club newsletter editor (and President) from the 80's, I appreciate your inclusion of old newsletters! Those are near and dear to my heart. 99% of people never know what went into creating those 40 years ago! Nice Dave Haynie references.
That line is actually a reference to one of my favorite Dexter's Laboratory bits. "Not Justice Fruit Pies, the delicious treat you'd have to be crazy to hate!" Look up Justice Fruit Pies on UA-cam sometime for a laugh. The Amiga might have been better in many ways, but that doesn't mean we have to hate things! There has come a time in my life where I have learned to love all computers. ;) The IIGS did have a full operating system, GS/OS, and while it lacked any kind of multitasking it certainly had a toolbox-based development setup and system management that would be recognized as an actual operating system. Which is pretty impressive for what it is, and one reason I find this machine endlessly fascinating even though I didn't own one back in the day. A faster IIGS in 1991 would've been a decent enough machine (certainly would've blown my Commodore 64 away), but I think I made the point that it wouldn't have reversed the system's fortunes.
Excellent work! Also makes me wonder now about my assumptions about decisions made by the other great 1980 computer builders, Commodore, Sinclair, etc, where previously I'd assumed bad management and costs were the fundamental reasons behind some of the failures to produce better machines.
I've found the answer is almost always "a little from column A, a little from column B." Although in the case of Commodore I'd lean more towards the column B of "Management." :)
For people that don't have access to magazines (I collect old tech magazines) the Mac World of April 1989 shows that the Mac II was already old , and we already have ps computers running on 32 bit microprocessors at 33mhzs. The Mac IIx was running a 68030 ath 16ghz (15.667) with a 68882 math coprocessor and 6 NuBus Slots.
Amazing video. I believe the evidence you cited supports the rationale underpinning your argument. Nicely done! On the subject of "should have", I have a rather different take on things, but of course this is all with the benefit of hindsight. I'd love to have seen Apple acquire the Amiga division from Commodore. There were a number of what I think are interesting architecture designs inherent in the Amiga which possibly would have made for a much more powerful computer, and much closer to modern-day computing (that is, color, native multimedia integration, etc.) than what Apple had going for it with the Macintosh, and as a consequence would have left the 8088/80x86 platform in the dust.
That's my first computer in 1988. I added RAM to it, but I was not careful, and I broke it with static electricity. I was dumb for not buying a hard drive for it, and I had to swap floppy disks all the time.
Everyone likes to say the 65816 is fast per clock compared to the 68000, but what happens when you want to dereference a 24-bit pointer??? That's really slow on 65816, same with dereferencing a 16-bit pointer on a 6502. The 65xx architecture was great at the time but it was more or less specialized for microcontroller-type applications where you were using a few fixed addresses in a small memory space. One you have more memory, try to do relocatable code, work with a bunch of structs, etc. 65xx really falls flat.
I didn't want to get too far into the weeds on arch differences between the two (especially because I am not a wizard at programming either of them and for pacing issues), but yes, that's one of the cons I know of for the '816. The analogy I always thought of it is that the '816 is like a nimble sports coupe versus 68000's sport sedan.
I still think it's sad that the charm of the 6502 computers got cut off by most of the 16-Bit machines with zero backwards compatibility. As a C64 person, I would have loved a compatible 65816 progression, and envy the Apple II persons for the IIGS. I would have told Commodore to shut up and take my money without caring a second about some 68000 machine with slightly better benchmarks.
@@Henrik_Holst I am not an expert, but the 6502 can address the first 256 bytes of RAM (zeropage) very fast with a single address byte, which is considered as the equivalent of 256 8-bit registers or 128 16-bit registers. If I see it correctly, the 65816 can have 65536 "zeropages" (called direct pages when relocated from the start of memory).
@@NuntiusLegis yes you save one cycle over other parts of RAM, aka LDA from the zeropage is 3 cycles vs 4 cycles for LDA from other parts of RAM, register to register is however only 2 cycles so zeropage is faster than other RAM but not as fast as the registers.
The fun thing about the 65c816 related to its architecture and RAM speed issues could have been largely mitigated by going to a wider memory bus as the ARM did, and adding (especially) the zero page memory on chip, along with at least a page of the stack, and there was the one reserved WDC instruction along with the COP instruction that could have restarted the wider forward-looking instruction set and overall architecture. The reality is with zero page addressing from the start, it was very much more of a RISC machine that, had all that zero page been on-chip, would have screamed for operations done with that address mode, being much like 256 bytes of CPU registers. And with adding a whole page of static RAM for in-page code execution, as well as the other things, it isn’t hard to see a logical path for the family to have kept growing, much like ARM has. After all, as node sizes kept getting smaller, the base 6502/65c816 compatibility mode would have become an insignificant portion of the die space used: it was from the start, a CPU meant to be minimal circuitry/die space for the functionality, and that could have served it very well. It’s not like any of the other CPU architectures weren’t forced into the more expensive ways of achieving performance anyway: that wasn’t something that had a realistic chance of not happening.
Based on research about the 65832, it was still going to use an 8-bit external data bus. I think Mensch prioritized socket/ plug-in compatibility a bit too much with the 816 (and 65802) versus a logical compatibility. I think those reasons can be defended, but they clearly prioritized a here-and-now viewpoint over a future viewpoint. It's easy for me, as someone who's not in the semiconductor business, to look back in hindsight and criticize those decisions. Whether it would've been more successful with a more aggressive redesign is yet another what-if. ARM is a more elegant re-implementation of a lot of these ideas without as much of the 502's technical baggage. If the 832 was a real clean-sheet with a more discrete emulation mode (a la real mode / protected mode) I wonder how it could've fared in the marketplace..
My school when I started kindergarten in 1993, still had Apple IIGS's in the computer labs. In 1994 they upgraded to Macintoshes. Then in 1997 went to Windows PC's.
I read that in 1980 Jobs himself approached one of their second source manufacturers of the 6502 to design a 16 bit version of the chip. The manufacturer wanted Apple to foot the entire bill, which Jobs declined. Given the timing, it sounded very much like Jobs initially wanted the Lisa to run on a 16-bit 6502. Imagine how different things might have turned out if the Lisa had been based around an ARM-like chip and been at least partially Apple // compatible.
I don't have the materials on hand right now, but the Lisa did switch from some different arch in its pre-GUI days to 68k. I don't remember what that arch was, exactly.
Great video, I've subscribed. Thank you for covering the 65C816's part in the birth of the ARM processor. If the 65C816 hadn't been so disappointing, Acorn might not have developed the ARM architecture, and then where would we be?
Wow a really good Documentary, I learned a lot!, Darn I can even see a F256K shown in there (I have one of course with an 65816), it is shown with the RPG demo I coded for VCF, ha ha never taught I would see it shown anywere..., nice!
The legend of Apple intentionally nerfing the IIgs always bothered me, on the surface it feels like the kind of story that's more popular than it is true, and it never appeared with anything compelling backing it up. A surface-level read on a story like that isn't going to be reliable of course, but generally the information I'd found seemed to support the idea that going faster just wasn't an especially viable option when the IIgs was being developed, or even for a few years after it was released. Thanks for the deep dive.
Them considering 5.5 MHz at one point is easy to figure out when you look for clock crystals and find out there's crytals rated for double the speed of the one used in the IIGS, makes you wonder if adding faster RAM to the main board and replacing the crystal with the doubled speed version would produce results
You can't just replace the crystal because too many devices in the system have their clock cascaded off of it, including the video generation. The system wouldn't work properly.
Comment on the article directly from the lead of the project (Hillman) you might find interesting: "A very long and mostly accurate article. The Brooklyn and Golden Gate were Apple /// projects which were meant to bridge the 8 & 16 bit world Those projects were killed because there was no interest in an Apple ///. The work morphed into the IIx, which became the IIGS. The clock speed was always a multiple of the color burst frequency (3.58 MHz) from the original Apple II."
Never heard of this myth before this video, but consider me educated :) Now I want to hear that 32 voice music chip given a workout. The few Apple 2 GS games I've tried hasn't really hinted at such a chip being under the hood.
I believe the way it's set up in the IIGS is that it's capable of 15 stereo voices using those 32 oscillators, but the system shipped with mono sound only. To get full stereo you needed a stereo expansion board (which mine has, it plugs in to a header on the logic board). The Mark Twain would've had stereo output according to the reports I've read about it. There's also the 64KB sound RAM which limited it a bit as well. But you can use SynthLab or modtrackers to let it stretch its legs. Bumbershoot Software has written an excellent writeup on the DOC. Highly recommended reading if you want to learn more about the technical bits.
I remember learning the Super NES used an improved version of the same 65816 CPU. The Sega Genesis used the 68000. I also remember my 7th Grade IT class in 1993 where the teacher's pet got the IIgs with the color monitor while the rest of us were stuck with green-screen IIe machines. That sprunt didn't even put the then-beast to proper use while I was making the IIe my bitch.
Some notes from the editor's desk. I'll add more to this as time goes on. Please read or check before commenting:
1. Whether Apple should've invested more directly in WDC is a topic that I didn't address because it's too speculative (and would've stopped the pace cold). Theoretically Apple could have bankrolled an improved die before WDC migrated to Sanyo. They could've made an investment like they did with Acorn and VLSI. But there's too little on the record to discuss it without a lot of speculation. I'm guessing one reason they didn't around the ROM 03 timeframe is because of the direction they went with ARM. In the 87/88 timeframe 68020 products were out, and the 68030 was on the way.
2. In case it wasn't obvious from the text, Apple was obviously pushing the Mac as its machine of the future. The GS and the Apple II were victims of political machinations at Apple, especially later in its life. But that doesn't mean other problems didn't exist. I didn't want to get in the weeds about what a successor to the IIGS would've looked like, because very little would be on the record.
A lot of Apple II people did a lot of great work during the 86-89 era, and did it under less than great managerial conditions.
3. On the subject of the SNES' usage of the 65816 core, I've since found decapped electron scans of both the 5A22 and 5A122 (the 1-CHIP SNES that combined the CPU along with PPU and other things). The original 5A22 has a CPU core that very closely matches the early die shots, so I don't think Ricoh modified the actual core at all (unlike the NES). I'm not a silicon designer; this is just based on my looking at it.
The 5A122, however, has a newer design for the CPU core that more closely matches the die shots I've seen of the post-Sanyo redesign. Given that the 1-CHIP and the SA-1 were in design around the same time, I think it's a reasonable bet that the SA-1 uses the same core that's found in the 1-CHIP, and therefore doesn't have the REP/SEP flaws.
There's other timing considerations for the SNES (its use of multiple clock speeds, fastROM / slowROM, the pseudosattic WorkRAM speed) that would've kept it from exceeding 4MHz back in the day without making the launch console considerably more expensive. But I think I'm comfortable saying that I now believe Ricoh didn't make any changes to the core to account for REP/SEP. The stable 3.56 MHz in its fastest mode likely comes from better yields at Ricoh.
4. Any IIGS faster than 4 MHz would've needed a cache (a la accelerators) due to the stringent memory requirements of the '816. Even putting aside that early chips (either due to REP/SEP or process issues) appeared to have problems closer to 4 MHz, they would've needed faster DRAM than what shipped with the GS. The GS probably had a healthy margin, but RAM was still expensive. A cache might've been the most cost effective solution, but this was still pretty new at the time.
5. The history of Zip and the accelerator companies is an interesting, but separate subject.
6. Left on the cutting room floor is Project Avatar. During the 90s Rebecca Heineman proposed a new GS-like system based on the 65816. It's similar in spirit to many of today's "modern retro computer" like the Foenix and Commander X16, except it would've been made in the 90s.
If one of 10 videos is one-tenth as good as this one, I am subscribing. Excellent job. I am going to see another couple and decide. Update: subscribed.
Very nice and well researched video! When I got my first IIGS a few years ago, and realized it was much slower than I was hoping for, I created the AppleSqueezer GS: a modern-day IIGS accelerator, which uses the 14MHz 65C816 + FPGA. Surprisingly (or perhaps not?) it's now a very popular accelerator / memory expansion / HDMI output used by many IIGS enthusiasts. I didn't know many of the details behind the early limitations of the 65816, and it was really interesting to learn more about this. Thank you!!
I know of the AppleSqueezer; alas, I don't own one. They're surprisingly hard to get. ;) I kid, because I love. You've done a fabulous job on that board.
If you don't mind a technical question-I know you use a full-fat 14MHz '816; how did you solve the DRAM speed problem on your board? Does the FPGA act like a cache controller for what amounts to a very large cache for the system's existing memory? I assume the modern 256MB of RAM on the board is more than capable enough to satisfy the 25-30ns-ish time that the 14MHz chips seem to want.
Of course at that point you may as well drop the 65816 and run everything even faster in the FPGA :) But keeping MOS alive is good too.
I had a talk once with an ex-Apple employee who did work on the IIgs Rom. He said that they did experiments with 4MHz and 7MHz (and even higher clock speeds) with engineering samples. The system did work at those speeds but at more than 4MHZ, faster Dram was needed, which would have increased costs. The 65816 was still unstable at higher speeds and more importantly not available in volume in the foreseeing future. The early versions of the 65816 did have some severe bugs that became apparent at higher clock speeds. Applied Engineering had to work around those bugs with their GAL-logic to archive 7 and 8MHz. I suspect that early revisions of the FPI did have 4MHz and 7MHz options but those were removed in the final revisions. In the end they went with the absolute safe option of slightly under 3MHz and may have hoped to update the IIgs with a higher cpu-clock later on with a new revision. Sadly that never materialized.
You've done a great summary of all the points I brought up with an added bonus of relaying them from a IIGS dev. I'd really like to see a detailed breakdown of the FPI chip's structure; like you I figure it's reasonable that it had a capability to switch to faster speeds that was dummied out of the final chip.
@@userlandia I think you have done some very good research and i just wanted to reaffirm your conclusion. I wish i could get the contact info of the engineer i had that conversation with but somehow i have no history in my (rarely used) FB-account anymore.
@@userlandia On another note, i am quite baffled that the IIGS did only have one graphics page for SHR graphics. The Video-overlay card that uses the exact same chipset as the IIGS has two pages for the frame buffer and can do 640x400i with 4 colors. I am pretty sure that Apple management did get involved in this case and wanted to prevent the IIGS to have better graphics (if only on paper) than the Macintosh 🙂
Indeed, I appreciate you offering your insight. Perhaps that conversation was in pre-Messenger days, or on a public post for a profile that no longer exists... alas.
Re: the SHR graphics, I can believe that there might've been market positioning (as the euphemisms like to say). The overlay card came out a few years later, though. I've never used one, so I can't speak to its technical capabilities or any hardware tweaks made for it, though. The VGC chip on the card is a -C variant (vs the -2 on my ROM 01 and the -4 I've seen on ROM 03). Would be curious as to the differences on these chips.
Before the internet. I heard about the IIgs was suppose to be faster. But, the big one was Apple Pushing Macintosh. This was already on fact with the use of color and adb. I can tell you once the internet came around. That is when rumors or facts got added, mixed, and spilled all over the place. Making a mess of simple facts. Some things I found out was true is not more so. Like how much Macintosh II was marked up. Way more than what i heard throughout the years. .
Archaeological rabbit holes like this is very much appreciated.
_Tack_ for the super thanks!
@@userlandia I have this system in the boxes some how and I got a feaw items signed by steve wazniac and I put it in there with it. I also have an Apple IIGS factory video while steve wazniac was still working it seems rare not online.
It's always a victory for the public when somebody actually goes and does the research behind computing 'urban legends.'
You went and uncovered the business, engineering, and manufacturing processes behind why the IIGS was the way that it was. In this, you have made a contribution to the history of computing.
With the popularity of retrocomputing, there are a lot of 'what-ifs' or 'might-have-beens,' and thanks to you, we're now further in understanding the IIGS. It makes you wonder 'what could have been' for the Apple II or Commodore 64 successor computer. Except that you've shown the the 'card table' 65816 design just wasn't going to meet the needs of those companies.
If a 4 MHz chip had been available in 1985, it would have affected not only the Apple IIGS, but likely the Commodore 65.
I guess since none of the proto-ARM or other blue-sky designs would have been known to / accepted by Apple, they would have had to get a faster-clocked 8-bit processor [6502-derivative], if possible, and mate it to a 16-bit graphics card. Like the Turbografix-16 or late Amstrads. Could an 8/16-bit Apple or Commodore computer have been successful, if released early enough?
Other than that, the 68000-based computers became the real Apple and Commodore 'successors,' even if they weren't backwards compatible. Maybe a C64/Apple II on a card for the 68000s was the answer.
It's sad that we don't see what the IIGS could really do. Even stock, but also with accelerators. There is a lot of software that pushes the C64, ZX, and to some degree the Apple II to its limits. Modern programs designed with the decades of experience over the 80s. But this doesn't exist much for the IIGS.
A few places to go from here: Could Commodore have released a worthy 8-bit successor to the C64 while demand existed? Your inclusion of the Foenix was also fascinating. How much does a system like that need to go beyond 1980s limitations in order to have the performance that it does?
Anyway, well-done, it's better to have one video looking at the facts, than 80 that speculate on the IIGS' limitations.
> How much does a system like that need to go beyond 1980s limitations in order to have the performance that it does?
good point. I see how the IIgs board has an old side and a new side and how much effort is needed to maintain old quirks. So, given how the IBM compatible PC is piecemeal upgrades over the decades, I wonder how much is still holding the PC back?
When people say "The Playstation 5 is just a windows PC" they're wrong because consoles come from a clean sheet design, but HOW wrong, exactly?
Not a topic I have much interest in, but you told the whole story so well that I watched the entire video. Great work.
I worked on the //GS development team from the early days as a test engineer. The reality is the mega-// chip couldn't support higher speed processors. The cost jump for the higher end '816 was not worth the risk. So the Mega-// silicon was limited to 3.5MHz and then 2.8MHz.
Can you square this with the fact that the Mega II runs on the 1 MHz side of the bus? Every hardware reference I've seen says the CPU must slow down to 1 MHz any time a Mega II access cycle needs to happen. That's also why it has its own set of Slow RAM. So the Mega II side shouldn't be dependent on how fast the CPU runs because the CPU always slowed down to 1 MHz on cycles when it needs to access that chip. The Mega II operates on the phi0 clock cycle, not the phi2 of the CPU.
Do you mean the FPI chip? Any thoughts you could provide (especially related to comments relayed by john_ace earlier in the comments) would be interesting.
@@userlandia I should have made that more clear. The mega// is limited to 1.024MHz bus transactions so running a CPU faster than about 3.5 MHZ has almost no benefit because all the system I/O is moderated by the Mega//.
It's interesting how much effort went into raw backwards compatibility. The Rosetta idea allows for much easier and cleaner hardware upgrades, it's a shame they did not go the 68000 + software rosetta way back then.
"Should" must be the most commonly used word in all of computing. Hardware, software, everything. They were using it back then and we're still using it now. It's so... freeing.
I work in IT and always say *should*
Should not the C64 accelerator SuperCPU running a 65816 at 20 MHz have gotten an honorable mention?
@NuntiusLegis The CMD SuperCPU was left on the cutting room floor, but I'm well aware of it since it was produced by Creative Micro Designs. They were located in East Longmeadow, Massachusetts, and 20-ish years ago I used to work down the street from their old offices. They were closed by then but I still found it to be a fun coincidence when I drove by on my commute. As a western Massachusetts native I always have to tip my cap to fellow members of the 413.
That said, I didn't mention it because I've never actually used one. The SuperCPU uses the 14MHz Sanyo chips overclocked to 20MHz, and I don't know enough about its caching architecture (let alone SuperRAM) to speak to it on a technical level. And this thing was long enough that it was cut for time (like Project Avatar). The life of CMD is worthy of its own story, with a lot of sore points for Commodore enthusiasts and its ownership alike.
10,000 BC
“This new wheel thing you designed, it’ll stay on that cart thing and not fall off killing anyone, right?”
It should… But given the budget you gave me, the unrealistic development timeline and the lack of any testing you demanded, I can’t guarantee anything.
@@c1ph3rpunk Show us this... "the wheel!"
Grew up with the IIgs and it was an utterly phenomenal machine, kept upgrading it for ten years until I graduated high school.
I never felt like the machine was that slow, though I always felt that the pokiest thing was the drive IO. Solved that issue with a SCSI hard drive later on.
When I was a teenager with an Amiga in the 1990s, I always noted that on video game boxes, the Atari ST, Apple IIGS, and Amiga screenshots were always very similar. It was only much later that I learned that, while the Apple IIGS could certainly generate beautiful images, its speed with moving sprites around and animations really hobbled it.
The IIGS lacked a blitter and depended entirely on CPU grunt. There's no double-buffering for the video, either, which also limits options for animation. It doesn''t mean you couldn't do it; just that you had to do a lot of clever programming (or lean on accelerators). The Wolfenstein 3D port that gets shown around a lot is a marvel of programming by Rebecca Heneiman and Eric Shepherd, but it really needs an accelerator to be playable. It was also heavily delayed, but that story's been well covered by Steven Weyrich.
Had the IIGS gotten faster CPUs in its lifetime it probably would've been less of an issue, because IBM Compatible PCs largely solved the problem in the same ways (more raw CPU grunt and clever programming) until accelerated graphics became more common.
@@userlandia Right... brute force with a faster CPU probably wouldn't have made up for the entire difference made by Amiga's custom co-processors, but it would have helped. I should look for some footage of a IIGS using one of those souped up accelerator cards. I think all I've seen footage of is a stock IIGS.
Yeah, the Amiga's custom processors were really special. It gave them a five year head start on the rest of the industry. Consider the SNES again-it can do some really amazing stuff with a 65C816 as its CPU, but it's not a stock '816 and Nintendo's PPUs did so much of the heavy lifting. And, again, expansion processors too. But it wasn't in households untill 1990/1991, well after the Amiga. The IIGS was limited by needing to support the older Apple II video modes, but given the timeframe under which the VGC was developed I think they did all right.
Probably one of the best videos on the IIGS. Definitely helps to dismiss those other rumors/theories raised by other channels/creators that gained traction years ago.
the IIGS. For awhile about fifteen or so years back I was all over this thing, full on rabbit hole. thank you for covering my favorite what if.
you know what you have to do then make your own new version of the apple 2 gs just with faster cpu clock speeds on the 65816 cpu which can go up to 15 mhz and make it as a nice nostalgia computer for anyone to buy and enjoy or you could make a working pip boy if your to lazy to make a full on better apple 2 gs 🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣
There were definitely design errors in the original Apple II designs related to timing.
The 6502's Phase0 pin was clock input. This was inverted in the 6502 and sent out on Phase1. Woz inverted that again for bus timing (primarily the read/not-write strobe), which was wrong. The correct signal to coordinate bus timing was Phase2, which was in phase with Phase0 but delayed enough for the chip to stabilize its bus. Or so memory tells me.
How do I know this? I had the honor of designing the computer portion of the hardware and writing all the firmware for a well received handheld consumer product back in the 80's.
To confirm my understanding of the 6502, I looked at the Apple II schematics, saw Phase1 in use, and beat my head against the wall for a while tracking down timing errors.
Live and learn.
Yep. And also the 6502 clock doesn’t need to be symmetric - nor should be. The bus access part (PHI2 high) can be longer than the internal state setup part (PHI2 low). A modern 65C02 can do over 12MHz that way with fast sram. Back in the day it took a selected 6502, slight overvolting, and ECL RAM with level translation. And a 70W supply for the 32 2kB ECL SRAM chips 😅
While the 65816 did have yield problems and REP/SEP problems, the key reason why 8Mhz and later 14MHz parts were unused was not due to the CPU, but due the DRAM. In the 80's and 90's, DRAM cost was about half of the computer cost, so the Woz GS had only 128K fast dram and 128K slow dram in order to meet the $999 price. But that meant that it needed upgradable memory, which meant a memory expansion slot. The memory expansion card needed buffers to fan out the bus to the up-to-4MB dram chips. The FPI/CYA decoded the address bus during the 140ns of PH0 low, then drove the RAS/CAS lines through the buffers to the memory card. It is no accident that the dram response time from the memory card required 210ns of PH0 high, 50% longer than the PH0 low time. A faster 65816 would have required a faster FPI/CYA, faster DRAM, lost compatibility with existing GS memory expansion cards, and ultimately needed a lot of external cache circuitry like on the Transwarp and Zip GS.
Thanks for more DRAM timing context. When I was writing the section about the issues with faster chips needing faster, more expensive DRAM (or SRAM cache) I didn't want to get too far into the weeds for pacing issues, and I felt my on-screen footnote addressed it adequately, but the phi0 cycle time is important additional context that I should have put in said footnote. I did mention the FPI's cycle times earlier but didn't sidebar into those because I felt the tech page I posted was good enough for those who wanted to pause.
"How Woz would have built his 8MHz IIGS in 1985 while affording a decent amount of memory is a rather inconvenient question." was my way of summarizing a bunch of technical minutiae like this.
@@userlandia It's amazing to me how few people understand the enormous effect that RAM price, speed, and technology (SRAM vs DRAM) had on all of the computers and game consoles in the 1980s and early 1990s.
CPU manufacturers were boosting up the clock speed of their chips far more rapidly than RAM manufacturers were able to (affordably) increase the speed and density of memory.
People who complain about Apple limiting the clock speed of the IIGS might wish to take a look at the cost/design compromises that had to be made for NEC to use a 7.16MHz 6502-variant in the PC Engine console in 1987.
The 8 bit external data bus of the 816 was a limiting factor, and I find it strange that they didn't use full width. It would have doubled the throughput...
@@userlandiathat would’ve also put the ][GS in the same pickle that the Macintosh was in. While it may have been technically possible to produce a machine with faster clock and RAM, it would t have made much business sense due to the costs and sourcing and that would’ve cratered the Apple ][ revenues. Always compromises.
@@johnbrandwood6568lol I think you could’ve reduced the video down to this one statement. Although I am absolutely fascinated with the detail in it. I owned a ][GS at this time and I remember having to save up multiple paychecks from my various jobs at hardware (Inacomp) and software (Babbages) shops for months to afford the memory expansion card and the chips that populated it. Yes, you had to buy them separately and hand install the chips on the board. In the end, the extra RAM wasn’t of much use since hardly any software took advantage of it. Except I believe AppleWorks was snappier with it than without since it didn’t have to hit the drives to page the app if you loaded the files into a RAM disk.
Apple in the early '90s was wild, in the span of 5 years they worked with 6502, 65C16, 68k and eventually PowerPC
Don't forget ARM, m88K, Intel (Star Trek), their Aquarius project... and the Macintosh Application Environment for UNIX ran on SPARC and PA-RISC too. Any others out there?
@@userlandiado any of the Z80 cards for the Apple II count? Not sure if Apple ever played around with the Z8000, the NS 320xx, or the Inmos T800s.
@thejeremyholloway Those wouldn't have been in the five year period that @tcscomment was talking about (I assume 1988 through 1993). Those predated that era, but we'll let them join the party anyway. Oh, and the Hobbit too.
they also had some run with intel x86 CPUs in those dual system Macs in early 90s, what was it Performa 630 PC with i486DX2?
Weren’t the Apple II Z80 cards produced by Microsoft?
Speaking of Megahertz. I've manged to overclock a W65C816 to 39Mhz. And Plasmo, over on the 6502 Forum, has manged to cross the 40Mhz mark clocking one stably at 41Mhz.
Heyy theres me and my IIGS at 48:40 at VCFMW! Playing Puttin' On the Ritz, and the only way it could do that in that program was by accelerating the snot out of it and having a RAMDISK to read from because that file is about 12MB in size and the CPU just cant move the data from the SCSI drive fast enough.
As someone who only glances into the retro computing landscape, I appreciate the debunking myth videos and articles!
Actual investigating journalism. Well done. Thanks for closing that wound.
It's incredible to me that you don't have more subscribers - another high quality video, you do a great job presenting a very long and complex story!
Thanks for taking the time to get to the bottom of the story. Great video.
The quality of content on this channel is insane! Thank you so much! As someone who was born after the Apple IIgs launched, reliving the experience through videos like these is incredible!
But things like 12:50 _"The IIGS, like many computers of its era, derived its base clock from a crystal oscillator"_
So unlike computers of today, or wtf? :D
CPU and system clock derivation was a lot cruder back then; most systems often relied on a single base clock for everything in the system (usually linked to video timing). There was a separate crystal for RTC in the IIGS but every other clock-based circuit relied on that single oscillator. Nowadays we have multiple clocks, there's various PLLs, a circuit's crystal might be embedded in a chip, et cetera... Obviously we still use oscillators but the architecture is very different. Odds are you know all that already and are just needling me for not taking a five minute detour on how clocks are generated yesterday vs. today, and I accept it for the spirit of needling that it is. ;)
Fantastic video! Great research and production quality. You've earned a sub from me for sure!
I worked one summer and used the money to buy an Apple IIGS. It's still in my closet complete with a Transwarp GS, RAMFast SCSI and two hard drives (one of them even still works).
I was thinking I was the king of the hill when I maxed out my OctaRAM board with 8MB of RAM!
I did most of my programming homework using the ORCA compilers. Ahh the memories....
I love it when people come out with their stories about their tricked out GSes. You should bust it out of the closet and set it up in a place of honor! Or at least make sure the PRAM battery is removed. :)
@@userlandia I'd love to give it a place of honor in my back room but, alas, I don't have the space right now. It would be really fun to try and get it on the network too. I understand there are some products I've missed over the past few years that would be really fun to have.
Thanks again for the video. I don't think I've ever seen a NEW video about the Apple IIGS.
incredible compilation and story about apple IIgs and the 65c816🤩
The importance of EDA tools cannot be understated. It all started going down hill when the engineer proudly designed the masks by hand instead of relying on EDA. A rudimental STA would have detected the critical cycle and prevented this mess.
Another important piece of evidence for the end of the Apple II line is that Applesoft was the result of Apple using Microsoft's BASIC, licensed for 8 years and then renewed for another 8. In the early 1990s, Apple and MS were at odds, and there was no chance of another license extension. The looming 1993 end-of-license explains Apple's lack of hardware upgrades in the 90s.
To be frank, someone could just white-room reimplement that BASIC in a couple of months, and probably end up with a faster BASIC. MS was a bit of a scourge on BASIC performance. It was not great. But it did maintain compatibility across platforms, at least for the core statement set.
Thanks. As a UK based computer abuser I can't claim to have had any experience with an Apple II so this in-depth look at the II GS is appreciated.
Holy Crap! why does this guy only have 6K subscribers! What a video 🙌
Tell your friends! ;)
How you don’t have 100k+ subs is beyond me.
At least I can say I was here for the early days. :)
You can help by telling your friends! And I appreciate every subscriber, whether they were the first or the newest.
At 9:10 mark, the same reference you are quoting states Jobs didn't want the Apple II to encroach on the Macintosh. So even if he was not at Apple when the IIGS launched, his philosophy still permeated. If Apple had been really serious about the IIGS, they would have done more to advance it.
Good thing that's not the only source I'm quoting, and not the only perspective I discuss about the political problems around the IIGS, including Jobs' thoughts. It's obvious that Jobs was focused on the Macintosh, and I say as much literally two minutes later. But that didn't mean he or others sabotaged the CPU speed, as the legend goes. I'm not here to defend Jobs so much as to say "Hey, this is very complex and it can't be boiled down into a pithy quip about Jobs not liking the Apple II." There has to be room for nuance.
Part of the point of this is that you can't take a single source (like the tell-all books) or even a first-hand source (like Bill Mensch) as gospel, especially if you can't see their primary sources and vet them. That quote I highlighted was in March 84, about the timeline of the IIx's cancellation. At the same time Apple was releasing the IIc and Jobs was a big driver of that project. By the time the IIGS proper got going in late 84, Jobs was embroiled in the Macintosh Office and would be out the door shortly after. Most of the decisions around the CPU speed would've taken place after he was stripped of power in spring 85. That's just based on the timeline of when the El Grando boards were created and when the custom chips went out.
But even if Apple wanted to push the IIGS harder, they literally couldn't at the time for technical reasons, which I've exhaustively detailed. And by the time some semblance of faster speed was available, it didn't make business sense. Even if they, say, dumped a lot of money into WDC, would Mensch and company still have stuck to hand layout? His goals may not have aligned with Apple's goals. Obviously there's all sorts of what-ifs going on here, but architecturally the Apple II is a brilliant piece of work for its time that was going to be replaced at some point-either by the Macintosh or by some other machine.
There's a whole story about business management with regards to computer platforms that isn't exclusive to Apple that's beyond the scope of this video, but suffice to say that the Apple II wasn't the only victim of bad management and customer neglect. The IBM PC compatible was only able to carry on like it did to the modern day because Intel did the hard work with the 386. But that's a long, extensive topic that can't be addressed in a youtube comment.
One of the best and most well balanced look at the development and short life of the IIGS. You articulated well what I have been arguing to a wall in a few IIGS and Macintosh groups.
He crashed a Bonanza? I've heard they call that the Doctor Killer because a good number of rich doctors will buy one and crash it because it can be tough to fly and they don't take the time to learn it well. That's the model plane that Buddy Holly, Ritchie Valens and Big Bopper were on.
Yup. Inexperienced pilot in a very powerful and sensitive aircraft is a recipe for disaster. There were a lot of factors to the crash-Woz was a freshly minted inexperienced pilot, had no IFR, lacked endorsements necessary for that type, definitely misjudged the amount of runway/airspeed he needed to take off, probably got his weights n balance wrong, etc... He's lucky he didn't face more severe legal penalties. It was probably a contributing factor to Sky Park closing 18 months later, but small airports like that were closing in similar fashion during that time period.
If you want to dive into a deeper rabbit hole allegedly related to the 65816, try researching its connections to the failed Atari/Synertek 6510 CPU project from the late 1970s. No, not the later MOS/CSG 6510 used in the C64. I’m talking about the Synertek 6510… a 6502 with 16-Bit extensions.
I actually did some research on the Synertek chip back when I was writing the Apple IIe computers of significant history, but I couldn't find a way to organically work it in, so it got cut. It's another absolutely fascinating case of "what if."
Fantastic vid and great research.
The problem with the 65816 is it is to closely coupled of the 6502. It should have been a true evolution with full 16 bit address bus and full true 20 or 24 bit address bus in a package similar to the 68K.
Wonderful video, and some of my assets used (and credited - thanks) within. I must disagree only with the Amiga 1200 analogy; it was far more of a leap over previous Amigas in its target-user slot than was the Mark Twain over the ROM 3 GS (32- vs. 16-bit system bus, 68020 vs 68000, new graphics chipset (if a fairly modest improvement -- they did offer "Productivity" modes, which were higher res, non-interlaced modes for "serious work")).
Hey, thank YOU for posting your photos under a permissive license. Credit isn't just good, it's the right and just thing to do. I'm a photographer too, I know how it goes.
Also, I agree that the A1200 was a greater technical leap than the ROM 03, but it wasn't ENOUGH of a leap to stave off PCs, which were making bigger improvements at a faster rate. Especially considering that Commodore was starving R&D and it wasn't as good as it could have been. My point was that the the A1200 wasn't enough to stop the Amiga's decline, much like the Mark Twain, even with a speed boost, wouldn't have been enough to turn the IIGS' tide. Now, if the A1200 had an 030 and a full AAA chipset...
Although I do think that if the Mark Twain (or the ROM 03) had a speed boost, it could have wound up like the A1200 in another way: it might've split the market for software. But I can't say for sure that's what would've happened.
Thanks for commenting and for your work!
The amount of research and quality production in this video is exceptional. You for sue answered that nagging question that we all had for years as to if the gimped clock speed was on purpose for other than technical reasons. Thanks and keep up the good work.
Thanks for the super thanks!
Really nice work Dan. You had to go deep into archives for some of those bits of info! Really impressive!
Amazing work! So glad I ran across this in my recommended.
This channel is amazing, would never have found you in podcasts had you not moved to video. Such fun nostalgia from old times. Now i wish there were weekly videos. I'll just have to not binge them at all at once.
It amuses me that when Woz went back to finish his Computing degree, he Probably did it working on the computer he designed :)
Good to see that the algorithm sometimes does get it right. PC History Rabbit Holes don't get proper deep dives instead you just get stuck in wikipedia loops typically which leave you disappointed. So I am happy to see proper deep dives like these sometimes make it to my suggest links. Instant subscribed.
very interesting! just wanted to say i LOVE the additional (visual) footnotes in the video, like in every good book. :-D and i also love the long documentary format. this is not a random video of the week, it's for the archives.
Acorn, a British computer maker went to WDC to buy the 65816 CPU. They were not impressed at all, so much so that they got permission to make their own CPU, which they went to 32 bit right away with the Acorn RISC Machine CPU, better known as ARM. But another fun Acorn fact. They made machines which could have co-processors. Their main machine, the BBC Model B and later Master system could use 80286, 68000, Z80 and even got ARM CPU board, that is where the first example which was fabbed for them was tested, by accident it basically ran on the electricity that came from the oscillator and test equipment, they forgot to power it on, but it was so efficient it still ran on basically static electricity.
In the book iWoz, Woz mentions the IIx briefly but shares no info what so ever, then disses the IIgs. It was fun seeing that Apple was trying to make a system like this, sadly it failed but still cool.
I am still impressed with the 65816 because it is beautifully backwards compatible.
As the video and accompanying blog post point out, Acorn did actually use the 65816 in the Communicator, which was aimed at certain kinds of business applications. That choice seems to have been largely determined by constraints imposed by existing projects and the financial state of Acorn at the time. The part used appears to have come from GTE, judging from machines that are still around.
The Communicator used the ULA from the Electron, and it is in various respects a follow-up of earlier efforts to augment the Electron with an expansion for use in communications applications. Adopting the 65816 presumably allowed the team responsible to evolve an existing approach, although they did bring in the original architect of the MOS to do a 65816 version. However, the division making the Communicator was pretty much on the margins at Acorn and was eventually eliminated in cost-saving measures, seeing founder Chris Curry's departure.
Another bit about the Communicator that I didn't mention in the text: it used 2 MHz '816s, which-given its development and release timeframe-lines up Dave Haynie's difficulties with sourcing 4 MHz chips and GTE's plentiful supply of 2 MHz chips in 1985.
@@NuntiusLegis John Romero interviewed Nasir Gebelli for a few hours www.youtube.com/@appletimewarppodcast1264 - and Nasir had said something like he had hardly touched the 16 bit instructions when he was coding Secret of Mana. Just took his NES skills and went onto SNES.
Such a great, well produced documentary. Thanks for creating it.
Based on this video and the research needed to create this video, I will never blame Steve for the Crippling of the Apple IIgs again. Thanks for setting us all straight on this.
Wonderful video. Of course, it was an insta-click for me as a IIGS enthusiast (my first computer!).
Awesome deep-dive into the 2.8 MHz myth - thank you for finally clearing this up! And agreed that the IIgs is still a great computer, especially with all the new peripherals available for it such as the AppleSqueezer accelerator, Uthernet II card, LiteSound stereo card, A2FPGA Multicard, and a bunch of HD emulators.
Greatly underrated channel. Awesome documentary research and storytelling!
Great job digging up all this information!
Just found your channel a few days ago and finally completed this video. Wow! What a great video essay.
Thank you instead for what I totally did not expect at the end, the ARM/Acorn excursion. Loved seeing that Communicator. Those Arcs and even that Master Compact.
Insanely great work! Well done.
Very nicely done. The take-away is that it suited Apple NOT to ensure the IIgs was more successful than the Mac. The 8-bit Guy has done a great thought experiment about what would have happened if Apple had pursued continuous improvement of the Apple II rather than the early failure that was the Mac. Apple nearly sank themselves with those terrible computers!
Not... really? The 65816 was a bit of a dead-end as a desktop processor. Even if higher speeds were possible in the quantities Apple worked in, the need for synchronous RAM would have crippled the GS's price at higher speeds. The 8-bit-guy claiming the lower cycle counts of the 6502 and 65816 were 'better' than the 68000 papers over this problem, since now it's trivial to get DRAM or SRAM at speeds and quantities unimaginable in the 1980s. But the 68000 could handle RAM at a range of speeds decoupled from the processor clock, something that we just take for granted these days.
First video I've seen myself quoted... 34 years ago. :) - andy
Your posts were one of many clues to the puzzle, Andy. If you have any more insight, I'd love to hear it. :)
Glad to be of service, but honestly my memory of that time is very limited because of the long gap. Unfortunately, I don't have archives which would help. A canonical authority for that time period (1989-1992) would be Dave Lyons who likely has either knowledge or archives.
The clock speed considerations are much the same with the ZXSpectrum, ZX81/80 and other computers from the era.
The 80's were a fantastic time of innovation and step-change improvement in computing. Thank-you for a fascinating deep dive!
What a great documentary! Super interesting. Thanks!
Good that this got into my recommended, great channel
Impressive narration!
Excellent piece. Genuinely interesting.
That was a good piece of retro hw journalism. Great job and keep it up!
What a fantastic video. I have a passing knowledge of the Acorn / BBC side of this (over in England the Apple II wasn't really a big thing) so I learned a lot.
Well researched and professionally created, I don't understand how you aren't a 100k+ subscriber account.
1:21 I've been racking my brain for the name of this game for weeks! What a happy coincidence to see it here! Thanks!
Solving mysteries, one clip at a time. 😎
Great video! lots of in depth analysis!
Terrific video! Lots of great IIGS information!
Great video! I was a little uneasy about checking my own posts about the Apple II series, in case I'd helped perpetuate the megahertz myth. Thankfully I had not.
I loved the Apple II line, I even saved up and bought a Laser 128 (//c clone), which was a mistake at that time in my life. The IIgs was really neat. But while the II line was great for hackers, it wasn’t strong as a consumer device. I hated the pre-NeXT Macs, but they did have their place, they just weren’t targeted at me.
Great video, pretty convincing research and argumentation.. It delivers pretty conclusive answers to questions on the 65816 I as a former 6501 guru-wiz had because I just never did real work with that CPU.
Regarding the 68000 User/Supervisor mode - that privilege separation was kinda pointless without an MMU to go along with it. Let me skip the details but the 68000 was basically broken for use with MMUs and required some expensive kludgery to bolt on a MMU. Plus there wasn't really a standard MMU. The 68020 / 68851 combo was in theory viable but expensive and the 688851 was pretty broken. Plus some vendors such as Sun were using their proprietary MMUs with the '020. Things started becoming sane only with the 68030 and its builtin MMU but for many but the most hardcore 68k vendors that was already the end of the 68k architecture. What I credit the 68k architecture with is that it was hands down way more elegant and programmable than x86. At the same time writing optimized code for it was more demanding than for the 6502 due to the large ways many things could be expressed in assembly code and the complexity of execution times. Something that got only even more complex with the 68020. Ironically RISCs were meant to make code generation easier for compilers - but with a slight help from tools they also made it easier for programmers. The orthogonality of the instruction set did also help - unlike the 68000 where things were just a slight bit different between address and data register.
Re. David Haynie - some people never change. it's a long time since I met him but didn't change so much between then and in the video.
Thanks for your insight! Your point about user/supervisor separation not being helpful without an MMU is well taken, which is one reason why the Mac didn't use it (Andy Hertzfeld talked about this in some places). I guess the point I was making is that it's an example of the 68k architects thinking of future needs, and how its audience/goals differed from something like the 65816.
@@userlandia Absolutely!
And I agree about the 68k architecture looking ahead. I think the function codes lines are one thing which shows they were thinking ahead. And the 68010 fixed the worst sins. And if you look at the instruction encoding (note, I have not fully researched this), there would have been space for a 64-bit variant.
Thank you for making such a great video. I always believed that myth about the IIgs had to be nonsense, but your research proves it.
Magnificent video, an immaculate documentary. It must have taken a dog's age to do all that research. Well done indeed.
The Apple iiGS was a sort of dream machine for me. We had Apple IIes in school which I enjoyed. When the GS appeared, it seemed like a miracle. Alas I never did get my hands on one, then or now, but thanks to brilliant productions like this I can imagine a world where I have my very own IIGS. Thanks!
Thanks for the Super Thanks!
That was my first new Apple computer. I bought it with a sound card and needle printer with scanner snap-on.
Did cost me an arm and a leg plus trading in my Apple IIe. But I loved it.
Made quite a case for the "other side" here ! A lot of stuff to think about !
I made a conscious choice to forgo the Mac and get the Apple ][gs simply for color and ensoniq sound chip alone when I was a kid. The amount of money spent on this system was enormous for that time. When I got home, unpacked it all, setup and booted….and waited….waited…..waited…and the desktop finally came up, I realized two things. One, I had to purchase a hard drive system for this beast and could not rely on the 3.5 floppies to boot off of and run programs on this puppy. Second, and something I had no control over, was that the cup was a dud out the door!! Sure, it ran games like Conan (Apple ][ users know what I am talking about) at lightning speed because the game had no timing system built in and relied simply on the known fact of the ][‘s clock speed for game timing, but everything else including boot was monotonous. When 2 hardware developers came out with clock multiplier add in boards I was ecstatic. The applied engineering transwarp board was hot on my list and of course the hard drive and this board meant much more money!!! In the end,it was worth the money for both upgrades for my BBS.
Excellent research! Thank you!
G'day from Sydney. Nice work. Subscribed.
Cheers, mate!
Interesting...
IIRC, Acorn were also considering the 65816 for the successor to the well known BBC Micro, but were frustrated by the delays, price and them being "a bit crap"... So they thought "What the hell, we'll make out own CPU"...
This CPU was the Acorn Risk Machine - The ARM1...
BTW, the original BBC Micro didn't have to slow its CPU down for ram access, Acorn sourced 4 Mhz ram for the 2Mhz 6502 based BBC Micro, so the video and CPU could access the ram on alternate clock cycles.
It would be interesting if backwards compatibility became such an important feature that they kept extending the architecture to 32 and 64 bits. Perhaps in an alternate universe there are Gen Alpha kids breaking out a 5.25 floppy to play Number Munchers in compatibility mode on a modern computer.
WDC announced but never made a 65832. Based on the preliminary data sheet it would've been even more of a PITA to program than the 65816, and generally too little too late.
@@ischmidt I’m sure it would have been an even bigger nightmare.
@@ischmidt yeah the 65C832 was a ridiculous design with no benefit. It had no new instructions, just yet another processor status bit to double register sizes. So you could add/subtract 32-bit numbers, but still no multiply/divide. And everything squeezed through that 8-bit data bus, so those 32-bit operations would take extra cycles. The thing wouldn't have provided any practical benefit, which is why they found no customers for it.
@@tim1724 Yup. Mitsubishi's 7700 series microcontrollers were a more reasonable 65816 extension: 16-bit data bus, multiply and divide, and no emulation mode (but still M and X mode bits).
You'd have to be crazy to hate the Apple IIGS? Dude, you know that there were much, much better home computers before this, right? Amiga, for a start, in 1985. Apple didn't even have an operating system, it had a simple program loader, the simplest software a computer could have. The Amiga had a full, pre-emptive multitasking operating system like Unix. It took Apple another 16 years to come up with that, and it was a Unix clone from the old NeXT OS. And here you are trying to make a case for a IIGS in 1991. Having said that, I appreciate your history road here! And as an old Computer Club newsletter editor (and President) from the 80's, I appreciate your inclusion of old newsletters! Those are near and dear to my heart. 99% of people never know what went into creating those 40 years ago! Nice Dave Haynie references.
That line is actually a reference to one of my favorite Dexter's Laboratory bits. "Not Justice Fruit Pies, the delicious treat you'd have to be crazy to hate!" Look up Justice Fruit Pies on UA-cam sometime for a laugh.
The Amiga might have been better in many ways, but that doesn't mean we have to hate things! There has come a time in my life where I have learned to love all computers. ;)
The IIGS did have a full operating system, GS/OS, and while it lacked any kind of multitasking it certainly had a toolbox-based development setup and system management that would be recognized as an actual operating system. Which is pretty impressive for what it is, and one reason I find this machine endlessly fascinating even though I didn't own one back in the day.
A faster IIGS in 1991 would've been a decent enough machine (certainly would've blown my Commodore 64 away), but I think I made the point that it wouldn't have reversed the system's fortunes.
Awesome video! Now I want to go home and play with my IIgs ;)
Good to see the good old VAX 11/780 being referenced
Excellent work! Also makes me wonder now about my assumptions about decisions made by the other great 1980 computer builders, Commodore, Sinclair, etc, where previously I'd assumed bad management and costs were the fundamental reasons behind some of the failures to produce better machines.
I've found the answer is almost always "a little from column A, a little from column B." Although in the case of Commodore I'd lean more towards the column B of "Management." :)
For people that don't have access to magazines (I collect old tech magazines) the Mac World of April 1989 shows that the Mac II was already old , and we already have ps computers running on 32 bit microprocessors at 33mhzs. The Mac IIx was running a 68030 ath 16ghz (15.667) with a 68882 math coprocessor and 6 NuBus Slots.
Amazing video. I believe the evidence you cited supports the rationale underpinning your argument. Nicely done!
On the subject of "should have", I have a rather different take on things, but of course this is all with the benefit of hindsight.
I'd love to have seen Apple acquire the Amiga division from Commodore. There were a number of what I think are interesting architecture designs inherent in the Amiga which possibly would have made for a much more powerful computer, and much closer to modern-day computing (that is, color, native multimedia integration, etc.) than what Apple had going for it with the Macintosh, and as a consequence would have left the 8088/80x86 platform in the dust.
That's my first computer in 1988. I added RAM to it, but I was not careful, and I broke it with static electricity. I was dumb for not buying a hard drive for it, and I had to swap floppy disks all the time.
Everyone likes to say the 65816 is fast per clock compared to the 68000, but what happens when you want to dereference a 24-bit pointer??? That's really slow on 65816, same with dereferencing a 16-bit pointer on a 6502. The 65xx architecture was great at the time but it was more or less specialized for microcontroller-type applications where you were using a few fixed addresses in a small memory space. One you have more memory, try to do relocatable code, work with a bunch of structs, etc. 65xx really falls flat.
I didn't want to get too far into the weeds on arch differences between the two (especially because I am not a wizard at programming either of them and for pacing issues), but yes, that's one of the cons I know of for the '816. The analogy I always thought of it is that the '816 is like a nimble sports coupe versus 68000's sport sedan.
I still think it's sad that the charm of the 6502 computers got cut off by most of the 16-Bit machines with zero backwards compatibility. As a C64 person, I would have loved a compatible 65816 progression, and envy the Apple II persons for the IIGS. I would have told Commodore to shut up and take my money without caring a second about some 68000 machine with slightly better benchmarks.
The 65816 also have much fewer registers than the 68000 leading further problems to the "faster per clock" claim.
@@Henrik_Holst I am not an expert, but the 6502 can address the first 256 bytes of RAM (zeropage) very fast with a single address byte, which is considered as the equivalent of 256 8-bit registers or 128 16-bit registers. If I see it correctly, the 65816 can have 65536 "zeropages" (called direct pages when relocated from the start of memory).
@@NuntiusLegis yes you save one cycle over other parts of RAM, aka LDA from the zeropage is 3 cycles vs 4 cycles for LDA from other parts of RAM, register to register is however only 2 cycles so zeropage is faster than other RAM but not as fast as the registers.
The fun thing about the 65c816 related to its architecture and RAM speed issues could have been largely mitigated by going to a wider memory bus as the ARM did, and adding (especially) the zero page memory on chip, along with at least a page of the stack, and there was the one reserved WDC instruction along with the COP instruction that could have restarted the wider forward-looking instruction set and overall architecture.
The reality is with zero page addressing from the start, it was very much more of a RISC machine that, had all that zero page been on-chip, would have screamed for operations done with that address mode, being much like 256 bytes of CPU registers. And with adding a whole page of static RAM for in-page code execution, as well as the other things, it isn’t hard to see a logical path for the family to have kept growing, much like ARM has. After all, as node sizes kept getting smaller, the base 6502/65c816 compatibility mode would have become an insignificant portion of the die space used: it was from the start, a CPU meant to be minimal circuitry/die space for the functionality, and that could have served it very well. It’s not like any of the other CPU architectures weren’t forced into the more expensive ways of achieving performance anyway: that wasn’t something that had a realistic chance of not happening.
Based on research about the 65832, it was still going to use an 8-bit external data bus. I think Mensch prioritized socket/ plug-in compatibility a bit too much with the 816 (and 65802) versus a logical compatibility. I think those reasons can be defended, but they clearly prioritized a here-and-now viewpoint over a future viewpoint. It's easy for me, as someone who's not in the semiconductor business, to look back in hindsight and criticize those decisions. Whether it would've been more successful with a more aggressive redesign is yet another what-if.
ARM is a more elegant re-implementation of a lot of these ideas without as much of the 502's technical baggage. If the 832 was a real clean-sheet with a more discrete emulation mode (a la real mode / protected mode) I wonder how it could've fared in the marketplace..
My school when I started kindergarten in 1993, still had Apple IIGS's in the computer labs. In 1994 they upgraded to Macintoshes. Then in 1997 went to Windows PC's.
Thank you for this fantastic piece!!!
Nice Sailor Jupiter rep on your phone!
I read that in 1980 Jobs himself approached one of their second source manufacturers of the 6502 to design a 16 bit version of the chip. The manufacturer wanted Apple to foot the entire bill, which Jobs declined.
Given the timing, it sounded very much like Jobs initially wanted the Lisa to run on a 16-bit 6502. Imagine how different things might have turned out if the Lisa had been based around an ARM-like chip and been at least partially Apple // compatible.
I don't have the materials on hand right now, but the Lisa did switch from some different arch in its pre-GUI days to 68k. I don't remember what that arch was, exactly.
This is a great video !!
Thank You.
Great video, I've subscribed. Thank you for covering the 65C816's part in the birth of the ARM processor. If the 65C816 hadn't been so disappointing, Acorn might not have developed the ARM architecture, and then where would we be?
65C864 perhaps :)
Wow a really good Documentary, I learned a lot!, Darn I can even see a F256K shown in there (I have one of course with an 65816), it is shown with the RPG demo I coded for VCF, ha ha never taught I would see it shown anywere..., nice!
I did show it in my VCF 2023 video as well (where I first saw a F256K)!
The legend of Apple intentionally nerfing the IIgs always bothered me, on the surface it feels like the kind of story that's more popular than it is true, and it never appeared with anything compelling backing it up. A surface-level read on a story like that isn't going to be reliable of course, but generally the information I'd found seemed to support the idea that going faster just wasn't an especially viable option when the IIgs was being developed, or even for a few years after it was released. Thanks for the deep dive.
Them considering 5.5 MHz at one point is easy to figure out when you look for clock crystals and find out there's crytals rated for double the speed of the one used in the IIGS, makes you wonder if adding faster RAM to the main board and replacing the crystal with the doubled speed version would produce results
You can't just replace the crystal because too many devices in the system have their clock cascaded off of it, including the video generation. The system wouldn't work properly.
Comment on the article directly from the lead of the project (Hillman) you might find interesting: "A very long and mostly accurate article. The Brooklyn and Golden Gate were Apple /// projects which were meant to bridge the 8 & 16 bit world Those projects were killed because there was no interest in an Apple ///. The work morphed into the IIx, which became the IIGS. The clock speed was always a multiple of the color burst frequency (3.58 MHz) from the original Apple II."
Due to your username I gather you have some connection to Mr. Hillman. If you do, let him know he has my sincerest appreciation.
Never heard of this myth before this video, but consider me educated :)
Now I want to hear that 32 voice music chip given a workout. The few Apple 2 GS games I've tried hasn't really hinted at such a chip being under the hood.
I believe the way it's set up in the IIGS is that it's capable of 15 stereo voices using those 32 oscillators, but the system shipped with mono sound only. To get full stereo you needed a stereo expansion board (which mine has, it plugs in to a header on the logic board). The Mark Twain would've had stereo output according to the reports I've read about it. There's also the 64KB sound RAM which limited it a bit as well. But you can use SynthLab or modtrackers to let it stretch its legs.
Bumbershoot Software has written an excellent writeup on the DOC. Highly recommended reading if you want to learn more about the technical bits.
The IIgs version of Silpheed plays the PC Roland MT-32 version of the soundtrack from the same .MID files. It's quite well done.
check out the demo called kernkompetenz
check out the IIGS demo called kernkompetenz
I remember learning the Super NES used an improved version of the same 65816 CPU. The Sega Genesis used the 68000. I also remember my 7th Grade IT class in 1993 where the teacher's pet got the IIgs with the color monitor while the rest of us were stuck with green-screen IIe machines. That sprunt didn't even put the then-beast to proper use while I was making the IIe my bitch.
Fascinating nad super-well researched video. Congrats on a great documentary and I hope it spreads across the interwebs.