Yes, i probably should have made that a bit more clear. Clip 1 - Rules of engagement - no FPGAs Clip 2 - Fights on: me vs NTSC Clip 3 - Hidden signal (colour was already there) Clip 4 - NTSC myths destroyed
Thanks for the feedback. Personally this is one of my favorite videos on this channel. Unfortunately, it didn't quite get the number of views the VGA from an EPROM video got.
Your video marks the FIRST time I have actually _understood_ how NTSC color even works... I've been messing with computers most of my life. 33 years ago, I fixed a TRS-80 CoCo 2, and 5 years before that, I got a Commodore 128, which I played with, but never understood at that young age. NTSC has always just gone over my head, and now, with a single video, I _COMPLETELY GRASP_ how those old machines did the trick, and how those old TVs interpreted it! _THANK YOU_ for this incredibly _WELL DONE_ visual explanation!
I have been a practicing broadcast video design engineer since 1982. Starting with analog to the current 12gbs 4K digital systems. And of course like most engineers of the day, I owned an Apple 2. It still amazes me how Steve Wozniak made that work with such little hardware. This same idea was also being applied to test signal generators of the time. Prior to that you had a TTL chip jungle generating the pattern feeding an analog NTSC encoder. A lot of points to induce drift. By synthesizing the NTSC (or PAL) signals as pure data in EPROMS, the only point of drift was the video output level. Of course the broadcast implementations of this idea were much more complex to adhere to broadcast video standards. Steve was truly a genius. Too bad he didn't remain with Apple. I did a lot of work in the early 1990s with composite SDI digital video interfacing. Most younger broadcast engineers think SDI was always a component video topology. Not true! Component SDI was first but there was also a composite SDI version of NTSC and PAL with 14mhz and 17mhz clock rates respectively. Just take the NTSC or PAL analog signal and digitize it at 4fsc and use the first and last few codes for timing data (video codes were from 16 to 235 in 8bit). This was basically gone by the late 1990s as all the new compressed digital broadcast standards were component based.
Excellent. I think this NTSC colour artifact that Steve Wozniak make the Apple 2, at least from a technical perspective. I've mainly worked in the area of computer generated imagery, but the generation of NTSC and PAL are both very interesting.
Haha, yes it gets a bit that way, i need all the prayer i can get. Last video a Hackaday article triggered the algorithm. Although it was only 7.1% of the views in the end, it triggered YT to distribute it more broadly.
Woz always claimed his use of GCR(5,3) in the Apple II floppy controller was "my most incredible experience at Apple and the finest job I did" but I reckoned his bit banged NTSC from a circuit that simultaneously refreshed the DRAMs is definitely up there. It really is the engineering equivalent of being in the zone in a fighter jet.
I've been trying to learn TV colour encoding recently... and this is one of the best explanations I've seen... I think I might finally "get" the colour burst and the phase shift now. :)
I've watched a few videos recently about people making electronic music with analogue feedback.... quite fascinating how the overview schematic of this process looks so very similar with the address bus in what is basically a feedback loop.
Look up the RGBtoHDMI project - it's a scan converter for displaying old video standards on standard LCD monitors with HDMI / DVI with pixel perfect scaling
Thanks, Yes true, PAL adds another layer of complexity. Even the Apple 2 is more complex in normal high res mode were the data stream is shifted by 1x 14MHz clock based on the most significant bit, but i wanted to get the basic idea across.
Or even weirder - SECAM. Believe it or not some countries in Africa still broadcast in SECAM although they're all moving to DVB. It's a real challenge to bitbang alternate FM modulated red and blue colour difference information between lines. I managed to get PAL working on ARM but never got round to making SECAM work.
That is an interesting trick to generate color from a digital signal. I was wondering, since a variable cap is hard to get, what is the combined capacitance of c2 and c3 in the tank circuit. A value of around 73pF would tune the tank to the color burst frequency, but I an unsure if that is optimal.
I could only find a 33uH inductor so i think my ideal capacitance is ~60pF I think the frequency will somewhat be fixed by the colour burst signal itself, so i think the dominant effect of the variable cap is to adjust the phase (hue)
CGA programs using 620x200 artifact color work _exactly_ the same way. The built-in RGBI to composite conversion is a bit more complicated. I that uses both sides of the clock (for effectively 28MHz) and might use a multibit DAC but the principle is similar.
This series is the best that I've found on YT. My Son and I are building a Z80 homebrew machine and we'll be using these videos as our definitive guide for the graphics. 🤩 We'd like to achieve a VGA output of 320x224 with 8-bit color - Ambient brightness lifted, but then use the remaining 2 bits of each byte for brightness (I've imagined this 8-bit color palette as the 'perfect palette' for 25 years but lacked the technical know-how to implement it and stumped that I've never seen it implemented). 🙃 I'll keep trying!! 😀 Thank you for your amazing content. 👏 Also... are your schematics and code hosted anywhere?
Excellent, glad you liked it. I have the schematic for the EPROM board on github.com/Turing6502/SAP6502 It requires a clock and outputs G0-G19 are connected to address lines A0-19. The other outputs are in the video. Good luck with the build.
TV sender and receiver use sharp low pass, band pass , hi pass filter to implement these sub-channels. You cannot bit bang these. See how the VIC-2 has separate luminance and chromance output fed into a high quality filter? Audio has yet another filter. TVs even use surface acoustic devices. Maybe be use a SVGA video DAC at 50 MHz and some Fourier filter?
I think the filters and analog electronics are more significant in the receiving side than for the sender of a simple computer generated display. The frequencies are really controlled by the dot clock, and you can send out a nasty old square wave and it gets interpreted correctly using the filters in the receiver. Of course, i haven't tried to do this with higher fidelity YCbCr yet, so i may change my tune then.
@@DrMattRegan I thought that radio channels and cross-talk are an accepted theory. When sending: Without a hard filter at the borders of my band, I get a visit from the FCC: And no, the information is interpreted wrongly. The classical example was an anchor-man with a jacket with a pattern on it. Back in the day, anchor men did not wear color, but checker board or thin stripes were allowed. With good equipment in the studio on a high quality RCA TV set the jacket stayed black and white. Cheap private local sender and the second TV in the kitchen and the jacket gained colors. The C64 came out years after the Apple ][ . Engineers used this time to read up on the topic. Additionally, Commodore added traces to the PCB to pass both signals without the band pass. My brain still explodes when I try to imagine what happens when the color signal touches DC, but this is not real. It cannot hurt me. Even in S-video the chromance only gains like 1 MHz on both sides. I actually like that chromance has no DC part. The baseband luminance should use the borders to compensate DC to zero. Or so. Ah, analog signals are such a mess. Digitally, I would use the color clock only. QAM modulation. One channel is luminance, the other is chromance and audio. This time time division multiplexing is applied ( after digital error correction ). Digital technology can delay the signal with ease. Already the tiny RAM of the Apple ][ would be enough to use the borders for information. Of course, for QAM I still need the color burst. I actually am a bit confused how modern terrestrial TV stabilizes the phase. I think QAM could synthesize it out of the signal (deviation). So it is important not to stop the signal (in the borders). Could even place audio into top and bottom border.
"I thought that radio channels and cross-talk are an accepted theory. " I trust that you've watched the video, but it's about composite NTSC, not RF transmitted NTSC. In particular how the Apple II generates artifact colour. Sure the Apple II goes above the 1.5 MHz limit for Q and 0.5 MHz limit for I, but i don't really think the FCC are going to come running for anyone using an Apple II connected via a composite cable to a monitor.
@@DrMattRegan radio channels were invented first. Those same engineers later squeezed in the color channel using their knowledge. Sometimes schoolcan take a short cut, but regarding color TV, the historical way seems to lead to wisdom. So many people praise composite over RF, while I cannot see a difference on my C16. RF does not manipulate the composite signal information. Woz was pragmatic to get something out of the door. Then Apple moved away from composite pretty fast, like every home computer, instead of wasting money on a system invented to have many terrestrial TV channels. Never would Woz advise others to repeat his stop gap.
Yes, it's actually very difficult to generate a sine wave in powerpoint. I figured that anyone who spots the shape difference, probably already knows enough about sine waves.
I have heard the description of NTSC and its inner workings many times, and it still gives me a headache every time. Given that it is backwards compatible with monochrome video, it's an engineering achievement. But compared to VGA it feels like such a hack. Although I am coming at it from the digital perspective and not an analog one. It might make more sense in an analog system.
I wouldn't try and get more than 16 colours from NTSC myself. VGA is much easier to use, but it was very clever of Woz to generate colour this way. Probably one of the important factors that bootstrapped the company. The apple 2 is even more complex, it uses the top bit in the byte to shift the by 1x14 MHz clock cycle which is how he gets 4 colors + black and white. Jensen Huang at NVidia would pound it into the architecture team that the most efficient architecture wins an gets all the money.
i dont understand the random maverick clips
Yes, i probably should have made that a bit more clear.
Clip 1 - Rules of engagement - no FPGAs
Clip 2 - Fights on: me vs NTSC
Clip 3 - Hidden signal (colour was already there)
Clip 4 - NTSC myths destroyed
I didn't either.
One of the best description of the color TV signal I've seen. No mumbo jumbo description, just info. Even if I an understanding before, I salute this!
Thanks for the feedback. Personally this is one of my favorite videos on this channel. Unfortunately, it didn't quite get the number of views the VGA from an EPROM video got.
Your video marks the FIRST time I have actually _understood_ how NTSC color even works...
I've been messing with computers most of my life. 33 years ago, I fixed a TRS-80 CoCo 2, and 5 years before that, I got a Commodore 128, which I played with, but never understood at that young age. NTSC has always just gone over my head, and now, with a single video, I _COMPLETELY GRASP_ how those old machines did the trick, and how those old TVs interpreted it! _THANK YOU_ for this incredibly _WELL DONE_ visual explanation!
Your welcome. Thanks for the feedback. Responses like this are why i make these videos!
Can only agree with previous comments. The pedagogics in explaining the complete video signal is wonderful - many many thanks!
Impressive work!
Thanks. Glad you got some value from it!
I have been a practicing broadcast video design engineer since 1982. Starting with analog to the current 12gbs 4K digital systems. And of course like most engineers of the day, I owned an Apple 2. It still amazes me how Steve Wozniak made that work with such little hardware. This same idea was also being applied to test signal generators of the time. Prior to that you had a TTL chip jungle generating the pattern feeding an analog NTSC encoder. A lot of points to induce drift. By synthesizing the NTSC (or PAL) signals as pure data in EPROMS, the only point of drift was the video output level. Of course the broadcast implementations of this idea were much more complex to adhere to broadcast video standards. Steve was truly a genius. Too bad he didn't remain with Apple.
I did a lot of work in the early 1990s with composite SDI digital video interfacing. Most younger broadcast engineers think SDI was always a component video topology. Not true! Component SDI was first but there was also a composite SDI version of NTSC and PAL with 14mhz and 17mhz clock rates respectively. Just take the NTSC or PAL analog signal and digitize it at 4fsc and use the first and last few codes for timing data (video codes were from 16 to 235 in 8bit). This was basically gone by the late 1990s as all the new compressed digital broadcast standards were component based.
Excellent. I think this NTSC colour artifact that Steve Wozniak make the Apple 2, at least from a technical perspective.
I've mainly worked in the area of computer generated imagery, but the generation of NTSC and PAL
are both very interesting.
Oh mighty algorithm, we the humble viewers of this channel request that you place this video in the suggested videos for computer enthusiasts.
Haha, yes it gets a bit that way, i need all the prayer i can get. Last video a Hackaday article triggered the algorithm. Although it was only 7.1% of the views in the end, it triggered YT to distribute it more broadly.
Woz always claimed his use of GCR(5,3) in the Apple II floppy controller was "my most incredible experience at Apple and the finest job I did" but I reckoned his bit banged NTSC from a circuit that simultaneously refreshed the DRAMs is definitely up there. It really is the engineering equivalent of being in the zone in a fighter jet.
Yep, I agree! I think it was getting colour, any colour that differentiated the Apple 2 from it’s 1977 competitors!
As usual, great content & well explained Matt. Thanks
Glad you like it. I couldn't resist with the Top Gun clips.
Awesome explanation and great aiding visuals. Much appreciated!
Glad you got some value from it.
I've been trying to learn TV colour encoding recently... and this is one of the best explanations I've seen... I think I might finally "get" the colour burst and the phase shift now. :)
Excellent. It's great when the penny drops! If you're up for it, have a look at the Turing6502 videos, at least the first two (
I've watched a few videos recently about people making electronic music with analogue feedback.... quite fascinating how the overview schematic of this process looks so very similar with the address bus in what is basically a feedback loop.
Yep, it’s a very common circuit, the finite state machine, you see them everywhere in computer architecture if you look hard.
If only my old LCD TV would properly display the signal of my Apple II europlus like your Hisense does ....
Yeah i think there is quite a bit of variability with how well artifact colour is displayed. Fortunately i think i got lucky with my display.
Look up the RGBtoHDMI project - it's a scan converter for displaying old video standards on standard LCD monitors with HDMI / DVI with pixel perfect scaling
Nice video. Of course, in Australia, we had to add a PAL colour card to the mix. PAL is a bit more of a challenge when compared to NTSC.
Thanks, Yes true, PAL adds another layer of complexity. Even the Apple 2 is more complex in normal high res mode were the data stream is shifted by 1x 14MHz clock based on the most significant bit, but i wanted to get the basic idea across.
Or even weirder - SECAM. Believe it or not some countries in Africa still broadcast in SECAM although they're all moving to DVB. It's a real challenge to bitbang alternate FM modulated red and blue colour difference information between lines. I managed to get PAL working on ARM but never got round to making SECAM work.
Really incredible content as always 🎉 incredibly helpful and you’ve helped me learn so many core concepts
I'm glad they are of value to you, more to come!
Great stuff as always.
Great explanation. Thanks.
That is an interesting trick to generate color from a digital signal. I was wondering, since a variable cap is hard to get, what is the combined capacitance of c2 and c3 in the tank circuit. A value of around 73pF would tune the tank to the color burst frequency, but I an unsure if that is optimal.
I could only find a 33uH inductor so i think my ideal capacitance is ~60pF
I think the frequency will somewhat be fixed by the colour burst signal itself, so i think the dominant effect of the variable cap is to adjust the phase (hue)
Great video, really well explained! It would be interesting to see a comparison with CGA composite NTSC output.
CGA programs using 620x200 artifact color work _exactly_ the same way. The built-in RGBI to composite conversion is a bit more complicated. I that uses both sides of the clock (for effectively 28MHz) and might use a multibit DAC but the principle is similar.
Thanks for the feedback. GCA composite uses the same strategy.
Amazing 😮.
Dr. Regan uses semicircles to plot sinusoids. x_x
Haha yes busted. But if you can spot that, you probably already know the details.
This series is the best that I've found on YT. My Son and I are building a Z80 homebrew machine and we'll be using these videos as our definitive guide for the graphics. 🤩 We'd like to achieve a VGA output of 320x224 with 8-bit color - Ambient brightness lifted, but then use the remaining 2 bits of each byte for brightness (I've imagined this 8-bit color palette as the 'perfect palette' for 25 years but lacked the technical know-how to implement it and stumped that I've never seen it implemented). 🙃 I'll keep trying!! 😀 Thank you for your amazing content. 👏 Also... are your schematics and code hosted anywhere?
Excellent, glad you liked it. I have the schematic for the EPROM board on
github.com/Turing6502/SAP6502
It requires a clock and outputs G0-G19 are connected to address lines A0-19.
The other outputs are in the video.
Good luck with the build.
@@DrMattRegan This is awesome..thank you!!
Do you have done PAL for us european and the rest of the world ?
Good to know there's interest out there for PAL. I've had one for PAL on the back-burner for a while, so i might advance it up the schedule.
TV sender and receiver use sharp low pass, band pass , hi pass filter to implement these sub-channels. You cannot bit bang these. See how the VIC-2 has separate luminance and chromance output fed into a high quality filter? Audio has yet another filter. TVs even use surface acoustic devices. Maybe be use a SVGA video DAC at 50 MHz and some Fourier filter?
I think the filters and analog electronics are more significant in the receiving side than for the sender of a simple computer generated display. The frequencies are really controlled by the dot clock, and you can send out a nasty old square wave and it gets interpreted correctly using the filters in the receiver. Of course, i haven't tried to do this with higher fidelity YCbCr yet, so i may change my tune then.
@@DrMattRegan I thought that radio channels and cross-talk are an accepted theory. When sending: Without a hard filter at the borders of my band, I get a visit from the FCC:
And no, the information is interpreted wrongly. The classical example was an anchor-man with a jacket with a pattern on it. Back in the day, anchor men did not wear color, but checker board or thin stripes were allowed. With good equipment in the studio on a high quality RCA TV set the jacket stayed black and white. Cheap private local sender and the second TV in the kitchen and the jacket gained colors.
The C64 came out years after the Apple ][ . Engineers used this time to read up on the topic. Additionally, Commodore added traces to the PCB to pass both signals without the band pass. My brain still explodes when I try to imagine what happens when the color signal touches DC, but this is not real. It cannot hurt me. Even in S-video the chromance only gains like 1 MHz on both sides.
I actually like that chromance has no DC part. The baseband luminance should use the borders to compensate DC to zero. Or so. Ah, analog signals are such a mess. Digitally, I would use the color clock only. QAM modulation. One channel is luminance, the other is chromance and audio. This time time division multiplexing is applied ( after digital error correction ). Digital technology can delay the signal with ease. Already the tiny RAM of the Apple ][ would be enough to use the borders for information. Of course, for QAM I still need the color burst. I actually am a bit confused how modern terrestrial TV stabilizes the phase. I think QAM could synthesize it out of the signal (deviation). So it is important not to stop the signal (in the borders). Could even place audio into top and bottom border.
"I thought that radio channels and cross-talk are an accepted theory. " I trust that you've watched the video, but it's about composite NTSC, not RF transmitted NTSC. In particular how the Apple II generates artifact colour. Sure the Apple II goes above the 1.5 MHz limit for Q and 0.5 MHz limit for I, but i don't really think the FCC are going to come running for anyone using an Apple II connected via a composite cable to a monitor.
@@DrMattRegan radio channels were invented first. Those same engineers later squeezed in the color channel using their knowledge. Sometimes schoolcan take a short cut, but regarding color TV, the historical way seems to lead to wisdom. So many people praise composite over RF, while I cannot see a difference on my C16. RF does not manipulate the composite signal information.
Woz was pragmatic to get something out of the door. Then Apple moved away from composite pretty fast, like every home computer, instead of wasting money on a system invented to have many terrestrial TV channels.
Never would Woz advise others to repeat his stop gap.
Nice vid. I enjoyed. But can I point our your misshapen sine waves at 11:42? :)
Yes, it's actually very difficult to generate a sine wave in powerpoint. I figured that anyone who spots the shape difference, probably already knows enough about sine waves.
I have heard the description of NTSC and its inner workings many times, and it still gives me a headache every time. Given that it is backwards compatible with monochrome video, it's an engineering achievement. But compared to VGA it feels like such a hack. Although I am coming at it from the digital perspective and not an analog one. It might make more sense in an analog system.
I wouldn't try and get more than 16 colours from NTSC myself. VGA is much easier to use, but it was very clever of Woz to generate colour this way. Probably one of the important factors that bootstrapped the company. The apple 2 is even more complex, it uses the top bit in the byte to shift the by 1x14 MHz clock cycle which is how he gets 4 colors + black and white.
Jensen Huang at NVidia would pound it into the architecture team that the most efficient architecture wins an gets all the money.
NTSC stands for Never Twice Same Color.
What about 3x R2R networks one for each color.
That works for VGA, but NTSC uses the colour subcarrier of 3.579 MHz
Why not generate the VBlank stuff in software and save the ROM space?
The plan is to use it to make a ZX Spectrum and an Apple 2, so software control over raster isn't an option.
@@DrMattRegan Does this mean you will be taking your video experiments further in the coming weeks or months??
What is the emulator used here?
HI John, I've used AppleWin, which is available on github
I'm pretty sure NTSC stands for Never The Same Color
I'm going to pretend I understood some of that
Try watching the VGA from an EPROM. If you want a more in-depth and slower paced explanation, try the Apple 2 wire-by-wire series.
@@DrMattRegan I most certainly will! Expect me to come back here to change my comment at some point
1st !
So what?
NTSC stands for Never The Same Colour