RE: chroma dots @6:27. Your speculation is correct. The chroma signal is encoded by "wiggling" (modulating) the Y signal within that range. I find it helps to understand this by looking at the signal from the perspective of the TV doing the decoding. A black and white TV has two radio demodulators, one that locks onto the sound sub-carrier to demodulate the sound (which is FM encoded) and one to lock onto the video sub-carrier and demodulate the video signal (aka, Luminance or Y), which is actually AM encoded. These two radio demodulators are essentially independent, apart from the fact they are tuned to two signals that are right next to each other on the dial. Early TVs are actually really simple, electronically. All they do is take the output of the video demodulator (which is a nice voltage between 0v and 1v), detect the horizontal and vertical sync pulses (which are used to lock the frequency/phase of the two flyback transformers controlling the CRTs vertical and horizontal scanning) and then send the remaining signal directly to the CRT's electron gun to control the brightness of the electron beam at that position. 0.33v Represents black, 1v represents white, while 0v represents a sync pulse. The most logical way of adding color to such a system would be to add a 3rd radio demodulator, one which picked up 3rd sub-carrier and decoded a chroma signal. But this would make the TV signal take up more bandwidth, and the FCC had already allocated 6Mhz channels. Additionally, you would have to replace all the black and white video equipment in the recording studios and transmitters to carry and transmit this extra signal. So instead, the two demodulators are left untouched and the chroma signal is actually modulated on top of the Y (aka Luminance, Black&White) signal to create a combined Luminance/Chrominance signal (which replaces the Y signal and decodes fine as a Y signal on old Black and White TVs). Color TVs actually have to demodulate the demodulated the combined Y/C (Luminance/Chrominance) signal a second time to extract the chroma I and Q signals. The chroma signal is encoded in the high frequency of this combined Y/C signal. A black line on the TV would have a Y/C with a constant 0.33v across the entire line. A white line would have a Y/C signal of a constant 1v across the entire line. For a solid colored line (say bright-green) the Y/C signal will fluctuate between 0.93v and 1.07v at a rate of 3.58 mhz. The phase difference between those fluctuations and the 3.58 Mhz color burst signal at the start of the line encode the hue of the color, with 225° representing green (178° for Yellow, 100° for Red, 0° for blue). The height of the fluctuations represent the saturation of the color (±0.07v represent 100% saturation and ±0v would be 0% saturation, or grayscale). The average height of the Y/C signal of course represents the luminance. For complex lines with multiple colors, the fluctuations in Y/C won't be a constant 3.58 mhz, as it will speed up and slow down rapidly to change into the correct phase for the color at each location on the screen. To decode this complex signal, first Color TVs have to split the Y/C signal by frequency. Early Color TVs would have used a Notch Filter, with a range of frequencies around 3.58Mhz (say 2.8Mhz to 4.1Mhz) being extracted as the chromance, while the rest (say 0 to 2.8Mhz and then 4.1Mhz to ~5.5Mhz) being interpreted as Luminance. Frequency in a luminance signal is the rate of change of brightness across the line. A solid color across the line would have a frequency of 0hz. An image with vertical b&w bars across the screen, each about 1/10th of the screen wide, would have a frequency of 0.2 Mhz. With vertical bars 1/100th of the screen wide the frequency would be 1.9 Mhz. If you had vertical bars which were were about 1/188th of the screen wide, a color TV would actually interpret it as color infomation and show a solid color. (A number of 8bit computers like the Apple II actually took advantage of this to create color). But as the width of of the vertical bars got thinner and the frequency increased over 4mhz, they would become visible as black and white bars again. Modern TVs use Comb Filters that use the infomation from previous and following lines to extract a much better Luminance signal that preserves most detail even around 3.58mhz. See this document for more details: www.intersil.com/content/dam/Intersil/documents/an96/an9644.pdf So what are chroma dots? They are simply the actual chroma infomation which has been modulated right in the middle of the luminance signal. B&W TVs that were manufactured after NTSC was standardized are meant to the same notch filter that color TVs use, and simply discard the infomation with that frequency.
This also shows why it was so important and used to be required that the broadcast studios would KILL the Chroma Carrier for B&W programs to restore the full resolution image without color signal modulation for the B&W programs. It also allowed the Color TVs to shut down the chroma circuitry, and pass unfiltered full luminance signal for the program to allow the full resolution of the original pre-color standard addition.
phirenz, your xplanation is great except for the use of "modulate" to describe how the chroma carrier is combined with the luma. It is not modulated on (multipled by) the luma signal, but simply added to the luma signal.
THANK YOU SO MUCH, I've had trouble for a long time understanding how color is encoded into a composite signal, I knew it had something to do with phase-shifted frequencies, but not how it actually worked, and I didn't know what the color burst actually did or why the chroma dots actually happen, and I couldn't really get a good understanding from technology connection's video, but you have explained it amazingly, I finally understand.
It’s truly amazing! I don’t understand EVERY single aspect of the technical explanation but do get 90%. I remember watching COLOR TV at my Great Aunt and Uncles house, (they were very well off) and being amazed at it! Watching the Wonderful World of Color, (the Original name of the Walt Disney show) and very early episodes of Bonanza, remembering my Great Uncle or Cousin getting down on the floor in front of the set to adjust the color, etc and bring spellbound at 4 years of age sticks with me. My Parents got Color the Weekend after President Kennedy was shot as our TV (b&w 21 inch) had blown up with smoke and theatrics on the day he was shot around 4pm, and Dad put their bedroom b&w on top of the main set, then calling a few repair men who couldn’t come and leaving for some store that sold Color as they had planned on that anyway. Sadly he bought a Philco which staggered along with countless repairs until the 70’s! Our next TV was my 23 inch b&w my Aunt gave me about ‘74 and although it was a late ‘50s Stereo TV combo it was a Zenith and worked until 82, having been relegated to my room about ‘77 and I married and took it with me in ‘78.it was an amazing Stereo too and worked until late ‘82. So their is my nostalgic look back!
My grandma bought us a used Admiral color set in 1962. Disney World of color and Mitch Miller! We were poor, too, but she used her diseased husband's veteran checks to get the set. It inspired my career in electronics. I became a TV repairman, and later an automation tech. Color TV sort of guided my career.
@@davidlogansr8007 you wanna know a secret... I work amongst with for and around technical design engineers making sure all is in order and accounted for along any possible line..... we can truely consider ourselfs professionals in our highly technical field (cables) .... if we understand 90% of what we do I call that a brilliant day of exceptional fortitude... usually its 75% at best , the rest is pure luck, improvisation and an endless learning curve
I enjoy your presentations, the fact that I already know the material gives me a greater appreciation of how well you are able to simplify the subject matter and cram so much into a short video. I also know when you make a mistake, but I didn't detect any in this video. I was eleven years old in 1953 and the introduction of compatible color was a big deal to me, even though my family couldn't afford a color TV set. From 1953 to 1956 I saw many of the first compatible color telecasts by going down to local department stores, which usually had at least one color TV set on display. I would sit, or stand there for hours, just to see TV in color. The salesmen stopped trying to run me off when they discovered that I could answer technical questions for customers and I kept the hue and saturation controls adjusted as soon as faces started to drift to green or purple. Nobody could adjust those early color sets for natural looking flesh tones better than I could. I'm glad you talked about chroma dots, because I always knew when a show was being broadcast in color (even when I was at home), because I could see the chroma dot crawl on our old B&W TV screen as plain as day. I quickly learned to recognize the color of items on a B&W screen by the dot pattern, and I got so good at it that I could almost imagine I was seeing the picture in color on a B&W screen. I could do this, because in the early days of live color broadcasting, people on TV commented on the color of things in the scene, so people at home with B&W sets would know what they were missing. It may have been a marketing ploy, intended to spur the sale of color sets, but people on TV were always commenting on the color of clothing and items on the set during the early days of live color TV, especially on variety shows. Thanks to this feedback, it didn't take long for me to associate the primary colors with specific dot patterns. The color bars TV stations used to broadcast along with the test pattern in the early days also helped me associate colors with chroma dot patterns. The dot crawl was also unique to each color, so it was helpful in guessing color on a B&W TV. It became a game for me to announce the color of an object on a B&W screen out loud before anyone on the TV show said what the color was. My friends and family could never figure out how I did it, and I won a few bets with the trick before people became convinced that I could really do it. I remember one night when a bright red convertible sports car drove on to the set during a Perry Como color broadcast, and I knew instantly that the car was red from the chroma dots , before Frank Gallop (the announcer) mentioned that it was red. Red was the easiest color to spot on a B&W TV screen, because the chroma dot pattern really stood out and it had a crawl that almost seemed to flicker. I could spot red from across the room, but for other colors I had to be close to the screen. During the first color broadcast of the Original Amateur Hour, Ted Mack mentioned that his fountain pen leaked just before the show began and there wasn't time to change his jacket. Then, he added, that this was his first show in color, so the lucky people with a color set could see that the ink stain was blue, while the rest of us only saw a black spot. They did a close-up of the ink spot while he was talking, so it was easy to see. By the time Ted Mack mentioned that the stain was blue, I already knew it was blue from the distinctive chroma dot pattern. Anyway, keep up the good work, and I hope this ramble down chroma dot memory-lane wasn't a bore. I never thought I'd be talking about this for the first time after more than 60 years.
@@MattMcIrvin In NZ, that was only true of SANYO colour TV's... the only ones with a tint control here, IIRC ... not needed for PAL-D, but they still had 'em :-) Shops used to make their window display sets really blue so they looked Moar Colour!!1!! :-)
Another fascinating video. You're quite right that we started B&W PAL broadcasts in the UK in 1967. In fact, if you ever watch early episodes of a series like Doctor Who, which began as 405-line B&W, and then shifted to 625-line B&W in 1969 (that's roughly when BBC One shifted to 625-line) you can really see the difference in quality. With regard to making telerecordings of shows made on video tape, one of the main reasons for doing this was that the BBC were fairly active in selling their programs on to foreign markets quite early on. This was usually done on a hierarchical basis - rather than go to the expense of making dozens of multiple film copies, the BBC would simply make a handful. These would then be sent to the first foreign broadcaster who had the right to show them, that broadcaster would send them into the next, and so on. And because of the myriad of broadcasting standards that existed at the time (some markets would be on 405-line, some on 625. Some on PAL, some on Secam (the French system) and others on NTSC - and then add colour into the mix by the early 70s), then the simplest and most broadcast-standard friendly method was to use B&W film copies. Of course, there are three problems with this method. Firstly, you can't guarantee that the last broadcaster which had the film copies actually sent them onto the next in the chain. Indeed, the BBC have managed to recover programmes long thought lost for good by tracing telerecordings that were either never sent on to the next broadcaster, or in fact returned to their commercial sales department, BBC Worldwide (formerly BBC Commercial Enterprises). Before the BBC established a proper archive, the broadcast arm would often make telerecordings of videotaped programmes for Commercial Enterprises, then wipe and reuse the tapes to make something else. Commercial Enterprises would sell the telerecordings to overseas broadcasters, and after a few years of a programme being available for sale, assumed its commercial viability had come to an end. No further sales would be made, and they would burn the telerecordings, as they were under the impression that the broadcast wing did indeed have a proper archive. Many thousands of hours of vintage television was lost in this way. Often, the only reason there are copies of vintage programmes available, or we only have B&W film copies of programmes originally shot in colour is because of film telerecordings. The second problem, is that censorship standards would differ from country to country. And what might be acceptable to UK audiences might not be for Australian or New Zealand censors, who would often remove offending material by simply cutting the offending sequence from the master telerecording, rather than go to the trouble of making their own copy. And when this was sent into the next broadcaster, that sequence was missing. Sometimes, these trimmed sequences have been recovered and are the only surviving example of the programme at all. The third problem with the telerecording method, is that if that programme was original shot on video, the telerecording looses the smooth look of video; as instead of being made up of 50 fields per second, copying that programme onto celluloid blends those fields into 25 frames per second (if the camera was modified to run at that speed, or 24 frames if not). And that blending of fields into frames can lead to visible errors in vision mixing between different cameras from the original multi-camera shoot (almost all videotaped programmes in the UK were shot multi-camera, with a vision mixers switching shots between different cameras). Fortunately, this was rectified when one particular fan of vintage television noticed that a repeat broadcast of a programme that only existed at a telerecording, which he had taped on his domestic VCR, reverted to it's naturally smooth motion on screen as he fast-forwarded through it. Realising that the VCR was in effect "blending" the individual frames back into fields, he contacted some friends who worked in broadcasting, but who also specialised in restoration of old TV programmes. Armed with this information, they were able to devise a process where old telerecordings of programmes shot on video can be made to look like video again. With regard to how long it took to establish PAL as standard due to the challenging topography of Europe, in fact the BBC was still broadcasting B&W 425 line television to some parts of the UK as recently as the 1980s, due to those places being unable to get a good UHF signal for 625-line PAL - but they could manage a VHF signal for 425 line. Since the introduction of satellite and digital terrestrial broadcasting, this is less of an issue. But (for example) at my house, I used to receive a fairly poor analogue terrestrial TV signal, as I don't have direct line of sight to a transmitter, and must rely on the signal bouncing back down to me from the ionosphere. When the switched to digital terrestrial broadcasting (using the DVB standard) the picture quality improved massively. Then, the strength of that signal was reduced, and I can no longer receive digital terrestrial signals. I can get digital satellite though, so all my television comes via that. With regard to showing movies on PAL systems and then running at a slightly higher speed - yes, this is true. If fact, when James Cameron's Titanic was broadcast on BBC One some years ago, there were complaints from some members of the public who assumed that the BBC had cut out material, due to the slightly shorter running time. In fact, the film had been shown unedited - but due to the extreme length, the slightly higher speed of 25fps meant a noticeable difference (at least, for petty-minded fans of epic disaster movies). The slightly higher running speed of movies also means a slightly higher audio pitch, which most PAL broadcasters compensate for by lowering the pitch by the same amount. With regard to the colour-recovery method you described - the software was actually written in a modern Windows-version of BBC Basic - a version of Basic that dates back to the Acorn Computer's BBC computer series of the early 1980s, introduced with the BBC's Computer Literacy Programme - a scheme they introduced to make computers available in schools and used to teach children. Almost all UK schools had at least one BBC Model B Microcomputer at the time, and the version of BASIC included on BBC Computers was a particularly good one. There is another method of colour recovery that was used by that same team of vintage programme restoration specialists I mentioned easier. They were all fans of Doctor Who. Many of the early colour episodes of Doctor Who had been sold abroad as B&W telerecordings to those countries that didn't have colour in the early 70s; but sales to the United States and Canada tended to be colour, as that was obviously the preferred format for North America broadcasters. And then, in typical BBC fashion, the original PAL master tapes were often wiped. When this occurred, the BBC would (in later years) obtain back their NTSC-conversion masters, and reconvert those for PAL. However, one particular Doctor Who story only existed as relatively high-quality B&W telerecordings, and a domestic recording of a North American transmission on a home format, which wasn't a suitable for a proper VHS release, let alone a repeat broadcast on BBC TV. So, in order to make a good quality colour begin, the restoration team took the telerecordings for their picture quality, and matched that with the colour signal from the NTSC home recording - combining the two to get, in effect, a colour telerecording. This was the mid-nineties, so the differences in screen geometry were easily corrected by bending the overlaid colour signal at the edges of the screen to match the B&W telerecording. As this has gone by, the same team have built a rather sophisticated reverse standards conversion machine, which has vastly improved the quality of programmes that were shot on PAL, converted to NTSC, then converted back to PAL years later. These used to be pretty terrible, but now are almost indistinguishable from other PAL programmes of the time. And, by incorporating the frames to fields conversion, they have managed to make some very high quality restorations of vintage programmes that would otherwise be in a very poor state. Combine that with the standard repairs for hairs in the gate, scratches, speckle, etc etc, they Restoration Team (add they are known) often put as much work into a recovered programme as you'd get with a major Hollywood studio rescuing a movie shot on nitrate stock.
ZygmaExperiment There is no such thing as PAL 625 B&W, since PAL is just a colour coding system. You are referring to CCIR 625 (which is B&W) and in use in countries in Europe since 1950, long before PAL or NTSC.
The Claws Of Axos has had two releases: the first used the NTSC masters reverse standard converted (which turns out to be more difficult with the early conversions due to the crude set-up of the original method), and the re-released version used the B&W film copies to provide the luminance and the NTSC masters chrominance to get a better overall picture. Inferno also had the same re-release.
Greetings! I've been looking through past comments (I do read them all, you know) and I've seen a common suggestion for more info graphics and less talking head. I really do appreciate this sort of feedback and I'm doing my best to address it. However, for this video, there's not a lot to show since it is really more of a string of factoids. I decided not to go into SECAM for this video--I just delved a little into the PAL vs. NTSC fight we seem to still be going on about even though analog television broadcasts aren't happening anymore... Thanks for watching, everyone!
Technology Connections I do like me some info graphics, but you’ve always struck a good balance in my opinion. I REALLY liked your practical examples when you were showing the result of moving or disabling the yoke with the CRT tube, way more than an info graphic as well.
I agree with soupisgdfood. One of my favorite TV series of all time is The Secret Life of Machines (which explained the inner workings common household and office appliances, in case you've never seen it) and a big part of that is how it perfectly balanced exposition, informational graphics, and practical demonstrations. Your channel is the closest I've seen to replicating that formula.
Technology Connections Hi, what about calls for a multi part deep dive into Teletext/Ceefax/etc. This is the digital interactive television technology we invented in the 70s If I make the request many time, does that count?
You could have addressed one shortcoming of PAL that is often forgotten. High frequency image components could "bleed" into the part reserved for the sound. The effect was horrible. When there was text inserted in an image, like for instance a sports table with results, the TV set would make a loud noise that would overwhelm the narrators voice. SECAM would not do that. (I grew in the border region between France and Germany, for that we had multi-standard TV-sets. My parents bought a Phillips X26K221 in 1972, it could display German PAL, French SECAM and even the very odd and uniquely French high-res B&W 819 line standard).
You missed the SECAM French colour TV. Some explanation would be interesting because this system was used in the old USSR, Egypt, France and other French culture countries. When colour TV was going to be implemented in Colombia, we had representatives from the three standards, NTSC, PAL and SECAM. Each one gave very interesting demonstrations and all arguing that their system was the best. Finally our Government decided on NTSC which it had many detractors, it was the best system not only because all the studios and all TV sets were American standards and for a country so near from the US and also broadcasting many shows from the States made it the obvious choice. All this happened in 1978 and the first official colour TV broadcast was on December 1, 1979 at 6:30 pm, President Turbay started with a speech, then an American movie.
SECAM was used not only in USSR but also most (if not all) countries of the eastern block. I remeber the time in the nineties when out television switched to PAL and old color TV's had to be "retuned" to still accept color. Although it was called "retuning" I think they just replaced decoder boards, at least for the most popular TV models.
Brazil adopted a modified version of PAL in 1973, which was unique to the country. It combines the NTSC 525-line 30 frames-per-second System M with the PAL color encoding system. The "PAL-M" system was compatible with monochrome NTSC and not compatible at all with European PAL. So people had to use decoders for their TVs or VCRs to watch American tapes. Argentina, Paraguay and Uruguay adopted another variation, "PAL-N". Other South American countries went with NTSC.
@@pd209458 I get that people were glad to be rid of communism but did everything have to go to the extent of getting rid of the television standard !? Was the standard that lousy or was it more important to harmonize with Western Europe ? I mean surely they must have seen that digital television was coming down the track so why not retain SECAM until then ? I don't know if your country had indigenous television manufacturing- if so I guess it was good for business and I suppose most people change their TV about once every 10 years anyway, but I'm guessing that your average Joe Soap / John Doe/ Eastern Bloc equivalent must have felt jerked about being put through two such big changes in a decade and a half or so.
@@barryholt9564 I think quality-wise SECAM was comparable with PAL. There were some advantages and disadvantages for both as it usually is for such standards. I was 9 when the change happened so obviously I had no personal opinion back then, but looking retrospectively I think the main issue was compatibility with imported TV sets, VCRs, home computers and game consoles. We had domestic TV production and some Video Cassette Players on a license from GoldStar but none of the latter. And, sadly, at that time people didn't care much about domestic industry.
A few assorted remarks .. 1. The terms PAL and NTSC do not describe frame rate or line count. They are colour systems only, and any of the two (or three, with SECAM) can theoretically be used with any frame rate and scan line count. For historical reasons, NTSC is mostly used with 525/25, and PAL is mostly used with 625/25, but that doesn't mean that is mandatory, or that the terms define resolution. There is one country that uses/used PAL with 525/30, namely is Brazil. PAL colour on top of a 525/30 signal was also used as a hack for bridging incompatibilities in multstandard VCRs, and apparently a reverse hack, i.e. NTSC colour on top of 625/25, also existed. The 625/25 family of signals have been around in Europe since 1948 and were broadcast in black and white for nearly 20 years on. At that stage the term "PAL" did not exist, and colour television was far beyond the horizon. The fact that the introduction of PAL colour did not require a slight shift of the frame rate as in NTSC was just luck because the maths turned out differently, it was not planned for. The UK is an exception in this regard, because as opposed to continental Europe they opted to retain their pre-WW2 405 line standard after 1945; they could have introduced colour television (with any of SECAM, PAL or NTSC) based on that standard, there was no necessity to switch to 625 for colour. (And that was seriously considered, they did tests with all three colour systems with 405). The UK switched to 625 eventually mainly in order to alleviate incompatibility issues with the rest of Europe. So with the 1962 introduction of 625 lines, the UK was a latecomer. So if you based your research mainly on UK sources, you might be getting a slightly distorted view of the events. In continental Europe, 625/25 in black & white had been the standard for nearly 15 years before then. 2. I think the major factor why Europe introduced colour tv nearly 15 years after the US was economics. Europe was still building up from the war and didn't have the resources to invest into such luxuries, while in the US consumerism was already in full swing. 3. I'm in Germany, in the 1970s and 80s in some areas you could receive terrestrial television in NTSC from AFN, the station serving the US military. You needed a multi standard set which were rare and expensive then, so not many people did. From my memory, the hue problems of NTSC were quite apparent and we had to use the tint dial quite a lot to correct those green and purple faces :).
Thanks for your comment about the UK. I wasn't understanding all that stuff about incompatibility because we had an old black & white set that we used for a few weeks (maybe months?) in the early '90s. This was after the presumably early '80s, possibly '70s color TV broke. I grew up in the Netherlands.
France and UK had old and "exotic" systems that they had to choose incompatible new one. Other Europe coutries used B/W 625/50 system, so they switched to 625/50 PAL or Secam and all B/W sets worked.
Xaver Lustig True with the frame rate and line count. However, I believe at the time they were working within technological constraints of the era. Everything had to match the AC frequency while still allowing enough analogue bandwidth to transmit all relevant image, signal and audio data. The bandwidth is pretty irrelevant today with transmission of TV via a digital signal. Much more data for less bandwidth. The digital era has also made NTSC, pal and secom transmission standards obsolete... So all this is really debate.. Lol.
+Brett Ison Yes it's mostly obsolete now, except if you still play analogue tapes or insinst on only buying new equipment if it has analogue out so you can run your old tvs with it (as I insist on doing :). The irony is that even the cheapest modern flat screen televisions come with a super multi standard tuner for old analogue signals, even though that is hardly needed today. 30 or 40 years ago it would have been a dream.
Cannot believe how on point this guy's production quality and editing is, the way he emulates the NTSC colour shifting. Even though pal is 10hz slower, I now am very thankful I grew up in a pal region.
Strictly speaking, "PAL" is just the colour encoding and can be used with other line/refresh rates, it just didn't happen much in practice. So you *could* have it both ways- Brazil used PAL at 525-line/30-fps (i.e. otherwise the same as NTSC) for "PAL-M", though I don't think anyone else did(?)
The thing with Gonzales Camarena, and so many people mentioning him in the comments, it's that in Mexican elementary schools we are taught that HE invented color TV. Even one of the first Mexican tv channels, Channel 5, made his name part of the name code, XHGC.
@@RodolfoAmbriz He definitely did. But its incorrect to call him "THE inventor of the color television", instead of calling him "ONE of the contributors that helped that color television became a reality". Also, considering that his inventions were based on the experiments of Logie did 20 years BEFORE Camarena, it means the value of his contribution is not as high as Logie's and therefore, its totally incorrect to call him the inventor of color television.
Brits always get told Baird invented television, which he did, but the Baird system was extremely low-resolution and impractical. The later US-invented electron gun system was the first TV people would actually want to watch. I wonder if Aussies get told their country invented aeroplanes? There was a claim somebody flew in 1900, I recall.
A few points: the NTSC (National Television System Committee) in 1951 demonstrated PAF (Phase Alternate Field) and PAL (Phase Alternate Line). Unfortunately to facilitate the line averaging to cancel the hue error in PAL, a 64us glass delay line had yet to be invented. It was not available until about 1960 which led to the PAL standard. The BBC was planning to adopt NTSC as late as 1966 but at the last moment switched to PAL. The BBC launched 625 lines in 1964 which would eventually supersede the 405 line system dating back to 1936. The 8MHz channel adopted in the UK in 1964 as opposed to the North American 6MHz channel adopted in the US in July 1936 is the main reason for US television resolution appearing somewhat less. PAL because of the line averaging, reduces color vertical resolution and the alternating phase encoding didn't allow for as efficient chroma-luma interleaving which meant that fine detail in the PAL picture say was more prone to 'crosscolor' a flickering rainbow effects. This was seen on presenter's shirts and ties and was fun to watch in PAL. This and other problems with PAL led it to be referred to as "Problems Are Lurking". Alas, if the BBC had pursued 625 line NTSC, it would have provided the best of both worlds: superior resolution and color.
In 1980s Sydney, Clive Robertson, a late night news presenter amongst other things, used to specialise in wearing ties that he *knew* would wreak havoc with the video signal - he took pleasure in giving the tech directors a hard time :-) . For this and other reasons, his news show was fun to watch.
As a German i say, NTSC in the 50‘s was a great performans by american engineers! Heart of NTSC is quadratur-amplitude-modulation of the two Color Signals, you need mulitplyers, free wheeling oscillators and so on. In this times no problem, but in the 50‘s there are only very bad and expensive vakuumtubes.
In a NTSC or PAL signal (also called composite video) the color signal based on a subcarrier is modulated with the luminans signal. It is very difficult to separate the composite signal into Y (luminans) and C (color) without aliasing. Simpel filtering using passive electonic components like capacitors and coils will give the aliasing (the crawling dots) on a black-and-white TV. Broadcast signals were internally based on component video, thus no crawling dots in tape editing. Component is Y, R-Y and B-Y signals making the video processing much easier. Yes, it requires three cables and three identical amplifier circuits to transmit a component video signal. However, due to black and white TV the colors have to be transmitted as luminans and chrominans mixed together - the composite signal. The subcarrier would when be used to set the hue and decoding correct on colour TVs. On broadcasting the tape recoders or synchronizers used a comb filter to split the received NTSC/PAL into luminans and chrominans for later editing. The comb filter came in the early 90's and very expensive, and it did not make sense to implement a comb filter on mono chrome TVs. Most people will not know about the crawling dots, mainly on cyan colours, and will be accepted as a trade off for having backward compability with B/W TV when NTSC/PAL made is possible to transmit colour is a single and relative low-rate bandwidth.
A more informed analysis than many, well done. Two points. PAL did not get everything right, e.g. Hanover Bars which they fixed by repeating the previous line color info. PAL was also much more prone to Chroma noise on long distance high power transmissions. This together with the Hanover Bars fix were built into SECAM which was a much better system than PAL. Of course there is always a downside. Post production in Ntsc is easiest and Secam is hardest. As a result most Secam production was made in PAL and transcoded fir transmission. The one error in your piece is where yu talk about tapes being wiped and b/w films being made. A telecine is a device which scans a frame if film and converts the image and the accompanying sound to a video and audio signal for transmission or recording on VT. Telecines do not record anything. The devices used to make these film copies (mainly used for export sales to developing markets) were called telerecorders and made telerecordings. These were basically film cameras with an optical sound head pointed at a tv monitor. So you should correct that. Telerecordings were very useful in the early days of television where there was no real standards conversion and with careful line up it was possible to use a telerecorder to overcome this. Subsequently analog then digital converters recomposed the pictures with vastly superior results. Oh one last thing, you confused color standard which CCIR broadcasting standard. The line field frequency is the result if the broadcast system not the color standard. So in the USA M stands for 525/60, the black level, the Peak white standard and the placement of synch data in fly back area. So Japan has NTSC J which had 525/60 but differed in other areas including frequencies. PAL is often 625/50 but in South America PAL M using PAL color with US line field rate and Argentina had PaL N which had a further mixture if the two. So we all have one HD standard if 1080i/ 720p now right? Nope we have ATSC, DVBT DTMB and ISDB (2 flavors) as well as the line field rate the compression codecs the frequencies, mux etc etc. and 4K and 8K drift further apart lacking full standards yet. Look forward to you doing one on UHD. Thanks
about 25vs30: this has nothing to do with quality, but all to do with mains-hum. Europe has 50Hz/230V mains and PAL has a field-rate of 50Hz to keep the mains-hum outside the visible portion of the signal. America has 60Hz/110V mains and NTSC has a field-rate that's close enough to do the same thing.
"mains-hum out of the visible portion of the signal" isn't absolutely correct. Instead, using the same field rate as the net frequency ensures that if interference distorts the picture in a TV set, the distortion is stationary and doesn't run through the picture. Stationary distortions are much more sufferable than moving distortions. To make this work, the field rate was synchronized with the net frequency. The TV set would hang on the same electricity grid as the TV station. If you received a signal from abroad from another country whose electricity grids weren't connected with each other, it wouldn't work. By 1953 when color TV was introduced, TV manufacturing was already that advanced that interference was no longer an issue. So the developpers of color TV could create a system were field rate and net frequency were no longer identical and the field rate was no longer synchronized with the grid. This created problems in older b&w sets that suffered from interference. So color TV was not 100% downward compatible.
MrBlc Duh, hence why i said “it hasn’t been 110V for a long, long time”, so clearly I know it used to be the standard. But in that case, the OP would have needed to say that it was 50Hz/220V, since that nominal voltage has since changed, too. My point was simply to remind people that it’s 120V now, and has been for the better part of a century. It’s illogically irritating to me that people still call it 110V so long after that ceased to be correct.
The U.K. switch to colour was also a switch of resolution. The original 405 line system was extremely early electronic TV and, a bit like NTSC, had made design choices that were less than optimal in hindsight. 405 Line (Marconi) tv was also tested with NTSC colour, but it was never pursued as the outcome wasn’t very good and better technology was around the corner. Meanwhile, in France they had adopted 819 line B&W television (HD in the 50s!), which was tested with SECAM colour and proved to be far too high bandwidth to be practical with multichannel television. Several European countries had already standardised on 625 line black & white and that was the EBU (European Broadcasting Union) preferred format, so the odd ball and older standards like 819 and 405 were dropped in favour of 625 but the mid 1950s. Colour TV in Europe was then only added to ‘modern’ 625 line systems. So there were plenty of TV stations that began as 625 line services, like RTE in Ireland and broadcast in b&w for some time. Also B&W TV sets remained cheaper, so even after the launch of colour, they remained a budget option into the 1970s. British broadcasters also only UHF for 625 / PAL. So previous Marconi 405 line tv was carried on VHF only. That was not the case elsewhere eg PAL I was broadcast on VHF in Ireland and colour tv on VHF was quite normal in most of Europe, Australia etc etc for many years.
Interesting to know that the French tested colour with the 819 line system. Indeed, it was HD in the 1950s. Also due to the high bandwidth (14 MHz channels!), in different areas, they had to use both audio 11ish MHz above video carrier, and audio 11 Mhz BELOW video carrier in what was otherwise the same frequency allocation, to get get extra "channels". In (I think) Belgium, they had a compromise system that squeezed 819 line signals into 7MHz channels.
Boy you bring back my memories of color tv. I use to replace a lot of color picture tubes back in the day!!! The old round ones, the triad or triangle gun, the inline 3 gun, and the Sony one, which was the easiest to align , I could tell you stories. Sony’s were great color TVs but the tuners were a pain in the neck especially the UHF tuners. I had to rebuild many a tuner in my time. Electronic tuners fixed that issue. I still have 3 working NTSC TVs in my house, two Sony’s and 1 Zenith and the one Sony is about 40 years old and still going strong!!!! And yes I was a tv repairman!!!
I first saw a colour TV in a department store in the UK just before Christmas 1966. By late 1966 BBC2 was regularly broadcasting in PAL colour, though at the time the channel only showed 'trade test' films in colour. These allowed the retail trade to become familiar with the technology and enabled consumers to see colour TVs and to buy them before the official launch of colour on 1st July 1967. By spring 1967 several of the scheduled BBC2 programmes were also being broadcast in colour on an unofficial basis well ahead of the July launch date. But yes, it was indeed coverage of Wimbledon that marked the first official scheduled broadcast in PAL.
The funny thing is I've got an effect I've been working on for usage in "retro style" games that, to a reasonable extent, sort of simulates all this. In fact it literally takes an RGB image, converts to YIQ space, and then it produces a pure black-and-white signal by taking Y and modulating it with I/Q using sine and cosine carriers - then later uses those same sine/cosine carriers to reconstruct the YIQ from that pure black and white signal and convert back to RGB. And yes, if you view that signal output in pure black and white (as I've done for debugging purposes), it looks *precisely* like that weird dotted black and white pattern you refer to, although in mine it also tends to manifest a diagonal "stripe" pattern as well (due to some other quirks of my implementation).
You do a remarkably good job of explaining a difficult subject. I was in the TV industry for 40 years and am still learning things about the NTSC (and competing) color system, including the difficult math. Luckily I only had to make it work, not derive the equations! BTW, while I agree that the technical stuff would benefit from more illustrations, yours is one of the better "talking heads," IMHO.
So much information in my head I want to tell you but so little time to do it… The reason why PAL was not adopted everywhere (else) is that it isn’t the superior system and it is a lot more expensive! The idea of PAL is that the color information is inverted every second line. Any error will e.g. show up positive in the odd lines and negative in the even lines. The average error should be zero - in theory! By mixing the color information of two lines and taking the average, you get half the color resolution but eliminate the broadcast errors caused by interference while the signal is traveling to your areal. In reality, the saturation of the colors degrade with the error. So with PAL, a phase shift will still give you correct colors but they are less intense. It just makes the problem less noticeable, not vanish. Another problem is the costs. How can you mix the color information of a line with the information of a previous line? The previous line is a matter of the past, long gone! Digital video signal storage was impossible in the early PAL colour TV era. And when it became possible, the circuit board in a studio broadcast machine which does it digitally was sold for over $4000 (mid 80s). Even today those boards are sold for more than $300 (used)! So a PAL TV contained an ultrasonic delay line. An ultrasound transducer injects the color signal into a crystal which is then picked up by an ultrasound receiver 63,942 µs later. This method was replaced in the late 1980s using sophisticated chips but still around until 1995. This crystal is a real jewel so while NTSC was nicknamed “Never the same color”, PAL was known as “Pay Additional Luxury”. The problem is that if the video runs a bit faster or slower, the delayed color video line can’t match with the current one. Also there are manufacturing tolerances as well so it is impossible to get a good match. This also decreases the saturation of the result. But there is a simple remedy, crank up the saturation. So while the picture still looks good, the color resolution is decreased. You loos 1/2 of the color information in the PAL system and then you loose more information due to tolerances. But this is still OK since the colour resolution of the human eye is only 1/3 of the brightness resolution. So PAL which is actually real bad for the colors is still OK for the human eye. SECAM tries to fix this by just sending one of the 2 information on the carrier alternating. You still need the delay line which never really matches up but as a result, you get no losses in saturation and you only use 1/2 of the color information. Since the human eye can’t notice that the colors won’t quite match on the brightness pattern, it also works good for the human eye - just with more accurate colors than PAL. Since the late 1980s, NTSC became superior to all the other formats! They just add a line of test patterns above the visible part of the screen. The micro-computerized Tvknows how this line has to look like and can detect all errors and then work simple filter circuits to compensate those errors. So modern NTSC TVs have much more color information and can eliminate errors just as good (or better) than classic PAL or SECAM. PAL+ also does the same nowadays so NTSC and PAL can break even here. SECAM on the other hand misses half of the color resolution and there is no way to restore it. So the former superior SECAM is now the worst system while PAL and NTSC sharing the same higher quality. Fun fact, the video encryption system “Nagravision“ also scrambles the PAL+ line. When PCs became powerful enough (>400Mhz), by identifying where the PAL+ line is, they could rule out 90% of all different ways the picture could be scrambled. By simple try and error they could figure out how the picture is scrambled and decode most European pay-TV channels in real time. It didn’t take long until all EU Pay-TV stations went all digital.
Here in Sweden, one of our public TV stations at the time (and one of the only two that was allowed to air by our government) did a pretty well known april fools joke when color TV was introduced. They did an informative broadcast that told people wathing that they could convert their black and white television set so a color one if they stretched a pantyhose over the TV. Yes, people actually fell for it.
And here is why the Delta Shadow Mask is not used for “normal living room” TVs: While the Delta mask gives you a good resolution and the electron guns are nicely packed tight in the CRT neck, it also swallows way over 80% of the beams. This is why a mid-sized monochrome TV can work with an acceleration voltage of 8kV or less, a color TV the same size needs at least 21kV. You need much more energy inside the beam to have an adequate brightness of the picture. A Trinitron will not only allow a maximum of resolution, it eats only about 40% of the beam energy (1/3 in theory), you get more than twice the brightness out of your beams. But there are two major problems: 1) the shadow mask is made out of thin wires which have to be really taunt which adds a tremendous force to the screen. The glass of the screen has to be a lot thicker to handle the forces. Just put a 14” trinitron and a 17” delta monitor on a scale, the smaller trinitron is a lot heavier due to all its extra glass. 2) The electron guns need to be sitting in a line next to each other making the neck of the tube a lot wider which in turn makes the deflection coils a lot larger and requiring a lot more power for deflection. Distance is your enemy when working with magnetic fields, the field weakens to the cube with distance! And since the gun beams have different distances to the deflection coils, you need to have an extra homogeneous field which in turn requires even larger coils to create a “Helmholz pair” A compromise is the slotted mask. Like the trinitron, it eats up less beam power (somewhat more than 50% if I recall correctly) but doesn’t add forces to the screen like the trinitron. So living room sized TVs are much cheaper and brighter using a slotted mask while small TVs still can use a delta mask for better colors. Computer monitors have to have a delta tube to be able to display an adequate resolution - or need to be trinitron.
This type of mask was the one in the early tv sets. It was called delta tube. The later was inline. The electron beam guns in the delta tube were arranged like a triangle. The second in line. The inl line tubes had a much better brightness an the convergence of the colors where much better.
I know that UA-cam is great for research but I literally have never come across a UA-cam playlist with this much information and depth into any subject. Even up to practical illustrations. My goodness. I was looking for a 10 min history on the invention of TV and now I feel like I have a degree in Television. Thank you so much for this content. This can actually be a publishable book.
3:13 Yes, all of you from PAL regions have actually been watching our movies in the wrong pitch. If you've searched for clips from your favorite movies and heard them with a lower pitch, you're actually hearing them in the correct pitch.
You could probably narrow that down to just UK, since every country had their own channels and movies were usually dubbed into local language. At most it would have been music or sound effects that were affected by the pitch shift.
I think today they use an electronic pitch shifter to fix this, shifting the frequencies down a couple of percent. With such a small change, it's unlikely to generate unwanted noise artefacts.
Mandolinic that solves the pitch but not the runtime problem. If you synchronize an NTSC and PAL version of a movie, I think you'll find that they will still fall out of sync when you leave them running.
Years ago digging about in my Nan's loft I found a wheel with red & green acetate panels in it attached to an ancient electric motor. I asked my Nan what it was & she told me it was my Grandad's experiment trying to turn black & white TV into colour. This suddenly makes sense! He'd died by this time so he wasn't around to ask but he used to build his own radios & I can remember him making microphones with me as a kid from scratch. He must've been a cool dude back in the day! Thank you for explaining this 2 colour method. That wheel diagram was exactly what I found in the loft that day!
Regarding chroma dots (and I hope I don't repeat anything you've already said, or bore anyone needlessly)... the dots you're seeing when viewing a colour signal on a high resolution B&W monitor is the chroma signal itself. ...I was going to go into the details of how the chroma signal is generated, but the comment got incredibly long and probably unintelligible. Suffice to say, the chroma signal is made up of a modulated sine wave. Such a signal has peaks and valleys in magnitude when you look at it up close. The overall height of these peaks and valleys (when you 'zoom out' and look at the amplitude of the signal) represents the saturation of the signal. If a B&W television doesn't filter out the chroma signal, it will draw the tiny peaks and valleys as dots on the screen. Highly saturated colours have a stronger carrier amplitude, and have a higher contrast between the peaks and valleys, making them more noticeable. The effect is magnified by gamma correction, because the peaks of the sine wave get stretched, making the dots even more visible.
I just found your channel a few days ago and am absolutely LOVING your content! The way you incorporate video effects to co-narrate each point really illustrates each point very well, and your analysis is both thorough and insightful (IMHO). Keep up the great work!!
erikig In the Netherlands the waiting was more a political than technical issue. The government had spent a lot of money into B&W equipment. They just didn’t want to invest in color. The PAL system was invented in 1962 by Walter Bruch. The first PAL broadcast in most of Europe was not until 1967.
It wasn't affordable for most people until around then anyway. From what I've read, colour TV sales only overtook B&W ones in 1972 in the US and 1975 in the UK.
Also worth mentioning regarding the interleaving of Y and C signals is that when viewed on a spectrum analyzer, the luminance signal appears as a series of frequency peaks with gaps between them, such that when the chrominance signal is well designed, it will fit in those gaps with little overlap.
Hey, I love your videos, and please, keep adding subtitles to the video, I'm learning English and it helps a lot to understand completely the video, because have so many different words, specific words, so the subtitles help to understand!
I am from the UK and just about old enough to remember the crossover between 405 line and 625 line transmissions. For a while people would have dual standard tv's capable of receiving both 405 line transmissions on VHF, and 625 line transmissions on UHF. This led to a messy mix of antenna's on the roofs of houses. Back then renting your tv set was very common in the UK, rather than owning your own set, so the change over did not really affect people too much, the rental company simply swapped your set when the time came. And remembering just how often we had to call out the tv repair man back in those days, i think renting was probably a good idea.
I love this channel so much. I'm always excited to see a new upload. As an engineer, this is like eating a perfectly nice steak, but for my brain, and it's a youtube video, and I'm bad at analogies.
I remember well the dual standard televisions I helped install in my youth. They were capable of both 405 & 625 lines with complicated switching between the two systems. I remember assisting the installation of a Philips 26" dual standard colour TV that was huge! It was mostly valve (vacuum tube) based and soaked up over 500 watts from the mains! Watching a black & white programme on the 405 line system on the 26" screen was like viewing it through a Venetian blind! It took a couple of hours for the TV guy to set up the convergence, purity, barrel/pincushion distortion, settings, etc. after he had degaussed the shadowmask tube with a mains powered coil. We were amazed when colour sets could be imported from Japan, plugged in and work without ANY setting up at all!
Phase alternation was actually tried by the NTSC. At the time, however, no 1-line video delay technology was available, so the cancellation of phase errors would depend on the eye doing the cancellation. This is not perfect, and could result in visible flicker or horizontal line patterns in the picture. It was decided that a reliable improvement could not be obtained and the technique was abandoned. When phase alternation was adopted for PAL, economical acoustic delay lines for use in receivers were available. It was found that poorly operating receivers could produce visible horizontal line patterns, which were nicknamed "Hannover bars" by some, based on the city where PAL was invented. These same delay lines were used in top-line NTSC receivers to make "comb" filters to reduce the interference between chroma and luma. Comb filtering was much more difficult in PAL receivers and generally not attempted due one phase of subcarrier not alternating phase between lines, while the other phase did.
A version of the color wheel system (at the recording end) was used to send color video back from the Apollo Moon missions, because a three-tube color video camera was just too finicky and bulky to send, but there were usable monochrome cameras that could be fitted with a mechanical color wheel. The interleaved signal was converted to NTSC on Earth using a whole lot of old-school analog video wizardry. I think the earlier missions only had the color-wheel camera inside the spacecraft, but later ones actually took them out on the lunar surface to send back live color TV. I saw a webpage somewhere that argued that the format was basically identical in some sense to Col-R-Tel.
Yes. The Apollo 11 mission used only B&W, running at 320 lines and 10 fps, and up-converted on Earth. Colour wheel was used subsequently. Wikipedia has a great article on this; "Apollo TV Camera".
When I was in London in 2002, one of the channels ran "The Matrix" and it was a WEIRD experience for me. Having seen the movie a whole bunch of times, I could instantly pick up that someone was strange about the audio. Everything was a little to quick. I was certain of it, as my friends and I would do Agent Smith impressions all the time and the exact timing of his dialog is key. I don't know if they still do this, but it was very interesting to see. Um, hear.
i love learning about this kind of stuff... the geeky questions i pondered as a kid before i had free access to the internet. you're well spoken and thanks for putting in the effort
Colour recovery from chroma crawl was theorized by James Insell after spotting colour breakthrough on UK broadcasts of telerecordings of Dr Who programmes, and further developed an idea of what was happening with Steve Roberts of the Dr Who restoration team. Richard T. Russell then wrote the software that actually put the theory into practice. I can not get my head around the maths, but my understanding is that at present it is believed that such colour recovery is only possible on PAL source material. I believe this is due to some aspect of the phase of the colour burst being impossible to recover on NTSC signals. I would not rule it out however. There are a lot of recordings that colour recovery is not possible on. The UK archives are full of anomalies, including programmes that shot on colour equipment that were only ever available in monochrome due to industrial action during the time of production that meant staff refused to operate with the saturation dials turned up. As a result, there are a few programmes with really faint colour as the dials were not turned all the way down, or because their minimum setting was not actually zero! I should also point out that when home computers arrived, the Amiga was the only machine that exploited the NTSC and PAL systems to it's advantage. Both varieties of the machine slightly underclocked their CPU's and other custom chips to 7.16 mhz and 7.09mhz respectively, so that they would operate at a speed that (through some maths) would complement some aspect of the video system. Having the cpu and custom hardware set at these particular rates reduced timing overheads and opened up opportunities for exploiting the hardware further. You could generate a lot of effects and work with a lot of video applications in the professional domain that would have otherwise have been impossible without a very expensive high end system.
Quite often, programmes produced during the colour strike were rendered B&W by the simple expedient of actually removing the colour tubes from the cameras.
In every attempt I have made to learn about color analog TV, I noticed sources abbreviate different highly technical stuff. I appreciate your ability to speculate due to the lack of definitive sources. Knowing the risk that I have previously learned others' "errors", I believe (but am not certain) that you have not correctly diagrammed the TV signal correctly at 6:25. May I posit: 1. You portray sync, then back porch, then color burst. I previously learned that back porch comes AFTER active video and the front porch comes BEFORE active video also carrying the 3.57~MHz (NTSC) color burst. 2. You portray the front porch following the active video; I present that this is the back porch. 3. The front porch is the zero volt sync to provide what's really known as the horizontal retrace (the raster sweep moves back horizontally & 2 lines down [remember the interlaced scan] for the next raster/sweep. The color burst modulates the front porch to not more than two volts so as not to confuse B&W TV's. The active video provides 2V to 5V for black to white video. I'm still pretty sure that I am missing a smaller voltage change between these two points (front porch & active video). Each line of active video still comes with more front porches until the bottom of the field, providing a back porch (really called the vertical retrace) to return to the top of the screen. Bonus: the back porch carries 2 characters (16 bits) of B&W dots for closed captioning, again the voltage not high enough to confuse non-CC TV's. Thank you for reading my two cents and feel free to reply & attack. Color analog TV was the singular most complicated circuitry in anyone's homes and depended on the widest variety of other technologies. Information on how it worked for an audience of non-engineers must necessarily be fraught with not just errors, but summary data, guesses, & speculation. To be sure, I highly respect the host and he jives with and taught me more compared with much else I learned elsewhere. Whew!
I told you Camarena was more of a myth than a fact. It actually turned out worse than what I already had found out about him. Great video. I'm glad it hasn't got any hate from Mexican viewers that still believe the myth. BTW, we in Mexico almost had the bicolor system as standard. Camarena was in the board of Telesistema Mexicano, the foundation of today's Televisa, after the merger that created TSM with his own station XHGC channel 5 (guess what the GC stands for in the call sign). An electronics associate was ready to deliver the TV sets to department stores but Camarena died in a car accident. This was a major setback for Camarena's system to start and soon after, the government opted for the NTSC standard, probably in an act of spite against the privately owned TSM as the government wanted to have their own TV network by forcing the networks to go bankrupt and expropriate them, which in the end happened with Channel 13. If Camarena and Televisa would had succeeded, they would had become an even worse monopoly as they would not only provide the programming but the sets as well.
Neither a myth. Color television was made by many people, making different systems and contributions. Camarena knew how to "sell" this idea in the US, and that's what he did. That's why he is so famous, and for integrating the new colour TV infrastructure in Mexico.
Something that may fit in here or interest you - colour LCD backlit monitors for fairly cheap displays - the LCD areas for a colour are open when the RGB LED backlight is the appropriate colour, and as that can be rapidly switched, you create a colour image from a simple RGB source and LCD display without complex pixel mapping, just managing the timing and POV.
Interesting! Wasn't aware of that colour restoration from old Video Programs that were recorded off TV screens to 16mm. Good thing the BBC kept these filmreels in their vault to be scanned in HD years later. I wonder how much could be achieved if these were scanned at 4K? Maybe the results would be a much more precise restoration with more details of the dots preserved. The fact that even regular Full HD scans have brought such good results speaks for the Filmstock they used.
mmh. Really all a matter of taste when it comes to film. PAL introduces a timing discrepancy, 3:2 pulldown introduces a visual discrepancy. Based on your own diagram 3 of 5 frames are correct, 2 out of 5 are awkard blended frames, one out of 4 of the original frames doesn't ever get displayed cleanly. Some people have higher sensitivity to visual artefacts, some more to temporal. Eventually this was solved more thoroughly, but that took a long time...
Yes! Those early red, green, blue dots are what I used to see when I put my eye(s) against the screen of our 1960's color TV when I was a kid. Thanks for showing them, I was beginning to wonder if my recollections were inaccurate.
Pal wasn´t that bad for watching TV programs or films, cinema was only 24 pictures second for example. So 25 fps was fine for video and in exchange you had a better resolution. Problem was with videogames, we play mostly videogames from NTSC regions (Japan and America) but they came in lower NTSC resolution but also lower PAL fps (And frame rate is important in videogames).
Strictly speaking, "PAL" was just the colour encoding system and didn't specify the frame rate. Apparently Brazil used (and still(?) uses) a variant called PAL-M which used PAL colour encoding but with 525-line, 30fps (i.e. otherwise the same as NTSC). Ironically, "PAL" was so often used as a synonym for 625-line/30-fps, you sometimes saw it in cases where PAL itself wasn't even used (e.g. a PS1 using component colour connection). As far as "regular" 625-line/25-fps "PAL"(!) goes, it should be made clear for those who weren't there at the time that the problem with the frame rate and older computers and consoles wasn't primarily that the screen refresh was lower and more flickery. The bigger problem was that- since many games back then were both timed and sychronised to the frame refresh (e.g. using the inter-frame "vertical blank interrupt" for certain calculations and updates), NTSC games ran 16-17% slower on PAL systems, even if the CPU in the console/computer itself ran at a similar speed. (This was the case with the Atari 800, for example).
Excellent video, can't wait for the next one. Also, fun fact about SECAM: here in Poland we have used this standard until the end of the 90s, and later switched to PAL - I always wondered why, and never got the answer.
Eastern Bloc countries used SECAM because it made it harder for people to watch TV from the democratic west - the only Western country which used SECAM instead of PAL was France and that was too far away from Eastern Europe to make listening in to French programs possible. All the western countries next to Warsaw Pact ones used PAL. So when the Warsaw Pact chose SECAM it made listening in on naughty western TV programs harder. en.wikipedia.org/wiki/SECAM#/media/File:PAL-NTSC-SECAM.svg en.wikipedia.org/wiki/SECAM#The_spread_of_SECAM "The adoption of SECAM in Eastern Europe has been attributed to Cold War political machinations. According to this explanation, East German political authorities were well aware of West German television's popularity and adopted SECAM rather than the PAL encoding used in West Germany. This did not hinder mutual reception in black & white, because the underlying TV standards remained essentially the same in both parts of Germany. However, East Germans responded by buying PAL decoders for their SECAM sets. Eventually, the government in East Berlin stopped paying attention to so-called "Republikflucht via Fernsehen", or "defection via television". Later East German-produced TV sets even included a dual standard PAL/SECAM decoder." After Communism collapsed people probably changed to PAL because PAL sets were more widely available. Though if you were going to stop people watching Western TV it would have been better to have a completely different, incompatible standard. SECAM tvs can still see a PAL broadcast in mono. But then the USSR and Warsaw Pact was never much good at innovation and manufacturing. Whatever the reason, all the Warsaw Pact countries migrated from SECAM to PAL after the Berlin Wall fell, communism ended and the Russians left en.wikipedia.org/wiki/SECAM#Migration_from_SECAM_to_PAL
>After Communism collapsed people probably changed to PAL because PAL sets were more widely available. I already knew about the fact that the whole Eastern Block used SECAM, just wondered why we changed from SECAM to PAL. That is just the explanation I wanted, and it sounds reasonable enough to be true.
Bero256, why was it easier? SECAM and PAL are kinda similiar, the only thing that differs them is sound placed in another place of the wave and color coded in a different way; Framerate and amount of lines - two things that could actually make a difference in editing video are the same in both SECAM and PAL.
In the 60's you had to buy a special BBC2 aerial(antenna) to receive that channel. So you could see which houses had it by looking for the extra aerial on the chimney. After 625 lines became standard across all channels, the term "BBC2 aerial" faded into history. TV engineers I worked with back in the day quipped that PAL meant "Picture always lovely" usually followed by some derogatory NTSC backronym.
When NTSC color broadcasts began, many viewers experienced tint issues. The tint issues were solved for people in Europe by introducing PAL instead of NTSC. In PAL, the same disturbance of the signal that causes NTSC colors to go wrong causes only a slight decolorization of the image. In other words, in case of bad reception, with PAL you see a picture that's a mixture of the correct color picture and a black-and-white picture. That's far more acceptable than the tint errors of NTSC since we were used to watch black and white before NTSC and PAL. So, at that moment PAL was better than NTSC. It has remained so ever since. End of story. No! A game changer happened. Several redundant game changers happened. The first one is cable TV. It's not just that cable TV has better reception, should have, sometimes you can have a bad signal from the cable provider. The color errors of NTSC happen because of multipathing. The reception of an OTA signal can be as bad as it gets, if there's no multipathing going on, the colors are correct with NTSC. But all too often there's multipathing OTA. Cable TV and satellite TV cannot suffer multipathing. Whether the reception is good or bad, NTSC always produces the correct color and PAL has no de-saturation issues at watching TV over cable or satellite. Another game changer is the plus package. There's PALplus and NTSCplus. They're downward compatible to PAL or NTSC respectively the way NTSC and PAL are downward compatible with b & w. With the plus package, the color remains stable even when multipathing is going on. These changes caused problems of PAL and NTSC to go away and NTSC is no longer worse in the way it once was. Now we can look at more minute details of both systems and find, PAL has disadvantages. This www.hawestv.com/mtv_2color/mtv_2colorP3.htm website states, PAL has a lower "gamut" than NTSC. In NTSC, Q is broadcast with less bandwidth and less signal strength than I. Q can be broadcast with less because we don't see the difference. In PAL, U and V use the same bandwidth. In consequence I and Q use the same bandwidth. It can't be in lower sideband modulation but must be in double-sideband modulation. This means I must be broadcast with less detail which difference can be seen. The pattern of dots caused on a b & w screen as shown in 5:35 was more intense in PAL than in NTSC. For one because Q was transmitted with lower amplitude. On the other hand because etc In the end, NTSC was the better system. At first, we waited for so long to introduce colour TV here in Europe, just to have a better system. Then our glory didn't last long.
My mom had one of the first TVs in Tacoma, WA. Later, when color TV came out, her family members would fight over the tint, and one (an uncle?) would just turn it back from color to B&W. I never understood this, but your video explains it all - the dependence on phase alone, the inconsistencies of vacuum tubes, and variations in Earth's magnetic field caused problems that my youth never exposed me to.
One of the the various issues with NTSC PAL and SECAM is the ability to maintain quality over long cable transmission (esp. in the days before satellites)... ironically ..apparently.. NTSC was more stable and easier to manage in such conditions ..perhaps because it could be rectified more easily. SECAM was notoriously difficult to use in the studio (chroma key for instance is impossible) so even in France programs were created in PAL and transmitted in SECAM . As for frame rate.... many UK productions were filmed at 25fps for obvious reasons. Of course in the UK during the 50s the BBC experiments with 405 line NTSC colour ... but in the end the decision was made to hold off colour until 625 line came on stream. I suspect because 405 line NTSC displayed on sets of the day looked as good as 625 line in an era when CRTs of the day had relatively lesser resolution. In the valve (tube) era stability of transmission chains was a big issue for the BBC and I suspect that was also a major deciding point ..but also I get the feeling the Beeb simply did not want colour at the time. ITV was pushing hard! Advertisers wanted colour. As for NTSC vs PAL.... forget it.... side by side NTSC -M vs PAL G is all about resolution and stability... and colour accuracy... and PAL G wins both but flicker is a more subjective issue... I have never be worried by 50hz flicker... I have lived with it all my life...these days of course.... 525 line NTSC at its best is good but somehow reds and oranges never look right for me ... start an NTSC DVD and the WB logo looks orange not gold...purely subjective...
Regarding the first paragraph... NTSC's hue fluctuation issues were only ever an issue in over-the-air broadcasts. Once put into a wire, it was rock-solid. Regarding the last... I suspect the color issue was due to the display not really being set up for NTSC image reproduction, as I've never had those issues on my American television. If your display is configured for PAL's different color gamut and gamma curve, NTSC sources will look wrong. And those adjustments aren't controls that were ever readily accessible to the end user.
SECAM used a narrow band FM carrier for the color. SECAM also only transmitted one color component per line, which required a line buffer (delay line memory) to store one of the color components while the other was being transmitted. NTSC/PAL didn't do this since it used QAM for the color components. QAM is a bit tricky to understand, but basically, if you look at a sine wave, there's spots in the middle where the wave crosses the 0 state that don't have any information. QAM takes another AM wave, and spaces it such that it fits inside those gaps, so you can have two signals in the same space. With FM, since the gaps are always moving around, you can't do that, so to get both color components with FM, you need the line buffer. This also cuts the vertical resolution of the color signal in half, but since the horizontal resolution was already crap to begin with, that's not an issue, though it is still worse. SECAM could theoretically have the best color reproduction, but it was terribly complicated and was vulnerable to the FM cliff effect for the color signal.
kargaroc386 Didn't SECAM also encode video signals in YDbDr format, which has luma, along with two separate chroma subcarriers that ride along the luma channel in a composite signal? I know it was also the only analog composite video standard to do this along with both vertical and horizontal subsampling thanks to having two delay lines instead of just one like PAL. I just wanted to verify.
No. Both PAL and Secam TV sets have 2 delay lines. One is for correcting luma / chroma delay and is used in NTSC too (NTSC use another I/Q delay line). Second is for chroma. In both systems it is used for adding next lines together - in PAL to correct phase error, in Secam to join red and blue chroma components.
@@xsc1000 This answer is a bit confusing. All color receivers require a short luminance delay line of one microsecond or so because there is a delay of the chroma due to the narrow bandwidth of the chroma circuits. The other delay line that is discussed is a much longer one-line delay that is used for other purposes: in NTSC, as a comb filter; in PAL, as a phase error canceller; in SECAM, to make both chroma components available simultaneously.
Great video, however likely been pointed out frame rates between the US and Europe, 60fps NTSC compared to 50fps on PAL are down to our European mains frequency of 50hz compared to yours at 60hz with each running at two fields per second in good old analogue. That's the reason.
lol, resolution over framerate. And now everyone wants framerate over resolution. I know they're not the same thing (audio/video vs video games), but still.
Besides the difference in video vs games, back then, the resolution was the weak point. Having 25-30 fps is fine for TV and movies, and was plenty for the games of that era. But 525/625 lines is pretty bad, especially with analog signals and CRT diaplays. Today, resolution has improved to the point where it's usually not an issue.
4k rocks anything more than that is pointless unless you have a really big screen up close, as far as farme rates there is a slight to do 120/144hz over 60 anything more than that though is even more pointless
They're completely different because when watching TV you don't need to worry about how quickly you see a response to your input. Seeing a faster response to pressing a controller button makes the controls feel more precise, and that's the primary reason gamers prefer higher frame rates.
If I remember right, the worst thing about SECAM is that they didn't invert the luminance signal. In NTSC, a strong signal means "dark", but in SECAM it means "bright". RCA figured that snow in the bright areas (where the signal would be weaker) would be less annoying than snow in dark areas.
SECAM had two color carriers and was capable of both vertical and horizontal subsampling, making it the first color, analog, composite video, television standard to do this. I think SECAM looked the best.
+Scott Larson SECAM is a colour system, it does not define the underlying b&w signal. It's true that France uses system where where amplitude and brightness are reversed to all other systems, but in Eastern Europe it is (or was) used with systems D/K and B/G which are the same as in some PAL countries.
The SECAM specification certainly does define the underlying monochrome signal. All color systems rely on a specific monochrome system otherwise the color system won't work. That's why Brazil doesn't use PAL. It uses a variant called PAL-M because PAL as used in the rest of the world wouldn't work with the NTSC-based monochrome system.
+Scott Larson Sorry you're in error. Any colour system can be adapted to any monochrome standard. Not all possible combination exist in real life, but they are theoretically possible. SECAM has been used with ITU standards L, B, G, D and K, PAL with B, G, D, K, I and M. The latter is the combination used in Brazil, and that is absolutely a PAL standard. And this is also why it's called PAL-M, it means PAL on top of ITU M. Using the same naming system, the standard for UK is PAL-I, most of Western Europe PAL-B/G, France SECAM-L, Eastern Europe SECAM-D/K or SECAM-B/G, mainland China PAL-D/K. That is what those combinations mean. It does not mean they're "not really PAL". And only system L as used in France is the one where luminance is inverted, all the SECAM variants used in Eastern Europe aren't. If there was a country which used NTSC on top of one of the 625 line signals, it would be e.g. NTSC-B/G, or in the UK NTSC-I (I think the BBC seriously considered this option). The BBC also tested all of the three colour systems with their 405 line standard A. If they had gone that route, it would be NTSC-A, SECAM-A or PAL-A, whichever it would have been. en.wikipedia.org/wiki/Broadcast_television_systems#ITU_standards Note that list defines as one of its parameters the frequency of the chroma subcarrier, but it doesn't specify how it is to be modulated.
I was wrong! I thought with my background and hobbyist research, this channel could not tell me something new about analogue television. But, reconstructing color our of monochrome artifacts? I know these artifacts from first-hand experience but would not think it is possible to derive the color from it. That is ... fantastic!
No I didn't and it is weird seeing myself appearing on a much more popular channel. I still prefer the idea of speeding up by 1 fps though, no resolution was lost and no frames had to be blended leaving the picture quality unaffected but I see that there are mixed opinions regarding sound.
Resolution isn't really "lost" in a 3:2 pulldown. The fields either stay onscreen 1 full frame, or 1.5 full frames worth of time. There is always 1 full frame visible for each full film frame in a 2-field period, just some frames half-stay longer for a third field. The only time resolution is lost is if you have captured a framerate (distinct points in time) that equals greater than 1/2 the fieldrate. Blending of frames still happens with 50i; it just happens *_every_* frame if both top and bottom fields represent the same point in time.. Field 1b to field 2a is a blend, even if the transition between 1a and field 1b is not.
The complaint is that for fast action scenes (or fast horizontal pans), the 3:2 pulldown's effect of displaying a frame where the two fields come from different source frames can produce some horizontal image-tearing, and jagged edges on high contrast mostly-vertical edges. I've noticed this on commercial VHS tapes and on DVDs. I've never noticed it on analog TV broadcasts or VHS recordings from TV broadcasts. Probably because the broadcast itself is a bit fuzzy to begin with. But I still think it's better than speeding up the movie. My ears are very sensitive to a soundtrack that's sped up by even 4%. Everything will just feel "wrong" to me, even if I'm not sure why.
The framerate difference between PAL and NTSC is also dictated by the frequency of the AC current. Artificial light is pulsating at 50Hz in Europe, if you record at 30fps the image will flicker.
Way to call out the PAL fans. Europe was WAY behind with their color broadcast standards. It's like comparing a mid 60's Mercedes to a mid 50's Chevrolet. A lot gets improved in a decade. For mid 50's technology that was also backward compatible color NTSC was pretty amazing. And, as you pointed out, the few shortcomings were addressed by the time PAL started to take off.
If you think Europe was behind, you should have seen Australia. We didn't start broadcasting in PAL until 1974. On the other hand, we did handily beat the US for sheer speed of adoption once we did get colour TV going.
I know literally nothing about any of the history of the tech in this series of videos, thanks for making such interesting content and thanks for not being a channel that over edits and instead trusts in your viewers attention span, I wish linus would watch some of your content and have a word with his editing team.
It is true that some TVs could produce an interference pattern caused by the difference between the chroma carrier and sound carrier frequencies but this was much more coarse than the chroma dot pattern. The fine 'chroma dots' were simply caused by the colour subcarrier being superimposed upon the luminance signal. The colour system used the supressed carrier AM technique which is why the dots disappeared in non coloured appears. The reference frequency of the colour subcarrier was not a whole multiple of the frame rate which meant that the exact phasing of the colour subcarrier info was different on consecutive frames. This meant that the chroma dots would move and hence be less noticeable by averaging out over several frames. It also meant that false colour artifacts produced by components of the luminance signal being wongly decoded as colour, would appear different on consecutive frames and produce a flickering effect.
8 Bits SECAM, when used properly, was effectively PAL but with basically perfect colour. Unfortunately, the way colour was handled also made SECAM a gargantuan pain in the ass to work with, and any kind of editing or VFX work required decoding the SECAM colour to a component format and then reencoding to SECAM afterwards. Unfortunately, almost all video editing equipment used composite formats, which meant that SECAM programs with lots of editing or VFX work suffer from similar artefacts of PAL. Eventually, many programs just straight up started being produced in PAL, and then converted to SECAM afterwards, which meant that the composite colour artefacts were actually worse than PAL since it had to be converted first.
I own a Trinitron from I believe the late 80s. If there's anything I can do to help you out with your Trinitron video, let me know. Also I made a video response for another one of your videos. Still in editing, but when it's done I think you'll like it.
WONDERFUL!!! This video explained to me just how they colorize films. Now I know the colors are not really too far off from what they actually were. Thanks!
The method described in this video only applies to films taken of color TV programs via telecine. It wouldn't work for footage that never existed in color, or for black-and-white prints of films shot in color. In those cases, other methods would have to be used.
When you get into digital television (and probably radio too) they're just methods to transmit an MPEG Transport Stream. All the mess with frame rates, etc is left to the codecs. I gather most televisions can handle any common frame rate anyway. The only problem now is updating codecs e.g MPEG-2 video to MPEG-4 AVC (a.k.a "h.264").
NTSC vs PAL is the classic first to market worst to market scenario for exactly the reasons you described. The later technology gets to learn from the earlier mistakes. We experienced the same issue when moving to digital TV, The UK pushed really hard with the Freeview system. The result was lower power transmissions ( so as not to interfere with the analog signals on nearby frequencies ) so you needed bigger aerials to even get it, and MPEG2 encoding ( hardware decoders for anything better were not available/too expensive to be putting in cheap set top boxes) so the video quality was not much better than the original analogue. The rest of Europe started there transmissions years later so their equipment supported MP4 encoding and higher bandwith. The result is that European Standard Definition Digital transmissions are better quality, and almost as good as the later introduced High Definition Digital TV in the UK, which required everyone to buy new boxes for.
Thanks for that. A couple of asides. 1) Over here, in the UK, we are told that the alternating phase burst was thought up in our Post Office Research Labs, which also gave us the analog computer, Enigma cracking etc etc. The math, drawings and stuff were given (not sold) to Telefunken on government orders as part of a rehabilitation program for (West) Germany. The Post Office department involved went into a fugue, especially since some years earlier they had been ordered to destroy all their computers (the first!) on orders originating in Langley,VA to stop the Roooshiuns fum geddnum. Although the unit kept working, pioneering scads of technologies and applications, they never regained their previous ascendency. Increasingly, politics intruded into engineering considerations. Total f***up ensued. 2) Thanks for the wee tip of the hat to John Locke Baird. However, my understanding is that after his mechanical system was rejected, with its demonstrated two color capacity, Baird threw himself into electronic television so hard he almost came out the other side. His hiring of Philo T Farnsworth wasn't the brain skimming exercise US historians usually depict. Intellectually, they were partners. 3) One of the most curious aspects of British history is the insistence that radar was "invented" by Robert Watson Watt. Over a year before RWW did his first field trial, John Logie Baird had been assigned the first of a succession of aircraft for a still classified series of experiments. It has to be either radar or a death ray, and JLB was a committed pacifist. 4) Before his death, John Logie Baird had conceived, designed and built an electronic three color system, approved for introduction as soon as the war was over. He had also demonstrated a full color optical 3D system. Publicity photos of this system in action may or may not be simulated. If the latter, I can't tell. If you want to look into this, please plague New Scientist magazine with your questions. They set my feet on this trail.
I actually prefer NTSC. Watching PAL TV always bothered me just a little bit whenever I would visit Europe. Although the picture was a little bit sharper due to the higher resolution, the flicker just messed up my eyes. I can't really see the flicker at all on an NTSC screen unless I move my eyes around really fast or something. But on PAL it's just impossible to ignore the constant flicker and it always bugged my eyes out trying to watch it. I don't know, just a personal opinion but I much prefer NTSC even with the lower resolution and color issues. At least according to my eyes, the reduced flicker was well worth it. In any case, none of this is an issue at all on LCD/Plasma flat screen HDTVs. It's all moot now.
But the flickering has nothing to do with PAL/NTSC. It depends on framerate and framerate depends on mains frequency. So 50Hz in Europe and 60Hz in US was choosen in 30's - 40's for B/W TVs.
I grew up with PAL to me its just a better colour system, American NTSC always looked bad until recently, never had an issue with the flickering even with a CRT, its not supposed to affect you if you are a decent distance away from the TV screen.
I am 80 years old. I can remember the Black & white days. I have also worked on a 800 line Hi-res system (Black & white only. used in a Military briefing situation audience.). I can also remember Popular Science mags saying the future would be a flat tv the fit on a wall. If WWII had not happend who knows but because of WWII eletrconics in High Freq, VHF and Uhf freq. Tv after the war was cheaper. Our first TV in 1948 was a round tube set.
The thing you've completely missed with the 4% PAL speed up is that unless you had already watched the same movie/tv show in NTSC you would never notice, unlike the NTSC 3:2 pull down, which was noticeable with every panning shot.
It depends on the individual (and the television system to which one is accustomed - NTSC/ATSC, in my case). I barely notice judder on panning shots (and when I do, it's only mildly irritating). Conversely, I find a ~4.17% increase in the pitch of people's voices highly noticeable/annoying, even if I'm unfamiliar with the production and its cast. (I realize that others have the opposite experience.)
I personally hate both situations. The motion is far more pleasant with PAL, but the SPEEDUP is horrid! But I will say that I can live with the speed up and find it much less objectionable than the jerky motion from 24 to 30 FPS conversions. I have always lived in the USA and grew up with NTSC and always hated the motion since I first noticed in when I was little watching Star Trek and Lost in Space. I had not noticed it much before that. As a result, I have actually bought quite a few TV series in PAL versions due to the better video motion!
Either flaw is correctable, of course. For 24p material to which pulldown was applied, inverse telecine results in an exact reconstruction. (Many modern TV monitors can detect the cadence and perform this function on the fly.) Speedup can be reversed through software, including a DirectShow filter called ReClock. In the case of a DVD, I would assign preference to a 576i version for its higher resolution (provided that it was sourced from a master of that resolution or greater). Blu-ray largely eliminates the issue, as 24p material is typically released as such worldwide.
Europe did introduce color TV 10-15 years after US, but still, in first years, color TV in US was so rare that we can conclude that NTSC color problems was result of premature standardization. System could still be improved before standardization without much impact tu acceptance of color TV.
@Frank Silvers you are right in general, but you are missing context - I was responding to idea that PAL was more robust because Europe introduced color TV 10-15 years after US, and even more important, i said that US problem was premature standardisation. Your comment should be addressed to this nice guy who made video, because he was talking about comparison to Europe. If we want to be fair and exact, SECAM (also used in parts of Europe) predates PAL and solves NTSC issues, PAL adapted SECAM solution but in NTSC signal structure.
Please do a follow up to this covering PAL-M used in Brazil that was 525 lines with PAL Colour and Subcarrier as well as FRENCH and RUSSIAN SECAM. FRENCH (France, not overseas holdings) used the lowest energy levels for SYNC and a reverse Luma polarity. Failing to cover these weird cases would really miss out on some interesting technology tweaks. ++ COVER NICAM STEREO and A2 STEREO ... noting that the US NTSC system for stereo for telly sucked ...
eyreland SECAM utilized the base 625/50/2:1 standard with two FM chroma subcarriers that rode alongside the luma signal in 'Y'Db'Dr fashion. SECAM utilized two delay lines to contain the additional FM chroma subcarriers on top of the base luma signal, and utilized vertical subsampling in addition to horizontal subsampling, with each of the individual two chroma components getting their own respective sample on every other horizontal line. I believe this made chroma crosstalk impossible despite SECAM being composite.
Of course there were chroma/luma crosstalks in Secam. But because of FM, they looked different than in PAL or NTSC. There were 2 delay lines - one for luma as in PAL or NTSC. The second was for vertical subsampling.
The framerate depends on the powergrid. In the US you have 60 Hz, in Europe we have 50 Hz. Movies are recorded at 24 fps, very close to PAL's 25 fps, so 30 fps is not really "better", just more convenient. In most digital video cameras, the "artistic" 24 fps option only exists for NTSC.
I read here a few complicated explanations re 'colour dot crawl' on a PAL or NTSC broadcast (they both do it). A good way to think of it is when you modulate a carrier with the information you are left with the carrier and sidebands. Those sidebands run either side of the carrier from effectively DC to infinity but they are at multiples of the modulating waveform, they look like the lines in a diffraction grating. This is the crux of how the backward compatibility worked. The sidebands allowed the cleverly chosen colour sidebands to slot, like a comb into the luminance signal. The B&W receiver would just not see the colour and this is why the NTSC colour(color) frequency is chosen - not to be 3.58MHZ but 3.579545 mhz±0.0003% so it ALMOST perfectly slots in. - It doesn't hence dot crawl.
@@ellenorbjornsdottir1166 sure thing. PAL color system gave us an edge in picture quality, however we suffered in the VCR era when imports from Japan and US had to be converted in order to work in color.
In Argentina we used PAL-M too! I was searching the comments and wondering what was the difference with regular PAL since he didn't mention PAL-M. Now I got my answer, thank you.
@@rogeriorogerio1007 Yes you are right, I was just doing some googling and it was PAL-N, not M. Apparently it wasn't compatible. Apparently PAL-M had a higher refresh rate while PAL-N was more similar to the European standard but with more scan lines. Anyway, I was wondering what were the differences and now I know.
You don't need to artificially manipulate the Y signal since it relates to color brightness. The dot pattern is indeed caused by the chroma signal. You can figure that out using a bar pattern generator and an oscilloscope: the dots will be more visible in the bars where the chroma signal has more amplitude.
RE: chroma dots @6:27.
Your speculation is correct. The chroma signal is encoded by "wiggling" (modulating) the Y signal within that range.
I find it helps to understand this by looking at the signal from the perspective of the TV doing the decoding.
A black and white TV has two radio demodulators, one that locks onto the sound sub-carrier to demodulate the sound (which is FM encoded) and one to lock onto the video sub-carrier and demodulate the video signal (aka, Luminance or Y), which is actually AM encoded. These two radio demodulators are essentially independent, apart from the fact they are tuned to two signals that are right next to each other on the dial.
Early TVs are actually really simple, electronically. All they do is take the output of the video demodulator (which is a nice voltage between 0v and 1v), detect the horizontal and vertical sync pulses (which are used to lock the frequency/phase of the two flyback transformers controlling the CRTs vertical and horizontal scanning) and then send the remaining signal directly to the CRT's electron gun to control the brightness of the electron beam at that position. 0.33v Represents black, 1v represents white, while 0v represents a sync pulse.
The most logical way of adding color to such a system would be to add a 3rd radio demodulator, one which picked up 3rd sub-carrier and decoded a chroma signal. But this would make the TV signal take up more bandwidth, and the FCC had already allocated 6Mhz channels. Additionally, you would have to replace all the black and white video equipment in the recording studios and transmitters to carry and transmit this extra signal.
So instead, the two demodulators are left untouched and the chroma signal is actually modulated on top of the Y (aka Luminance, Black&White) signal to create a combined Luminance/Chrominance signal (which replaces the Y signal and decodes fine as a Y signal on old Black and White TVs). Color TVs actually have to demodulate the demodulated the combined Y/C (Luminance/Chrominance) signal a second time to extract the chroma I and Q signals.
The chroma signal is encoded in the high frequency of this combined Y/C signal. A black line on the TV would have a Y/C with a constant 0.33v across the entire line. A white line would have a Y/C signal of a constant 1v across the entire line. For a solid colored line (say bright-green) the Y/C signal will fluctuate between 0.93v and 1.07v at a rate of 3.58 mhz. The phase difference between those fluctuations and the 3.58 Mhz color burst signal at the start of the line encode the hue of the color, with 225° representing green (178° for Yellow, 100° for Red, 0° for blue). The height of the fluctuations represent the saturation of the color (±0.07v represent 100% saturation and ±0v would be 0% saturation, or grayscale). The average height of the Y/C signal of course represents the luminance.
For complex lines with multiple colors, the fluctuations in Y/C won't be a constant 3.58 mhz, as it will speed up and slow down rapidly to change into the correct phase for the color at each location on the screen.
To decode this complex signal, first Color TVs have to split the Y/C signal by frequency. Early Color TVs would have used a Notch Filter, with a range of frequencies around 3.58Mhz (say 2.8Mhz to 4.1Mhz) being extracted as the chromance, while the rest (say 0 to 2.8Mhz and then 4.1Mhz to ~5.5Mhz) being interpreted as Luminance.
Frequency in a luminance signal is the rate of change of brightness across the line. A solid color across the line would have a frequency of 0hz. An image with vertical b&w bars across the screen, each about 1/10th of the screen wide, would have a frequency of 0.2 Mhz. With vertical bars 1/100th of the screen wide the frequency would be 1.9 Mhz. If you had vertical bars which were were about 1/188th of the screen wide, a color TV would actually interpret it as color infomation and show a solid color. (A number of 8bit computers like the Apple II actually took advantage of this to create color). But as the width of of the vertical bars got thinner and the frequency increased over 4mhz, they would become visible as black and white bars again.
Modern TVs use Comb Filters that use the infomation from previous and following lines to extract a much better Luminance signal that preserves most detail even around 3.58mhz. See this document for more details: www.intersil.com/content/dam/Intersil/documents/an96/an9644.pdf
So what are chroma dots? They are simply the actual chroma infomation which has been modulated right in the middle of the luminance signal. B&W TVs that were manufactured after NTSC was standardized are meant to the same notch filter that color TVs use, and simply discard the infomation with that frequency.
This is a terrific explanation, I like the way you included the actual video levels in volts.
This also shows why it was so important and used to be required that the broadcast studios would KILL the Chroma Carrier for B&W programs to restore the full resolution image without color signal modulation for the B&W programs. It also allowed the Color TVs to shut down the chroma circuitry, and pass unfiltered full luminance signal for the program to allow the full resolution of the original pre-color standard addition.
phirenz, your xplanation is great except for the use of "modulate" to describe how the chroma carrier is combined with the luma. It is not modulated on (multipled by) the luma signal, but simply added to the luma signal.
THANK YOU SO MUCH, I've had trouble for a long time understanding how color is encoded into a composite signal, I knew it had something to do with phase-shifted frequencies, but not how it actually worked, and I didn't know what the color burst actually did or why the chroma dots actually happen, and I couldn't really get a good understanding from technology connection's video, but you have explained it amazingly, I finally understand.
Now I understand color moare when presenter on tv had checkered pattern clothes
the amount of work that went into giving us color tv while keeping B&W compatibility is mind blowing and I never gave it a second thought.
It’s truly amazing! I don’t understand EVERY single aspect of the technical explanation but do get 90%. I remember watching COLOR TV at my Great Aunt and Uncles house, (they were very well off) and being amazed at it! Watching the Wonderful World of Color, (the Original name of the Walt Disney show) and very early episodes of Bonanza, remembering my Great Uncle or Cousin getting down on the floor in front of the set to adjust the color, etc and bring spellbound at 4 years of age sticks with me. My Parents got Color the Weekend after President Kennedy was shot as our TV (b&w 21 inch) had blown up with smoke and theatrics on the day he was shot around 4pm, and Dad put their bedroom b&w on top of the main set, then calling a few repair men who couldn’t come and leaving for some store that sold Color as they had planned on that anyway. Sadly he bought a Philco which staggered along with countless repairs until the 70’s! Our next TV was my 23 inch b&w my Aunt gave me about ‘74 and although it was a late ‘50s Stereo TV combo it was a Zenith and worked until 82, having been relegated to my room about ‘77 and I married and took it with me in ‘78.it was an amazing Stereo too and worked until late ‘82. So their is my nostalgic look back!
Maybe without the B&W compatibility I could have watch a colour TV growing up. Snooker on B&W is miserable.
My grandma bought us a used Admiral color set in 1962. Disney World of color and Mitch Miller! We were poor, too, but she used her diseased husband's veteran checks to get the set. It inspired my career in electronics. I became a TV repairman, and later an automation tech. Color TV sort of guided my career.
@@davidlogansr8007 you wanna know a secret... I work amongst with for and around technical design engineers making sure all is in order and accounted for along any possible line..... we can truely consider ourselfs professionals in our highly technical field (cables) .... if we understand 90% of what we do I call that a brilliant day of exceptional fortitude... usually its 75% at best , the rest is pure luck, improvisation and an endless learning curve
@@daishi5571 "The yellow is on the side cushion, and for those of you watching in black and white, it's just behind the blue."
I enjoy your presentations, the fact that I already know the material gives me a greater appreciation of how well you are able to simplify the subject matter and cram so much into a short video. I also know when you make a mistake, but I didn't detect any in this video. I was eleven years old in 1953 and the introduction of compatible color was a big deal to me, even though my family couldn't afford a color TV set. From 1953 to 1956 I saw many of the first compatible color telecasts by going down to local department stores, which usually had at least one color TV set on display. I would sit, or stand there for hours, just to see TV in color. The salesmen stopped trying to run me off when they discovered that I could answer technical questions for customers and I kept the hue and saturation controls adjusted as soon as faces started to drift to green or purple. Nobody could adjust those early color sets for natural looking flesh tones better than I could. I'm glad you talked about chroma dots, because I always knew when a show was being broadcast in color (even when I was at home), because I could see the chroma dot crawl on our old B&W TV screen as plain as day. I quickly learned to recognize the color of items on a B&W screen by the dot pattern, and I got so good at it that I could almost imagine I was seeing the picture in color on a B&W screen. I could do this, because in the early days of live color broadcasting, people on TV commented on the color of things in the scene, so people at home with B&W sets would know what they were missing. It may have been a marketing ploy, intended to spur the sale of color sets, but people on TV were always commenting on the color of clothing and items on the set during the early days of live color TV, especially on variety shows. Thanks to this feedback, it didn't take long for me to associate the primary colors with specific dot patterns. The color bars TV stations used to broadcast along with the test pattern in the early days also helped me associate colors with chroma dot patterns. The dot crawl was also unique to each color, so it was helpful in guessing color on a B&W TV. It became a game for me to announce the color of an object on a B&W screen out loud before anyone on the TV show said what the color was. My friends and family could never figure out how I did it, and I won a few bets with the trick before people became convinced that I could really do it. I remember one night when a bright red convertible sports car drove on to the set during a Perry Como color broadcast, and I knew instantly that the car was red from the chroma dots , before Frank Gallop (the announcer) mentioned that it was red. Red was the easiest color to spot on a B&W TV screen, because the chroma dot pattern really stood out and it had a crawl that almost seemed to flicker. I could spot red from across the room, but for other colors I had to be close to the screen. During the first color broadcast of the Original Amateur Hour, Ted Mack mentioned that his fountain pen leaked just before the show began and there wasn't time to change his jacket. Then, he added, that this was his first show in color, so the lucky people with a color set could see that the ink stain was blue, while the rest of us only saw a black spot. They did a close-up of the ink spot while he was talking, so it was easy to see. By the time Ted Mack mentioned that the stain was blue, I already knew it was blue from the distinctive chroma dot pattern. Anyway, keep up the good work, and I hope this ramble down chroma dot memory-lane wasn't a bore. I never thought I'd be talking about this for the first time after more than 60 years.
Nice use of tint during the explanation!
Agree. Such sweet little things help tremendously to make a commonly dry subject more interesting and easier to understand for most people. ^^
Those green and purple skin tones are a vivid memory of my childhood...
Bruh I thought it was the weed
Yes, but it was overdone. It went on for too long and got distracting.
@@MattMcIrvin In NZ, that was only true of SANYO colour TV's... the only ones with a tint control here, IIRC ... not needed for PAL-D, but they still had 'em :-)
Shops used to make their window display sets really blue so they looked Moar Colour!!1!!
:-)
Another fascinating video. You're quite right that we started B&W PAL broadcasts in the UK in 1967. In fact, if you ever watch early episodes of a series like Doctor Who, which began as 405-line B&W, and then shifted to 625-line B&W in 1969 (that's roughly when BBC One shifted to 625-line) you can really see the difference in quality.
With regard to making telerecordings of shows made on video tape, one of the main reasons for doing this was that the BBC were fairly active in selling their programs on to foreign markets quite early on. This was usually done on a hierarchical basis - rather than go to the expense of making dozens of multiple film copies, the BBC would simply make a handful. These would then be sent to the first foreign broadcaster who had the right to show them, that broadcaster would send them into the next, and so on. And because of the myriad of broadcasting standards that existed at the time (some markets would be on 405-line, some on 625. Some on PAL, some on Secam (the French system) and others on NTSC - and then add colour into the mix by the early 70s), then the simplest and most broadcast-standard friendly method was to use B&W film copies.
Of course, there are three problems with this method. Firstly, you can't guarantee that the last broadcaster which had the film copies actually sent them onto the next in the chain. Indeed, the BBC have managed to recover programmes long thought lost for good by tracing telerecordings that were either never sent on to the next broadcaster, or in fact returned to their commercial sales department, BBC Worldwide (formerly BBC Commercial Enterprises). Before the BBC established a proper archive, the broadcast arm would often make telerecordings of videotaped programmes for Commercial Enterprises, then wipe and reuse the tapes to make something else. Commercial Enterprises would sell the telerecordings to overseas broadcasters, and after a few years of a programme being available for sale, assumed its commercial viability had come to an end. No further sales would be made, and they would burn the telerecordings, as they were under the impression that the broadcast wing did indeed have a proper archive. Many thousands of hours of vintage television was lost in this way. Often, the only reason there are copies of vintage programmes available, or we only have B&W film copies of programmes originally shot in colour is because of film telerecordings.
The second problem, is that censorship standards would differ from country to country. And what might be acceptable to UK audiences might not be for Australian or New Zealand censors, who would often remove offending material by simply cutting the offending sequence from the master telerecording, rather than go to the trouble of making their own copy. And when this was sent into the next broadcaster, that sequence was missing. Sometimes, these trimmed sequences have been recovered and are the only surviving example of the programme at all.
The third problem with the telerecording method, is that if that programme was original shot on video, the telerecording looses the smooth look of video; as instead of being made up of 50 fields per second, copying that programme onto celluloid blends those fields into 25 frames per second (if the camera was modified to run at that speed, or 24 frames if not). And that blending of fields into frames can lead to visible errors in vision mixing between different cameras from the original multi-camera shoot (almost all videotaped programmes in the UK were shot multi-camera, with a vision mixers switching shots between different cameras). Fortunately, this was rectified when one particular fan of vintage television noticed that a repeat broadcast of a programme that only existed at a telerecording, which he had taped on his domestic VCR, reverted to it's naturally smooth motion on screen as he fast-forwarded through it. Realising that the VCR was in effect "blending" the individual frames back into fields, he contacted some friends who worked in broadcasting, but who also specialised in restoration of old TV programmes. Armed with this information, they were able to devise a process where old telerecordings of programmes shot on video can be made to look like video again.
With regard to how long it took to establish PAL as standard due to the challenging topography of Europe, in fact the BBC was still broadcasting B&W 425 line television to some parts of the UK as recently as the 1980s, due to those places being unable to get a good UHF signal for 625-line PAL - but they could manage a VHF signal for 425 line. Since the introduction of satellite and digital terrestrial broadcasting, this is less of an issue. But (for example) at my house, I used to receive a fairly poor analogue terrestrial TV signal, as I don't have direct line of sight to a transmitter, and must rely on the signal bouncing back down to me from the ionosphere. When the switched to digital terrestrial broadcasting (using the DVB standard) the picture quality improved massively. Then, the strength of that signal was reduced, and I can no longer receive digital terrestrial signals. I can get digital satellite though, so all my television comes via that.
With regard to showing movies on PAL systems and then running at a slightly higher speed - yes, this is true. If fact, when James Cameron's Titanic was broadcast on BBC One some years ago, there were complaints from some members of the public who assumed that the BBC had cut out material, due to the slightly shorter running time. In fact, the film had been shown unedited - but due to the extreme length, the slightly higher speed of 25fps meant a noticeable difference (at least, for petty-minded fans of epic disaster movies). The slightly higher running speed of movies also means a slightly higher audio pitch, which most PAL broadcasters compensate for by lowering the pitch by the same amount.
With regard to the colour-recovery method you described - the software was actually written in a modern Windows-version of BBC Basic - a version of Basic that dates back to the Acorn Computer's BBC computer series of the early 1980s, introduced with the BBC's Computer Literacy Programme - a scheme they introduced to make computers available in schools and used to teach children. Almost all UK schools had at least one BBC Model B Microcomputer at the time, and the version of BASIC included on BBC Computers was a particularly good one.
There is another method of colour recovery that was used by that same team of vintage programme restoration specialists I mentioned easier. They were all fans of Doctor Who. Many of the early colour episodes of Doctor Who had been sold abroad as B&W telerecordings to those countries that didn't have colour in the early 70s; but sales to the United States and Canada tended to be colour, as that was obviously the preferred format for North America broadcasters. And then, in typical BBC fashion, the original PAL master tapes were often wiped. When this occurred, the BBC would (in later years) obtain back their NTSC-conversion masters, and reconvert those for PAL. However, one particular Doctor Who story only existed as relatively high-quality B&W telerecordings, and a domestic recording of a North American transmission on a home format, which wasn't a suitable for a proper VHS release, let alone a repeat broadcast on BBC TV. So, in order to make a good quality colour begin, the restoration team took the telerecordings for their picture quality, and matched that with the colour signal from the NTSC home recording - combining the two to get, in effect, a colour telerecording. This was the mid-nineties, so the differences in screen geometry were easily corrected by bending the overlaid colour signal at the edges of the screen to match the B&W telerecording. As this has gone by, the same team have built a rather sophisticated reverse standards conversion machine, which has vastly improved the quality of programmes that were shot on PAL, converted to NTSC, then converted back to PAL years later. These used to be pretty terrible, but now are almost indistinguishable from other PAL programmes of the time. And, by incorporating the frames to fields conversion, they have managed to make some very high quality restorations of vintage programmes that would otherwise be in a very poor state. Combine that with the standard repairs for hairs in the gate, scratches, speckle, etc etc, they Restoration Team (add they are known) often put as much work into a recovered programme as you'd get with a major Hollywood studio rescuing a movie shot on nitrate stock.
ZygmaExperiment There is no such thing as PAL 625 B&W, since PAL is just a colour coding system. You are referring to CCIR 625 (which is B&W) and in use in countries in Europe since 1950, long before PAL or NTSC.
Yes, that's true enough. But it isn't a technical essay. It's a You Tube comment.
The Claws Of Axos has had two releases: the first used the NTSC masters reverse standard converted (which turns out to be more difficult with the early conversions due to the crude set-up of the original method), and the re-released version used the B&W film copies to provide the luminance and the NTSC masters chrominance to get a better overall picture.
Inferno also had the same re-release.
The original UK system was 405 lines (not 425 lines). I know I'm being pedantic about your very fine and informative post.
Greetings!
I've been looking through past comments (I do read them all, you know) and I've seen a common suggestion for more info graphics and less talking head. I really do appreciate this sort of feedback and I'm doing my best to address it. However, for this video, there's not a lot to show since it is really more of a string of factoids.
I decided not to go into SECAM for this video--I just delved a little into the PAL vs. NTSC fight we seem to still be going on about even though analog television broadcasts aren't happening anymore...
Thanks for watching, everyone!
Technology Connections I do like me some info graphics, but you’ve always struck a good balance in my opinion.
I REALLY liked your practical examples when you were showing the result of moving or disabling the yoke with the CRT tube, way more than an info graphic as well.
I agree with soupisgdfood. One of my favorite TV series of all time is The Secret Life of Machines (which explained the inner workings common household and office appliances, in case you've never seen it) and a big part of that is how it perfectly balanced exposition, informational graphics, and practical demonstrations. Your channel is the closest I've seen to replicating that formula.
I really dislike infographics - I have to keep my eyes on the video, rather than just my ears.
Technology Connections
Hi, what about calls for a multi part deep dive into Teletext/Ceefax/etc. This is the digital interactive television technology we invented in the 70s
If I make the request many time, does that count?
You could have addressed one shortcoming of PAL that is often forgotten. High frequency image components could "bleed" into the part reserved for the sound. The effect was horrible. When there was text inserted in an image, like for instance a sports table with results, the TV set would make a loud noise that would overwhelm the narrators voice. SECAM would not do that. (I grew in the border region between France and Germany, for that we had multi-standard TV-sets. My parents bought a Phillips X26K221 in 1972, it could display German PAL, French SECAM and even the very odd and uniquely French high-res B&W 819 line standard).
You missed the SECAM French colour TV. Some explanation would be interesting because this system was used in the old USSR, Egypt, France and other French culture countries. When colour TV was going to be implemented in Colombia, we had representatives from the three standards, NTSC, PAL and SECAM. Each one gave very interesting demonstrations and all arguing that their system was the best. Finally our Government decided on NTSC which it had many detractors, it was the best system not only because all the studios and all TV sets were American standards and for a country so near from the US and also broadcasting many shows from the States made it the obvious choice. All this happened in 1978 and the first official colour TV broadcast was on December 1, 1979 at 6:30 pm, President Turbay started with a speech, then an American movie.
SECAM was used not only in USSR but also most (if not all) countries of the eastern block. I remeber the time in the nineties when out television switched to PAL and old color TV's had to be "retuned" to still accept color. Although it was called "retuning" I think they just replaced decoder boards, at least for the most popular TV models.
Brazil adopted a modified version of PAL in 1973, which was unique to the country. It combines the NTSC 525-line 30 frames-per-second System M with the PAL color encoding system. The "PAL-M" system was compatible with monochrome NTSC and not compatible at all with European PAL. So people had to use decoders for their TVs or VCRs to watch American tapes. Argentina, Paraguay and Uruguay adopted another variation, "PAL-N". Other South American countries went with NTSC.
@@pd209458 We had to call our tv guy to come by and insert this pal board into our soviet tv )
@@pd209458 I get that people were glad to be rid of communism but did everything have to go to the extent of getting rid of the television standard !? Was the standard that lousy or was it more important to harmonize with Western Europe ? I mean surely they must have seen that digital television was coming down the track so why not retain SECAM until then ? I don't know if your country had indigenous television manufacturing- if so I guess it was good for business and I suppose most people change their TV about once every 10 years anyway, but I'm guessing that your average Joe Soap / John Doe/ Eastern Bloc equivalent must have felt jerked about being put through two such big changes in a decade and a half or so.
@@barryholt9564 I think quality-wise SECAM was comparable with PAL. There were some advantages and disadvantages for both as it usually is for such standards.
I was 9 when the change happened so obviously I had no personal opinion back then, but looking retrospectively I think the main issue was compatibility with imported TV sets, VCRs, home computers and game consoles. We had domestic TV production and some Video Cassette Players on a license from GoldStar but none of the latter. And, sadly, at that time people didn't care much about domestic industry.
I liked old tv tint control as a kid. Fun to decalibrate for funky colors.
A few assorted remarks ..
1. The terms PAL and NTSC do not describe frame rate or line count. They are colour systems only, and any of the two (or three, with SECAM) can theoretically be used with any frame rate and scan line count. For historical reasons, NTSC is mostly used with 525/25, and PAL is mostly used with 625/25, but that doesn't mean that is mandatory, or that the terms define resolution. There is one country that uses/used PAL with 525/30, namely is Brazil. PAL colour on top of a 525/30 signal was also used as a hack for bridging incompatibilities in multstandard VCRs, and apparently a reverse hack, i.e. NTSC colour on top of 625/25, also existed.
The 625/25 family of signals have been around in Europe since 1948 and were broadcast in black and white for nearly 20 years on. At that stage the term "PAL" did not exist, and colour television was far beyond the horizon. The fact that the introduction of PAL colour did not require a slight shift of the frame rate as in NTSC was just luck because the maths turned out differently, it was not planned for.
The UK is an exception in this regard, because as opposed to continental Europe they opted to retain their pre-WW2 405 line standard after 1945; they could have introduced colour television (with any of SECAM, PAL or NTSC) based on that standard, there was no necessity to switch to 625 for colour. (And that was seriously considered, they did tests with all three colour systems with 405). The UK switched to 625 eventually mainly in order to alleviate incompatibility issues with the rest of Europe. So with the 1962 introduction of 625 lines, the UK was a latecomer. So if you based your research mainly on UK sources, you might be getting a slightly distorted view of the events. In continental Europe, 625/25 in black & white had been the standard for nearly 15 years before then.
2. I think the major factor why Europe introduced colour tv nearly 15 years after the US was economics. Europe was still building up from the war and didn't have the resources to invest into such luxuries, while in the US consumerism was already in full swing.
3. I'm in Germany, in the 1970s and 80s in some areas you could receive terrestrial television in NTSC from AFN, the station serving the US military. You needed a multi standard set which were rare and expensive then, so not many people did. From my memory, the hue problems of NTSC were quite apparent and we had to use the tint dial quite a lot to correct those green and purple faces :).
Thanks for your comment about the UK. I wasn't understanding all that stuff about incompatibility because we had an old black & white set that we used for a few weeks (maybe months?) in the early '90s. This was after the presumably early '80s, possibly '70s color TV broke. I grew up in the Netherlands.
France and UK had old and "exotic" systems that they had to choose incompatible new one. Other Europe coutries used B/W 625/50 system, so they switched to 625/50 PAL or Secam and all B/W sets worked.
Xaver Lustig True with the frame rate and line count. However, I believe at the time they were working within technological constraints of the era. Everything had to match the AC frequency while still allowing enough analogue bandwidth to transmit all relevant image, signal and audio data. The bandwidth is pretty irrelevant today with transmission of TV via a digital signal. Much more data for less bandwidth. The digital era has also made NTSC, pal and secom transmission standards obsolete... So all this is really debate.. Lol.
+Brett Ison Yes it's mostly obsolete now, except if you still play analogue tapes or insinst on only buying new equipment if it has analogue out so you can run your old tvs with it (as I insist on doing :). The irony is that even the cheapest modern flat screen televisions come with a super multi standard tuner for old analogue signals, even though that is hardly needed today. 30 or 40 years ago it would have been a dream.
Xaver Lustig Oh I bet! Then you'd really be king when it came to importing tapes of US shows!
Cannot believe how on point this guy's production quality and editing is, the way he emulates the NTSC colour shifting.
Even though pal is 10hz slower, I now am very thankful I grew up in a pal region.
Strictly speaking, "PAL" is just the colour encoding and can be used with other line/refresh rates, it just didn't happen much in practice.
So you *could* have it both ways- Brazil used PAL at 525-line/30-fps (i.e. otherwise the same as NTSC) for "PAL-M", though I don't think anyone else did(?)
@NotATube Nobody else did and I could argue that our color system was the best one hahahaha. We just did very late, 1972.
The thing with Gonzales Camarena, and so many people mentioning him in the comments, it's that in Mexican elementary schools we are taught that HE invented color TV. Even one of the first Mexican tv channels, Channel 5, made his name part of the name code, XHGC.
But he really made contributions to color television.
That "name code" is actually known as a call sign.
@@RodolfoAmbriz He definitely did. But its incorrect to call him "THE inventor of the color television", instead of calling him "ONE of the contributors that helped that color television became a reality". Also, considering that his inventions were based on the experiments of Logie did 20 years BEFORE Camarena, it means the value of his contribution is not as high as Logie's and therefore, its totally incorrect to call him the inventor of color television.
Brits always get told Baird invented television, which he did, but the Baird system was extremely low-resolution and impractical. The later US-invented electron gun system was the first TV people would actually want to watch.
I wonder if Aussies get told their country invented aeroplanes? There was a claim somebody flew in 1900, I recall.
And the actual system that would become NTSC (and later PAL) was originally concieved by a French guy in 1938.
A few points: the NTSC (National Television System Committee) in 1951 demonstrated PAF (Phase Alternate Field) and PAL (Phase Alternate Line). Unfortunately to facilitate the line averaging to cancel the hue error in PAL, a 64us glass delay line had yet to be invented. It was not available until about 1960 which led to the PAL standard.
The BBC was planning to adopt NTSC as late as 1966 but at the last moment switched to PAL.
The BBC launched 625 lines in 1964 which would eventually supersede the 405 line system dating back to 1936. The 8MHz channel adopted in the UK in 1964 as opposed to the North American 6MHz channel adopted in the US in July 1936 is the main reason for US television resolution appearing somewhat less. PAL because of the line averaging, reduces color vertical resolution and the alternating phase encoding didn't allow for as efficient chroma-luma interleaving which meant that fine detail in the PAL picture say was more prone to 'crosscolor' a flickering rainbow effects. This was seen on presenter's shirts and ties and was fun to watch in PAL. This and other problems with PAL led it to be referred to as "Problems Are Lurking".
Alas, if the BBC had pursued 625 line NTSC, it would have provided the best of both worlds: superior resolution and color.
In 1980s Sydney, Clive Robertson, a late night news presenter amongst other things, used to specialise in wearing ties that he *knew* would wreak havoc with the video signal - he took pleasure in giving the tech directors a hard time :-) . For this and other reasons, his news show was fun to watch.
As a German i say, NTSC in the 50‘s was a great performans by american engineers! Heart of NTSC is quadratur-amplitude-modulation of the two Color Signals, you need mulitplyers, free wheeling oscillators and so on. In this times no problem, but in the 50‘s there are only very bad and expensive vakuumtubes.
In a NTSC or PAL signal (also called composite video) the color signal based on a subcarrier is modulated with the luminans signal. It is very difficult to separate the composite signal into Y (luminans) and C (color) without aliasing. Simpel filtering using passive electonic components like capacitors and coils will give the aliasing (the crawling dots) on a black-and-white TV. Broadcast signals were internally based on component video, thus no crawling dots in tape editing. Component is Y, R-Y and B-Y signals making the video processing much easier. Yes, it requires three cables and three identical amplifier circuits to transmit a component video signal. However, due to black and white TV the colors have to be transmitted as luminans and chrominans mixed together - the composite signal. The subcarrier would when be used to set the hue and decoding correct on colour TVs. On broadcasting the tape recoders or synchronizers used a comb filter to split the received NTSC/PAL into luminans and chrominans for later editing. The comb filter came in the early 90's and very expensive, and it did not make sense to implement a comb filter on mono chrome TVs. Most people will not know about the crawling dots, mainly on cyan colours, and will be accepted as a trade off for having backward compability with B/W TV when NTSC/PAL made is possible to transmit colour is a single and relative low-rate bandwidth.
A more informed analysis than many, well done. Two points. PAL did not get everything right, e.g. Hanover Bars which they fixed by repeating the previous line color info. PAL was also much more prone to Chroma noise on long distance high power transmissions. This together with the Hanover Bars fix were built into SECAM which was a much better system than PAL. Of course there is always a downside. Post production in Ntsc is easiest and Secam is hardest. As a result most Secam production was made in PAL and transcoded fir transmission.
The one error in your piece is where yu talk about tapes being wiped and b/w films being made. A telecine is a device which scans a frame if film and converts the image and the accompanying sound to a video and audio signal for transmission or recording on VT. Telecines do not record anything. The devices used to make these film copies (mainly used for export sales to developing markets) were called telerecorders and made telerecordings. These were basically film cameras with an optical sound head pointed at a tv monitor. So you should correct that. Telerecordings were very useful in the early days of television where there was no real standards conversion and with careful line up it was possible to use a telerecorder to overcome this. Subsequently analog then digital converters recomposed the pictures with vastly superior results. Oh one last thing, you confused color standard which CCIR broadcasting standard. The line field frequency is the result if the broadcast system not the color standard. So in the USA M stands for 525/60, the black level, the Peak white standard and the placement of synch data in fly back area. So Japan has NTSC J which had 525/60 but differed in other areas including frequencies. PAL is often 625/50 but in South America PAL M using PAL color with US line field rate and Argentina had PaL N which had a further mixture if the two.
So we all have one HD standard if 1080i/ 720p now right? Nope we have ATSC, DVBT DTMB and ISDB (2 flavors) as well as the line field rate the compression codecs the frequencies, mux etc etc. and 4K and 8K drift further apart lacking full standards yet. Look forward to you doing one on UHD. Thanks
about 25vs30: this has nothing to do with quality, but all to do with mains-hum.
Europe has 50Hz/230V mains and PAL has a field-rate of 50Hz to keep the mains-hum outside the visible portion of the signal.
America has 60Hz/110V mains and NTSC has a field-rate that's close enough to do the same thing.
NTSC games ran better too
That’s 60Hz/120V for North America, not 110V. (It hasn’t been 110 for a long, long time.)
"mains-hum out of the visible portion of the signal" isn't absolutely correct. Instead, using the same field rate as the net frequency ensures that if interference distorts the picture in a TV set, the distortion is stationary and doesn't run through the picture. Stationary distortions are much more sufferable than moving distortions.
To make this work, the field rate was synchronized with the net frequency. The TV set would hang on the same electricity grid as the TV station. If you received a signal from abroad from another country whose electricity grids weren't connected with each other, it wouldn't work.
By 1953 when color TV was introduced, TV manufacturing was already that advanced that interference was no longer an issue. So the developpers of color TV could create a system were field rate and net frequency were no longer identical and the field rate was no longer synchronized with the grid. This created problems in older b&w sets that suffered from interference. So color TV was not 100% downward compatible.
@@tookitogo it was when standard was defined, so more accurate correction would be *had
MrBlc Duh, hence why i said “it hasn’t been 110V for a long, long time”, so clearly I know it used to be the standard. But in that case, the OP would have needed to say that it was 50Hz/220V, since that nominal voltage has since changed, too. My point was simply to remind people that it’s 120V now, and has been for the better part of a century. It’s illogically irritating to me that people still call it 110V so long after that ceased to be correct.
The U.K. switch to colour was also a switch of resolution. The original 405 line system was extremely early electronic TV and, a bit like NTSC, had made design choices that were less than optimal in hindsight. 405 Line (Marconi) tv was also tested with NTSC colour, but it was never pursued as the outcome wasn’t very good and better technology was around the corner.
Meanwhile, in France they had adopted 819 line B&W television (HD in the 50s!), which was tested with SECAM colour and proved to be far too high bandwidth to be practical with multichannel television.
Several European countries had already standardised on 625 line black & white and that was the EBU (European Broadcasting Union) preferred format, so the odd ball and older standards like 819 and 405 were dropped in favour of 625 but the mid 1950s.
Colour TV in Europe was then only added to ‘modern’ 625 line systems.
So there were plenty of TV stations that began as 625 line services, like RTE in Ireland and broadcast in b&w for some time.
Also B&W TV sets remained cheaper, so even after the launch of colour, they remained a budget option into the 1970s.
British broadcasters also only UHF for 625 / PAL. So previous Marconi 405 line tv was carried on VHF only. That was not the case elsewhere eg PAL I was broadcast on VHF in Ireland and colour tv on VHF was quite normal in most of Europe, Australia etc etc for many years.
Interesting to know that the French tested colour with the 819 line system. Indeed, it was HD in the 1950s. Also due to the high bandwidth (14 MHz channels!), in different areas, they had to use both audio 11ish MHz above video carrier, and audio 11 Mhz BELOW video carrier in what was otherwise the same frequency allocation, to get get extra "channels".
In (I think) Belgium, they had a compromise system that squeezed 819 line signals into 7MHz channels.
Boy you bring back my memories of color tv. I use to replace a lot of color picture tubes back in the day!!! The old round ones, the triad or triangle gun, the inline 3 gun, and the Sony one, which was the easiest to align , I could tell you stories. Sony’s were great color TVs but the tuners were a pain in the neck especially the UHF tuners. I had to rebuild many a tuner in my time. Electronic tuners fixed that issue. I still have 3 working NTSC TVs in my house, two Sony’s and 1 Zenith and the one Sony is about 40 years old and still going strong!!!! And yes I was a tv repairman!!!
I first saw a colour TV in a department store in the UK just before Christmas 1966. By late 1966 BBC2 was regularly broadcasting in PAL colour, though at the time the channel only showed 'trade test' films in colour. These allowed the retail trade to become familiar with the technology and enabled consumers to see colour TVs and to buy them before the official launch of colour on 1st July 1967. By spring 1967 several of the scheduled BBC2 programmes were also being broadcast in colour on an unofficial basis well ahead of the July launch date. But yes, it was indeed coverage of Wimbledon that marked the first official scheduled broadcast in PAL.
NTSC = Never Twice the Same Colour, PAL = Peace At Last, and SECAM = System Essentially Contrary to the American Method.
The funny thing is I've got an effect I've been working on for usage in "retro style" games that, to a reasonable extent, sort of simulates all this. In fact it literally takes an RGB image, converts to YIQ space, and then it produces a pure black-and-white signal by taking Y and modulating it with I/Q using sine and cosine carriers - then later uses those same sine/cosine carriers to reconstruct the YIQ from that pure black and white signal and convert back to RGB.
And yes, if you view that signal output in pure black and white (as I've done for debugging purposes), it looks *precisely* like that weird dotted black and white pattern you refer to, although in mine it also tends to manifest a diagonal "stripe" pattern as well (due to some other quirks of my implementation).
PLEASE tell me this is done and where I can get it
You do a remarkably good job of explaining a difficult subject. I was in the TV industry for 40 years and am still learning things about the NTSC (and competing) color system, including the difficult math. Luckily I only had to make it work, not derive the equations!
BTW, while I agree that the technical stuff would benefit from more illustrations, yours is one of the better "talking heads," IMHO.
So much information in my head I want to tell you but so little time to do it…
The reason why PAL was not adopted everywhere (else) is that it isn’t the superior system and it is a lot more expensive!
The idea of PAL is that the color information is inverted every second line. Any error will e.g. show up positive in the odd lines and negative in the even lines. The average error should be zero - in theory!
By mixing the color information of two lines and taking the average, you get half the color resolution but eliminate the broadcast errors caused by interference while the signal is traveling to your areal.
In reality, the saturation of the colors degrade with the error. So with PAL, a phase shift will still give you correct colors but they are less intense. It just makes the problem less noticeable, not vanish.
Another problem is the costs. How can you mix the color information of a line with the information of a previous line? The previous line is a matter of the past, long gone!
Digital video signal storage was impossible in the early PAL colour TV era. And when it became possible, the circuit board in a studio broadcast machine which does it digitally was sold for over $4000 (mid 80s). Even today those boards are sold for more than $300 (used)!
So a PAL TV contained an ultrasonic delay line. An ultrasound transducer injects the color signal into a crystal which is then picked up by an ultrasound receiver 63,942 µs later. This method was replaced in the late 1980s using sophisticated chips but still around until 1995. This crystal is a real jewel so while NTSC was nicknamed “Never the same color”, PAL was known as “Pay Additional Luxury”.
The problem is that if the video runs a bit faster or slower, the delayed color video line can’t match with the current one. Also there are manufacturing tolerances as well so it is impossible to get a good match. This also decreases the saturation of the result. But there is a simple remedy, crank up the saturation. So while the picture still looks good, the color resolution is decreased.
You loos 1/2 of the color information in the PAL system and then you loose more information due to tolerances. But this is still OK since the colour resolution of the human eye is only 1/3 of the brightness resolution. So PAL which is actually real bad for the colors is still OK for the human eye.
SECAM tries to fix this by just sending one of the 2 information on the carrier alternating. You still need the delay line which never really matches up but as a result, you get no losses in saturation and you only use 1/2 of the color information. Since the human eye can’t notice that the colors won’t quite match on the brightness pattern, it also works good for the human eye - just with more accurate colors than PAL.
Since the late 1980s, NTSC became superior to all the other formats! They just add a line of test patterns above the visible part of the screen. The micro-computerized Tvknows how this line has to look like and can detect all errors and then work simple filter circuits to compensate those errors. So modern NTSC TVs have much more color information and can eliminate errors just as good (or better) than classic PAL or SECAM.
PAL+ also does the same nowadays so NTSC and PAL can break even here. SECAM on the other hand misses half of the color resolution and there is no way to restore it. So the former superior SECAM is now the worst system while PAL and NTSC sharing the same higher quality.
Fun fact, the video encryption system “Nagravision“ also scrambles the PAL+ line. When PCs became powerful enough (>400Mhz), by identifying where the PAL+ line is, they could rule out 90% of all different ways the picture could be scrambled. By simple try and error they could figure out how the picture is scrambled and decode most European pay-TV channels in real time. It didn’t take long until all EU Pay-TV stations went all digital.
Both farmats are good
Here in Sweden, one of our public TV stations at the time (and one of the only two that was allowed to air by our government) did a pretty well known april fools joke when color TV was introduced. They did an informative broadcast that told people wathing that they could convert their black and white television set so a color one if they stretched a pantyhose over the TV. Yes, people actually fell for it.
And here is why the Delta Shadow Mask is not used for “normal living room” TVs:
While the Delta mask gives you a good resolution and the electron guns are nicely packed tight in the CRT neck, it also swallows way over 80% of the beams. This is why a mid-sized monochrome TV can work with an acceleration voltage of 8kV or less, a color TV the same size needs at least 21kV. You need much more energy inside the beam to have an adequate brightness of the picture.
A Trinitron will not only allow a maximum of resolution, it eats only about 40% of the beam energy (1/3 in theory), you get more than twice the brightness out of your beams. But there are two major problems:
1) the shadow mask is made out of thin wires which have to be really taunt which adds a tremendous force to the screen. The glass of the screen has to be a lot thicker to handle the forces. Just put a 14” trinitron and a 17” delta monitor on a scale, the smaller trinitron is a lot heavier due to all its extra glass.
2) The electron guns need to be sitting in a line next to each other making the neck of the tube a lot wider which in turn makes the deflection coils a lot larger and requiring a lot more power for deflection. Distance is your enemy when working with magnetic fields, the field weakens to the cube with distance! And since the gun beams have different distances to the deflection coils, you need to have an extra homogeneous field which in turn requires even larger coils to create a “Helmholz pair”
A compromise is the slotted mask. Like the trinitron, it eats up less beam power (somewhat more than 50% if I recall correctly) but doesn’t add forces to the screen like the trinitron.
So living room sized TVs are much cheaper and brighter using a slotted mask while small TVs still can use a delta mask for better colors. Computer monitors have to have a delta tube to be able to display an adequate resolution - or need to be trinitron.
This type of mask was the one in the early tv sets. It was called delta tube. The later was inline. The electron beam guns in the delta tube were arranged like a triangle. The second in line. The inl line tubes had a much better brightness an the convergence of the colors where much better.
I know that UA-cam is great for research but I literally have never come across a UA-cam playlist with this much information and depth into any subject. Even up to practical illustrations. My goodness. I was looking for a 10 min history on the invention of TV and now I feel like I have a degree in Television. Thank you so much for this content. This can actually be a publishable book.
3:13 Yes, all of you from PAL regions have actually been watching our movies in the wrong pitch. If you've searched for clips from your favorite movies and heard them with a lower pitch, you're actually hearing them in the correct pitch.
This explained a lot, never new it affected so much
You could probably narrow that down to just UK, since every country had their own channels and movies were usually dubbed into local language. At most it would have been music or sound effects that were affected by the pitch shift.
I think today they use an electronic pitch shifter to fix this, shifting the frequencies down a couple of percent. With such a small change, it's unlikely to generate unwanted noise artefacts.
Plus the rest of the English speaking world, the Netherlands and much of Scandinavia.
Mandolinic that solves the pitch but not the runtime problem. If you synchronize an NTSC and PAL version of a movie, I think you'll find that they will still fall out of sync when you leave them running.
Years ago digging about in my Nan's loft I found a wheel with red & green acetate panels in it attached to an ancient electric motor. I asked my Nan what it was & she told me it was my Grandad's experiment trying to turn black & white TV into colour. This suddenly makes sense!
He'd died by this time so he wasn't around to ask but he used to build his own radios & I can remember him making microphones with me as a kid from scratch.
He must've been a cool dude back in the day! Thank you for explaining this 2 colour method. That wheel diagram was exactly what I found in the loft that day!
Regarding chroma dots (and I hope I don't repeat anything you've already said, or bore anyone needlessly)... the dots you're seeing when viewing a colour signal on a high resolution B&W monitor is the chroma signal itself.
...I was going to go into the details of how the chroma signal is generated, but the comment got incredibly long and probably unintelligible. Suffice to say, the chroma signal is made up of a modulated sine wave. Such a signal has peaks and valleys in magnitude when you look at it up close. The overall height of these peaks and valleys (when you 'zoom out' and look at the amplitude of the signal) represents the saturation of the signal. If a B&W television doesn't filter out the chroma signal, it will draw the tiny peaks and valleys as dots on the screen. Highly saturated colours have a stronger carrier amplitude, and have a higher contrast between the peaks and valleys, making them more noticeable. The effect is magnified by gamma correction, because the peaks of the sine wave get stretched, making the dots even more visible.
I just found your channel a few days ago and am absolutely LOVING your content! The way you incorporate video effects to co-narrate each point really illustrates each point very well, and your analysis is both thorough and insightful (IMHO). Keep up the great work!!
I didn’t realize Europe/UK waited 10-15yrs to get color tv 📺
erikig In the Netherlands the waiting was more a political than technical issue.
The government had spent a lot of money into B&W equipment.
They just didn’t want to invest in color.
The PAL system was invented in 1962 by Walter Bruch. The first PAL broadcast in most of Europe was not until 1967.
Don't worry, Australia didn't get tv (B&W) until 1956, then colour television came 19 years later in 1975.
@@batoff01 NZ beat ya by 2 years ...
;-)
@@blitzwing1 I think you misspelled 'Lenin room' there :-)
It wasn't affordable for most people until around then anyway. From what I've read, colour TV sales only overtook B&W ones in 1972 in the US and 1975 in the UK.
Also worth mentioning regarding the interleaving of Y and C signals is that when viewed on a spectrum analyzer, the luminance signal appears as a series of frequency peaks with gaps between them, such that when the chrominance signal is well designed, it will fit in those gaps with little overlap.
Hey, I love your videos, and please, keep adding subtitles to the video, I'm learning English and it helps a lot to understand completely the video, because have so many different words, specific words, so the subtitles help to understand!
Same
I am from the UK and just about old enough to remember the crossover between 405 line and 625 line transmissions. For a while people would have dual standard tv's capable of receiving both 405 line transmissions on VHF, and 625 line transmissions on UHF. This led to a messy mix of antenna's on the roofs of houses. Back then renting your tv set was very common in the UK, rather than owning your own set, so the change over did not really affect people too much, the rental company simply swapped your set when the time came. And remembering just how often we had to call out the tv repair man back in those days, i think renting was probably a good idea.
I love this channel so much. I'm always excited to see a new upload. As an engineer, this is like eating a perfectly nice steak, but for my brain, and it's a youtube video, and I'm bad at analogies.
I remember well the dual standard televisions I helped install in my youth. They were capable of both 405 & 625 lines with complicated switching between the two systems. I remember assisting the installation of a Philips 26" dual standard colour TV that was huge! It was mostly valve (vacuum tube) based and soaked up over 500 watts from the mains! Watching a black & white programme on the 405 line system on the 26" screen was like viewing it through a Venetian blind! It took a couple of hours for the TV guy to set up the convergence, purity, barrel/pincushion distortion, settings, etc. after he had degaussed the shadowmask tube with a mains powered coil.
We were amazed when colour sets could be imported from Japan, plugged in and work without ANY setting up at all!
Phase alternation was actually tried by the NTSC. At the time, however, no 1-line video delay technology was available, so the cancellation of phase errors would depend on the eye doing the cancellation. This is not perfect, and could result in visible flicker or horizontal line patterns in the picture. It was decided that a reliable improvement could not be obtained and the technique was abandoned. When phase alternation was adopted for PAL, economical acoustic delay lines for use in receivers were available. It was found that poorly operating receivers could produce visible horizontal line patterns, which were nicknamed "Hannover bars" by some, based on the city where PAL was invented. These same delay lines were used in top-line NTSC receivers to make "comb" filters to reduce the interference between chroma and luma. Comb filtering was much more difficult in PAL receivers and generally not attempted due one phase of subcarrier not alternating phase between lines, while the other phase did.
Imagine if they actually kept that
Everyone would be using NTSC, PAL wouldn't even exist, and things would be less confusing in general.
A version of the color wheel system (at the recording end) was used to send color video back from the Apollo Moon missions, because a three-tube color video camera was just too finicky and bulky to send, but there were usable monochrome cameras that could be fitted with a mechanical color wheel. The interleaved signal was converted to NTSC on Earth using a whole lot of old-school analog video wizardry. I think the earlier missions only had the color-wheel camera inside the spacecraft, but later ones actually took them out on the lunar surface to send back live color TV. I saw a webpage somewhere that argued that the format was basically identical in some sense to Col-R-Tel.
Yes. The Apollo 11 mission used only B&W, running at 320 lines and 10 fps, and up-converted on Earth. Colour wheel was used subsequently. Wikipedia has a great article on this; "Apollo TV Camera".
When I was in London in 2002, one of the channels ran "The Matrix" and it was a WEIRD experience for me. Having seen the movie a whole bunch of times, I could instantly pick up that someone was strange about the audio. Everything was a little to quick. I was certain of it, as my friends and I would do Agent Smith impressions all the time and the exact timing of his dialog is key. I don't know if they still do this, but it was very interesting to see. Um, hear.
Happens in Australia quite a bit, too (PAL 50Hz, 25 fps).
i love learning about this kind of stuff... the geeky questions i pondered as a kid before i had free access to the internet. you're well spoken and thanks for putting in the effort
Colour recovery from chroma crawl was theorized by James Insell after spotting colour breakthrough on UK broadcasts of telerecordings of Dr Who programmes, and further developed an idea of what was happening with Steve Roberts of the Dr Who restoration team. Richard T. Russell then wrote the software that actually put the theory into practice. I can not get my head around the maths, but my understanding is that at present it is believed that such colour recovery is only possible on PAL source material. I believe this is due to some aspect of the phase of the colour burst being impossible to recover on NTSC signals. I would not rule it out however.
There are a lot of recordings that colour recovery is not possible on. The UK archives are full of anomalies, including programmes that shot on colour equipment that were only ever available in monochrome due to industrial action during the time of production that meant staff refused to operate with the saturation dials turned up. As a result, there are a few programmes with really faint colour as the dials were not turned all the way down, or because their minimum setting was not actually zero!
I should also point out that when home computers arrived, the Amiga was the only machine that exploited the NTSC and PAL systems to it's advantage. Both varieties of the machine slightly underclocked their CPU's and other custom chips to 7.16 mhz and 7.09mhz respectively, so that they would operate at a speed that (through some maths) would complement some aspect of the video system. Having the cpu and custom hardware set at these particular rates reduced timing overheads and opened up opportunities for exploiting the hardware further. You could generate a lot of effects and work with a lot of video applications in the professional domain that would have otherwise have been impossible without a very expensive high end system.
Quite often, programmes produced during the colour strike were rendered B&W by the simple expedient of actually removing the colour tubes from the cameras.
In every attempt I have made to learn about color analog TV, I noticed sources abbreviate different highly technical stuff. I appreciate your ability to speculate due to the lack of definitive sources. Knowing the risk that I have previously learned others' "errors", I believe (but am not certain) that you have not correctly diagrammed the TV signal correctly at 6:25. May I posit: 1. You portray sync, then back porch, then color burst. I previously learned that back porch comes AFTER active video and the front porch comes BEFORE active video also carrying the 3.57~MHz (NTSC) color burst. 2. You portray the front porch following the active video; I present that this is the back porch. 3. The front porch is the zero volt sync to provide what's really known as the horizontal retrace (the raster sweep moves back horizontally & 2 lines down [remember the interlaced scan] for the next raster/sweep. The color burst modulates the front porch to not more than two volts so as not to confuse B&W TV's. The active video provides 2V to 5V for black to white video. I'm still pretty sure that I am missing a smaller voltage change between these two points (front porch & active video). Each line of active video still comes with more front porches until the bottom of the field, providing a back porch (really called the vertical retrace) to return to the top of the screen. Bonus: the back porch carries 2 characters (16 bits) of B&W dots for closed captioning, again the voltage not high enough to confuse non-CC TV's. Thank you for reading my two cents and feel free to reply & attack. Color analog TV was the singular most complicated circuitry in anyone's homes and depended on the widest variety of other technologies. Information on how it worked for an audience of non-engineers must necessarily be fraught with not just errors, but summary data, guesses, & speculation. To be sure, I highly respect the host and he jives with and taught me more compared with much else I learned elsewhere. Whew!
Seriously, the best explanation for me. I was always wondering about PAL and NTSC frame rates.
Now, I can calmly go and explain myself. LOL
This is really an underrated channel. This guy could teach in college.
I never learned about YUV or YCbCr encoding and why it exists until I encountered this channel. Turns out, it's extremely useful
I told you Camarena was more of a myth than a fact. It actually turned out worse than what I already had found out about him. Great video. I'm glad it hasn't got any hate from Mexican viewers that still believe the myth.
BTW, we in Mexico almost had the bicolor system as standard. Camarena was in the board of Telesistema Mexicano, the foundation of today's Televisa, after the merger that created TSM with his own station XHGC channel 5 (guess what the GC stands for in the call sign). An electronics associate was ready to deliver the TV sets to department stores but Camarena died in a car accident. This was a major setback for Camarena's system to start and soon after, the government opted for the NTSC standard, probably in an act of spite against the privately owned TSM as the government wanted to have their own TV network by forcing the networks to go bankrupt and expropriate them, which in the end happened with Channel 13. If Camarena and Televisa would had succeeded, they would had become an even worse monopoly as they would not only provide the programming but the sets as well.
Thank goodness!
Neither a myth. Color television was made by many people, making different systems and contributions. Camarena knew how to "sell" this idea in the US, and that's what he did. That's why he is so famous, and for integrating the new colour TV infrastructure in Mexico.
im here because they are everywhere saying they invented color tv and i looked it up and its not even a little bit true
@@RodolfoAmbriz not
@@RodolfoAmbriz he is not famous
Something that may fit in here or interest you - colour LCD backlit monitors for fairly cheap displays - the LCD areas for a colour are open when the RGB LED backlight is the appropriate colour, and as that can be rapidly switched, you create a colour image from a simple RGB source and LCD display without complex pixel mapping, just managing the timing and POV.
Interesting! Wasn't aware of that colour restoration from old Video Programs that were recorded off TV screens to 16mm. Good thing the BBC kept these filmreels in their vault to be scanned in HD years later. I wonder how much could be achieved if these were scanned at 4K? Maybe the results would be a much more precise restoration with more details of the dots preserved. The fact that even regular Full HD scans have brought such good results speaks for the Filmstock they used.
That 'whew' at the end was for you and me both! That was a LOT of info in 12 minutes!
mmh. Really all a matter of taste when it comes to film.
PAL introduces a timing discrepancy, 3:2 pulldown introduces a visual discrepancy. Based on your own diagram 3 of 5 frames are correct, 2 out of 5 are awkard blended frames, one out of 4 of the original frames doesn't ever get displayed cleanly.
Some people have higher sensitivity to visual artefacts, some more to temporal.
Eventually this was solved more thoroughly, but that took a long time...
One of the best channels about retro technology.Possibly the best!
You're publishing more often! Cool! You do quality work on this channel
Zgadzam się z tobą ;)
Yea loving his work! Keep it up!
Yes! Those early red, green, blue dots are what I used to see when I put my eye(s) against the screen of our 1960's color TV when I was a kid. Thanks for showing them, I was beginning to wonder if my recollections were inaccurate.
Pal wasn´t that bad for watching TV programs or films, cinema was only 24 pictures second for example. So 25 fps was fine for video and in exchange you had a better resolution. Problem was with videogames, we play mostly videogames from NTSC regions (Japan and America) but they came in lower NTSC resolution but also lower PAL fps (And frame rate is important in videogames).
Strictly speaking, "PAL" was just the colour encoding system and didn't specify the frame rate. Apparently Brazil used (and still(?) uses) a variant called PAL-M which used PAL colour encoding but with 525-line, 30fps (i.e. otherwise the same as NTSC).
Ironically, "PAL" was so often used as a synonym for 625-line/30-fps, you sometimes saw it in cases where PAL itself wasn't even used (e.g. a PS1 using component colour connection).
As far as "regular" 625-line/25-fps "PAL"(!) goes, it should be made clear for those who weren't there at the time that the problem with the frame rate and older computers and consoles wasn't primarily that the screen refresh was lower and more flickery.
The bigger problem was that- since many games back then were both timed and sychronised to the frame refresh (e.g. using the inter-frame "vertical blank interrupt" for certain calculations and updates), NTSC games ran 16-17% slower on PAL systems, even if the CPU in the console/computer itself ran at a similar speed. (This was the case with the Atari 800, for example).
I understand almost precisely nothing of this but I love listening to it!
Excellent video, can't wait for the next one.
Also, fun fact about SECAM: here in Poland we have used this standard until the end of the 90s, and later switched to PAL - I always wondered why, and never got the answer.
Reds SECAM was used by the Soviet Bloc (and France).
Eastern Bloc countries used SECAM because it made it harder for people to watch TV from the democratic west - the only Western country which used SECAM instead of PAL was France and that was too far away from Eastern Europe to make listening in to French programs possible. All the western countries next to Warsaw Pact ones used PAL. So when the Warsaw Pact chose SECAM it made listening in on naughty western TV programs harder.
en.wikipedia.org/wiki/SECAM#/media/File:PAL-NTSC-SECAM.svg
en.wikipedia.org/wiki/SECAM#The_spread_of_SECAM
"The adoption of SECAM in Eastern Europe has been attributed to Cold War political machinations. According to this explanation, East German political authorities were well aware of West German television's popularity and adopted SECAM rather than the PAL encoding used in West Germany. This did not hinder mutual reception in black & white, because the underlying TV standards remained essentially the same in both parts of Germany. However, East Germans responded by buying PAL decoders for their SECAM sets. Eventually, the government in East Berlin stopped paying attention to so-called "Republikflucht via Fernsehen", or "defection via television". Later East German-produced TV sets even included a dual standard PAL/SECAM decoder."
After Communism collapsed people probably changed to PAL because PAL sets were more widely available.
Though if you were going to stop people watching Western TV it would have been better to have a completely different, incompatible standard. SECAM tvs can still see a PAL broadcast in mono. But then the USSR and Warsaw Pact was never much good at innovation and manufacturing.
Whatever the reason, all the Warsaw Pact countries migrated from SECAM to PAL after the Berlin Wall fell, communism ended and the Russians left
en.wikipedia.org/wiki/SECAM#Migration_from_SECAM_to_PAL
5Rounds Rapid
That would explain French politics...
>After Communism collapsed people probably changed to PAL because PAL sets were more widely available.
I already knew about the fact that the whole Eastern Block used SECAM, just wondered why we changed from SECAM to PAL. That is just the explanation I wanted, and it sounds reasonable enough to be true.
Bero256, why was it easier? SECAM and PAL are kinda similiar, the only thing that differs them is sound placed in another place of the wave and color coded in a different way; Framerate and amount of lines - two things that could actually make a difference in editing video are the same in both SECAM and PAL.
In the 60's you had to buy a special BBC2 aerial(antenna) to receive that channel. So you could see which houses had it by looking for the extra aerial on the chimney. After 625 lines became standard across all channels, the term "BBC2 aerial" faded into history. TV engineers I worked with back in the day quipped that PAL meant "Picture always lovely" usually followed by some derogatory NTSC backronym.
When NTSC color broadcasts began, many viewers experienced tint issues. The tint issues were solved for people in Europe by introducing PAL instead of NTSC. In PAL, the same disturbance of the signal that causes NTSC colors to go wrong causes only a slight decolorization of the image. In other words, in case of bad reception, with PAL you see a picture that's a mixture of the correct color picture and a black-and-white picture. That's far more acceptable than the tint errors of NTSC since we were used to watch black and white before NTSC and PAL. So, at that moment PAL was better than NTSC.
It has remained so ever since. End of story.
No! A game changer happened. Several redundant game changers happened.
The first one is cable TV. It's not just that cable TV has better reception, should have, sometimes you can have a bad signal from the cable provider. The color errors of NTSC happen because of multipathing. The reception of an OTA signal can be as bad as it gets, if there's no multipathing going on, the colors are correct with NTSC. But all too often there's multipathing OTA. Cable TV and satellite TV cannot suffer multipathing. Whether the reception is good or bad, NTSC always produces the correct color and PAL has no de-saturation issues at watching TV over cable or satellite.
Another game changer is the plus package. There's PALplus and NTSCplus. They're downward compatible to PAL or NTSC respectively the way NTSC and PAL are downward compatible with b & w. With the plus package, the color remains stable even when multipathing is going on.
These changes caused problems of PAL and NTSC to go away and NTSC is no longer worse in the way it once was. Now we can look at more minute details of both systems and find, PAL has disadvantages. This www.hawestv.com/mtv_2color/mtv_2colorP3.htm website states, PAL has a lower "gamut" than NTSC.
In NTSC, Q is broadcast with less bandwidth and less signal strength than I. Q can be broadcast with less because we don't see the difference. In PAL, U and V use the same bandwidth. In consequence I and Q use the same bandwidth. It can't be in lower sideband modulation but must be in double-sideband modulation. This means I must be broadcast with less detail which difference can be seen.
The pattern of dots caused on a b & w screen as shown in 5:35 was more intense in PAL than in NTSC. For one because Q was transmitted with lower amplitude. On the other hand because etc
In the end, NTSC was the better system. At first, we waited for so long to introduce colour TV here in Europe, just to have a better system. Then our glory didn't last long.
My mom had one of the first TVs in Tacoma, WA. Later, when color TV came out, her family members would fight over the tint, and one (an uncle?) would just turn it back from color to B&W. I never understood this, but your video explains it all - the dependence on phase alone, the inconsistencies of vacuum tubes, and variations in Earth's magnetic field caused problems that my youth never exposed me to.
One of the the various issues with NTSC PAL and SECAM is the ability to maintain quality over long cable transmission (esp. in the days before satellites)... ironically ..apparently.. NTSC was more stable and easier to manage in such conditions ..perhaps because it could be rectified more easily. SECAM was notoriously difficult to use in the studio (chroma key for instance is impossible) so even in France programs were created in PAL and transmitted in SECAM . As for frame rate.... many UK productions were filmed at 25fps for obvious reasons.
Of course in the UK during the 50s the BBC experiments with 405 line NTSC colour ... but in the end the decision was made to hold off colour until 625 line came on stream. I suspect because 405 line NTSC displayed on sets of the day looked as good as 625 line in an era when CRTs of the day had relatively lesser resolution.
In the valve (tube) era stability of transmission chains was a big issue for the BBC and I suspect that was also a major deciding point ..but also I get the feeling the Beeb simply did not want colour at the time. ITV was pushing hard! Advertisers wanted colour.
As for NTSC vs PAL.... forget it.... side by side NTSC -M vs PAL G is all about resolution and stability... and colour accuracy... and PAL G wins both but flicker is a more subjective issue... I have never be worried by 50hz flicker... I have lived with it all my life...these days of course.... 525 line NTSC at its best is good but somehow reds and oranges never look right for me ... start an NTSC DVD and the WB logo looks orange not gold...purely subjective...
Regarding the first paragraph... NTSC's hue fluctuation issues were only ever an issue in over-the-air broadcasts. Once put into a wire, it was rock-solid.
Regarding the last... I suspect the color issue was due to the display not really being set up for NTSC image reproduction, as I've never had those issues on my American television. If your display is configured for PAL's different color gamut and gamma curve, NTSC sources will look wrong. And those adjustments aren't controls that were ever readily accessible to the end user.
Why do i love this stuff so much
What about the SECAM system which was used in France and Eastern Europe. What was the difference between NTSC, PAL and SECAM?
SECAM used a narrow band FM carrier for the color. SECAM also only transmitted one color component per line, which required a line buffer (delay line memory) to store one of the color components while the other was being transmitted. NTSC/PAL didn't do this since it used QAM for the color components.
QAM is a bit tricky to understand, but basically, if you look at a sine wave, there's spots in the middle where the wave crosses the 0 state that don't have any information. QAM takes another AM wave, and spaces it such that it fits inside those gaps, so you can have two signals in the same space.
With FM, since the gaps are always moving around, you can't do that, so to get both color components with FM, you need the line buffer. This also cuts the vertical resolution of the color signal in half, but since the horizontal resolution was already crap to begin with, that's not an issue, though it is still worse.
SECAM could theoretically have the best color reproduction, but it was terribly complicated and was vulnerable to the FM cliff effect for the color signal.
kargaroc386 Didn't SECAM also encode video signals in YDbDr format, which has luma, along with two separate chroma subcarriers that ride along the luma channel in a composite signal? I know it was also the only analog composite video standard to do this along with both vertical and horizontal subsampling thanks to having two delay lines instead of just one like PAL. I just wanted to verify.
No. Both PAL and Secam TV sets have 2 delay lines. One is for correcting luma / chroma delay and is used in NTSC too (NTSC use another I/Q delay line). Second is for chroma. In both systems it is used for adding next lines together - in PAL to correct phase error, in Secam to join red and blue chroma components.
@@xsc1000 This answer is a bit confusing. All color receivers require a short luminance delay line of one microsecond or so because there is a delay of the chroma due to the narrow bandwidth of the chroma circuits. The other delay line that is discussed is a much longer one-line delay that is used for other purposes: in NTSC, as a comb filter; in PAL, as a phase error canceller; in SECAM, to make both chroma components available simultaneously.
Your videos are a great starting point for more detailed research :) you present all the basics in a very intelligible format. Good stuff.
Great video, however likely been pointed out frame rates between the US and Europe, 60fps NTSC compared to 50fps on PAL are down to our European mains frequency of 50hz compared to yours at 60hz with each running at two fields per second in good old analogue. That's the reason.
A brilliant addendum to the previous videos! My head’s spinning; I like that in a video
watching this on a good CRT monitor. :D
I loved how thorough you were with this whole explanation. I’m very much looking forward to your upcoming video on the Trinitron tubes!
lol, resolution over framerate. And now everyone wants framerate over resolution.
I know they're not the same thing (audio/video vs video games), but still.
Besides the difference in video vs games, back then, the resolution was the weak point. Having 25-30 fps is fine for TV and movies, and was plenty for the games of that era. But 525/625 lines is pretty bad, especially with analog signals and CRT diaplays. Today, resolution has improved to the point where it's usually not an issue.
4k rocks anything more than that is pointless unless you have a really big screen up close, as far as farme rates there is a slight to do 120/144hz over 60 anything more than that though is even more pointless
They're completely different because when watching TV you don't need to worry about how quickly you see a response to your input. Seeing a faster response to pressing a controller button makes the controls feel more precise, and that's the primary reason gamers prefer higher frame rates.
I'm glad this was cleared up and explained better than I expected.
And in the former Soviet Union we call SECAM the worst color system, and we say SECAM was a mistake ... Especially when writing to VHS in MESECAM
If I remember right, the worst thing about SECAM is that they didn't invert the luminance signal. In NTSC, a strong signal means "dark", but in SECAM it means "bright". RCA figured that snow in the bright areas (where the signal would be weaker) would be less annoying than snow in dark areas.
SECAM had two color carriers and was capable of both vertical and horizontal subsampling, making it the first color, analog, composite video, television standard to do this. I think SECAM looked the best.
+Scott Larson SECAM is a colour system, it does not define the underlying b&w signal. It's true that France uses system where where amplitude and brightness are reversed to all other systems, but in Eastern Europe it is (or was) used with systems D/K and B/G which are the same as in some PAL countries.
The SECAM specification certainly does define the underlying monochrome signal. All color systems rely on a specific monochrome system otherwise the color system won't work. That's why Brazil doesn't use PAL. It uses a variant called PAL-M because PAL as used in the rest of the world wouldn't work with the NTSC-based monochrome system.
+Scott Larson Sorry you're in error. Any colour system can be adapted to any monochrome standard. Not all possible combination exist in real life, but they are theoretically possible. SECAM has been used with ITU standards L, B, G, D and K, PAL with B, G, D, K, I and M. The latter is the combination used in Brazil, and that is absolutely a PAL standard. And this is also why it's called PAL-M, it means PAL on top of ITU M. Using the same naming system, the standard for UK is PAL-I, most of Western Europe PAL-B/G, France SECAM-L, Eastern Europe SECAM-D/K or SECAM-B/G, mainland China PAL-D/K. That is what those combinations mean. It does not mean they're "not really PAL". And only system L as used in France is the one where luminance is inverted, all the SECAM variants used in Eastern Europe aren't.
If there was a country which used NTSC on top of one of the 625 line signals, it would be e.g. NTSC-B/G, or in the UK NTSC-I (I think the BBC seriously considered this option). The BBC also tested all of the three colour systems with their 405 line standard A. If they had gone that route, it would be NTSC-A, SECAM-A or PAL-A, whichever it would have been.
en.wikipedia.org/wiki/Broadcast_television_systems#ITU_standards
Note that list defines as one of its parameters the frequency of the chroma subcarrier, but it doesn't specify how it is to be modulated.
I was wrong! I thought with my background and hobbyist research, this channel could not tell me something new about analogue television. But, reconstructing color our of monochrome artifacts? I know these artifacts from first-hand experience but would not think it is possible to derive the color from it. That is ... fantastic!
Lol, I actually appeared in someone else's video. Achievement unlocked.
Achievement unlocked, indeed! I guess you didn't notice it the first time :)
No I didn't and it is weird seeing myself appearing on a much more popular channel. I still prefer the idea of speeding up by 1 fps though, no resolution was lost and no frames had to be blended leaving the picture quality unaffected but I see that there are mixed opinions regarding sound.
Heh, looks like I unlocked the same achievement :P
Resolution isn't really "lost" in a 3:2 pulldown. The fields either stay onscreen 1 full frame, or 1.5 full frames worth of time. There is always 1 full frame visible for each full film frame in a 2-field period, just some frames half-stay longer for a third field. The only time resolution is lost is if you have captured a framerate (distinct points in time) that equals greater than 1/2 the fieldrate. Blending of frames still happens with 50i; it just happens *_every_* frame if both top and bottom fields represent the same point in time.. Field 1b to field 2a is a blend, even if the transition between 1a and field 1b is not.
The complaint is that for fast action scenes (or fast horizontal pans), the 3:2 pulldown's effect of displaying a frame where the two fields come from different source frames can produce some horizontal image-tearing, and jagged edges on high contrast mostly-vertical edges.
I've noticed this on commercial VHS tapes and on DVDs. I've never noticed it on analog TV broadcasts or VHS recordings from TV broadcasts. Probably because the broadcast itself is a bit fuzzy to begin with.
But I still think it's better than speeding up the movie. My ears are very sensitive to a soundtrack that's sped up by even 4%. Everything will just feel "wrong" to me, even if I'm not sure why.
The framerate difference between PAL and NTSC is also dictated by the frequency of the AC current. Artificial light is pulsating at 50Hz in Europe, if you record at 30fps the image will flicker.
Way to call out the PAL fans. Europe was WAY behind with their color broadcast standards. It's like comparing a mid 60's Mercedes to a mid 50's Chevrolet. A lot gets improved in a decade. For mid 50's technology that was also backward compatible color NTSC was pretty amazing. And, as you pointed out, the few shortcomings were addressed by the time PAL started to take off.
Sir David Attenborough talks about his early efforts to promote the new broadcast standard.
ua-cam.com/video/22uUS8vq48c/v-deo.html
If you think Europe was behind, you should have seen Australia. We didn't start broadcasting in PAL until 1974. On the other hand, we did handily beat the US for sheer speed of adoption once we did get colour TV going.
Not quite right, Daro. BBC2 began 625-line *b&w* broadcasts in 1964, colour beginning in 1967. BBC1 and ITV began colour broadcasts in 1969.
Shared Knowledge somewhere else on UA-cam, there is a video of the Bert Kaempfert orchestra from the first color tv broadcast in Europe (1968)
Love your videos you make the most complex things simpler and easier to understand
1:37 ah yes, the broad cats.
😂😂
I know literally nothing about any of the history of the tech in this series of videos, thanks for making such interesting content and thanks for not being a channel that over edits and instead trusts in your viewers attention span, I wish linus would watch some of your content and have a word with his editing team.
Subbed.
UA-cam recomend this video again, and i watched
It is true that some TVs could produce an interference pattern caused by the difference between the chroma carrier and sound carrier frequencies but this was much more coarse than the chroma dot pattern. The fine 'chroma dots' were simply caused by the colour subcarrier being superimposed upon the luminance signal. The colour system used the supressed carrier AM technique which is why the dots disappeared in non coloured appears. The reference frequency of the colour subcarrier was not a whole multiple of the frame rate which meant that the exact phasing of the colour subcarrier info was different on consecutive frames. This meant that the chroma dots would move and hence be less noticeable by averaging out over several frames. It also meant that false colour artifacts produced by components of the luminance signal being wongly decoded as colour, would appear different on consecutive frames and produce a flickering effect.
Why no one talks about *SECAM* !!
It's pretty much PAL without the need of the license from Telefunken.
rust actually, SECAM is technically very different and a lot more complex. FWIW, the color reproduction was also terrible compared to PAL.
8 Bits SECAM, when used properly, was effectively PAL but with basically perfect colour. Unfortunately, the way colour was handled also made SECAM a gargantuan pain in the ass to work with, and any kind of editing or VFX work required decoding the SECAM colour to a component format and then reencoding to SECAM afterwards. Unfortunately, almost all video editing equipment used composite formats, which meant that SECAM programs with lots of editing or VFX work suffer from similar artefacts of PAL. Eventually, many programs just straight up started being produced in PAL, and then converted to SECAM afterwards, which meant that the composite colour artefacts were actually worse than PAL since it had to be converted first.
We don't like secam.
Good job on the explanation so far. This series of videos may be the best on the subject on the whole Internet. Keep up the good work :)
Can you make a video about SECAM?
You must be the most knowledgeable person on the planet in this field.
I own a Trinitron from I believe the late 80s. If there's anything I can do to help you out with your Trinitron video, let me know. Also I made a video response for another one of your videos. Still in editing, but when it's done I think you'll like it.
"wrap-up -- some extra info", video exactly 1 minute shorter than the video it's follows. Love that!
Why after all these years UA-cam doesn't have an option for 576 resolution?
WONDERFUL!!! This video explained to me just how they colorize films. Now I know the colors are not really too far off from what they actually were. Thanks!
The method described in this video only applies to films taken of color TV programs via telecine. It wouldn't work for footage that never existed in color, or for black-and-white prints of films shot in color. In those cases, other methods would have to be used.
What about a video about DVB vs ATSC ;) xD
The rabbit hole would become an underground labyrinth to surpass carpenter ants...
Lol, that's probably true.
What about the Japanese ISDB-T?
When you get into digital television (and probably radio too) they're just methods to transmit an MPEG Transport Stream. All the mess with frame rates, etc is left to the codecs. I gather most televisions can handle any common frame rate anyway. The only problem now is updating codecs e.g MPEG-2 video to MPEG-4 AVC (a.k.a "h.264").
And now we have to update to the newest HEVC "H.265" and open source rival AV1 codecs for 4K video :-P
NTSC vs PAL is the classic first to market worst to market scenario for exactly the reasons you described. The later technology gets to learn from the earlier mistakes.
We experienced the same issue when moving to digital TV, The UK pushed really hard with the Freeview system. The result was lower power transmissions ( so as not to interfere with the analog signals on nearby frequencies ) so you needed bigger aerials to even get it, and MPEG2 encoding ( hardware decoders for anything better were not available/too expensive to be putting in cheap set top boxes) so the video quality was not much better than the original analogue. The rest of Europe started there transmissions years later so their equipment supported MP4 encoding and higher bandwith. The result is that European Standard Definition Digital transmissions are better quality, and almost as good as the later introduced High Definition Digital TV in the UK, which required everyone to buy new boxes for.
A switch from 400 lines to 625 in UK wasn't an issue as most rented their TVs not bought them outright.
Thanks for that. A couple of asides. 1) Over here, in the UK, we are told that the alternating phase burst was thought up in our Post Office Research Labs, which also gave us the analog computer, Enigma cracking etc etc. The math, drawings and stuff were given (not sold) to Telefunken on government orders as part of a rehabilitation program for (West) Germany. The Post Office department involved went into a fugue, especially since some years earlier they had been ordered to destroy all their computers (the first!) on orders originating in Langley,VA to stop the Roooshiuns fum geddnum. Although the unit kept working, pioneering scads of technologies and applications, they never regained their previous ascendency. Increasingly, politics intruded into engineering considerations. Total f***up ensued.
2) Thanks for the wee tip of the hat to John Locke Baird. However, my understanding is that after his mechanical system was rejected, with its demonstrated two color capacity, Baird threw himself into electronic television so hard he almost came out the other side. His hiring of Philo T Farnsworth wasn't the brain skimming exercise US historians usually depict. Intellectually, they were partners.
3) One of the most curious aspects of British history is the insistence that radar was "invented" by Robert Watson Watt. Over a year before RWW did his first field trial, John Logie Baird had been assigned the first of a succession of aircraft for a still classified series of experiments. It has to be either radar or a death ray, and JLB was a committed pacifist.
4) Before his death, John Logie Baird had conceived, designed and built an electronic three color system, approved for introduction as soon as the war was over. He had also demonstrated a full color optical 3D system. Publicity photos of this system in action may or may not be simulated. If the latter, I can't tell.
If you want to look into this, please plague New Scientist magazine with your questions. They set my feet on this trail.
I actually prefer NTSC. Watching PAL TV always bothered me just a little bit whenever I would visit Europe. Although the picture was a little bit sharper due to the higher resolution, the flicker just messed up my eyes. I can't really see the flicker at all on an NTSC screen unless I move my eyes around really fast or something. But on PAL it's just impossible to ignore the constant flicker and it always bugged my eyes out trying to watch it. I don't know, just a personal opinion but I much prefer NTSC even with the lower resolution and color issues. At least according to my eyes, the reduced flicker was well worth it. In any case, none of this is an issue at all on LCD/Plasma flat screen HDTVs. It's all moot now.
But the flickering has nothing to do with PAL/NTSC. It depends on framerate and framerate depends on mains frequency. So 50Hz in Europe and 60Hz in US was choosen in 30's - 40's for B/W TVs.
I grew up with PAL to me its just a better colour system, American NTSC always looked bad until recently, never had an issue with the flickering even with a CRT, its not supposed to affect you if you are a decent distance away from the TV screen.
I grew up with PAL (50 Hz) too, but it took me years to get used to that flicker, and even at that point it wasn't pleasant.
I am 80 years old. I can remember the Black & white days. I have also worked on a 800 line Hi-res system (Black & white only. used in a Military briefing situation audience.). I can also remember Popular Science mags saying the future would be a flat tv the fit on a wall. If WWII had not happend who knows but because of WWII eletrconics in High Freq, VHF and Uhf freq. Tv after the war was cheaper. Our first TV in 1948 was a round tube set.
The thing you've completely missed with the 4% PAL speed up is that unless you had already watched the same movie/tv show in NTSC you would never notice, unlike the NTSC 3:2 pull down, which was noticeable with every panning shot.
It depends on the individual (and the television system to which one is accustomed - NTSC/ATSC, in my case).
I barely notice judder on panning shots (and when I do, it's only mildly irritating). Conversely, I find a ~4.17% increase in the pitch of people's voices highly noticeable/annoying, even if I'm unfamiliar with the production and its cast. (I realize that others have the opposite experience.)
I personally hate both situations. The motion is far more pleasant with PAL, but the SPEEDUP is horrid! But I will say that I can live with the speed up and find it much less objectionable than the jerky motion from 24 to 30 FPS conversions. I have always lived in the USA and grew up with NTSC and always hated the motion since I first noticed in when I was little watching Star Trek and Lost in Space. I had not noticed it much before that.
As a result, I have actually bought quite a few TV series in PAL versions due to the better video motion!
Either flaw is correctable, of course. For 24p material to which pulldown was applied, inverse telecine results in an exact reconstruction. (Many modern TV monitors can detect the cadence and perform this function on the fly.) Speedup can be reversed through software, including a DirectShow filter called ReClock.
In the case of a DVD, I would assign preference to a 576i version for its higher resolution (provided that it was sourced from a master of that resolution or greater). Blu-ray largely eliminates the issue, as 24p material is typically released as such worldwide.
Europe did introduce color TV 10-15 years after US, but still, in first years, color TV in US was so rare that we can conclude that NTSC color problems was result of premature standardization. System could still be improved before standardization without much impact tu acceptance of color TV.
@Frank Silvers you are right in general, but you are missing context - I was responding to idea that PAL was more robust because Europe introduced color TV 10-15 years after US, and even more important, i said that US problem was premature standardisation. Your comment should be addressed to this nice guy who made video, because he was talking about comparison to Europe.
If we want to be fair and exact, SECAM (also used in parts of Europe) predates PAL and solves NTSC issues, PAL adapted SECAM solution but in NTSC signal structure.
@Frank Silvers we are comparing standards by simply stating it's origin. Both european standards are younger amd both solves NTSC issues.
Please do a follow up to this covering PAL-M used in Brazil that was 525 lines with PAL Colour and Subcarrier as well as FRENCH and RUSSIAN SECAM. FRENCH (France, not overseas holdings) used the lowest energy levels for SYNC and a reverse Luma polarity.
Failing to cover these weird cases would really miss out on some interesting technology tweaks.
++
COVER NICAM STEREO and A2 STEREO ... noting that the US NTSC system for stereo for telly sucked ...
eyreland SECAM utilized the base 625/50/2:1 standard with two FM chroma subcarriers that rode alongside the luma signal in 'Y'Db'Dr fashion. SECAM utilized two delay lines to contain the additional FM chroma subcarriers on top of the base luma signal, and utilized vertical subsampling in addition to horizontal subsampling, with each of the individual two chroma components getting their own respective sample on every other horizontal line. I believe this made chroma crosstalk impossible despite SECAM being composite.
Of course there were chroma/luma crosstalks in Secam. But because of FM, they looked different than in PAL or NTSC. There were 2 delay lines - one for luma as in PAL or NTSC. The second was for vertical subsampling.
The framerate depends on the powergrid. In the US you have 60 Hz, in Europe we have 50 Hz. Movies are recorded at 24 fps, very close to PAL's 25 fps, so 30 fps is not really "better", just more convenient. In most digital video cameras, the "artistic" 24 fps option only exists for NTSC.
Last time I was this early color TVs didn't exist.
I read here a few complicated explanations re 'colour dot crawl' on a PAL or NTSC broadcast (they both do it). A good way to think of it is when you modulate a carrier with the information you are left with the carrier and sidebands. Those sidebands run either side of the carrier from effectively DC to infinity but they are at multiples of the modulating waveform, they look like the lines in a diffraction grating. This is the crux of how the backward compatibility worked. The sidebands allowed the cleverly chosen colour sidebands to slot, like a comb into the luminance signal. The B&W receiver would just not see the colour and this is why the NTSC colour(color) frequency is chosen - not to be 3.58MHZ but 3.579545 mhz±0.0003% so it ALMOST perfectly slots in. - It doesn't hence dot crawl.
Common misconception repeated here. PAL does not mean 50 Hz at all. Ever heard of PAL-M? Flicker free 60 Hz M system with the superior PAL color.
You're Brazilian.
@@ellenorbjornsdottir1166 sure thing. PAL color system gave us an edge in picture quality, however we suffered in the VCR era when imports from Japan and US had to be converted in order to work in color.
In Argentina we used PAL-M too!
I was searching the comments and wondering what was the difference with regular PAL since he didn't mention PAL-M.
Now I got my answer, thank you.
@@salemsaberhagen8926 hi, I believe Argentina had PAL-N, which is 50hz
@@rogeriorogerio1007 Yes you are right, I was just doing some googling and it was PAL-N, not M. Apparently it wasn't compatible.
Apparently PAL-M had a higher refresh rate while PAL-N was more similar to the European standard but with more scan lines.
Anyway, I was wondering what were the differences and now I know.
You don't need to artificially manipulate the Y signal since it relates to color brightness.
The dot pattern is indeed caused by the chroma signal. You can figure that out using a bar pattern generator and an oscilloscope: the dots will be more visible in the bars where the chroma signal has more amplitude.