What I find the most fascinating is how many extra features engineers were able to cram into the analog standards. Adding in color signals, closed captions, secondary audio programs and programming data into the same bandwidth while still maintaining compatibility with older television receivers.
An analog signal can transmit a mind boggling exponentially larger amount of data than a digital one. The hard part is encoding, decoding, and dealing with decay/loss. We just very recently figured out how to transmit data with another band of light in fiber optics.
The Japanese were right about choosing a 2:1 ratio. Closer to full widescreen instead of the academy 16:9 compromise that was settled for. However, I really do like the fact that TV and computer displays are all finally using the same resolutions. Very convenient.
The first Plasma TV I ever got apparently had a physical resolution of 1024x768, but it used rectangular pixels and gave the illusion of being a 720p display until you hooked up a computer to the VGA input. I looked at the physical pixels and checked, but it turned out the resolution it was feeding to the computer was absolutely right, the pixels were just physically rectangular and stretched a 4:3 ratio out to 16:9. The reason this worked at the time was because DVDs usually achieved widescreen by being anamorphic... that is, they actually delivered a "squished" 4:3 picture that, when spread out to 16:9, looked correct. So the TV was designed to accommodate this without a lot of horizontal scaling, by just having the same aspect ratio as the DVD while having a wider screen through the odd-shaped pixels. I will say, out of all the HDTVs I ever watched, I thought that one looked the best with 4:3 content stretched to fill the screen, and I never knew why for years. But once I knew how it worked, it was obvious... because it really is a native 4:3 display, they just managed to make it wider than it is tall. The ultimate test... which worked, was ripping a Blu-Ray and downscaling it to 1024x768, using the anamorphic setting usually intended for DVDs to see if it looked better on this TV. It did... looked better than anything coming in on the HDMI port, or anything from component video. It was kind of weird, because I realized the majority of people using that TV would never see the best picture it could produce, because everything was either too high a resolution and had to be scaled down using a bad scaler, or too low a resolution and had to be scaled up using a bad scaler. But if you feed it an anamorphic widescreen DVD, or better yet a customized 1024x768 signal over the VGA port that reproduces the anamorphic effect? It's perfect. In fact, I now wonder if some of the places that sold those TVs did just that, used a VGA port and a custom signal to show the TV at it's best, using content like that I created for it by hand...
We had a Thomson Plasma with I think was 800x600 widescreen. When it broke in like 2008, I bought a 1080p Panasonic and after disabling Overscan, it was pixel perfect for my HTPC and had an insane good picture quality compared to the LCDs. It still hangs in what's now my mothers room. No one ever wants to move this heavy beast. Still have the first Plasma TV staying around. 40" Pioneer 4:3 with 640:480 from around 97. Don't know what to do with it. Needed the wall space for other things.
@@okaro6595 Yes there does and very frustratingly its very hard to find true 1280x720p panels. There basically always either 1366x768 or 1024x768... Really really annoying, so when you look at a native 1920x1080p media on a 1080p screen, most the reason it looks better next to a 720p tv is just because its native and your 720p screen panel is not. Very very few true 720p displays out there, only some HD CRT and TV will produce a decent quality image or else you need to use a little 4:3 vga crt screen in letterbox... So annoying no half decent ips panels out there, they could just provide an option on a 1366x768 TV to crop the image in a little to a dot to dot scale, but good luck trying to find a TV with such a simple choice of picture option.
The only thing that kinda pisses me off, that 720p TVs are NEVER actually 1280x720. They are almost always actually 1366x768, simply because it was easier/cheaper for manufacturers to convert existing processes to make TVs, instead of making new ones.
Yeah, that sucked. The only two ways of getting real 720p back in the day were either finding one of the super-rare, native 720p monitors or using a 720p wall projector with DVI. But there simply were no native 720p TVs. So we were robbed of the true, 1:1 picture clarity on X360 & PS3.
@@ThePreciseClimber I have a Retrotink 4k now, and you can integer scale 720p 3x to 4k, allowing you to see the true 1:1 picture quality of PS3/360 games. It looks amazing, to see the actual pixel perfect graphics. I also like to use the LCD grid effect to create a faux 720p Display, and it looks even better although it darkens the image quite a bit.
I used one as a hand-me-down for a while as a monitor while my computer still had VGA and yeah those displays look noticeably better when you feed them their actual native resolution.
@@udenszirnis1644 USB only APPEARS sensible until you look closer. Sure, the physical connection is the same, but you have USB 2, USB 3.0, USB 3.1, USB 3.2 (actually 3.0), USB 3.2 (actually 3.1), USB 3.2 (this one is genuinely 3.2), whether or not it supports thunderbolt, different levels of power delivery, etc.
Not only the France but also Monaco, Belgium , Luxembourg, Algeria ... the 819 line was a 737i @ 14Mhz in France @ 7Mhz in Belgium and Luxembourg ... 625 lines developped by CCCP was adopted with the color (PAL / SECAM) and 819 lines has been abandonned between 1972 and 1983 ...
@@SterkeYerke5555 1948-1984 ( 1985 including Monaco ). Beating 1000 lines was done in 1941 in lab by René Barthélémy. Let's remember him for this accomplishment.
This is a fantastic video! I'm an engineer who works at one of the companies that makes a lot of the broadcast equipment, and I was taught all of this during my internship. You did a great job of going through the history of video in an interesting and not drawn-out manner. I will be sharing this video with my coworkers and I'm sure many others. Keep up the good work!
@@TheAussieRepairGuy It was probably part of the original script before he changed it. The English UK subtitles are grammatically correct with punctuation and everything, which you never get with auto captions
The best reference for this topic from the U.S. perspective is the book "Defining Vision" by Joel Brinkley. It documents the entire agonizing political (and sometimes technical) story behind the seemingly endless battle for high definition television in the U.S. There were around ten proposed resolutions from various organizations at various times for various reasons and the computer industry had little input. The first proposals were analog or hybrid digital/analog systems until one company developed a fully digital system that, while wasn't still good enough, was clearly superior to all of the other proposed systems.
I must say that resolutions like 1920 x 1200 have been used for a long time among professional monitors, now you can still find laptops with high screens and a vertical resolution of 1200 pixels.
I have a 1920 x 1200 secondary screen, it used to be my main screen and it was soooo nice, especially for games and productivity, u could have the task bar at the bottom and program bar at the top while the game didnt feel squished at all, also for youtube, it makes it able to not have the progress bars of videos as overlay on video, it just stays under the video screen which is amazing when taking screenshots of paused video
I remember when 'big' Tvs were anything much bigger than like 35", lord help your mover's backs if you went bigscreen, doubly so for that brief period of HD tube TVs.
@@baronvonslambertlmfao, my Dad is 78 and has hearing loss in the high range from shooting at the gun range (was a cop), and recently his battery backup for his pc needed the battery replaced, I go in his house and it's a constant squeal from this thing, he couldn't have cared less 😂.
Honestly, pre-digital television broadcasting just feels like a sort of arcane magic to me, the more I know about it, the less sense it seems to make. It feels miraculous that any of it worked. Don't even get me started on CRTs.
Really? To me it seem's very ... primitive for the lack of better word. You can easily understand everything what happens in an analogue broadcast. Digital broadcast is the real magic. You most likely understand it only on a very high abstraction layer. Ie analogue video encoding is something you can explain in a few minutes. Digital video encoding is an art where only handfull of people know how a single frame is encoded from start to finish. You go one layer up and puting I/P/B frames and ordering them is one of the hundreds of problems and there are ten more layers to the digital broadcasting.
Peter, this is incredibly well made, detailed and informative. You should be very proud and this should be required watching for all interested in all things video.
The fact that they did not immediately realize the need for square pixels insane. PCs got square pixels already in 1987 with the VGA. Square pixels make things so much simpler.
I reckon working with a *generated* image instead of a *captured* image might have had something to do with it - it's hard enough figuring out where to put down the next black pixel when drawing a circle with square pixels - if they're not square, it's even worse. Of course, this was immediately subverted by things like the ZX Spectrum video memory structure, where the order was line 0, then 8 lines below, then another 8 lines below - for ONE THIRD of the screen; then continue with line 1, line 9, line 17 etc; then proceed to the second third and repeat. YOU try drawing a circle onto that...
Standard resolution on a CRT TV for basic programming doesn't require the same pin sharp focus of reading fine text on a monitor. Generally any image looks great on a CRT when reading fine text isn't a requirement
Yeah, if you have rectangular pixels, the software designers will often still treat them as if they are square, resulting in a stretched look. Most NES games did have that issue.
Great video. Couple things I wanted to add. SMPTE is general accepted as “simp-tee” for short. Save yourself a mouthful next time. One other cool note is that if you multiply the width by height by framerate from NTSC land(720x480x30) you get the same number as doing the same equation in PAL resolutions. (720x576x25). Both yield 10,368,000. Not sure why this is. Seeing as how both standards have the same bandwidth I’m guessing that’s just the most they could get out of it.
It's definitely not a coincidence that the line numbers multiplied by the frame rate equals the same number. Or really the same but with the total number of scanlines (525 vs 625). The resulting number is the line rate. It's (nearly) the same because the 625 line standard is based on the 525 line one.
Very interesting! A brief search of online dictionaries only mentions this sense being used in "computing" not other fields. I found a blog I'm sure I can't link, which offers the below: "The word has been used in engineering since the nineteenth century. The word ‘bug’ actually is short for Bugbear. (sometimes found as Bugaboo). It’s meaning is much closer to ‘Gremlin’, where the people who worked on engineering prototypes often grew to suspect that the problems were due to malicious spooks. I sometimes even still hear it said that some software is cursed with malicious spirits. The ‘Bug’ or ‘Bogey’ part of the word is traceable back to the fifteenth century in the meaning of ‘Hobgoblin’, devil or ghost. In East Anglia particularly, the word Bugbear’, first recorded in the sixteenth century, is still used in referring to problems with machinery."
Channel named "Our own devices" _just_ made a video about telegraphy (morse code), where he explains the term! "Bug" was a nickname for an inexperienced telegraph operator.
My Dad bought a Sony HD CRT in 1999 for around $4k (I'm not exactly sure how much). He was so proud of it but within 5 years flat-screens started coming out and it made our insanely expensive and insanely heavy HD TV already look outdated. He kept it until the late 2000s though.
and now those monsters are worth money to the right people... they're such amazing tvs and i wish i could have one. some people selling them around me but would need to reeeeeealy be ready to put it in one place and never move it again
being any ind of early adopter when it came to HD and home theater tech around the turn of the century was insanely expensive and not remotely cost effective. You'd pay huge amounts for the latest tech and then it would be surpassed by something newer long before any kind of reasonable end of life. A $4K HD TV wasn't even all that expensive back then. You could spend more than twice that on a plasma display that didn't even do 720p and which suffered horrendous burn-in if you ever left anything on the screen for a hile - including just station logos and the like. I paid about $3500 for a rear projection 720p DLP TV back when that was the best bang for your buck in HD TVs but it was basically worthless within a couple of years.
@@SterkeYerke5555 The HD CRT still looks better than any LCD, though Plasma could give it a run depending on the content and plasma TV. Still worth more than both, most people don't remember they have a plasma TV and sell it for LCD prices.
@@uponeric36 I guess it's a matter of taste, but I don't think many people would be inclined to agree with you. Viewing angles and motion clarity will surely be better than any lcd, as well as resolution scaling, but when it comes to brightness, colour volume or sharpness, I don't think any crt can match modern lcd's. Input lag is bound to be better on a modern lcd than an HD crt as well (though obv not as good as earlier crt's), and decent minileds will beat the crt for contrast too (as well as most non-Kuro plasmas). Even a modern non-miniled VA panel might beat it for native contrast, though it'll be even more compromised on viewing angles. As much as I dislike lcd's in general, 40+ years of development is starting to pay off.
must’ve been insanely weird seeing it for the first time 😅 I’m pretty young and going from a 60Hz monitor to a 120Hz monitor felt absolutely amazing… though probably nothing like seeing a colored TV for the first time 😮
It was NOT just the lack of RAM that held pixel counts low in the early days of PCs. Being able to move those pixels around the screen requires CPU cycles, so the more pixels you have to move, the slower your game or application will be able to draw. And listening to you refer to the IBM PC as a professional machine versus the Amiga was hilarious. First off, the PC was able to use an RF interface just like a Vic-20. Second, NO ONE bought a new Amiga 1000 to play games or to use it with a TV. It cost the equivalent of $3500! That's a steep price for a game machine!
Between the 8 Bit Guy and Nostalgia Nerd, I learned more about CRT technology in the last week than I learned in the first 50 years of my life. Outstanding!
This is the best explanation I've come across as to why those specific numbers were chosen. Here I was thinking that it was the closest approximation of 1 megapixel (720) and 2 megapixels (1080) at 16:9.
I've never understood the concept of the megapixel. It's an absolutely useless and unintuitive way to measure resolution, which has thankfully largely died out by now.
@@__christopher__ Which would have been fine if it hadn't been for the fact that it _replaced_ the useful information in the marketing. I wouldn't have minded so much if they had still given the resolution in a way that makes sense (x*y pixels), but they didn't.
Even IBM computers took a while to use square pixels, 320x200 was a common CGA/EGA/VGA resolution and wasn't letterboxed, so pixels were "tall" compared to 320x240.
@@mal2ksc Well... for VGA it was to overlap with MCGA 256 color modes. If a game wanted to show 256 colors, it could be programmed for VGA 320x240 (supported by all VGA cards) or 320x200 (VGA and MCGA), and yes, MCGA had less VRAM so LucasArts games have choppy scrolling even on VGA at 320x200 and games like the Lion King ran at 60fps at 320x240.
talk to a broadcast engineer or video mastering expert and most of them will tell you that the active width of a 13.5 mhz 4:3 image is probably closer to either 702, 704, 709, or 716 pixels rather than the full 720, and that those additional columns are essentially just padding and overscan to account for any horizontal offsets that might happen as a result of analog/digital conversion errors. backing this up is the fact that the dvd spec allows widths of both 352 and 704 (and not 360), and that the earlier and best-supported versions of the ATSC digital broadcast standards don't even bother to support a 720-wide mode, only 704
@@gamecubeplayer debatable depending on who you talk to, but kinda yeah. that said, the TOTAL length of a line, including active, blanking, and sync area comes out to 858 for ntsc or 864 for pal (when sampled at 13.5 mhz).
6:18 Looks like C from afar, but closer-up, it looks like JavaScript. Immediate edit: The video was _so good,_ I thought all 20 of these minutes were *less than 5.* I *_was_* watching at 1x speed! Thank you, Mr. Nostalgia Nerd!
I'm a bit shocked. I've never seen this subject fully understood in a YT video before. They always get something way wrong, or gloss over important details. I expected to be typing pedantic corrections here today, but no. Instead, I learned a few more details myself. You have a new subscriber.
This was a great insight and the connection to the early 80’s computers is fascinating, especially given I’ve very recently been watching your histories of Sinclair, Acorn, Dragon and Commodore and they’re amazing! I love this place!
I watched a similar show, showing the advances in TV Japan. They showcased portable handheld TV's that were so far above what we could get in the US too.
Here's an interesting idea for a video. Why not look at the evolution of the picture standards in relation too colour & contrast. from rec.601, rec.709 to rec.2020 HDR. for instance why was the original colour gamut, gamma curves & 100nits of brightness decided upon? what was the point of the minor upgrades of rec.709? Why did it take so long for HDR to become a thing when LCD's have been exceeding aspects of the SDR specs since the 90's. And why did they decide on the rec.2020 colour gamut if it still can't handle all visible colours, officially it's so they don't require conversion chips to handle imaginary colours but seeing as all digital video requires computer chips anyway that explanation always baffled me.
Minor correction: BT.2020 is UHDTV, which is an SDR standard. BT.2100 (HDR-TV) uses the same primaries though. As for how they decided on those primaries: Look up the paper "UHDTV Image Format for Better Visual Experience"
@@Dogelition Can't access that paper without being an institutional member or purchasing for far too much money. isn't available through any other sources either that I can find.
I remember I had a 1600x1200 screen, and my one absolute firm rule was "When I change screen, it needs to be greater or equal to the HEIGHT I already have.", which pushed me to 1920x1200 in a 16:10 screen. I actually had 3 screens of this resolution, before the market saw them vanish with 1920x1080 being the only option, at which point I pushed up to 2560x1440. Unfortunately, I wanted to keep my 24" size, but I had to choose between a 1920x1080 24" screen, or a 2560x1440 27" screen, so I went bigger. Also interestingly, I remember a few years earlier when I got my 3rd 1920x1200 screen, that I almost bought a 2560x1600 screen, still 16:10.
I remember getting a Dell 3008 in about 2017 (which was also 2560x1600) and keeping the firm rule that I wouldn't upgrade until 4K oled monitors were available. Didn't make it. The 3008 died in the summer of 2022, at which point only the very first oled monitors had hit the market. They would've been a downgrade height-wise, being a 34" 3440x1440 screen, so I had to "settle" on a 4K 144Hz IPS screen instead. Can't say I'm unhappy, but I'm still a bit bummed I didn't make it far enough with the 3008.
@@bobingabout They are in Europe at least. The Acer Vero B247W, Samsung F24T, Dell P2425 and the Iiyama Prolite XUB series are all fairly modern and somewhat widely available 1920x1200 monitors that don't break the bank. The Dell even goes up to 100 Hz, which nowadays I'd say isn't only preferable for gaming, but even feels better when just navigating Windows in general (or any other OS ;).
In my personal computing history, for the purposes of mostly using a computer for software development, I went from 320x256->768x288->1600x1200->2x1600x1200->1920x1080->3840x2160 Started in 8bit Acorns, then 32bit Acorns, then onto PCs with dual monitors before falling back to FHD once I had to negotiate space with my wife and two large high resolution CRTs took up too much of it!
The video doesn’t skip this “era”. It mentions Rec. 601, and the DVD is based on Rec. 601. Please also note that the “NTSC” DVD doesn’t contain an actual NTSC signal, so this is a confusing misnomer. Rec. 601 is a digital video standard from the early 1980s. When the DVD was introduced at the end of the 1990s, everybody was already talking about HDTV. So the DVD was pathetic from the beginning.
NTSC was an analogue broadcast standard and a standard for driving analogue CRTs. DVD is a standard for optical discs. They have nothing to do with each other. Especially as the video signal was often displayed on digital displays.
I love 1080p. It's the best of most if not all worlds. It's very clear, sharp, not too far that you can't see the pixels and for what I do when converting shows and movies to my Xbox 360. I need to see the individual pixels even in a 100% window size so I can tell between what is the original and what happened to it. Of course I'm mainly referring to Virtual Dub but overall 1080p is a perfect and controllable resolution. I just wish UA-cam wouldn't treat is so harshly. Have you ever noticed that videos that get the VP9 treatment ruins the 1080p option whilst 1440p removes 90% of the artifacts? What's up with that? AV1 had better fix this.
I had no idea 1080i went back to before I was born. It felt new and fancy during my teens in the early 2000. I can remember being stunned by my little white MacBook being capable of 1080p 444 content playback in ~2007.
similar age to me, i remember working at an electronics store in 2001 at 17 years old and seeing one of the early HD plasma screens hanging on a wall. mind blown for sure.
@@leexgx interlaced is kinda weird. Gran Turismo 4 on the PS2 could do 1080i and I usually went with 480p because it didn't make the road jitter while turning
At 02:05: "Reducing the scanning frequency to 29.970 frames per second, with the remaining bandwith used to carry the color signal" - uhm... no, not at all. I don't know where to start correcting this, as it's so wrong and such a big misconception. In short: the frame rate was reduced to 29.97 Hz in order to reduce interference between the newly introduced color carrier and the existing sound carrier frequency by shifting everything in a manner so that the color carrier and the audio carrier are different by a non-integer multiple of the line frequency. This reduction of the line frequency and this overall framerate (or, rather, field rate) has nothing, really *nothing* to do to accomodate the "higher bandwith" of a color television signal. In fact, the overall bandwith of the NTSC color signal is not at all higher than that of a black-and-white signal. The spectrum of the color sub-carrier lies well (and completely) inside the spectrum of the luminance signal. This is actually why we have things like cross-color artifacts and dotcrawl, and why S-Video (separate Y/C signals) was even invented. Sorry for the elaborate correction, but I just couldn't let this stand as such. Otherwise, as always, great video! Keep it up!
@jublywubly Actually 1080, or reported at 1080. In America, everything will say it's 1080, even if it's actually 1088. The last 8 lines just get cut off the bottom and not displayed.
I really wished we had settled on a 2:1 aspect ratio. That would have been a much better fit for films shot in 'scope. Narrower content (1.85) could have been pillar boxed similarly to how we pillar box old NTSC/standard def. and 4:3 film content today.
I have a Mitsubishi XL5U projector that can project at 720p (native resolution of 1024x768) that I’ve had since I was 8. 14 years later, I now have an Acer projector that’s a native of 1080p. 720p to me as a kid was imax quality comparing the 2 now 🤣🤣🤣
projectors were and still are a strange thing as far as resolutions are concerned. you can typically feed them with a much higher signal than they will output, but they wont complain about it and run just fine..
Then there's the CRT that really doesn't care about resolution. As long as the signal is right it'll try to output it. Make windows look very tiny with a huge resolution. Don't think anybody even cared about native resolution until LCD screens came along.
@@davidmcgill1000 However some CRT monitors could get physically destroyed when feeding them unsupported video modes (basically the problem was exceeding the maximum supported horizontal or vertical frequency).
The original 1936 405-line broadcasts from the BBC (until 1985) were declared the first "high definition" regular broadcasts in the world - far from what we class as HD now but a leap ahead to previous television technology beforehand.
CGA, EGA, and Hercules would like a word with you about computer monitors not being interlaced... 🤣 [Edited: My bad, I got mixed up between interlaced _memory_ and interlaced _video._ ]
Those were not interlaced at least not the MDA. Interlacing is rare on computers as it makes horizontal lines jump up and down. I did have a SVGA that could either do 800x600x56 or 1024x768x96i. The latter was awful on normal Widows content but very nice on images. They did use interleaving at the memory level but this did not show to the monitor.
@@okaro6595 Oh, there very much WERE real interlaced modes, displayed as such. A special edition of the S3 Virge 3D video card came with LCD shutter glasses, which got their switching signal directly from the bottom two lines of the picture, as sampled by a dongle on the VGA output - the final two lines were supposed to be 1/4 white and 3/4 white respectively, each identifying which half-picture is on the screen at the time, blacking the LCD for the other eye...
@@AttilaAsztalos my friends dad had those shutter glasses, they did 3d content as well - I think he had Fallout 3 or New Vegas and a couple racing games (Need For Speed etc), sht *blew my mind* bitd 😂
Wow that was fun, Story, I was working at the BBC studios in Milton Keynes UK in the late early 90's. In comes this huge Sony windscreen monitor a Sony 3830. The props guys where not happy as the resolution on this set was incredible showing the smallest defect in the scenery which was held together with gaff tape and foam.
It probably was cheaper to manufacture than 1920x1080 (until economies of scale changed that). It had no loss of vertical resolution over the 1024x768 of earlier computer monitors. And existing 1024x768 computer content mapped perfectly to the middle of it, with no need for scaling. _EDIT: QuestionBlockGaming said it better in their own comment._
As an additional step, that super-odd resolution of 1366x768 that was common on so many laptop panels for so long was actually a stopgap resolution, that was meant to be compatible with programs that required 1024x768 while also delivering a widescreen resolution for watching 1280x720 media and navigating websites meant for widescreen displays. I had more than a few programs (meant for work!) that would outright crash if the resolution wasn't at least 1024x768, and the 1366x768 resolution was a great alternative!
@@colinstu it being on a laptop of the era might've looked bad but it was still better than having a straight 1280x720p panel. But yeah all those 1366x768 panels on televisions were GNARLY
@@colinstu I think it's to do with divisibility by powers of two. 1366 isn't divisible by 8, but 1360 and 1368 are (170*8 and 171*8, respectively), and when it comes to lists of funny resolutions, you usually see one or both of them.
@@Roxor128 Multiples of 16, actually. That's a common cell size of LCDs, so if you needed any resolution that's not a multiple of 16x16 you'd to have special partial cells on one (or worse, two) sides. You get the same issue with 1080 displays, where there's a half cell on the top or bottom. On some displays, you can even see those unused 8 pixels when you compare how the top and bottom edge looks. But there, manufacturers have no choice---leaving out the last row (i.e. delivering 1920x1072) is no option. But leaving out a column that'd be 10/16 unused from a no-real-standard-resolution display...easy.
@@НААТMaybe that's how it's viewed, but I'd still argue 1080p is the sweet spot. Newer, higher-resolution stuff still looks good on it, and at the same time, it doesn't make older, lower-resolution stuff look quite as bad as 4k does. That's what _I'd_ call a gold standard.
@@НААТlol, no 4k isn't lol. The majority of content, video and gaming is still 1080p. If you only look at the numbers from Steam for example, that very clearly shows that most people still game at 1080p. The same goes for UA-cam videos and things like Netflix etc. In fact, for most video stream services, you need a more expensive membership to enjoy 4k. Yes, I am aware that most televisions are being sold as 4k, doesn't mean what people are watching is 4k. If you know a little bit about the resolution of our eyes, you'll quickly see that the improvement goes down very quickly. Except for things like video post editing, there are no practical benefits anymore above 4k. I think it's even banned in some countries if I am not mistaken? It just eats power.
18:39 Uuh...WHAT 720p screens? :P Native 720p displays were exceedingly rare. 99.9% of "HD Ready" TVs and monitors were actually 1366x768p. Which actually prevented us from seeing all the 720p games on X360 & PS3 in their true, 1:1 sharpness.
I remember writing programs for the PC in my youth and having to choose between 320x200 with 256 colors, or 640x480 with only 16 colors. Back then, it was a point of great frustration, but now I look back on those limitations with a feeling of nostalgia. Feels like we're just wasting pixels these days, with 4K/8K resolutions, and the only relevant trade-off is the framerate/bandwidth.
Yeah, I remember creating pixel art on my computer with 16 colors, but only 4 could be used. And I don't remember the resolution I had back then, but it was definitely low.
I think it's interesting that the terminology used to talk about resolutions has always been the number of horizontal lines, like 480p, 720p, 1080p, even 1440p, until 4k came into the world and started using the number of vertical lines (3840x2160 is 4k but we don't call it 2160p). interesting to think that that terminology was a holdover from counting scanlines.
Back in 2008, a buddy of mine that worked in TV post-production was telling me about film transfers being done in 4K, which was twice the horizontal and twice the vertical resolution on 1080i and being progressive scan (he may have mentioned the fps, but I don't recall). I asked him why it was referring to the horizontal rather than the vertical resolution and he said because film aspect ratios vary so it's better to think in terms of the commonality - the horizontal. I also asked why it isn't 4096, to be true 4K and he said to keep the resolution an integer multiple of 720p and 1080i for easy down converting.
It's not interesting. 2160p is double of 1080p, of course that's a big no-no for marketing departments. We must call it '4K' so people know it's 4 times more better than their crammy obsolete 1080p monitors. Buy now!
@@rager1969 Those 2k and 4k used in movie production are actually 2048 and 4096 pixels wide. The TV/PC world stole those names and used them for 1x and 2x 1080p, respectively. Movies go by horizontal resolution because that's a fixed size---the width of the film strip. The height of the picture depends on how many perforations high the camera exposed and on ow much of that the director blocked off. So a 2-1 aspect ratio movie would be 4096x2048, while a 4-3 movie would be 4096x3072. In video world, those two would be 3840x2160 with 240 pixels of black bars at the top and bottom and 3840x2160 with 960 pixels of black bars at the sides
I remember watching a 20/20 report on HDTV circa 1990 and the whole thing was them mostly saying "you can't see the difference on your tv at home, but trust us it looks much better"
Funny that they changed counting the vertical (720p, 1080p) to counting horizontal (4k). Counting by the latter scheme, 1080 should be called 2k instead.
They didn't really change; they just adopted the movie resolution names and used them for the closest video resolution. A "real" 4k frame is 4 thousand pixels wide, not those 3840 computers and TVs use. Move resolutions have always been about the width. The width of a film strip is constant, but how you subdivide the infinite length of the strip is up to you. Although it makes very much sense to use multiples of the perforation and mask off areas at the top and bottom you don't need. Quite the opposite of the fixed number of lines and fuzzy horizontal resolution of TV/video signals. So when digitizing film, the only thing standardised is the horizontal resolution. The number of lines is how many you get after cutting off the black areas and can be different from movie to movie. A notable exception is IMAX, where they run a 70mm-wide film strip horizontally, so it becomes 70mm high and the width becomes the variable.
@@HenryLoenwind With analog video the main parameter has always been the number of scanlines. _Movie_ film was measured by width but that's totally different. And it's not just IMAX using horizontal film, standard still cameras did, too.
In France, the movie director Abel Gance invented super wide format called Polyvision in 1927. It was basically 3 pictures side by side of a 4:3 aspect ratio each. But it needed 3 movie projectors. This was not easy to use everywhere. Seeing this, the astronomer Pr. Henri Chrétien tought about a way to do wide screen with a single projector, compressing the image by anamorphosis. He called his system Hypergonar. Those breakthoughs were ingored until 30 years later people in the USA started to watch more and more television and don't go to movie theatre anymore. Movie studios wanted more exciting technologies to bring back audience in theatres. Cinerama is a rip off Gance's system, also using 3 projectors; and the Fox studio bought Hypergonar from Chrétien and rebranded it under the name Cinemascope. The Cinemascope was a sucess. This is why you have wide screen now. And who invented 4:3 in first place? It was William Dickson who wanted a 1,5:1 aspect ratio, but due to technical limitations of technology and dimentions of the 35 mm film gauge decided to make the best out of this and went for 4:3 instead. All the formats are more or less directly related to this.
When early HD systems were sold there was commercial terms, HD ready for 720 and Full HD for 1080. They are not used anymore. In broadcast field we use "HD 1280×720" or "HD 1440×1080" or "HD 1920×1080".
@@eDoc2020 And even movies and broadcast TV will paraphrase their closed captions sometimes, to accommodate slow readers in scenes where people are talking a lot.
High Definition consist of various resolutions. You have to remember when the standard was created, high resolution fixed pixel displays weren't in use. CRT ruled the day, and could display numerous input signals at their native resolution. So the focus was on the source resolution, not the display resolution. When it came to displays, it got even more complicated. A display for sale in the US could legally be listed as High Definition with a native resolution as low a 1024x768 if the pixels were rectangular in shape. For source material it started at 1280x720 progressive. Young people seem to have forgotten ATSC HD broadcasts were limited to 1080i for nearly 20 years. Up until recently, no 1080p OTA broadcasts existed. compression codecs and bitrate play just as big a role in the quality of a high definition image as the output resolution. Thats why UA-cam videos at 1080p often look soft compared with a 1080p BD disc.
16:9 with 1920:1080 resolution was never adopted by "PC". It was forced to use as cheaper alternative. Manufacturers of PC monitors just used cheaper/the same panels of "tv" screens.
19:26 That chart is way off. Instead of Desktop, it should perhaps say Desktop / Laptop as cheap laptops are absolutely affecting those results (e.g 1366 x 768 - not many on a Desktop PC uses a monitor of that resolution). And I find third place '1536x864' as bizarre. In 2023? Really?
Did anything ever even use 1536x864? As far as I know the 4:3 version 1152x864 exists only because on budget CRT monitors you could get 75Hz at that resolution, being between 1024x768@85Hz and 1280x960@60Hz
1536×864 is due to incorrectly measuring resolutions. 1920×1080 with 120dpi (125%) is a common configuration, but browsers divide the width and height by 1.25 and report it as 1536×864 so that scalable content adapts to that size.
I love this channel feels very much like it came from the eras its so often describing. Its like the favors you taste in the wine come from the soil it was grown in.
When 625 line PAL was originally marketed they also sold it as "High Definition" which to fair it was compared to the old 405 line system... which was also marketed as "High Definition"
I suspect the answer is gematria and the moon. Screens are 1080 and 2160p because those are the average radius and diameter of the moon at its equator, respectively. Hence, 1080 and 2160p. 2160 is the sum of a cube's angles, 90x4=360 per face x 6 faces = 2160. Drop the zero, you have 216, the 6th cube number. 6x6x6=216. The most prominent aspect ratio is 16:9. This can be explained with (pythagorean / digital root) gematria. The word "you" = 25 + 15 + 21, as y is the 25th letter, o the 15th, and u the 21st. The sum being 41, the 13th prime, where 13 is the 6th prime and also 6+1+6, a "number of the beast" and also part of Earth's orbital velocity. Which is 66,616 miles/hour. 6 is the first perfect number, the 3rd triangular number, and so on. In pythagorean gematria (compared to "ordinal") a digital root is taken before adding the numbers together. So Y=25, 2+5=7, Y=7. O=15=1+5=6, U=21=2+1=3. Simple digit sum. So You=7+6+3=16. There's the 16 in 16:9. i is the 9th letter. There's the 9. So the screen aspect ratio 16:9 is the ratio of you:i. "Money" is likewise, the word is an encoded reference to the eye of Horus, ie the moon. Mo[o]ney[e]. You also have mon-E, or one e. One energy, one 5, where 5 is the senses, sensory reality, and witht he 5 pointed star it's the top point, the hidden "aetheric" mover, the spirit. The other 4 are the elements, or the visible, the seen. Fire, water, air, earth. The pentagram also encodes the music scale in its unfolded ratios, it has infinite recursion, and so on. Screen sizes and resolutions are references to the moon. Which is chased away by the sun, it reflects the sun, it eclipses the sun periodically. Where the masculine and feminine, the sun, and moon, the beast divided, become one, revealing the corona, ie the white ring. There was an important eclipse over much of Europe in November (11) of 1331. Fold in your ring finger, you have 13. Flip your hand around, 31. 1331. For the older 4:3 screens it's the same. 16 is the 4th square. 9 is the 3rd square. 4x4 and 3x3. ie 16:9. The magic square of the sun is 111 and it's a 6x6 grid. 1080, drop the zeroes, 18. A lucky number. Also 3x6, or 6x3. 6+6+6. Which is the number that connects the sun, the moon, and the Earth. It's the number of "The World". That's the beast. Also in Hebrew gematria the word for "a man" = 216, 6x6x6. It's the number of a man. "six hundred threescore and six" gives 313 and 133, which is a whole 'nother rabbit hole. 1:45 441 lines. 441 is the 21st square. 21x21. 21 is the 6th triangular number. 66 again. Or U is the 21st letter, So U times U. You x you, you alone, or alone with others, in front of the TV. Not that I was ever all that social. 2+1=3. So 33 also. 625 x 576 for PAL. 625 is the 25th square. 25 is the 5th square. 5^4. 576 is the 24th square. 2+4=6. 66. X is also the 24th letter. So XX. 625 x 576 = 360,000. 600th square, 600x600. Or drop the zeroes, 36. The 6th square. 6x6 either way.
Actually, I had long wondered why they chose 720x576 instead of 768x576 for (PAL) DV video (the latter would have resulted in square pixels for 4:3 aspect ratio). Now it makes sense. Thank you.
PAL was a color standard and has nothing to do with the number of lines. The 625 line system existed before anyone even thought of PAL. In the UK the black and white standard was 405 lines though they adopted the 625 line system in anticipation of the color in the 60s.
2:00 The scan rate of NTSC colour was reduced slightly to make it circuits easier to build with the hardware of the time; it did not change the bandwidth. The bandwidth needed for colour information was taken out of the bandwidth available for horizontal resolution, leaving colour systems with a maximum resolution of about 160 lines across. (Sort of. The interactions between colour information and luminance information are rather more complex than that, so the resolution actually depends on what colours are being displayed, but if you wanted to display alternating black and white vertical lines on a colour display, you can't do more than about 160 before the TV decides that it should be display colour information.) 7:00 Higher vertical resolutions were introduced long before VGA; even IBM had introduced 350-line EGA by 1984. But it was the Japanese who kicked this off with microcomputers (probably due to wanting to be able to display kanji reasonably well) few years earlier (around 1981) with 400 line displays in systems such as the NEC PC-8801 and Fujitsu FM-77. (And of course the Japanese were heavily involved in later HD standards, since they both used almost exactly the same NTSC system as North America and had a very strong industry building TVs and exporting them worldwide.) BTW, in North America we usually pronounce "SMPTE" not letter-by-letter, but as "simptee."
@@Khloya69 I am certainly not in that country too. I knew the U.S. educational system was bad, but I had no idea it was so bad that you guys don't even know that there are other countries than the U.S. in North America. And when I use "colour," it's not a "britishism." It's our standard spelling in Canada. (I wouldn't expect you to know about that, though, if you don't even know that Canada is in North America. Now you've got _two_ things to TIL about!)
@@Curt_Sampson I knew Canada is in North America, but i thought it statistically unlikely someone would be commenting from a country with a smaller population than California. Also, I do not respect British English.
Yes. 720p (1280x720) is part of the HD standard. Many people confuse this with the 720x480 resolution of DVDs. It's an unfortunate coincidence that 720 features in both specs.
It is HD but came later. 720p was born out of NBC & Zenith's 787.5/60p experiments which began in the early 1980's. NBC/Zenith claimed that a progressive scan image would be better & simpler than an interlaced (then 1035i) image and experimentally obtained that 787.5 would be the best mix of static resolution, motion resolution, and bandwidth; however, there was little hardware available unlike what the Japanese had. Regardless, NBC ended up using the 787.5 system to fight against 1035i being accepted as the sole American standard in the early 80's. This standard was later revived when digital HD transmission was considered in the USA & modified to 750/60p (720p) since it could use the same pixel clock (74.25MHz) as 1080i.
@@ReelyInteresting1080i is actually slightly higher resolution than 720p but if you use mpeg interframe compression then it doesn't really matter because you can use the same bitrate
Something I don't know about a boring standard preferential? Going right back to the 40s? Brilliant! Sign me up! Don't know if you noticed this yourself, but the rest of the recommendations on my homepage is a goddamn global dumpster fire on the brink of collapse. Pity it only goes for 20 min. I may even watch it twice. So, please sir. Continue. I'm more than invested .
René Barthélémy invented HD in 1941, which later became analog HD standard in France in 1948 known as « 819 lignes ». It was black and white, used 12 MHz bandwidth and Acamedy aspect ratio. It was adpoted by neighbourgh French speaking countries. The last one to use this system was Monaco who discontinued this system in 1985.
To be fair HD is 720P, FHD is 1080p, 4k is UHD and then we kinda gave up as 8k and 16k have no names that I've heard.
720p is HD ready
NHK calls their 8K broadcasts “Super Hi-Vision”, which those also have 22.2 surround sound.
They are:
720p / HD ready
1080p / Full HD / FHD / HD ready 1080p
4K / 4K UHD / UHD-1
8K / 8K UHD / UHD-2 / Super Hi-Vision
4K = Ultra High Def, 8K = OctopiDef, 16K = Hexadefcimal
Correct
What I find the most fascinating is how many extra features engineers were able to cram into the analog standards. Adding in color signals, closed captions, secondary audio programs and programming data into the same bandwidth while still maintaining compatibility with older television receivers.
Old school engineers were wizards. Kind of amazing to see what they pulled off with so little.
An analog signal can transmit a mind boggling exponentially larger amount of data than a digital one. The hard part is encoding, decoding, and dealing with decay/loss.
We just very recently figured out how to transmit data with another band of light in fiber optics.
Making color signals still work on B&W TVs was genius.
And "hiding" a 16:9 picture in the existing 4:3 signal using its full resolution, for wide screen TV's (PALplus).
@@faming1144 pal also had teletext
The Japanese were right about choosing a 2:1 ratio. Closer to full widescreen instead of the academy 16:9 compromise that was settled for. However, I really do like the fact that TV and computer displays are all finally using the same resolutions. Very convenient.
The first Plasma TV I ever got apparently had a physical resolution of 1024x768, but it used rectangular pixels and gave the illusion of being a 720p display until you hooked up a computer to the VGA input. I looked at the physical pixels and checked, but it turned out the resolution it was feeding to the computer was absolutely right, the pixels were just physically rectangular and stretched a 4:3 ratio out to 16:9. The reason this worked at the time was because DVDs usually achieved widescreen by being anamorphic... that is, they actually delivered a "squished" 4:3 picture that, when spread out to 16:9, looked correct. So the TV was designed to accommodate this without a lot of horizontal scaling, by just having the same aspect ratio as the DVD while having a wider screen through the odd-shaped pixels. I will say, out of all the HDTVs I ever watched, I thought that one looked the best with 4:3 content stretched to fill the screen, and I never knew why for years. But once I knew how it worked, it was obvious... because it really is a native 4:3 display, they just managed to make it wider than it is tall. The ultimate test... which worked, was ripping a Blu-Ray and downscaling it to 1024x768, using the anamorphic setting usually intended for DVDs to see if it looked better on this TV. It did... looked better than anything coming in on the HDMI port, or anything from component video. It was kind of weird, because I realized the majority of people using that TV would never see the best picture it could produce, because everything was either too high a resolution and had to be scaled down using a bad scaler, or too low a resolution and had to be scaled up using a bad scaler. But if you feed it an anamorphic widescreen DVD, or better yet a customized 1024x768 signal over the VGA port that reproduces the anamorphic effect? It's perfect. In fact, I now wonder if some of the places that sold those TVs did just that, used a VGA port and a custom signal to show the TV at it's best, using content like that I created for it by hand...
720p is a signal type, not a panel resolution. There does not need to be any direct relation between the signal and the panel resolution.
I had that same Panasonic.
We had a Thomson Plasma with I think was 800x600 widescreen.
When it broke in like 2008, I bought a 1080p Panasonic and after disabling Overscan, it was pixel perfect for my HTPC and had an insane good picture quality compared to the LCDs.
It still hangs in what's now my mothers room. No one ever wants to move this heavy beast.
Still have the first Plasma TV staying around. 40" Pioneer 4:3 with 640:480 from around 97.
Don't know what to do with it. Needed the wall space for other things.
@@okaro6595 YES THERE DOES!!!
@@okaro6595 Yes there does and very frustratingly its very hard to find true 1280x720p panels. There basically always either 1366x768 or 1024x768...
Really really annoying, so when you look at a native 1920x1080p media on a 1080p screen, most the reason it looks better next to a 720p tv is just because its native and your 720p screen panel is not. Very very few true 720p displays out there, only some HD CRT and TV will produce a decent quality image or else you need to use a little 4:3 vga crt screen in letterbox...
So annoying no half decent ips panels out there, they could just provide an option on a 1366x768 TV to crop the image in a little to a dot to dot scale, but good luck trying to find a TV with such a simple choice of picture option.
The only thing that kinda pisses me off, that 720p TVs are NEVER actually 1280x720. They are almost always actually 1366x768, simply because it was easier/cheaper for manufacturers to convert existing processes to make TVs, instead of making new ones.
Yeah, that sucked. The only two ways of getting real 720p back in the day were either finding one of the super-rare, native 720p monitors or using a 720p wall projector with DVI.
But there simply were no native 720p TVs. So we were robbed of the true, 1:1 picture clarity on X360 & PS3.
@@ThePreciseClimber I have a Retrotink 4k now, and you can integer scale 720p 3x to 4k, allowing you to see the true 1:1 picture quality of PS3/360 games.
It looks amazing, to see the actual pixel perfect graphics. I also like to use the LCD grid effect to create a faux 720p Display, and it looks even better although it darkens the image quite a bit.
I used one as a hand-me-down for a while as a monitor while my computer still had VGA and yeah those displays look noticeably better when you feed them their actual native resolution.
1366x768 isn't even real 16:9.
Multiply 768 by 16 and then divide by 9, and you get 1365.333
The same problem is with 480p on UA-cam.
@@kon_radar xga is all sorts of weird and always was lol. I think it wanted to be 5:4 and later 16:10 in it's heart
Popular Science magazines had articles about HD TV in the 80's and would say in just a few years. 25 years later!
So glad it only took 70 years to standardize displays, let's pray it doesn't take USB as long.
USB is a data transfer protocol, quite different from display tech.
USB has been the standard for decades, I have bo doubt it’ll keep title even with USB C gaining prominence
@@Stabbyhara There's different versions, though. And why the need for all those different plugs on the peripheral end, I don't know.
@@Foebane72EU just signed Type C as The Standart, required for all devices. Even Apple devices finally got sensible connections.
@@udenszirnis1644 USB only APPEARS sensible until you look closer. Sure, the physical connection is the same, but you have USB 2, USB 3.0, USB 3.1, USB 3.2 (actually 3.0), USB 3.2 (actually 3.1), USB 3.2 (this one is genuinely 3.2), whether or not it supports thunderbolt, different levels of power delivery, etc.
France had 819 line HD TV broadcasts from 1950 until the mid 80's.
Not only the France but also Monaco, Belgium , Luxembourg, Algeria ... the 819 line was a 737i @ 14Mhz in France @ 7Mhz in Belgium and Luxembourg ...
625 lines developped by CCCP was adopted with the color (PAL / SECAM) and 819 lines has been abandonned between 1972 and 1983 ...
@@zichittufrederic7489 Visited France in High School in mid 70s, was astonished at how much sharper their TVs were.
@@jamesheartney9546 are you sure it wasn't just the 625-line/SECAM signal but a true 819 line TV
@@zichittufrederic7489 737i? That's basically HD territory as far back as the 1950s 🤯
@@SterkeYerke5555 1948-1984 ( 1985 including Monaco ).
Beating 1000 lines was done in 1941 in lab by René Barthélémy. Let's remember him for this accomplishment.
This is a fantastic video! I'm an engineer who works at one of the companies that makes a lot of the broadcast equipment, and I was taught all of this during my internship. You did a great job of going through the history of video in an interesting and not drawn-out manner. I will be sharing this video with my coworkers and I'm sure many others. Keep up the good work!
Same reason as they now call everything that has nothing to do with AI, AI...
Marketing..
You're AI
OMG someone else noticed lol. I am always bitching about the fact that nothing out there is sentient yet thankfully.
@@nbrown5907AI =/= AGI. Much stuff is AI. Much is barely machine learning. Artificial general intelligence is wayyyyyyy off.
i saw sunglasses once with "4k vision" lenses, yeah.... theres alot BS around indeed
Ever heard of "AI-powered deodorant"? Yeah, its that bad nowadays. Marketing are dumb asmf, and those that fall for it are even worse.
2:16 - "would have their own stuff going on" - check the subtitles on that lol...
LOL
Hahahahahaha
GUESSING, some captioners add a swear to prevent YT from listing a vid as 'for kids, educational' and shutting off comments and Miniplayer.
@@Reepicheep-1 it might just be auto captions not hearing things right.
happens a bit in my videos.
@@TheAussieRepairGuy It was probably part of the original script before he changed it. The English UK subtitles are grammatically correct with punctuation and everything, which you never get with auto captions
The best reference for this topic from the U.S. perspective is the book "Defining Vision" by Joel Brinkley. It documents the entire agonizing political (and sometimes technical) story behind the seemingly endless battle for high definition television in the U.S. There were around ten proposed resolutions from various organizations at various times for various reasons and the computer industry had little input. The first proposals were analog or hybrid digital/analog systems until one company developed a fully digital system that, while wasn't still good enough, was clearly superior to all of the other proposed systems.
I must say that resolutions like 1920 x 1200 have been used for a long time among professional monitors, now you can still find laptops with high screens and a vertical resolution of 1200 pixels.
1920x1200 is far superior for actually doing work IMHO. 1920x1080 is way too short.
I have a “2k” gaming laptop and it’s actually pretty nice
I have a 1920 x 1200 secondary screen, it used to be my main screen and it was soooo nice, especially for games and productivity, u could have the task bar at the bottom and program bar at the top while the game didnt feel squished at all, also for youtube, it makes it able to not have the progress bars of videos as overlay on video, it just stays under the video screen which is amazing when taking screenshots of paused video
I'm also team 1200p, been so for over 20 years. 16:10 is the superior aspect ratio.
1200p rules, can vouch for it. 16:10 and similar are goated
I always found it weird / interesting that 1080p was chosen when 720p was the faux-standard for HD for a lot of the early standard's definition.
Don't forget 480pPSHD
@@everythingponyI never heard of this.
I've never seen a 1080p television signal. They are 480p, 720p, 1080i, 4K. 1080p seems to be only the PC/gaming standard.
@@bartek053031080p tvs are pretty Common here in Europe. Atleast the Netherlands. My tvs always have been 1080p previously
@bartek05303 digital broadcast is progressive
That Sony TV would've made my brain explode in 1988. I didn't even have a colour TV till like 1985!
what a dumbass
I remember when 'big' Tvs were anything much bigger than like 35", lord help your mover's backs if you went bigscreen, doubly so for that brief period of HD tube TVs.
@@baronvonslambertlmfao, my Dad is 78 and has hearing loss in the high range from shooting at the gun range (was a cop), and recently his battery backup for his pc needed the battery replaced, I go in his house and it's a constant squeal from this thing, he couldn't have cared less 😂.
@@baronvonslambertMost adults can't actually hear the high pitched squeal, so that's why your mother couldn't hear it. Protect your hearing!
@@baronvonslambert 1998? My family had a B&W TV in the kitchen until 2010. That's when we shuffled the TVs around and that B&W became a spare.
Honestly, pre-digital television broadcasting just feels like a sort of arcane magic to me, the more I know about it, the less sense it seems to make. It feels miraculous that any of it worked.
Don't even get me started on CRTs.
Really? To me it seem's very ... primitive for the lack of better word. You can easily understand everything what happens in an analogue broadcast.
Digital broadcast is the real magic. You most likely understand it only on a very high abstraction layer.
Ie analogue video encoding is something you can explain in a few minutes. Digital video encoding is an art where only handfull of people know how a single frame is encoded from start to finish. You go one layer up and puting I/P/B frames and ordering them is one of the hundreds of problems and there are ten more layers to the digital broadcasting.
CRTs were a commercialised particle accelerator, weren't they?
Peter, this is incredibly well made, detailed and informative. You should be very proud and this should be required watching for all interested in all things video.
The fact that they did not immediately realize the need for square pixels insane. PCs got square pixels already in 1987 with the VGA. Square pixels make things so much simpler.
I reckon working with a *generated* image instead of a *captured* image might have had something to do with it - it's hard enough figuring out where to put down the next black pixel when drawing a circle with square pixels - if they're not square, it's even worse. Of course, this was immediately subverted by things like the ZX Spectrum video memory structure, where the order was line 0, then 8 lines below, then another 8 lines below - for ONE THIRD of the screen; then continue with line 1, line 9, line 17 etc; then proceed to the second third and repeat. YOU try drawing a circle onto that...
Standard resolution on a CRT TV for basic programming doesn't require the same pin sharp focus of reading fine text on a monitor. Generally any image looks great on a CRT when reading fine text isn't a requirement
Yeah, if you have rectangular pixels, the software designers will often still treat them as if they are square, resulting in a stretched look. Most NES games did have that issue.
1987? Macs had square pixels in 1984. And they were 72dpi so 1 pt in software was 1pt onscreen.
Life isn’t made of squares, so it makes sense to me that it took a digital overthrow of analog signals to force the change.
Great video. Couple things I wanted to add. SMPTE is general accepted as “simp-tee” for short. Save yourself a mouthful next time. One other cool note is that if you multiply the width by height by framerate from NTSC land(720x480x30) you get the same number as doing the same equation in PAL resolutions. (720x576x25). Both yield 10,368,000. Not sure why this is. Seeing as how both standards have the same bandwidth I’m guessing that’s just the most they could get out of it.
It's definitely not a coincidence that the line numbers multiplied by the frame rate equals the same number. Or really the same but with the total number of scanlines (525 vs 625). The resulting number is the line rate. It's (nearly) the same because the 625 line standard is based on the 525 line one.
1:10 Interesting seeing an article from the 1930’s talking about fixing “bugs”. Never knew the term was used prior to computers.
Very interesting! A brief search of online dictionaries only mentions this sense being used in "computing" not other fields.
I found a blog I'm sure I can't link, which offers the below:
"The word has been used in engineering since the nineteenth century.
The word ‘bug’ actually is short for Bugbear. (sometimes found as Bugaboo). It’s meaning is much closer to ‘Gremlin’, where the people who worked on engineering prototypes often grew to suspect that the problems were due to malicious spooks. I sometimes even still hear it said that some software is cursed with malicious spirits. The ‘Bug’ or ‘Bogey’ part of the word is traceable back to the fifteenth century in the meaning of ‘Hobgoblin’, devil or ghost. In East Anglia particularly, the word Bugbear’, first recorded in the sixteenth century, is still used in referring to problems with machinery."
Channel named "Our own devices" _just_ made a video about telegraphy (morse code), where he explains the term!
"Bug" was a nickname for an inexperienced telegraph operator.
My Dad bought a Sony HD CRT in 1999 for around $4k (I'm not exactly sure how much). He was so proud of it but within 5 years flat-screens started coming out and it made our insanely expensive and insanely heavy HD TV already look outdated. He kept it until the late 2000s though.
and now those monsters are worth money to the right people... they're such amazing tvs and i wish i could have one. some people selling them around me but would need to reeeeeealy be ready to put it in one place and never move it again
That HD CRT probably looked better than most plasmas or lcd's until the late 2000s as well, so I guess he made the right choice in the end.
being any ind of early adopter when it came to HD and home theater tech around the turn of the century was insanely expensive and not remotely cost effective. You'd pay huge amounts for the latest tech and then it would be surpassed by something newer long before any kind of reasonable end of life. A $4K HD TV wasn't even all that expensive back then. You could spend more than twice that on a plasma display that didn't even do 720p and which suffered horrendous burn-in if you ever left anything on the screen for a hile - including just station logos and the like. I paid about $3500 for a rear projection 720p DLP TV back when that was the best bang for your buck in HD TVs but it was basically worthless within a couple of years.
@@SterkeYerke5555 The HD CRT still looks better than any LCD, though Plasma could give it a run depending on the content and plasma TV. Still worth more than both, most people don't remember they have a plasma TV and sell it for LCD prices.
@@uponeric36 I guess it's a matter of taste, but I don't think many people would be inclined to agree with you. Viewing angles and motion clarity will surely be better than any lcd, as well as resolution scaling, but when it comes to brightness, colour volume or sharpness, I don't think any crt can match modern lcd's. Input lag is bound to be better on a modern lcd than an HD crt as well (though obv not as good as earlier crt's), and decent minileds will beat the crt for contrast too (as well as most non-Kuro plasmas). Even a modern non-miniled VA panel might beat it for native contrast, though it'll be even more compromised on viewing angles. As much as I dislike lcd's in general, 40+ years of development is starting to pay off.
I have just realised that I have watched this whole video without getting bored, that is no mean feat. I do look forward to these.
as a person with ADHD, i too really enjoy a video that never loses my attention to something else.
I'm old, so going from a B&W to a colour TV in 1976 was earth shattering to a kid.
must’ve been insanely weird seeing it for the first time 😅
I’m pretty young and going from a 60Hz monitor to a 120Hz monitor felt absolutely amazing… though probably nothing like seeing a colored TV for the first time 😮
My dad still talks about seeing Star Trek in color at a (rich) friend's house for the first time and being absolutely blown away by it...
color*
Is it true that back then even real life was black and white ?
@KeroseneKerosine We started with 60 Hz. Atari, NES, Genesis/Megadrive, SNES, all at 60 Hz.
the transition from analogue to digital and consolidation of all the different analogue techniques was quite beautiful and well designed
It was NOT just the lack of RAM that held pixel counts low in the early days of PCs. Being able to move those pixels around the screen requires CPU cycles, so the more pixels you have to move, the slower your game or application will be able to draw.
And listening to you refer to the IBM PC as a professional machine versus the Amiga was hilarious. First off, the PC was able to use an RF interface just like a Vic-20. Second, NO ONE bought a new Amiga 1000 to play games or to use it with a TV. It cost the equivalent of $3500! That's a steep price for a game machine!
Between the 8 Bit Guy and Nostalgia Nerd, I learned more about CRT technology in the last week than I learned in the first 50 years of my life. Outstanding!
Let's make a 1280x720 resolution!!
...
Ok 1366x768 it is!
"hd ready"
lol
This is the best explanation I've come across as to why those specific numbers were chosen. Here I was thinking that it was the closest approximation of 1 megapixel (720) and 2 megapixels (1080) at 16:9.
I've never understood the concept of the megapixel. It's an absolutely useless and unintuitive way to measure resolution, which has thankfully largely died out by now.
@@Wishbone1977 It was a marketing tool for digital cameras. It wasn't designed to be useful, it was designed to sound impressive.
@@__christopher__ Which would have been fine if it hadn't been for the fact that it _replaced_ the useful information in the marketing. I wouldn't have minded so much if they had still given the resolution in a way that makes sense (x*y pixels), but they didn't.
@@Wishbone1977 yeah, it was to sell cameras.. that's pretty much it
love your videos for years, cheers from switzwrland
Even IBM computers took a while to use square pixels, 320x200 was a common CGA/EGA/VGA resolution and wasn't letterboxed, so pixels were "tall" compared to 320x240.
Fun fact: both of those can be integer resized to 1600*1200. Multiply by 5 horizontally and by 5 or 6 vertically.
I hate maths
@@unbearifiedbear1885
Those are some of the easiest calculations ever.
@@mal2ksc Well... for VGA it was to overlap with MCGA 256 color modes. If a game wanted to show 256 colors, it could be programmed for VGA 320x240 (supported by all VGA cards) or 320x200 (VGA and MCGA), and yes, MCGA had less VRAM so LucasArts games have choppy scrolling even on VGA at 320x200 and games like the Lion King ran at 60fps at 320x240.
talk to a broadcast engineer or video mastering expert and most of them will tell you that the active width of a 13.5 mhz 4:3 image is probably closer to either 702, 704, 709, or 716 pixels rather than the full 720, and that those additional columns are essentially just padding and overscan to account for any horizontal offsets that might happen as a result of analog/digital conversion errors. backing this up is the fact that the dvd spec allows widths of both 352 and 704 (and not 360), and that the earlier and best-supported versions of the ATSC digital broadcast standards don't even bother to support a 720-wide mode, only 704
isn't 720x480 just 704x480 with blanking interval black bars?
@@gamecubeplayer debatable depending on who you talk to, but kinda yeah. that said, the TOTAL length of a line, including active, blanking, and sync area comes out to 858 for ntsc or 864 for pal (when sampled at 13.5 mhz).
6:18 Looks like C from afar, but closer-up, it looks like JavaScript.
Immediate edit: The video was _so good,_ I thought all 20 of these minutes were *less than 5.*
I *_was_* watching at 1x speed!
Thank you, Mr. Nostalgia Nerd!
Red Dwarf Quotes! 05:50
Oh hell yeah :D
I'm a bit shocked. I've never seen this subject fully understood in a YT video before. They always get something way wrong, or gloss over important details. I expected to be typing pedantic corrections here today, but no. Instead, I learned a few more details myself. You have a new subscriber.
ITs always nice to see your content videos..true nostalgia...Greetings from Portugal
Portugal, caralho!
This was a great insight and the connection to the early 80’s computers is fascinating, especially given I’ve very recently been watching your histories of Sinclair, Acorn, Dragon and Commodore and they’re amazing! I love this place!
i remember watching a documentary in the 90s about how everyone would enjoy HD digital TV and it was fascinating. Now we are at 4K heading into 8K.
I watched a similar show, showing the advances in TV Japan.
They showcased portable handheld TV's that were so far above what we could get in the US too.
To be fair, that was at least 25 years ago
ARE WE?
The best screen in my place, is a 27inch 1440p display... Everyone else is fine with their over sized 1080p tv's. 4K isn't even considered.
@@justinpatterson5291 Hmm, pretty close to 24" 1920x1200 in MY book ;-)
Here's an interesting idea for a video. Why not look at the evolution of the picture standards in relation too colour & contrast.
from rec.601, rec.709 to rec.2020 HDR. for instance why was the original colour gamut, gamma curves & 100nits of brightness decided upon?
what was the point of the minor upgrades of rec.709? Why did it take so long for HDR to become a thing when LCD's have been exceeding aspects of the SDR specs since the 90's. And why did they decide on the rec.2020 colour gamut if it still can't handle all visible colours, officially it's so they don't require conversion chips to handle imaginary colours but seeing as all digital video requires computer chips anyway that explanation always baffled me.
Minor correction: BT.2020 is UHDTV, which is an SDR standard. BT.2100 (HDR-TV) uses the same primaries though.
As for how they decided on those primaries: Look up the paper "UHDTV Image Format for Better Visual Experience"
@@Dogelition Can't access that paper without being an institutional member or purchasing for far too much money.
isn't available through any other sources either that I can find.
8:26 is the sample used in Panda Style by LAOS, on Hospital Records. Didn’t expect that one to hit me in the face😮😅
I really appreciate the subtitles. Also, great video!
I remember I had a 1600x1200 screen, and my one absolute firm rule was "When I change screen, it needs to be greater or equal to the HEIGHT I already have.", which pushed me to 1920x1200 in a 16:10 screen. I actually had 3 screens of this resolution, before the market saw them vanish with 1920x1080 being the only option, at which point I pushed up to 2560x1440.
Unfortunately, I wanted to keep my 24" size, but I had to choose between a 1920x1080 24" screen, or a 2560x1440 27" screen, so I went bigger.
Also interestingly, I remember a few years earlier when I got my 3rd 1920x1200 screen, that I almost bought a 2560x1600 screen, still 16:10.
Yeah the early 2000s was touch for those of us with higher end monitors.. 1080p was a downgrade, and 1440p wasn't yet available
I remember getting a Dell 3008 in about 2017 (which was also 2560x1600) and keeping the firm rule that I wouldn't upgrade until 4K oled monitors were available. Didn't make it. The 3008 died in the summer of 2022, at which point only the very first oled monitors had hit the market. They would've been a downgrade height-wise, being a 34" 3440x1440 screen, so I had to "settle" on a 4K 144Hz IPS screen instead. Can't say I'm unhappy, but I'm still a bit bummed I didn't make it far enough with the 3008.
You can still buy 1920x1200 monitors. It's my favorite also.
@@dizzywow They're very rare, and I often find that they're only for sale in America, and aren't available in the UK.
@@bobingabout They are in Europe at least. The Acer Vero B247W, Samsung F24T, Dell P2425 and the Iiyama Prolite XUB series are all fairly modern and somewhat widely available 1920x1200 monitors that don't break the bank. The Dell even goes up to 100 Hz, which nowadays I'd say isn't only preferable for gaming, but even feels better when just navigating Windows in general (or any other OS ;).
In my personal computing history, for the purposes of mostly using a computer for software development, I went from 320x256->768x288->1600x1200->2x1600x1200->1920x1080->3840x2160
Started in 8bit Acorns, then 32bit Acorns, then onto PCs with dual monitors before falling back to FHD once I had to negotiate space with my wife and two large high resolution CRTs took up too much of it!
Fun fact: 4k (2160p) is a direct multiple of both 720 and 1080 resolutions.
In fact, it's the smallest direct multiple of both.
fun fact: 5120x2880 is the smallest direct multiple of both 480 & 576
CDs, Laserdisc, First colour TV, ect. I had no idea they did this compliance of standards so early. Great video!
*etc
NTSC standard DVD resolution was 720 x 480i. Just thought I mention it, since the video skipped this era.
The video doesn’t skip this “era”. It mentions Rec. 601, and the DVD is based on Rec. 601. Please also note that the “NTSC” DVD doesn’t contain an actual NTSC signal, so this is a confusing misnomer.
Rec. 601 is a digital video standard from the early 1980s. When the DVD was introduced at the end of the 1990s, everybody was already talking about HDTV. So the DVD was pathetic from the beginning.
NTSC was an analogue broadcast standard and a standard for driving analogue CRTs.
DVD is a standard for optical discs.
They have nothing to do with each other.
Especially as the video signal was often displayed on digital displays.
I love 1080p. It's the best of most if not all worlds. It's very clear, sharp, not too far that you can't see the pixels and for what I do when converting shows and movies to my Xbox 360. I need to see the individual pixels even in a 100% window size so I can tell between what is the original and what happened to it. Of course I'm mainly referring to Virtual Dub but overall 1080p is a perfect and controllable resolution. I just wish UA-cam wouldn't treat is so harshly. Have you ever noticed that videos that get the VP9 treatment ruins the 1080p option whilst 1440p removes 90% of the artifacts? What's up with that? AV1 had better fix this.
If only Conan knew this:
In the year 2000! In the year 2000!
TV aspect ratios and resolutions will become more standardized
Well, it's a long time until the year 2000 factorial. :-)
So glad you are still uploading, can't wait for the next video!
I had no idea 1080i went back to before I was born. It felt new and fancy during my teens in the early 2000.
I can remember being stunned by my little white MacBook being capable of 1080p 444 content playback in ~2007.
similar age to me, i remember working at an electronics store in 2001 at 17 years old and seeing one of the early HD plasma screens hanging on a wall. mind blown for sure.
1080i was anoying to watch (sky used it for a very long time) I just set everything to 720p until sky had newer boxes that could actually use 1080p
@@leexgx interlaced is kinda weird. Gran Turismo 4 on the PS2 could do 1080i and I usually went with 480p because it didn't make the road jitter while turning
@@smiththers2I miss plasma tvs. Sure, they doubled as space heaters but they had such nice picture quality. OLED is close but still too expensive
@@NLynchOEcake 1080i on GT4 was just 480p upscaled to 1080i for HD tv's that didn't support 480p. There wasn't much point in using it otherwise.
At 02:05: "Reducing the scanning frequency to 29.970 frames per second, with the remaining bandwith used to carry the color signal" - uhm... no, not at all.
I don't know where to start correcting this, as it's so wrong and such a big misconception. In short: the frame rate was reduced to 29.97 Hz in order to reduce interference between the newly introduced color carrier and the existing sound carrier frequency by shifting everything in a manner so that the color carrier and the audio carrier are different by a non-integer multiple of the line frequency.
This reduction of the line frequency and this overall framerate (or, rather, field rate) has nothing, really *nothing* to do to accomodate the "higher bandwith" of a color television signal. In fact, the overall bandwith of the NTSC color signal is not at all higher than that of a black-and-white signal. The spectrum of the color sub-carrier lies well (and completely) inside the spectrum of the luminance signal. This is actually why we have things like cross-color artifacts and dotcrawl, and why S-Video (separate Y/C signals) was even invented. Sorry for the elaborate correction, but I just couldn't let this stand as such.
Otherwise, as always, great video! Keep it up!
FYI - Most broadcast 1080p is actually encoded at 1920 x 1088 - because of maths or something...
If you enable the DLSS console in your game. Sometimes it reported 1920 X 1088 reder resolution for some reason as well
Mod16 encoding
It's because 1088 is divisible by 16 and most videos are enconded in 16x16 px blocks
Not in Australia. Our FHD broadcasts are 1920 x 1080.
@jublywubly Actually 1080, or reported at 1080. In America, everything will say it's 1080, even if it's actually 1088. The last 8 lines just get cut off the bottom and not displayed.
Wow, you've knocked it out of the park with the footage - you've always had the dialog down pat though.
still remember the time where 1080 was new. all we could focus on were the objects around the subject, not the subject itself 😂
i remember seeing blades of grass on a football game field. mind was def blown at that!
I remember being able to see the light in the beads of sweat on a WWE wrestler's body the first time I used a 1080p TV.
@@CyanRooper I remember thinking "I'm not sure I'm going to like the new era of movies where we can see the actor's pockmarks and stubble"
I really wished we had settled on a 2:1 aspect ratio. That would have been a much better fit for films shot in 'scope. Narrower content (1.85) could have been pillar boxed similarly to how we pillar box old NTSC/standard def. and 4:3 film content today.
Nothing is more "I'm pretending to be a Gamer" than at 20:00 where the game is literally playing before she is sitting down!😆😂
Thanks for featuring my HDM-3830 monitor in your video :-)
I have a Mitsubishi XL5U projector that can project at 720p (native resolution of 1024x768) that I’ve had since I was 8.
14 years later, I now have an Acer projector that’s a native of 1080p.
720p to me as a kid was imax quality comparing the 2 now 🤣🤣🤣
projectors were and still are a strange thing as far as resolutions are concerned. you can typically feed them with a much higher signal than they will output, but they wont complain about it and run just fine..
@@smiththers2 yet you try and feed a monitor with a higher res, then it will occasionally complain about the input not being supported
Then there's the CRT that really doesn't care about resolution. As long as the signal is right it'll try to output it. Make windows look very tiny with a huge resolution. Don't think anybody even cared about native resolution until LCD screens came along.
@@davidmcgill1000 However some CRT monitors could get physically destroyed when feeding them unsupported video modes (basically the problem was exceeding the maximum supported horizontal or vertical frequency).
The original 1936 405-line broadcasts from the BBC (until 1985) were declared the first "high definition" regular broadcasts in the world - far from what we class as HD now but a leap ahead to previous television technology beforehand.
CGA, EGA, and Hercules would like a word with you about computer monitors not being interlaced... 🤣
[Edited: My bad, I got mixed up between interlaced _memory_ and interlaced _video._ ]
Also about square pixels. ♥
Those were not interlaced at least not the MDA. Interlacing is rare on computers as it makes horizontal lines jump up and down. I did have a SVGA that could either do 800x600x56 or 1024x768x96i. The latter was awful on normal Widows content but very nice on images.
They did use interleaving at the memory level but this did not show to the monitor.
@@okaro6595 Oh, there very much WERE real interlaced modes, displayed as such. A special edition of the S3 Virge 3D video card came with LCD shutter glasses, which got their switching signal directly from the bottom two lines of the picture, as sampled by a dongle on the VGA output - the final two lines were supposed to be 1/4 white and 3/4 white respectively, each identifying which half-picture is on the screen at the time, blacking the LCD for the other eye...
@@okaro6595amigas like a word with u
@@AttilaAsztalos my friends dad had those shutter glasses, they did 3d content as well - I think he had Fallout 3 or New Vegas and a couple racing games (Need For Speed etc), sht *blew my mind* bitd 😂
Wow that was fun, Story, I was working at the BBC studios in Milton Keynes UK in the late early 90's. In comes this huge Sony windscreen monitor a Sony 3830. The props guys where not happy as the resolution on this set was incredible showing the smallest defect in the scenery which was held together with gaff tape and foam.
Hope the arcade is going well and the book is flying off the shelves. I assume that’s where you’ve been?
Full on history lessons is what I come here for, so thank you!
I still remember when 1080p became mainstream and it was simply just awesome.
The 5:4 aspect ratio looks pleasing to me
Why was 1366x768 / 1360x768 so common there during those 720p days? What a painful 5yrs or so.
It probably was cheaper to manufacture than 1920x1080 (until economies of scale changed that). It had no loss of vertical resolution over the 1024x768 of earlier computer monitors. And existing 1024x768 computer content mapped perfectly to the middle of it, with no need for scaling.
_EDIT: QuestionBlockGaming said it better in their own comment._
As an additional step, that super-odd resolution of 1366x768 that was common on so many laptop panels for so long was actually a stopgap resolution, that was meant to be compatible with programs that required 1024x768 while also delivering a widescreen resolution for watching 1280x720 media and navigating websites meant for widescreen displays. I had more than a few programs (meant for work!) that would outright crash if the resolution wasn't at least 1024x768, and the 1366x768 resolution was a great alternative!
too bad it was SO common in TVs too. Dreadful era. That and they'd stick that on 15" laptops too, those pixels were so huge and chunky and barftastic.
@@colinstu it being on a laptop of the era might've looked bad but it was still better than having a straight 1280x720p panel. But yeah all those 1366x768 panels on televisions were GNARLY
@@QuestionBlockGaming also any idea on 1360x768? that seemed to come up a lot too. Seriously why 6 less.
@@colinstu I think it's to do with divisibility by powers of two. 1366 isn't divisible by 8, but 1360 and 1368 are (170*8 and 171*8, respectively), and when it comes to lists of funny resolutions, you usually see one or both of them.
@@Roxor128 Multiples of 16, actually. That's a common cell size of LCDs, so if you needed any resolution that's not a multiple of 16x16 you'd to have special partial cells on one (or worse, two) sides. You get the same issue with 1080 displays, where there's a half cell on the top or bottom. On some displays, you can even see those unused 8 pixels when you compare how the top and bottom edge looks. But there, manufacturers have no choice---leaving out the last row (i.e. delivering 1920x1072) is no option. But leaving out a column that'd be 10/16 unused from a no-real-standard-resolution display...easy.
1080p is still sweet spot.
No it isn't lol. 4k is the golden standard now
@@НААТ on pc gaming 1080p is the most popular resolution.
@@とふこ for pc 1440p will be the norm pretty soon. Those monitors aren't that expensive anymore. I own a 3440x1440p one myself
@@НААТMaybe that's how it's viewed, but I'd still argue 1080p is the sweet spot. Newer, higher-resolution stuff still looks good on it, and at the same time, it doesn't make older, lower-resolution stuff look quite as bad as 4k does. That's what _I'd_ call a gold standard.
@@НААТlol, no 4k isn't lol.
The majority of content, video and gaming is still 1080p.
If you only look at the numbers from Steam for example, that very clearly shows that most people still game at 1080p.
The same goes for UA-cam videos and things like Netflix etc.
In fact, for most video stream services, you need a more expensive membership to enjoy 4k.
Yes, I am aware that most televisions are being sold as 4k, doesn't mean what people are watching is 4k.
If you know a little bit about the resolution of our eyes, you'll quickly see that the improvement goes down very quickly.
Except for things like video post editing, there are no practical benefits anymore above 4k. I think it's even banned in some countries if I am not mistaken? It just eats power.
18:39 Uuh...WHAT 720p screens? :P
Native 720p displays were exceedingly rare. 99.9% of "HD Ready" TVs and monitors were actually 1366x768p.
Which actually prevented us from seeing all the 720p games on X360 & PS3 in their true, 1:1 sharpness.
Don't forget the Panasonic plasmas with a 1024x768 16:9 display. Just to make sure absolutely nothing looks perfect!
Xbox 360 supported 1366x768 resolution though
I remember writing programs for the PC in my youth and having to choose between 320x200 with 256 colors, or 640x480 with only 16 colors. Back then, it was a point of great frustration, but now I look back on those limitations with a feeling of nostalgia. Feels like we're just wasting pixels these days, with 4K/8K resolutions, and the only relevant trade-off is the framerate/bandwidth.
Yeah, I remember creating pixel art on my computer with 16 colors, but only 4 could be used. And I don't remember the resolution I had back then, but it was definitely low.
Great video! I've always wondered how the resolutions were decided upon. Most enlightening!
1440p is love, 1440p is life.
1440p has almost no reason to exist because it's not an integer upscale of 1080i/p
But it looks gorgeous
@@gamecubeplayer It does for gaming and phones, so the majority of the ways we use displays..
Okay, I feel old when I watch this and remember all of it happening. Excellent ! Thanks for the memories!
I think it's interesting that the terminology used to talk about resolutions has always been the number of horizontal lines, like 480p, 720p, 1080p, even 1440p, until 4k came into the world and started using the number of vertical lines (3840x2160 is 4k but we don't call it 2160p). interesting to think that that terminology was a holdover from counting scanlines.
Back in 2008, a buddy of mine that worked in TV post-production was telling me about film transfers being done in 4K, which was twice the horizontal and twice the vertical resolution on 1080i and being progressive scan (he may have mentioned the fps, but I don't recall). I asked him why it was referring to the horizontal rather than the vertical resolution and he said because film aspect ratios vary so it's better to think in terms of the commonality - the horizontal. I also asked why it isn't 4096, to be true 4K and he said to keep the resolution an integer multiple of 720p and 1080i for easy down converting.
Sometimes it is called 2160p
Really the exact name depends on the context because 4k could mean a bunch of different resolutions
It's not interesting. 2160p is double of 1080p, of course that's a big no-no for marketing departments. We must call it '4K' so people know it's 4 times more better than their crammy obsolete 1080p monitors. Buy now!
@@rager1969 Those 2k and 4k used in movie production are actually 2048 and 4096 pixels wide. The TV/PC world stole those names and used them for 1x and 2x 1080p, respectively.
Movies go by horizontal resolution because that's a fixed size---the width of the film strip. The height of the picture depends on how many perforations high the camera exposed and on ow much of that the director blocked off. So a 2-1 aspect ratio movie would be 4096x2048, while a 4-3 movie would be 4096x3072. In video world, those two would be 3840x2160 with 240 pixels of black bars at the top and bottom and 3840x2160 with 960 pixels of black bars at the sides
I remember watching a 20/20 report on HDTV circa 1990 and the whole thing was them mostly saying "you can't see the difference on your tv at home, but trust us it looks much better"
**cleans their glasses even though they know the bloom on those light sources was a video effect that reached right into their brain**
🌀All these Circles ⚫equal a line 👽All these ⚫Circles equal a line 👽All these Circles ⚫equal a line 🐈⬛
i was blinking my eyes like mad because i had just woken up....
Your explainers are sooo goood!!! Many many thanks for the work you do! Thanks a ton! 🙏
Funny that they changed counting the vertical (720p, 1080p) to counting horizontal (4k). Counting by the latter scheme, 1080 should be called 2k instead.
The also went the disk-manufacturer route and started lying about the values, too. What they call "4k" is only 3.84k.
They didn't really change; they just adopted the movie resolution names and used them for the closest video resolution. A "real" 4k frame is 4 thousand pixels wide, not those 3840 computers and TVs use.
Move resolutions have always been about the width. The width of a film strip is constant, but how you subdivide the infinite length of the strip is up to you. Although it makes very much sense to use multiples of the perforation and mask off areas at the top and bottom you don't need. Quite the opposite of the fixed number of lines and fuzzy horizontal resolution of TV/video signals. So when digitizing film, the only thing standardised is the horizontal resolution. The number of lines is how many you get after cutting off the black areas and can be different from movie to movie.
A notable exception is IMAX, where they run a 70mm-wide film strip horizontally, so it becomes 70mm high and the width becomes the variable.
@@HenryLoenwind With analog video the main parameter has always been the number of scanlines. _Movie_ film was measured by width but that's totally different. And it's not just IMAX using horizontal film, standard still cameras did, too.
HD; UHD1 and UHD2 are used for television. 2K, 4K and 8K are used for cinema, and they have a 17:9 aspect ratio at maximum resolution.
In France, the movie director Abel Gance invented super wide format called Polyvision in 1927. It was basically 3 pictures side by side of a 4:3 aspect ratio each. But it needed 3 movie projectors. This was not easy to use everywhere. Seeing this, the astronomer Pr. Henri Chrétien tought about a way to do wide screen with a single projector, compressing the image by anamorphosis. He called his system Hypergonar.
Those breakthoughs were ingored until 30 years later people in the USA started to watch more and more television and don't go to movie theatre anymore. Movie studios wanted more exciting technologies to bring back audience in theatres. Cinerama is a rip off Gance's system, also using 3 projectors; and the Fox studio bought Hypergonar from Chrétien and rebranded it under the name Cinemascope. The Cinemascope was a sucess.
This is why you have wide screen now.
And who invented 4:3 in first place? It was William Dickson who wanted a 1,5:1 aspect ratio, but due to technical limitations of technology and dimentions of the 35 mm film gauge decided to make the best out of this and went for 4:3 instead. All the formats are more or less directly related to this.
HD is 720p and fullHD is 1080
720p is less than the normal 1280x1024 pixels we were used to on CRT.
Calling this "HD" always was a bit of a stretch.
When early HD systems were sold there was commercial terms, HD ready for 720 and Full HD for 1080. They are not used anymore.
In broadcast field we use "HD 1280×720" or "HD 1440×1080" or "HD 1920×1080".
I literally ask myself this all the time and am so happy to have had this video come up in my feed. Amazingly explained. Insta sub!
Why are the captions so different from what you're actually saying?
some are auto generated by youtube.. some are added by him i think, especially the FML one lol
I suspect the caption text comes directly from a script but then he's not reading the script word-for-word.
@@eDoc2020 And even movies and broadcast TV will paraphrase their closed captions sometimes, to accommodate slow readers in scenes where people are talking a lot.
High Definition consist of various resolutions. You have to remember when the standard was created, high resolution fixed pixel displays weren't in use. CRT ruled the day, and could display numerous input signals at their native resolution. So the focus was on the source resolution, not the display resolution. When it came to displays, it got even more complicated. A display for sale in the US could legally be listed as High Definition with a native resolution as low a 1024x768 if the pixels were rectangular in shape. For source material it started at 1280x720 progressive. Young people seem to have forgotten ATSC HD broadcasts were limited to 1080i for nearly 20 years. Up until recently, no 1080p OTA broadcasts existed. compression codecs and bitrate play just as big a role in the quality of a high definition image as the output resolution. Thats why UA-cam videos at 1080p often look soft compared with a 1080p BD disc.
HD isn’t 1080p, 1080p is HD.
16:9 with 1920:1080 resolution was never adopted by "PC". It was forced to use as cheaper alternative. Manufacturers of PC monitors just used cheaper/the same panels of "tv" screens.
most tvs seem to be bgr instead of what most of monitors use, rgb, are these from the same panel?
19:26 That chart is way off. Instead of Desktop, it should perhaps say Desktop / Laptop as cheap laptops are absolutely affecting those results (e.g 1366 x 768 - not many on a Desktop PC uses a monitor of that resolution). And I find third place '1536x864' as bizarre. In 2023? Really?
Did anything ever even use 1536x864? As far as I know the 4:3 version 1152x864 exists only because on budget CRT monitors you could get 75Hz at that resolution, being between 1024x768@85Hz and 1280x960@60Hz
@@Pasi123 I am not familiar with the resolution. iPhone maybe? Though the chart is labelled "Desktop screen resolutions", so I am unsure.
And now that I think about it, what even happened to 1440p? Surely there were more 1440p screens than fucking 1366x768 screens in 2023?
1536×864 is due to incorrectly measuring resolutions. 1920×1080 with 120dpi (125%) is a common configuration, but browsers divide the width and height by 1.25 and report it as 1536×864 so that scalable content adapts to that size.
i've used 1366 x 768 until few years ago, now it remains as my secondary monitor
I love this channel feels very much like it came from the eras its so often describing. Its like the favors you taste in the wine come from the soil it was grown in.
When 625 line PAL was originally marketed they also sold it as "High Definition" which to fair it was compared to the old 405 line system... which was also marketed as "High Definition"
Book ordered! Thanks for everything you do :D
I suspect the answer is gematria and the moon. Screens are 1080 and 2160p because those are the average radius and diameter of the moon at its equator, respectively. Hence, 1080 and 2160p. 2160 is the sum of a cube's angles, 90x4=360 per face x 6 faces = 2160. Drop the zero, you have 216, the 6th cube number. 6x6x6=216. The most prominent aspect ratio is 16:9. This can be explained with (pythagorean / digital root) gematria. The word "you" = 25 + 15 + 21, as y is the 25th letter, o the 15th, and u the 21st. The sum being 41, the 13th prime, where 13 is the 6th prime and also 6+1+6, a "number of the beast" and also part of Earth's orbital velocity. Which is 66,616 miles/hour. 6 is the first perfect number, the 3rd triangular number, and so on.
In pythagorean gematria (compared to "ordinal") a digital root is taken before adding the numbers together. So Y=25, 2+5=7, Y=7. O=15=1+5=6, U=21=2+1=3. Simple digit sum. So You=7+6+3=16. There's the 16 in 16:9. i is the 9th letter. There's the 9. So the screen aspect ratio 16:9 is the ratio of you:i. "Money" is likewise, the word is an encoded reference to the eye of Horus, ie the moon. Mo[o]ney[e]. You also have mon-E, or one e. One energy, one 5, where 5 is the senses, sensory reality, and witht he 5 pointed star it's the top point, the hidden "aetheric" mover, the spirit. The other 4 are the elements, or the visible, the seen. Fire, water, air, earth. The pentagram also encodes the music scale in its unfolded ratios, it has infinite recursion, and so on.
Screen sizes and resolutions are references to the moon. Which is chased away by the sun, it reflects the sun, it eclipses the sun periodically. Where the masculine and feminine, the sun, and moon, the beast divided, become one, revealing the corona, ie the white ring. There was an important eclipse over much of Europe in November (11) of 1331. Fold in your ring finger, you have 13. Flip your hand around, 31. 1331.
For the older 4:3 screens it's the same. 16 is the 4th square. 9 is the 3rd square. 4x4 and 3x3. ie 16:9. The magic square of the sun is 111 and it's a 6x6 grid. 1080, drop the zeroes, 18. A lucky number. Also 3x6, or 6x3. 6+6+6. Which is the number that connects the sun, the moon, and the Earth. It's the number of "The World". That's the beast. Also in Hebrew gematria the word for "a man" = 216, 6x6x6. It's the number of a man. "six hundred threescore and six" gives 313 and 133, which is a whole 'nother rabbit hole.
1:45 441 lines. 441 is the 21st square. 21x21. 21 is the 6th triangular number. 66 again. Or U is the 21st letter, So U times U. You x you, you alone, or alone with others, in front of the TV. Not that I was ever all that social. 2+1=3. So 33 also.
625 x 576 for PAL. 625 is the 25th square. 25 is the 5th square. 5^4. 576 is the 24th square. 2+4=6. 66. X is also the 24th letter. So XX. 625 x 576 = 360,000. 600th square, 600x600. Or drop the zeroes, 36. The 6th square. 6x6 either way.
Bro what are you yapping about
@@lovrito2008 I forgot to mention that 1331 is the 11th cube. 11x11x11.
Conspiracy theories be like:
@@lovrito2008 The other thing is the 1920 part. "eclipse" = 192, "second" = 192. Both Agrippa's key.
Actually, I had long wondered why they chose 720x576 instead of 768x576 for (PAL) DV video (the latter would have resulted in square pixels for 4:3 aspect ratio). Now it makes sense. Thank you.
0:00 Awww, what a gorgeous kitty! 😻
PAL was a color standard and has nothing to do with the number of lines. The 625 line system existed before anyone even thought of PAL. In the UK the black and white standard was 405 lines though they adopted the 625 line system in anticipation of the color in the 60s.
SD/480p 640x480,
HD/720p 1280x720,
FHD/1080p 1920x1080,
QHD/2k/1440p 2560x1440,
UHD/4k/2160p 3840x2160,
FUHD/8k/4320p 7680x4320
Doesn't make sense to call QHD 2K as it is 2560 wide (far from ~ 2000). If 2160p is 4K (3840 wide) , then 1080p is 2K (1920 wide).
1440p is 2.5K.
@@stalefurset9444 it's because 1440p is 4x the resolution of HD at 720p. The 2k comes from the 2560 horizontal pixels while UHD/4k has 3840.
480p is actually 800*480. I know it from Nintendo Wii.
@@nowknow Nah, 1920 is 2k. 2560 is 2.5k.
2:00 The scan rate of NTSC colour was reduced slightly to make it circuits easier to build with the hardware of the time; it did not change the bandwidth. The bandwidth needed for colour information was taken out of the bandwidth available for horizontal resolution, leaving colour systems with a maximum resolution of about 160 lines across. (Sort of. The interactions between colour information and luminance information are rather more complex than that, so the resolution actually depends on what colours are being displayed, but if you wanted to display alternating black and white vertical lines on a colour display, you can't do more than about 160 before the TV decides that it should be display colour information.)
7:00 Higher vertical resolutions were introduced long before VGA; even IBM had introduced 350-line EGA by 1984. But it was the Japanese who kicked this off with microcomputers (probably due to wanting to be able to display kanji reasonably well) few years earlier (around 1981) with 400 line displays in systems such as the NEC PC-8801 and Fujitsu FM-77. (And of course the Japanese were heavily involved in later HD standards, since they both used almost exactly the same NTSC system as North America and had a very strong industry building TVs and exporting them worldwide.)
BTW, in North America we usually pronounce "SMPTE" not letter-by-letter, but as "simptee."
Color*
@@Khloya69 There's only one country in the world that misspells "colour" as "color," so I guess I know where you're from.
@@Curt_Sampson based on the bottom of the message, you are in that country too. So don’t use britishisms.
@@Khloya69 I am certainly not in that country too. I knew the U.S. educational system was bad, but I had no idea it was so bad that you guys don't even know that there are other countries than the U.S. in North America.
And when I use "colour," it's not a "britishism." It's our standard spelling in Canada. (I wouldn't expect you to know about that, though, if you don't even know that Canada is in North America. Now you've got _two_ things to TIL about!)
@@Curt_Sampson I knew Canada is in North America, but i thought it statistically unlikely someone would be commenting from a country with a smaller population than California. Also, I do not respect British English.
I had a bunch of CRT TV's I'm glad I moved to 4K QD OLED'S thank God for new technology 💯
Thank you for taking the time to explain even further why 1080p is HD 😂
wasnt 720 hd?
Yes, I remember this. Then 1080p was Full HD
Yes. 720p (1280x720) is part of the HD standard. Many people confuse this with the 720x480 resolution of DVDs. It's an unfortunate coincidence that 720 features in both specs.
@@joesterling4299isn't 720x480 just 704x480 with blanking interval black bars?
It is HD but came later. 720p was born out of NBC & Zenith's 787.5/60p experiments which began in the early 1980's. NBC/Zenith claimed that a progressive scan image would be better & simpler than an interlaced (then 1035i) image and experimentally obtained that 787.5 would be the best mix of static resolution, motion resolution, and bandwidth; however, there was little hardware available unlike what the Japanese had. Regardless, NBC ended up using the 787.5 system to fight against 1035i being accepted as the sole American standard in the early 80's.
This standard was later revived when digital HD transmission was considered in the USA & modified to 750/60p (720p) since it could use the same pixel clock (74.25MHz) as 1080i.
@@ReelyInteresting1080i is actually slightly higher resolution than 720p but if you use mpeg interframe compression then it doesn't really matter because you can use the same bitrate
Something I don't know about a boring standard preferential? Going right back to the 40s? Brilliant! Sign me up! Don't know if you noticed this yourself, but the rest of the recommendations on my homepage is a goddamn global dumpster fire on the brink of collapse. Pity it only goes for 20 min. I may even watch it twice. So, please sir. Continue. I'm more than invested .
Me: clicks video
Video: loads in at 720p
Me: ha.
René Barthélémy invented HD in 1941, which later became analog HD standard in France in 1948 known as « 819 lignes ». It was black and white, used 12 MHz bandwidth and Acamedy aspect ratio. It was adpoted by neighbourgh French speaking countries. The last one to use this system was Monaco who discontinued this system in 1985.