+OhLookIt'sAnUnexpectedChromebook Good lord, I'm a 54 Y.O. fossil from the C.R.T. age, and even I never thought that. 1,080 pixels is about 0.0000010 megapixels! LOL
+Erik István Fejes - Same. You'd think that in the age of services like Google and Wikipedia, people wouldn't assume things as often, but sadly we're only human.
@@monsterhunterdude5448 yes, some networks are even beginning to experiment with broadcasting in 4k, for example, ABC broadcasts 1 college football game a week in 4k.
+Niznuts123 Better than what? Will 1080p be better than 1440i or 2160i? You don't seem to understand why Interlaced is used. It is to save bandwidth without losing too much detail. It's a compromise based on available bandwidth. Sure, Progressive is better than Interlaced. But Interlaced wasn't introduced to be better. It was introduced because hardware wasn't fast enough to transmit progressive frames.
As someone who works with this I am happy to see that Linus didn't mess any facts up. Stamp of approval! Important to point out that 1080p24 (any movie and most scripted TV) can fit into 1080/60i signal (what is commonly known as Telecine) and be decoded by most TVs "worth their salt" as the original 1080p24 picture with very little loss in quality. On a codec level, encoding interlaced is typically less compressible so a true P encoded signal would still be of benefit, but it's hardly the end of the world.
@@jawknee4088 gsync basically forces every frame to go in its own spot with a smooth transition from frame to frame to elimimate frame tearing. (Poor) deinterlation takes two different frames and forces them together, essentially making the worst screen tear effect possible, combing. Gsync uses extra resources to keep every frame apart, and deinterlacing forces frames together in order to save resources, so they are doing essentially the opposite.
technicly they didnt. most films are/were shot in that resolution the upscaled/down scaled to fit a 1080p or 4k tv. also they skiped it due to it being an odd number
+Elliot Kaufman it's called 2K and isn't used because 4K is a thing. 4K is a perfect multiple of 1080p while 2K is not, so 1080p can be "perfectly" upscaled to 4K. plus its a bigger number than 2K, easier to market, etc etc.
Well, interlacing was actually also a way to achieve smoother motion (very visible on rolling credits, for example), and the half fields are not from "slightly different frames" - they are straight up from precedent and following frame, so to speak, so deinterlacing that isn't complicated algorithmically at all (as long as we're not talking about standards conversions, where things get really messy, with cadence, frame rate, and scan order).
60i is actually better than 30p in that it has smoother motion for fast moving video. 60p would be better than 60i, but requires twice the bandwidth to broadcast.
None of the replies to this comment makes sense , so no it doesn't save bandwidth but gives you a smooth image at the same bandwidth as the jittery one with the one exception of quality loss . But who really cares it's 2018 and we are fighting to get the first 144k monitor then 240k , so on to 1080k
got news for YOU too. 5K and 8K TVs are in the works already too...so you're a little behind too. " "Samsung And LG Are Touting 8K TVs, But Sharp Is Actually Selling One For A Whopping $16,000" but they are out there already
The interlacing of analog video was not only convenient back then but today it's EXTREMELY convenient that such a standard was in use for video recording because once de-interlaced we can recover 50 or 60 fps video by de-interlacing the frames of that old material, add digital processing to those extracted fields and you get the best quality AND THE BEST FRAMERATE just like that.
Pretty good although some facts are wrong. Basically, TV is recoded at 60fps (ntsc) or 50fps (pal). Each frame is stored in half the horizontal resolution to save space/bandwidth. This means that when we blend two fields to one frame, we are actually merging two pictures takes at different point in time. (we are reducing temporal resolution) This is why any movement will cause combing (those horizontal lines). The result is 30 merged pictures per second instead of 60. For this reason 720p@60 is better than 1080i@60. 1080i@60 is better for fast moving scenes than 1080p@30. However one big reason for choosing 720p@60 over 1080i@60 for sports is because of so many straight horizontal lines that would flicker on TVs with poor quality deinterlacing.
Actually, you never see combing in normal viewing environments. I have a Philips LCD widescreen TV from 2008 that is able to do 720p@60 and 1080i@60 and I don't see ANY! combing at all with the latter
When I bought my first LCD back in like 2007 I spent HOURS and HOURS searching for 1080i vs 1080p. In the end I settled on an off brand 32" LCD that was 1080i for $599 and that was CHEAP for that time period. Man how times have changed, now you can buy a 32" 1080p for next to nothing.
Wow man. This is probably the most high quality engineering tutorial video. I was searching for interlace scanning mechanism and stumbled upon this video. Typically, engineering tutorials on youtube are of poor audio/video quality.
It's also somewhat wrong, as it gives the impression that the consecutive half-frames form a full-frame. That's not the case. A 60i signal is a 60fps video with every other line skipped, alternating between even and odd. The missing lines are completely discarded. There is never a full-frame in an interlaced signal, each half-frame is created at a different point in time.
I love the way he said "come on linus its 2016 why do we have to sacrifice anything!" Like its 2021 now and we have sacrificed a lot over this virus...
Asynchronous Compute is NOT an AMD thing. and for the trolls that talk about nvidia and Async: maxwell already can perform async-compute, but it highly depends on the developer to utilize it. Asnccomput is there to utilize the GPU a bit better - well nvidia has always done that by writing their own drivers for big games and making sure they run as efficiently as possible.
What I've noticed is that most cable companies in the US almost always support progressive scanning. DirecTV didn't however from what I remember, they mainly used interlaced, you had to pay for a progressive signal. Spectrum lets you choose what you want your resolution to be, whether it be 480i, 480p, 720i, 720p, 1080i, or 1080p. But for my antenna that I'm using most channels use 480i and almost never drop in signal. But some local channels like channel 9 (ABC) use 720p but channel 36 (NBC) and channel 3 (CBS) use 1080i. I've noticed that sometimes because of this, lag can be terrible if it's raining or even if it's windy. Meanwhile channels that are using a lower resolution or a progressive signal, are more clear and reliable, even during a heavy storm. This is why I'm glad my antenna doesn't need it's own power supply because I can just plug my TV into my UPS and watch TV if the power is out.
I understand the 480i and 576i signals as they were born so long ago that there were really not much alternatives, but the 1080i standard should never ever been created. Deinterlacing is never the same as an originally progressive video and also it is bad for editing and playing videos. On the time that HD resolutions arrived if for some reason still there was any bandwidth issue preventing 1080p60 to be used (which probably wasn't, they just wanted to fit lots and lots of channels more on the satellite links with low bandwidth), 720p60 should have been chosen as a default instead.
Interlaced is bad for playing videos, really? I've got a bunch of family footage from the 2000s that was filmed on different cameras, etc - the interlaced videos, when deinterlaced, look like they're going at 60 fps, they blow the competition out of the water. If I could choose between double the percieved fps at the cost of not being able to get good still frames when pausing, I know what I'd pick.
@@andreivaughn1468 interlacing is bad when all our displays are progressive nowadays. If possible, everything should be at least 60 progressive frames. I also have lots of VHS footage from the 90's and 2000's. Indeed when properly deinterlaced they look good and it was needed back then as there was no pratical way of recording 480p60. When high definition standards were conceived, there were 60p options, so 1080i shouldn't exist. 720p60 is better than 1080i for editing, recording, playing...
@@arthurand1006 I've used progressive displays all my life... I am not that old, but what I am saying is...clearly there was an era where 60 progressive fps was not feasible, and in that era, the "faux" 60 fps that could be gained by using interlacing looks better than standard 30 progressive fps. To my eyes anyways! I'm not saying 60 interlaced frames is better than 60 progressive frames, I am just saying 60 interlaced frames seems to take up the same storage space as 30 progressive ones AND looks better.
You know i thought my eyes were fucking with me when i was watching 1080i videos compared to 1080p on my computer. I legit thought my eyes were broken and im the only one who could see those lines during movement. I didnt see these kind of videos since 2 years ago. This cleared up everything and i no longer think my eyes are fucked up. Thanks man.
+Brett Kessner consoles output at 240p, a whole different standard that cheaply made TV scalers either can't handle well or completely misinterpret as 480i
I think that no, we have cable and also IPTV from the same provider - cable channels are mostly delivered in 576i50 and 1080i50, but the same channels, but via IPTV are delivered mostly in 576p25 and 720p50.
so what about the tv channels that are broadcasting 4K (bein sports in the middle east) yea they offer a hihgh line expensive receiver! Euro 2016 & world cup 2018. but my questions are: 1- does 4k have an interlaced signal ? or is it only progressive ? 2- if they can broadcast 4k, then i think we can easily have a 1080p 60 fps broadcasted as well, am i right ?
Linus is talking about North America mainly where the infrastructure for this stuff has been there for years and to expensive to change. Though you are correct the 4k footage you are watching nowadays is in progressive and they do easily also broadcast 1080p.
I take exception with your use of the word frame in regards to 1080i. With 1080i, each FRAME is composed of 2 FIELDS displayed one after the other, referred to as upper and lower fields (odd lines or even lines)
wow just four years ago they were only using standard HD not even full for sports and now we're using 8K for the Olympics, that's well over sixteen times better.
Dear youtube commenters. 1080p, 1440p etc, are digital television formats, not display resolutions. The word you're looking for is "Full HD" or FHD (1920x1080) or QHD (2560x1440)
@TechQuickie: Linus! Great videos man, you are correct on the reason behind not having (had circa 2016) 1080p TV sooner. Subscribed! Vive la UA-cam Revolucion! I hope, I really do.
+Kasian I don't even think the Xboxones upscales games to 4k. it probably plays 900p, upscales to 1080p, and your monitor or 4k TV upscales from 1080p to 4k, because of the panel and information it is being sent.
UC i always thought the p in 1080p stands for "pixels"
+OhLookIt'sAnUnexpectedChromebook Good lord, I'm a 54 Y.O. fossil from the C.R.T. age, and even I never thought that. 1,080 pixels is about 0.0000010 megapixels! LOL
I didn't. I searched for what it means literally the first time I saw it.
+Erik István Fejes - Same. You'd think that in the age of services like Google and Wikipedia, people wouldn't assume things as often, but sadly we're only human.
japzone One day, my friend. But for now, we can only hope.
I feel better about my DVDs now that I know They're 345,600p!
When he said "cmon guys, it's 2016" I realised this is year 2020
EDIT: It’s 2024 now and I am still getting likes on this comment….crazy imo
Can we stream 1080p60 yet?
@@monsterhunterdude5448 yes, some networks are even beginning to experiment with broadcasting in 4k, for example, ABC broadcasts 1 college football game a week in 4k.
@@rosalynconrad3623 *nice*
bra same was thrown for a loop as well.
Why did this video show up in 2020??? Lol
Lol I thought p stood for pixel
yeah ikr
Stooooped it means pee
Me too xD
me too lol
Same
"the problem is bandwith"
>video stops to try and buffer for 2 minutes
yeah you're god damn right
Congrats on 420 likes.
God is god we are all humans
@@suzanabazi5056 please learn english
@@nutzboi what's wrong
@@nutzboi about as many pixels that are on your screen
Linus: ‘come on guys its 2016’
Me: *laughs in 2021*
i scrolled so much for this comment
haha now we have | 1080p 360 fps | 1440p 240 fps | 4k 120fps | 8k 60 fps |
@@ji0k394 Don't forget 144p 30fps.
Ah, I found my "random video recommendation" brothers in the comments. Sup fellas.
Yeah, and my satellite provider's STREAMING service is still in 1080i...
Dat sunburn.
lol I thought Im only one that noticed
I can't tell if he's sunburnt or if it's just their unnatural colour correction
It's sunburn, CC isn't that bad and the rest of him is relatively normal color, as even seen in the thumbnail
Yup. He got sunburned a month or two ago. They shoot these in batches for someone else to edit and render. At least that's my understanding.
can u explain more easy
UA-cam Recommended: here
Me: *watches*
Linus: "Its 2016"
Me: *visible confusion*
When he said that, I suddenly pause the video and go to comment section
@@izzulhaqmahardika8892 me too haha
3 most unpredictable things:
1. Your Girlfriend
2. Girls mood
3. UA-cam Recommendation
@@izzulhaqmahardika8892 same
same right now
I can't wait until the day interlacing finally dies
Why?
+TheOlian04 its ugly
+ÉsxatosFox with a good delacing algorithm you won't see it -_-
+TheOlian04 no matter what, 1080p will always be better.
+Niznuts123 Better than what? Will 1080p be better than 1440i or 2160i? You don't seem to understand why Interlaced is used. It is to save bandwidth without losing too much detail. It's a compromise based on available bandwidth.
Sure, Progressive is better than Interlaced. But Interlaced wasn't introduced to be better. It was introduced because hardware wasn't fast enough to transmit progressive frames.
_"Oh come on, Linus. This is 2016."_
Simpler times. 😔
F
Hello 👋,
From 2024 😂
2020 wasn't that complex too
*_2016:_* 720p 60fps
*_2020_* 1440p 240fps
2020: Coronavirus
For two or three weeks now, 720p is no longer considered "HD" on UA-cam. Time flies..
@@Jackrodder :( i wish it still was
*2030: 8K 360FPS*
2021 :16k 1000fps
If only Linus uploaded in 1080p60...
+Jt pkmn No new stuff because we're all used to old stuff.
+Tb0n3 also cuz my internet is shit and I don't feel like figuring out how to limit to regular 30 frames
all you have to do is drop down to 480p or lower, and UA-cam will always play videos at 30fps.
+Commodorefan64 480p sucks, can't see Linus's perfect face in high quality
You would have to sacrifice 4K for some extra frames
As someone who works with this I am happy to see that Linus didn't mess any facts up. Stamp of approval! Important to point out that 1080p24 (any movie and most scripted TV) can fit into 1080/60i signal (what is commonly known as Telecine) and be decoded by most TVs "worth their salt" as the original 1080p24 picture with very little loss in quality. On a codec level, encoding interlaced is typically less compressible so a true P encoded signal would still be of benefit, but it's hardly the end of the world.
In 2010 we got a 50 inch 1080p TV and we were really excited to open it just to find out most channels are ONLY in 576i
People in the UK: buying a 4K TV and watching SD. Why tho
And here I am in 2021 who bought a 50” 4K TV just to find out that most channels are only in 1080p
@@AUA-camChannelwithNoName but 1080p on a 50" 4ktv looks fine tho
i see 1080p on a 65" 4ktv and it looks fine
@@c0wqu3u31at3r gaming
@@c0wqu3u31at3r SD is being deactivated in the UK, everyone is fully switching over to either 1080i, 1080p or 4K60 (usually reserved for sport)
You know you're watching a Canadian channel when the first Sport that Linus could think of was Ice Hockey :D
0:35 That picture of Bernie Sanders was taken in my highschool gym. 😂😂
Wait wut?
whaaat?
Woaa
Zyaire Fullwiley wow
Haha
That P stands for Pascal.
GTX 1080P for 4K.
Mike Brown *nods in approval*
Agreed!
lol
What about the Titan XP version of windows 95 Kappa
Better naming scheme than nvidia
Linus: "it's 2016"
Damn has it really been that long?
Deinterlacing is a type of anti-G SYNC
Anti g sync???? Lol
wut
@@jawknee4088 gsync basically forces every frame to go in its own spot with a smooth transition from frame to frame to elimimate frame tearing. (Poor) deinterlation takes two different frames and forces them together, essentially making the worst screen tear effect possible, combing. Gsync uses extra resources to keep every frame apart, and deinterlacing forces frames together in order to save resources, so they are doing essentially the opposite.
Does the same apply for free sync
Mentioned a politician without starting a flame war? Well done.
in 2023 and the bbc’s live broadcast service is still in 1080i (some channels 4K60)
Why did films skip 2560x1440.
technicly they didnt. most films are/were shot in that resolution the upscaled/down scaled to fit a 1080p or 4k tv. also they skiped it due to it being an odd number
That's called 2.5K and is available on some devices like go pro's and drones.
+Elliot Kaufman it's called 2K and isn't used because 4K is a thing. 4K is a perfect multiple of 1080p while 2K is not, so 1080p can be "perfectly" upscaled to 4K. plus its a bigger number than 2K, easier to market, etc etc.
lol no its 1440p.
+scootbmx01 well, 1440p, QHD, WQHD, 2K & 2.5K are all valid as the standard is less well defined having only taken off in niche markets.
Well, interlacing was actually also a way to achieve smoother motion (very visible on rolling credits, for example), and the half fields are not from "slightly different frames" - they are straight up from precedent and following frame, so to speak, so deinterlacing that isn't complicated algorithmically at all (as long as we're not talking about standards conversions, where things get really messy, with cadence, frame rate, and scan order).
I don't understand why 60 half frames would save bandwith opposed to 30 full frames. :O
60i is actually better than 30p in that it has smoother motion for fast moving video. 60p would be better than 60i, but requires twice the bandwidth to broadcast.
TheShonenJumper Because it refreshes faster (since it doesn't have full frames to render).
None of the replies to this comment makes sense , so no it doesn't save bandwidth but gives you a smooth image at the same bandwidth as the jittery one with the one exception of quality loss . But who really cares it's 2018 and we are fighting to get the first 144k monitor then 240k , so on to 1080k
For the same bandwidth you get a smoother motion.
Xistence Studios do you not know the difference between framerate and resulotion
When he said, "Not progressive like car insurance or Bernie Sanders.", that cracked me up.
UA-cam decided to show this 4 years later, and I’m not complaining
Linus, I think you are about 10 years late with this one...
How?
It's something I've wondered about for a long time.
got news for YOU too. 5K and 8K TVs are in the works already too...so you're a little behind too. "
"Samsung And LG Are Touting 8K TVs, But Sharp Is Actually Selling One For A Whopping $16,000"
but they are out there already
***** Oh yes, all about 4k, when the majority still uses 1080p. Smart one.
4K today is what 1080p was 10yrs ago.
I was watching this and learning only to hear him say "it's 2016!" Having my life flash before my eyes
Annoys me how even in 2019 the BBC STILL deliver shows in 480p,
BBC is European so it probably doesn’t deliver in 480p and I think that BBC is broadcasting in 1080p50 (in UK)
They broadcast in 576i or 1080i. Progressive is not used on UK broadcasts.
In my 17 years of using youtube, this is the first video that does not have a single dislike. Congratulations!!!! great video
Me an intellectual: p = pixels i = inch
Americans can't even measure screens in pixels xD
bro your tv is 1080 inches
Bruh
@MrOrzech bruh it’s a joke, (ima assume you’re British because them people are full of themselves) bruh British people can’t even notice a joke clown.
@@epicbread1310 Meh, that is barely half of Frank’s TV
The interlacing of analog video was not only convenient back then but today it's EXTREMELY convenient that such a standard was in use for video recording because once de-interlaced we can recover 50 or 60 fps video by de-interlacing the frames of that old material, add digital processing to those extracted fields and you get the best quality AND THE BEST FRAMERATE just like that.
heh, right when he said 720p my video quality dropped to 360p
thanos is coming
scuttle06 he’s coming, HE’S COMING! *HE’S COMINGGG OH MAN THAT WAS A BIG LOAD*
Setting playback to 1.25x makes this as fast as he presents these days. 2023
1080p generally produces a sharper image than 1080i, and often with less motion blur on an LCD TV. For this reason, I find 1080p better for gaming.
Jeeeeez thank you. I can’t with all this talking. Yes or No. 😩 ... 1080P it is 👍🏿
@@hraculezboufaunt9726 1080p is better than 1080i, end of story
Tbh i dont get settings for 1080i only 1080p
"But come on, Linus, this is 2016; Why do we have to sacrifice anything?"
We have to sacrifice thing to gaben.
because it is not 2020
Why did I think the title was:
*“1080 vs 1080ti”*
Lol same
Same lmao
Pretty good although some facts are wrong. Basically, TV is recoded at 60fps (ntsc) or 50fps (pal). Each frame is stored in half the horizontal resolution to save space/bandwidth. This means that when we blend two fields to one frame, we are actually merging two pictures takes at different point in time. (we are reducing temporal resolution) This is why any movement will cause combing (those horizontal lines). The result is 30 merged pictures per second instead of 60. For this reason 720p@60 is better than 1080i@60. 1080i@60 is better for fast moving scenes than 1080p@30. However one big reason for choosing 720p@60 over 1080i@60 for sports is because of so many straight horizontal lines that would flicker on TVs with poor quality deinterlacing.
In the clip he shows sum of lines on i, cant be right? should be every other i think
Actually, you never see combing in normal viewing environments. I have a Philips LCD widescreen TV from 2008 that is able to do 720p@60 and 1080i@60 and I don't see ANY! combing at all with the latter
When I bought my first LCD back in like 2007 I spent HOURS and HOURS searching for 1080i vs 1080p. In the end I settled on an off brand 32" LCD that was 1080i for $599 and that was CHEAP for that time period. Man how times have changed, now you can buy a 32" 1080p for next to nothing.
Just asking for a reference. How much does it cost though? I'm from India so just wanna know
quality has gone to crap, thats why these TV's are getting cheaper and cheaper
Wow man. This is probably the most high quality engineering tutorial video. I was searching for interlace scanning mechanism and stumbled upon this video. Typically, engineering tutorials on youtube are of poor audio/video quality.
It's also somewhat wrong, as it gives the impression that the consecutive half-frames form a full-frame. That's not the case. A 60i signal is a 60fps video with every other line skipped, alternating between even and odd. The missing lines are completely discarded. There is never a full-frame in an interlaced signal, each half-frame is created at a different point in time.
3:43
I say " Come on Linus THIS IS 2020 WHY DO WE HAVE TO SACRIFICE ANYTHING"
3:44
come on linus this is 2016
me looking up when this video upload.
2:46 The Aristocrats! by Gilbert Gottfried !!!
Hey man! Are you gay ?
@@DarkSharingan21 yes
@@KotTheSillyOne based and gaypilled
Yet another reason to join/continue the PC master race, thnx Linus
I can't believe 2016 is already 12 years ago.
Makes 2016 video on tech that's been around and explained over 10+ years ago. Way to go.
lol.. who still watches cable anyway? i cut the cord back in 2007
I love the way he said "come on linus its 2016 why do we have to sacrifice anything!" Like its 2021 now and we have sacrificed a lot over this virus...
AMD Asynchronous Compute as fast as possible
👍 cuz no thumbs on mobile
And send that to NVIDIA, they need to step they're Async game up!
+Tiago Luz They don't need to right now, it'll be a huge waste of money because of the very low amount of games that use async anyways.
yes
Asynchronous Compute is NOT an AMD thing.
and for the trolls that talk about nvidia and Async:
maxwell already can perform async-compute, but it highly depends on the developer to utilize it.
Asnccomput is there to utilize the GPU a bit better - well nvidia has always done that by writing their own drivers for big games and making sure they run as efficiently as possible.
Thank you Linus. This channel is an easy way for me to catch up on the latest tech. I work in computer repair so this helps a lot.
What I've noticed is that most cable companies in the US almost always support progressive scanning. DirecTV didn't however from what I remember, they mainly used interlaced, you had to pay for a progressive signal. Spectrum lets you choose what you want your resolution to be, whether it be 480i, 480p, 720i, 720p, 1080i, or 1080p. But for my antenna that I'm using most channels use 480i and almost never drop in signal. But some local channels like channel 9 (ABC) use 720p but channel 36 (NBC) and channel 3 (CBS) use 1080i. I've noticed that sometimes because of this, lag can be terrible if it's raining or even if it's windy. Meanwhile channels that are using a lower resolution or a progressive signal, are more clear and reliable, even during a heavy storm. This is why I'm glad my antenna doesn't need it's own power supply because I can just plug my TV into my UPS and watch TV if the power is out.
Nice to see you on here again Linus.
2020: 8k over antennas
Me: excuse me WTF?
why tho? just use internet
@@MenacingPerson it'll be for the olympics i think
@@enricodelascio4330 that makes no sense still.
@@MenacingPerson true
@@MenacingPerson TVs do have 4 and 8k channels, and they're not using internet connections
Shiny happy electronics👌😂 Great video Linus!
Could you please film and edit in 50 or 60 fps. UA-cam supports it now and it was really apparent in this video with Linus's many fast hand gestures.
They sacrifices 60 FPS for 4K video, in some years in the future they will have 4K 60 Cameras affordable, because today is too expensive.
nanopulga098
But this video doesn't have a 4k option
They are in USA so I don’t think that they’ll shoot at 50p
@Lucas Silva 60fps also doesn’t look good unless it’s displayed on 60Hz or 120Hz xdd
25/24 FPS is used in films and tv and is sufficient.
Bourgeoisie is a noun. If you're trying to use it as an adjective, you want Bourgeois.
Every time my mind wanders and I think of a technical question. I always end up here with You Linus. Cheers for being you
Subscribed.
Linus: "Its 2016"
Me: whaaaaat
"it's 2016" 😭
I understand the 480i and 576i signals as they were born so long ago that there were really not much alternatives, but the 1080i standard should never ever been created. Deinterlacing is never the same as an originally progressive video and also it is bad for editing and playing videos.
On the time that HD resolutions arrived if for some reason still there was any bandwidth issue preventing 1080p60 to be used (which probably wasn't, they just wanted to fit lots and lots of channels more on the satellite links with low bandwidth), 720p60 should have been chosen as a default instead.
….and it is what they used and still use prob over 50% anyway, still… HAH, TRUE that / (ON) This BROSKi … ......
Interlaced is bad for playing videos, really? I've got a bunch of family footage from the 2000s that was filmed on different cameras, etc - the interlaced videos, when deinterlaced, look like they're going at 60 fps, they blow the competition out of the water. If I could choose between double the percieved fps at the cost of not being able to get good still frames when pausing, I know what I'd pick.
@@andreivaughn1468 interlacing is bad when all our displays are progressive nowadays. If possible, everything should be at least 60 progressive frames. I also have lots of VHS footage from the 90's and 2000's. Indeed when properly deinterlaced they look good and it was needed back then as there was no pratical way of recording 480p60. When high definition standards were conceived, there were 60p options, so 1080i shouldn't exist. 720p60 is better than 1080i for editing, recording, playing...
@@arthurand1006 I've used progressive displays all my life... I am not that old, but what I am saying is...clearly there was an era where 60 progressive fps was not feasible, and in that era, the "faux" 60 fps that could be gained by using interlacing looks better than standard 30 progressive fps. To my eyes anyways!
I'm not saying 60 interlaced frames is better than 60 progressive frames, I am just saying 60 interlaced frames seems to take up the same storage space as 30 progressive ones AND looks better.
You know i thought my eyes were fucking with me when i was watching 1080i videos compared to 1080p on my computer. I legit thought my eyes were broken and im the only one who could see those lines during movement. I didnt see these kind of videos since 2 years ago. This cleared up everything and i no longer think my eyes are fucked up. Thanks man.
Bravo Linus.
I am currently working with video Quality Control and have to deal with those issues daily.
Keep up the good work!
Anyone notice his T-shirt "Holy balls"
You know which way to think,right
He is talking about 720p and 1080i
And I am watching this video in 1080p🥲
why does old consoles like nes/snes/genesis/ps1 all look better on interlaced crt tvs vs newer progressive screens (small lcdtvs 16"-21")?
Because that's what they were made for
Because most TVs have poor quality deinterlacers.
because they don't scale appropriately to hd resolutions?
+Brett Kessner consoles output at 240p, a whole different standard that cheaply made TV scalers either can't handle well or completely misinterpret as 480i
240 still can't pixel match on a 1080p tv, there's no half pixel
Some of you seem very shook that videos from the past exist and it's lovely.
Man we have come a long ways in 4 years.
"or Bernie Sanders" xD I laughed but also cried a little bit.....rip. Heros never die, they only re-spawn.
Bernie's not a hero, he's a legend.
***** ayy like daddy trumpo
what's a chafee
Maniac of Doom well it's a tank, and also a failed democratic candidate for president.
I see
I always thought the P in 1080p stood for 1080 pixels
And what is with IPTV? Is there interlacing too?
i would like to know too
I think that no, we have cable and also IPTV from the same provider - cable channels are mostly delivered in 576i50 and 1080i50, but the same channels, but via IPTV are delivered mostly in 576p25 and 720p50.
IPTV is delayed and rubbish, tends to be progressive
I feel like these videos have become alot... calmer
"Come on Linus, this is 2016!!"
Watching in 2022, wish I could go back then once again
Wait so p doesn't mean pixel?
no it means ur mom
@@are3287 thx ♡
My Internet is hell slow
Hem Shah better now ?
It took 1 year for this comment to upload lol
Hem Shah my download speed is 2kb/s or 2gb file for a year.
Better now
Use netcut
so what about the tv channels that are broadcasting 4K (bein sports in the middle east)
yea they offer a hihgh line expensive receiver! Euro 2016 & world cup 2018.
but my questions are:
1- does 4k have an interlaced signal ? or is it only progressive ?
2- if they can broadcast 4k, then i think we can easily have a 1080p 60 fps broadcasted as well, am i right ?
Linus is talking about North America mainly where the infrastructure for this stuff has been there for years and to expensive to change.
Though you are correct the 4k footage you are watching nowadays is in progressive and they do easily also broadcast 1080p.
4K is the resolution it doesn't necessarily denote whether it's interlaced or progressive you can have an interlaced 4K resolution image
Broadcasts have tended to be progressive for 4k, ie 2160p.
your video brought back my engineering days , electronics engineering rockss😎
I take exception with your use of the word frame in regards to 1080i. With 1080i, each FRAME is composed of 2 FIELDS displayed one after the other, referred to as upper and lower fields (odd lines or even lines)
"we won't get 1080p any time soon"
four years later - 8k with 64 times as many pixels
Not through cable providers. You took that quote so out of context and it makes him look like an idiot.
@@jaalan7896 That was not my point. It's just so crazy how fast technology evolves.
@@petterhaugenes4744 there is still no 1080p broadcasts, they are 1080i. In the UK all broadcasts are interlaced aside from UHD.
3:43 wait wait wait... HOLD ON let me check
That line made me check when this was uploaded😂
movement moment
"Come on Linus, this is 2016!"
wat
U are 3 yrs late bro
@@adityaisgreat21 u are 1 yrs late bro
@Qwert u are 1 month late bro
@@jadoo782002able u are 1 day late bro
wow just four years ago they were only using standard HD not even full for sports and now we're using 8K for the Olympics, that's well over sixteen times better.
AND my TV experience is solidly spoiled. Thanks LINUS.
damn, when u say its 2016 im almost cry =(
Me: laughs in 2019
Me: Laughs in 2019+1
Me: laughs in 2020
@@Hensteve Thats illegal
In 2018, we use 2160p mostly.
Ryan The Retro Gamer that's 4K
My PC dies at 720p
In 2019, we use 128k xD
Ryan The Retro Gamer nah, we use 1440p. FUCK 4K
in 2013 they used 720p and in 2019 they use 1080p
2016 Linus was so cute, lol.
I didn't come here for this info but sat through the whole thing. Interesting stuff!
so can you have 2160i and 2160p?
Technically yes but if anybody actually thinks 2160i is a good idea they ought to die.
.
+Revenge woh couldn't agree more with you there
+Revenge woh I second that.
Revenge woh I'm a year late but I agree
2:45 gilbert gottfried? Is that you?
I first saw 1080i on my Xbox 360 display options. Anybody else?
Way to go Linus! Awesome! Simple and Scientific. Keep uploading :)
RIP TunnelBear
2019?
TV MASTER RACE LUL
LUL
4:21. PCMR FTW! (although he actually says PC master raist.)
Everytime I listen to it, it gets worse! :P ahah. We love you linus :P
PC Master wrist
Dear youtube commenters. 1080p, 1440p etc, are digital television formats, not display resolutions. The word you're looking for is "Full HD" or FHD (1920x1080) or QHD (2560x1440)
@TechQuickie: Linus! Great videos man, you are correct on the reason behind not having (had circa 2016) 1080p TV sooner. Subscribed! Vive la UA-cam Revolucion! I hope, I really do.
Who watched this in 1080p lol
this must be an American problem, over here sports channels and such use 4K today
Where do you live?
Norway, and yes it might be a satellite problem as most people over here have TV over internet
*****
no
There is 1 sports channel in the UK that is broadcast in 4K
In Australia sports is 720p. Nothing is higher
Xbox One... 1080i hahaha #PCMasterRace.
+13thxenos The XboxOneS only plays 4k video. Games only get upscaled to 4k..
+Legend you're saying games on PC are only upscaled to 4k?
+TheDarkToes no xbox upscales
+Kasian I don't even think the Xboxones upscales games to 4k. it probably plays 900p, upscales to 1080p, and your monitor or 4k TV upscales from 1080p to 4k, because of the panel and information it is being sent.
PC doesn't upscale to 4K. It's native (as long as you have the GPU for it, that is).
Today i was messing with HANDBREAK to convert some old video and I saw deinterlace thing I had no idea what was it thanks for this video :)
Linus, you're a National Treasure and a credit to your Nation. I can't learn anything about a PC without you.....it would seem.