OK, here comes a pinned comment about the whole Star Trek thing (and I've edited this to give a little more clarity); I'm getting a lot of mixed information from commenters about what specifically is preventing Voyager (and Deep Space Nine) from getting an HD Re-release. Some people are saying it's just a money thing, and that Paramount couldn't recoup their investment (as learned from their experience doing the TNG re-release). TNG was, after all, edited on tape so what is the substantial difference? Yes. *TNG was originally edited on tape, not film.* I implied in the video that there was a "final cut" on film but this isn't the case. From my research and understanding, as helped along by others, the key difference between TNG and Voyager when it comes to making an HD re-release feasible is that while *some* of the VFX work on TNG was rendered to tape, this was largely things like phaser effects, planetary orbital scenes, and other minutia. Most shots of the Enterprise were done with models / practical effects, and as such they existed on film from the get go, meaning they existed with the full detail film can capture. So yes, the version we saw upon its initial release came from a tape, but unlike Voyager (and DS9), a much greater percentage of the original effects footage existed on film. This meant that to re-release the series in HD, scanning the film negatives took care of a large portion of business. It largely needed to simply be re-edited to match the original cuts and timing. The most complicated part of the process was re-compositing some of the film-based effects. But since they existed on film, it was mainly a matter of re-alignment. Relatively few VFX elements were done exclusively on tape, and so it wasn't a monumental undertaking to recreate them for the HD release. In contrast, for Voyager and (to a lesser extent) DS9, CGI was used _heavily._ And since that was rendered to tape and composited with the live-action scanned from film, the negatives (assuming they still exist) are just the live scenes and nothing more. If a re-release were to be done in HD, a *much* greater amount of work would need to be done. DS9 doesn't have it quite as bad as Voyager, but it would still be more complex than TNG. Some of the original computer models have been found which would help with recreating the VFX scenes, but in many cases the production houses that did this work went out of business. Which leaves many, _many_ scenes where the entire thing needs to be re-done from scratch. And given how expensive this would be, it's likely never to happen. So no, it's not impossible that we'd see DS9 and Voyager get re-released in HD, but very unlikely. The extra complexities required would add tremendous cost to an already expensive endeavor.
Technology Connections Meanwhile, I’m rewatching TNG n Netflix, the compression artifacts when you see the establishing shots of the Enterprise on a star field are atrocious
Consider, running contemporaneous to DS9 was Babylon 5. B5 famously used Amiga computers running Video Toasters (in the first couple seasons, along with SeaQuest DSV), and that stuff looked waaaay better than what Paramount was doing with DS9 and Voyager. There seems to have been a optimization done in the CGI work that boils down to "how much effort do we REALLY need to spend here? It's going on broadcast TV." DS9/Voyager was fine for TV and VHS, but the models would look like traaash if rendered in modern resolutions. You'd need to rebuild every CGI shot for 2x 7 seasons x 26ish episodes per season. Also while DS9 was the tits from S1E1, Voyager needed until like S4 to get moderately passable.
I just posted this reply to another comment, but yes... Here's the TLDR from treknews.net Basically, TOS was all mastered on film, so they could literally just rescan the master print for each episode and be done with it. They did redo VFX with CGI for the remaster because the film compositing done in the '60s wasn't great and led to muddy shots that looked much worse compared to the HD scans. I believe they put both versions on the TOS Blu-Ray release, which is a nice touch. TNG, DS9, and VOY were all shot on film (though I have heard that there's some issues with the film stock used for certain shows and even specific seasons, such as season 2 of TNG), but they were all composited and edited in video. There are no master film prints of finished shows like in TOS. In that respect, the three modern series' are all the same. The difference however is that TNG used very little CGI when originally produced, but DS9 and VOY did. So in all three modern shows a remaster would entail the same scanning of the original raw film footage, re-compositing all the scenes and in-camera effects (assuming the plates are all still usable), and using modern CGI to recreate the VFX from the original video editing. Recreating the CGI from DS9 and VOY is non-trivial because a lot of the original assets are lost, either due to backup failures, deletion to save drive space, and/or closure of the production houses that made them. Even if the assets were available, simply re-rendering them at HD resolution would make them look cartoony at best, and a jumbled pixellated mess at worst. That worked for South Park because it's already a cartoon, but you can still see in their re-rendered early seasons some rasterized assets (like the poof ball on Stan's hat) that are jarringly pixellated. I think DS9 suffers the worst from this because there are some extensive space battles with hundreds of ships, all of which would need to be basically recreated from scratch. That could make remastering one episode of DS9 cost as much as the entire remaster of TNG. As for 4x3 vs 16x9, just leave it like it is. I think in the case of TNG it was mostly filmed in 4x3, so to get 16x9 you crop out a lot of the top and bottom of the frame and can lose important parts of the scene, or try to rebuild the sides of the set in CGI. In some cases they did shoot on wider film and did some pan-and-scan, but then you risk seeing the edge of the set, lights, tripods, wires, and other crap that then also needs to be digitally removed. All of this would add yet another layer of cost, so just accept the black bars, they won't hurt you, trust me.
I got halfway through the video, paused, scrolled down to correct the hell out of you, and was happy to see this comment instead. You have it right now! If you go and watch the special features for the blu-rays (plenty of the stuff can be found on youtube), you get a pretty good breakdown of the hows and whys of it all. Further discussions of the possibility of DS9 getting remastered over the years have had all the inherent problems with it and all the CGI brought to light. Also cases of VFX artists who worked at now defunct studios popping up out of the woodwork saying "hey, I have some of those scenes on a hard drive somewhere!", but it's really a lost cause, there's too much to rebuild. A recent DS9 documentary had some select scenes remastered to show what it could have been like (involving a fanmade CGI recreation of a battle scene). (note: there's very little fan outcry for HD Voyager... Voyager was considered a testbed for using CGI so it's riddled throughout the show from the outset. And while it's a fine enough show and has its fans, it's not lauded with acclaim like TNG and DS9) What's interesting to note is tha they were considering going full CGI from the start of TNG. They even had VFX houses do tests. They mostly look bad but some shots look actually kinda incredible for 1987. If that had happened, TNG would have been a lost cause too! Thankfully they didn't go that route because not only did it not quite look good enough, but also the CGI industry was tiny, unreliable _and_ unstable at the time.
Monty Python even made a joke out of it: "Alright, stop that sketch. It got too silly." "You can't stop it - it's on film!" "That doesn't make any difference to the viewer at home, does it?"
@@dominateeye I am pretty sure that sketch was specifically what taught me the difference between the appearance of video and film. I'd noticed the "soap-opera effect" before, but until that moment, I didn't know what it really was.
As a kid I tried to identify what made soap operas look so weird to me. I never could put my finger on it. I asked other people and they didn't know what I was talking about. I thought it was the lighting. Now I find out it was the frame rate I was reacting to. Thank you for solving a mystery of my childhood.
As a digital native, I can't wrap my head around what people mean by he soap opera effect and why it would be considered a bad thing. TV shows, sitcoms and the like never looked weird to me, but natural, whereas the stuttery or blurred movement in movies sometimes gives me a slight feeling of motion sickness or headache. Maybe It just comes down to being used to something? Cause our eyes don't have a framerate, nor does natural movement, so when it comes to a (virtual) reproduction of it, more should be better...
@@LRM12o8 It may have something to do with movies having a slightly altered look to them that makes them visually appealing, a bit like slightly slowed down motion looks cool, while slightly sped up actions looks almost comical. The higher frame rate of video makes it look starkly "too real" but with the odd generic lighting, daytime TV always looked unsettling to me, a bit like a visual version of the Stepford Wives.
As a video gamer, I got to find about this longer while ago, but it was still somewhat of a weird revelation to me. Not only that but while on PC I always like higher framerates (not that I could get more than 40 on my Pentium 3) - for TV/Film I always preferred or were just conditioned to prefer the framerate of film based media. I am not sure if 24 just hits some kind of sweet spot in style, roughness, smoothness and everything in between. Still, these things would funny enough come back with gaming and being transferred from there back to (Mostly just Japanese) Animation with Guilty Gear Xrd. For those less in the know, this is a fighting game that opted to present look of hand drawn (Japanese) animation from entirely 3D scene character models included. One major thing was purposefully turning off modern interpolation technologies (on the fly generation of maximum amount of frames to an animation as current frames per second allows) and doing every "frame" of the game by hand. Most interestingly, like said, this actually transferred over even to animation industry as some people from that same game development team left the company to find an anime studio that uses 3D and same techniques for animation one recent more known things being "Beastars" TV anime. The framerate conundrum especially for animation and movies etc. is fascinating. Even in Fighting games some games that tried to copy Guilty Gear while skipping steps ended up looking cheaper and "unnaturally smooth" while smoothness is actually the natural state of anything moving. We are just real weird.
@@SumeaBizarro hm, well I'm a gamer myself, but I remember that even before that, as a kid I've already sometimes noticed stuttery cameraflights (mostly in documentaries) and got frustrated with fight scenes that were so blurry that I could hardly tell what they were doing in cinema movies. Ever since I learned about FPS and got used to gaming at min. 50 FPS, I often get a slight feeling of motion sickness from movies. I don't think that there are any averse affects of high framerate for the image quality or the feeling of a movie, but that Hollywood simply managed to convince people otherwise, because they didn't want to upgrade for some reason. I mean there surely must be a reason why TV adapted higher framerate cameras, as soon as more than 24 FPS was possible. And I can even understand that Hollywood didn't upgrade right away, cause at that time storage was probably too expensive. TV stations didn't have to store their video footage, they just broadcasted it live once and that was it. But before storage for HFR movies was feasible, Hollywood discovered the option of hiding lackluster acting through the limits of 24 FPS: just make your kicks and punches faster than the camera can clearly capture them and nobody will notice, if the fight choreography doesn't look convincing in reality, that's my guess. I mean, if you watch old action movies from Bruce Lee or Jackie Chan, they're fight scenes are much slower, and there's no blurriness in the motion. They slowed and perfectly timed their moves to hide the technical limitations of their camera equipment through their acting, not the other way around! But at some point, demand for faster and faster action rose, to the point it became too fast for the cinema cameras. But instead of upgrading the cameras, Hollywood pulled a Jedi mind-trick on people, telling them "You don't want higher framerate and sharp high-speed action scenes", that's what it feels like to me. The reasons Hollywood people bring up against HFR don't make sense to me. Apart from the "soap opera effect", the most notable one I've heard is that it drive storage costs for the movie footage too much. Sure, 48FPS or 72FPS would double or tripple the storage requirement, respecively. But that didn't stop Hollywood from going from 1080p to 4K (which is a good improvement), *quadrupling* the amount of storage space required and now they go from 4K to 8K (for diminishing returns, imo), quadrupling the storage requirements again! Also HDR adds a significant increase, as well. So storage costs clearly can't be the issue! I agree that for animated movies, it's not an issue, they're fine at 24Hz. The issues I have with 24Hz movies don't apply to animated movies, because there's no cameras involved, every frame is a perfectly sharp hand-crafted picture, so there's no blur, unless it was intentionally added and I guess the fact that the drawings, no matter how good, are never photo-realistic prevents my brain from expecting fluid motion and thus getting motion sick (just a theory). I've actually just watched a video about why interlaced 60Hz animations look terrible compared to the original 24Hz animations it makes total sense to me. the tl:dr is that "movement" is carefully timed to give it purpose and expression, and the AI interlacing programs destroy that, making the animations feel bland and lifeless, because Computers can't emulate the imperfection and irregularity of real life. That also means, if an animator was to animate something at 60FPS natively, hand-crafting 60 frames per second, it would take awful lot of time, but it would look just fine. It's not the framerate, but the way it's achieved, that makes the difference in animated movies. This is the video I'm talking about: ua-cam.com/video/_KRb_qV9P4g/v-deo.html
I didn't really get that comment, does he mean you should stop trying to determine the resolution equivalency of imax? like why bother releasing an old imax in 4k when we can do it in 16k in a decade?
@@jonaboy3 I think he was saying you should give up trying to determine the resolution in order to scan it -- because the resolution is just way too high to edit digitally (and maybe to capture to begin with). Editing 8K video is already too much for many prosumer-level computers. 16K would be four times that! Most computers would just choke.
@@GregConquest 8k is alrdy the standard for many pro producers even UA-cam, see linus techtips as an example. So doing even higher res-es(?) on scanning nowadays for those deeeeeep pocket enterprises shouldn't be a problem at all... Whether or not its worth it (due to the fact that why would you produce 16 or 32k video now that ppl wont be able to consume properly/or give you a decent advantage over other production methods) acting as a limlting factor for such endeavor being accomplished at all, is a different matter
@@TheLiasas Pro producers don't use prosumer-level computers. And I was explaining why 16K would be prohibitively expensive. You missed the point on both.
Regarging early tv: I think the most amazing thing was that they would shoot live TV-shows live for the east coast, radio the live signal to the west coast on a non-standard frequency, film the broadcast, develop the film and then broadcast a projection of the film live to the west coast because they had no way of time shifting otherwise. TV Studios would use more film than actual hollywood at the time.
Well in addition to AT&T microwave links, AT&T also used coax cables. Yes there were coax cables running hundreds of miles across the country tapping off in each major city. A single cable could carry hundreds of phone calls or one TV signal. So you can imagine the hourly expense to rent time on the cable or microwave link.
@@andydelle4509 In the old "Bell Telephone" days the phone was the ONLY utility we had that never failed. Electricity,Gas,Water all had occasional outages, But from the 60's to the '80s I don't remember a single time our phone "went out"! it was a seriously over built sturdy system.
”Photography seems like a very trivial thing today but it wasn’t so easy for the vast majority of its history. The goal of this video isn’t to go through the entire history of photography” That’s not very Technology Connections of you “so we’re gonna fast forward to about 1900” Ah, there it is.
Since you mentioned Monty Python, they once did a sketch where the characters suddenly realized the building was "surrounded by film". They kept trying to escape, but every time it cut to an exterior shot, they were on film and retreated back inside in panic.
On a related note: Since you have been making videos for 15 years, I have noticed that your early videos, and many other early UA-cam videos, are currently not available at resolutions greater than 240p, but if they were originally filmed at a higher resolution, could they be re-published at a resolution more like the one at which originally filmed? I also wonder if the practice of filming past content at a higher resolution than it was shown on television was more common in Germany (and maybe other Germanic countries like the Netherlands) than in the English-speaking world given than these people and known for planning ahead.
@@Myrtone If the source video originally was a higher resolution, or a higher resolution version can be made, one could publish a new version of the video in higher resolutions. The issue is that UA-cam won't let you alter an existing video post with a different video. This is to prevent fraud; someone posting a video that goes viral and then changing the video to an ad to capitalize on all the views, but it means that the only way to do a higher resolution is to make a new video post. So people can either delete the original video and post a new version, or leave the old video up and post a new version. The issue with the first one is that you'll break anyone who previously linked to that specific video (think of all the rick-rolls that would be broken if that original video was deleted). You'll also lose all the views associated with that original video, which could affect your visibility. As for leaving the old one up and posting a new version, that's actually what happened with the Wham! Last Christmas video. The original was posted 10 years ago and has 422 Million views, and the new 4K version has 8.7 million views: ua-cam.com/video/E8gmARGvPlI/v-deo.html ua-cam.com/video/bwNV7TAWN3M/v-deo.html Sometimes you'll see people post links in descriptions, or add cards that link to the newer, better version. However, even then the new version may not get many views. I think the Wham! video is getting so many due to it being a perennial Christmas classic song, and people are primed to want to view the video this time of year, even though it's so old. For most older UA-cam videos, even if it's only a decade old, most people want to see something new instead of something they've already watched, but in HD.
Film shoots were incredibly expensive. So when shooting professionally, there was a budget for your stock and how many feet of film was going to be developed. This required more forethought, planning, and actors who could nail their lines in a minimum number of takes. If you were shooting excessive numbers of takes or angles, you probably would go over budget. Today, storage is dirt cheap and getting a viewable image is instant... rather than putting all your hopes and prayers that what you shot was on a good batch of film and nobody screwed up the process along the way. The same holds true, to some degree, with music. It was expensive to record music. It was expensive to shoot film. But here we are... people making videos of what they ate for lunch.
that's what makes Queen's concert in Hungary in 1986 really impressive. the whole concert was shot in 35mm film with 16 different cameras, and it was all thanks to the hungarian government. they hired the best cameramen in the country. and it's why we have the full remastered concert in HD on blu-ray right now
And at least back in the day, there was a infrastructure for it. If a movie had a decent budget, it was pretty common to have a filmlab on set so that the film could be processed within a day and the director could just look at shot the next to determine if they needed to do reshoot it or not. Plus with that budget, it wouldn't have too difficult to order for more rolls quickly. And even on a low budget, there wasn't too much of a trouble to find a filmlab in your local area that could process your movie quickly. Now filmlabs are rare and expensive so you need to put extra money to process your film. And even if a movie has a big budget, the infrastructure is gone so getting more rolls of film is harder, more expensive and chances are that you can't have a filmlab on set.
Until HDSLRs and DOF adapters came along, depth of field was the easiest way to spot that something was shot on 35mm or larger film. If you had a terrible film to video transfer the dynamic range could be horrid.
Today I learned the "soap opera" effect is a real thing. Growing up, I always noticed this... odd fluidity with soap operas. They all flowed so smoothly, and everyone I tried explaining it to didn't understand. I thought I was crazy until this video.
and today we have TV's that by default "upgrade" all inputs to have the soap opera effect. I sometimes find it painful to watch other peoples TV's. Especially when that is turned on AND they are viewing the image in the wrong aspect radio and for some reason dont notice that everyone looks squashed
someone should start a streaming service with a similar premise to what GOG did. "We'll stream your old movies and TV that you've forgotten about and aren't selling, a cut of the revenue going to you, and if you let us borrow your film masters we'll cast it to 4K at no cost to you!"
The point of GOG is that you own the games that you buy, so streaming isn't really similar to GOG. Anyway, scanning, cleaning the scans and redoing audio is a VERY expensive process. For this reason, this idea can't really happen.
The Criterion Channel, it's like the Netflix of older 4k film transfers. They usually try to collaborate with the directors or cinematographers of various films to make sure the quality matches their vision and final cut of the other released formats and theatrical cuts. You can also buy their Blu-Rays from The Criterion Collection, which they jam-pack with special features, commentaries, and other behind the scenes stuff. Other companies worth mentioning too: Arrow Video, Eureka, and BFI.
That Apollo 11 documentary that came out last year had some absolutely AMAZING looking shots from the 60's, and not just stuff like the launches. Much of the footage from "behind the scenes" looks better than anything released today, it is so much more natural looking it's like you're actually there...it actually feels odd at first. It's also a great example that (assuming the proper specs) digital technology can recreate the original analog signal perfectly, both video and audio.
My father also got some digitalized very old pictures in high resolution of the place where I live but like from the 60s, 70s and whatever all in black and white and they have incredible amount of detail that I can zoom a lot on stuff with great quality, I'm guessing the original government photos were also in a big size. It literally took decades for the digital camera to get anywhere close but at least nowadays almost everything even cheap has OK quality greatfully and good stuff has amazing quality.
As a photographer who uses mostly film, I get a kick out of people who are surprised how beautiful and rich film can look. (Not all the time, but if its processed, stored and displayed properly.)
using film in the last decade, for stills photography, is all but redundant. 50megapixel + cameras are easy enough to get and so much more editable. 20mpx is probably more than film can practically match anyway.
@@southerncharity7928 It's not redundant, because it has a different look to it that's almost impossible to recreate digitally. You can get fairly close, but it is still detectable most of the time.
@@southerncharity7928 While modern high-end digital photography is undoubtably superior in terms of quality, there's something about film that just can't be recreated. It's almost like it a 2.5D image with a tiny layer of depth that you just can't get out of digital.
"It looks like it was shot on the tape" I can totally feel that sentence... Though we cant techincally explain the difference... Our eyes do catch it ! Perfectly explained.
While on the surface the sentence is a tautology - floor is obviously made out of floor, it's true - tape has a look to it that film doesn't, and vice versa.
It is a perfect example of how our perception is split into several part. The final processed image mostly removes the difference, since it is not true information that needs to be presented to our brain directly. But the imperfections are recorded and analyzed by our brain too, and if needed, it can tell, this is probably taped or filmed or whatever, but in most case, it is just discarded as irrelevant information.
We CAN technically explain the difference. It's motion blur associated with 25-30 FPS and lack of it in 60 FPS. It's really obvious when once you understand it: ua-cam.com/video/oaaBOu086Xg/v-deo.html
Wow. I’m a (retired) adjunct professor of photography and your explanation of visual recording media was perfection! Your jumps back and forth between film to tape to digital was concise and specific - and coherent and even accessible! Amazing. Brilliant. And… thank you!
Here I am, at 1AM watching a video on stuff that I mostly already know, AND YET how could I resist? This guy could read the Cleveland phonebook and he would have my undivided attention.
You're so right. The way he explains things in technical and layman's terms at the same time make his videos so great and the topics are great. I always learn something
Star trek TNG was mastered on tape. To create the HD version you're talking about, they had to dig out the shots, re-edit it together, and redo the special effects from scratch digitally. My understanding is they are currently in the process of doing just this to voyager now.
Gregory Parsons Not possible with Voyager, apart from maybe the first couple of seasons. Voyager went to using CGI and that was all rendered to tape, not film. So unlike TNG, where the effects were shot to film and then composited onto tape, there are no film elements for Voyager after the first couple of seasons, other than the live action shots. They’d have to recreate all of the CGI for Voyager, which would cost millions of dollars. Not gonna happen.
another commenter pointed out that apparently the remake for Voyager is cancelled due to an unfortunate lack of demand, apparently other star trek remasters didn't reach sales expectations.
@@sunspot42 it could, a few million dollars to make millions. All of the non filmed effects of TNG had to be redone (lasers, matt paintings, compositing) at the cost of millions. They intended to do it for voyager too, but it seems the blurays of TNG didn't sell well enough. 💶
@@lonesnark Let's be clear - virtually all of TNG's effects were filmed, along with the live action. The effects were composited onto video back in the day. In order to produce the HD remaster, they had to scan in all of those film elements - live action and effects shots - and recomposite as needed. Some effects shots would have half a dozen or more film passes, so this was a considerable effort, but virtually all the bits and pieces existed on film. For Voyager that simply isn't the case. After the first couple of seasons the effects were done direct to video via CGI. There are no film elements. All of those effects would have to be re-rendered in HD, and I suspect many of the original CGI elements don't exist in any form - they'd have to be recreated from the ground up. That would cost well in excess of ten million dollars for the entire series. There's no way Paramount is gonna blow that kind of money on Voyager - they'd never recoup their investment. Heck, they didn't make back the money they blew on the far simpler HD remaster of TNG.
@@sunspot42 TNG Effects had to be partially re-done digitally as well since they also were only done on the Video Level. But yeah model shots would be an entire different thing. Also why we probably won't be getting any HD release of Babylon 5 any time soon.
Some episodes of “The Twilight Zone” look very, very different from others, and I think it’s because at some point in production they switched from film to videotape. I’m only guessing that that is the reason, though. Edit: Wikipedia confirms that 6 episodes of season 2 (1960) (The Lateness of The Hour, The Night of The Meek, The Whole Truth, Twenty-Two, Static, and Long Distance Call) were recorded on videotape as a cost-cutting action, but because it didn’t cut costs enough and it was nearly impossible to edit, it was never repeated.
Also old videocon imaging tubes back then had serious problems with contrasting creating a need for different elaborate lighting and makeup techniques to get a decent product. All time consuming and costly. Not really cheeper to produce a vastly inferior less durable product.
@@michaelshultz2540 When the Twilight Zone were done on tape, the video cameras used "image orthicon" pick up tubes which produced dark halos when photographing bright objects such as candles. The vidicon tubes that came later ('70s) eliminated this problem..
@@TheOzthewiz I think those tubes only really started to get traction in the 80s, as many music videos and even live broadcast sports games (such as night games of US baseball) are clearly the older type. In fact I think even some footage from the 90s has a bit of the effect.
One of my favourite examples of film vs tape was Doctor Who. Most if not all serials of classic Doctor Who as mentioned were shot on tape in the studio and film on location, like most shows in the 70s. But with the 3rd Doctor's first serial "Spearhead From Space" because of a strike at the BBC, the entire serial was shot on location with 16mm film and because of that, it was the first classic serial released on Blu-Ray.
I believe in the days when this mattered the BBC considered 16mm equivalent to SD and 35mm equivalent to HD. this was before 4k was a thing, so thanks to them for doing a thin they didn't see a point in doing on paper. the Spearhead from Space blu ray shows you could get an HD image from 16mm
On a side note. A small theater near where I grew up continued using a film projector until about a year ago. I was pretty cool going into that back room to see the projector. That thing was a work of art. I'm more surprised that the major studios still sell film versions of their productions.
We had a little local theater near us that used film until 4 or 5 years ago. I used to go there just for the novelty of watching a movie on actual film. Then they upgraded to digital. And not a great quality projector at that, because I was able to see the pixels sometimes. I ended up just buying my own video projector and now I watch movies at home. No point in paying high prices just to watch someone else do video when I can do it myself. Film had a novelty to it, and there was more cost/effort to produce the physical media & project it. Video I can watch at home, and it's cheaper for me in the long run.
@@chuckheider9938 That's what the one near me did. I haven't been there since they "upgraded" the projector so I can't speak on the quality. They completely renovated the place as well. It was definitely that old school aspect that drew people in, the tickets were always pretty low priced, but the movies didn't come out until about 1 1/2 months after they debuted.
18:53 Ok, I know I'm 4000 comments late to the discussion. BUT, as my dad is a huge 16mm collector (probably has close to 1000 films at this point- and if you ever want to cover the original Kodacolor lenticular color film, let me know!), 720p is not enough for 16mm film, unless you're talking about the EARLY 16mm stuff, pre-1930. (16mm film released in 1925, but you just do not find any film from 1925 for some reason. I don't know why, despite having a camera, possibly two released in 25). 16mm scans should be 1080p minimum, with 2K resolution (i.e. 1440p) being the best "mid-range" option. (Do note that 16mm film scanners for the home are still stupid expensive, with the cheapest being about $5000). But really if you see this, please let me know. I'd love to help out with a video on the first consumer motion-picture format, and I can give you access to SOOOOO MANY resources. (I know you said do the best you can, just wanna clarify. Also, at 19:13, 16mm films degrade A LOT- particularly the earliest kodachrome from 35/36, and almost all ektachrome from the 60s-80s. Turns either red or purple. And this is totally ignoring vinegar syndrome)
@@emeryththeman man, great job breaking it down in depth! Personally I think 4K is optimal for 16mm transfers, I feel like someone who says 720p hasn't actually seen 16mm film in person. That should capture most of the grain and make it actually look as it does from a projector.
Yeah. I agree. I've seen higher quality 16mm and there is even a difference between 2K and 4K scan. Although technicians when doing back of the napkin calculations say it was about 3K - for good quality 16mm film. 35mm is more like 6-8K if not higher (there were very good quality films used sometimes). 70mm is probably 16K range at least. Although I haven't seen personally scans of those (and only 1 or 2 of 35mm) so I don't feel like an expert. I am pretty sure the Wham! video could be released in 8K - but that's a bit too high for UA-cam. Or at least was.
The Walking Dead & The Shield are good examples of how good 16mm film can look. Both of them wanted grittier looking shows and presumably the cost benefits of shooting on 16mm instead of 35mm.
Here in Australia we had an extra issue with the quality of taped shows. Because US programs were NTSC and the Australian TV standard was PAL, the conversion would make US taped shows look mushy. These were usually taped studio shows like All in the Family or all the soap operas. Filmed shows, like say, The Addams Family and Hogan's Heroes looked much better. Australian taped shows, like the studio bits of Division 4 and variety shows like the Paul Hogan show, were also much sharper and cleaner than US taped shows. (Those Australian shows also used film for outdoor shots, and that was noticeable in the same way as your Monty Python example.) I was amazed when I visited the US to see how taped TV shows looked so much sharper when broadcast in their original NTSC formats. Until then I thought American TV was really poorly made, and not degraded in the broadcast format conversion.
Same here in south Africa, for the same reason. However a lot of the home made content was shot on 8mm and 16mm film, because the stock cost was a lot less than 35mm, leading to the film actually being equal to the broadcast video, though you can see the old broadcast tapes ageing with the reruns, they are a lot softer than any telecine conversion, even using the old hopping patch telecines that they used to have. As well a lot of the archives are from home recorders, as the broadcasters would recycle tapes a lot, as it was expensive, and nobody was really archiving anything other than the film stock.
@@rty1955 Of course! Now I understand! That's why my 24megapixel Stills camera images with their lousy 0fps framerate look so mushy compared to the crystal clear images from my 1megapixel, 15fps 10 year old webcam! Thanks.
This is spoofed in the UK TV series "The Day Today" where the news reports from American channel CBN are processed to exaggerate the effects of poor NTSC to PAL conversion (examples clips can be found on UA-cam if you search for "the day today barbara wintergreen").
"And we were all having fun watching these amazing movies at the cinema, enjoying our popcorn, socializing with the neighbours" This line hits different in 2020
@@AzathothsAlarmClock See 2001 on a proper large cinema screen (preferably watch the 4K restoration by Christopher Nolan) and you won't fall asleep. It was never intended to be seen on a small screen, you really need to see it on a huge screen that fills your entire field of vision - when you do that, suddenly it all zips by.
The Prisoner transfer for Bluray was awesome. One of the downsides here, is that even when they did use film, sometimes you didn't notice how out of focus things were. And sometimes they do a better job of correcting for noise than others. I never noticed how out of focus Ghostbusters was, but they made it worse by not properly accounting for digital noise over the top of that. It's why it's often best to scan the same image multiple times as the noise and sampling errors will tend to vary a bit, allowing for a cleaner scan.
@@djsherz That's not quite the reference I was making, but if the writer of that line were hinting at that I would be A) not surprised and B) a little blown away by the subtlety there!
@@TechnologyConnections I believe you may have been referencing the early rationale for television i.e. learning foreign languages. Well, we got PBS as well as the Gong Show so it's all good.
When I got into broadcast TV in 1976, we used 2" quad tape machines to record sporting events and save highlights. Our instant replay/slo-mo machine was a set of polished discs that could record up to 30 seconds. The replays we wanted to save were immediately transferred to the 2" tape machine to free up the disc recorder for another replay. We had to rush to get the transfer done, and often would miss the next hilite if it happened before we got the disc recorder back into record. The entire machine was contained in 3 heavy, metal boxes that stacked one on top of the other. We had one tech assigned just to that machine to keep it in working order throughout the broadcast. It was always "fouling", requiring re-polishing of the discs. It was a fiddly, quirky, and maintenance-intensive device that was as big as a small wardrobe/armoire and weighed hundreds of pounds. It was the Ampex HS-100. I bet you'd like to investigate it! Here's a link: ua-cam.com/video/cW7jvmoLQ7o/v-deo.html
Ah, the days of Big Iron. I volunteered at the local cable TV center (back when community TV production was a thing) in high school in the late 80s and got to play around with honkin' big switchers and cameras on shoots. These were the days when movie production companies would send out packets with 8x10 prints from the movie for use in movie reviews, and you "digitized" them by putting them on a rostrum and pointing a camera at the things.
I really appreciate that Alec is willing to put his gaffs at the end. Lol makes for a little chuckle and shows he isn’t afraid to look anything other than perfect. I’m really starting to like this fellow.
Im glad someone explains why some shows had the "soap opera effect." I always have to explain to people why vhs videos on youtube are sometimes in 60fps.
I don’t know how it’s so hard for people to understand.. me and my sister noticed and figured it out when we were kids. I gotta say tho it still surprises me seeing really high quality old video
Kenneth Branagh's "Murder On The Orient Express" from 2017 was shot on 65mm FIlm. The first time I watched it I almost instantly start scratching my head about the look of it, so I looked it up. It is so clearly not digital and GORGEOUS. I love filming digital but Film just has a look you can't recreate. (I edited to correct, as I had said 70mm when it was actually 65mm. A commenter Michael Parente pointed this out in the comments.)
leave it to Kenneth Branagh to do something that sounds pretentious but looks or is done so good you forgive him pulling a pretentious show (Like his 4hr take on Hamlet, such a good interpretation but FOUR HOURS????)
If you can see a movie shot and projected in 70mm, absolutely do it. I've seen both 2001 and Patton in that format at a local film festival (in a proper old theater that seats 1600 with one freaking enormous screen), and they looked amazing.
No news there. Most high production movies are not filmed digitally. What you were noticing was probably a creative lens choose or maybe they staged the shots to use more of the cameras FOV(less zooming-in in post).
Doctor Who would've been a good example to use since it uses both film and tape. What I always found an amusing artefact of the video half of the production was the way bright lights would swear across the screen when moving.
@@gussyt1761 The next story, not the next episode. Spearhead from Space is a story made up of four episodes. Yes, I know they were shown as compilation edits in the USA, but Spearhead from Space is not a Doctor Who episode, it's a Doctor Who story :)
@@Takeshi357 That is called comet-tailing and was an artefact of the imaging tubes in the cameras. It was greatly reduced with advances in tube camera technology, and then eliminated altogether when cameras switched to using CCDs rather than image tubes.
Been rewatching The Waltons from the 1970's, it was shot on 35mm film and looks gorgeous on my 4k tv without any goofy anamorphic distortion. An amazing amount of detail put into those sets by Lorimar/Warner Bros. Also another way to tell if a tv show was tape or film is an announcer may say in the end credits something like: "All in the Family was videotaped before a live studio audience" or the credits will read to the effect: "Videotaped at Bueno Vista studios Burbank CA" or it may credit Panavision, Kodak & etc. if on film.
My mind was not blown by that Last christmas remaster, because I bought West Side Story, The Wizard of Oz and Robocop on blu-ray. The image quality is so good, they look like they were filmed in the 2000's
You'll occasionally get some color weirdness from old film samples, but it's nothing a good mastering studio can't fix with time and money. I just feel really bad for the actors filming for Technicolor films, because the first couple generations required the set to be dramatically overlit to get adequate exposure. Those lights get really damn hot and make the entire set feel like a large toaster oven.
The Wizard of Oz was even a special case since it was shot on early multistrip Technicolor which was shot on basically on 3 individual strips of film. One for each colour. And that was then layered to a single strip colour print. When they scanned the camera negatives they had to do it once for every colour. And that also helped to eliminate dust and dirt a lot easier since the image restoration software they used compared all 3 pictures of all 3 film-strips for every frame and if there was a speck of dust on only ONE of them but not on the other two, it removed that speck of dust and made it look like the other two.
I wonder if they will ever re-master the Thriller music video. It's very unusual for music videos to be shot on film in the first place, and when they did, it was often a promotional project from a film director who was already in the industry. Motion picture film stock is EXPENSIVE so it's only used for projects which have a very high return on investment.
On a somewhat related subject, Rolling Stone did a reprint of the John Lennon cover photo, taken in 1980. One of my younger co-workers was shocked to learn that was a 1980 photo. The quality was so sharp one could see the pores in his skin, along with the vivid color!
There are high resolution scans of Playboy Centerfolds of the 1980s. The level of detail is absolutely amazing. Those have been shot using Hasselblads back then, if I remember correctly.
@@splashstrike It was almost certainly was at least Medium Format (120/220 roll film). Professional photographers used 35mm for "field" work where portability was important (I.E. photo journalism). Fashion Photography was dominated by Medium Format, Portrait work was also dome at even larger formats such as 4"x 5" (or bigger!) cut sheet film using a large "view camera". P.S. There ARE Medium format digital cameras today. But they cost as much as a decent used car.
This guy just taught me exactly how photography and television works concisely in a way that I understood completely, all in the space of 5 minutes...I actually felt myself getting smarter 😂 you've earned yourself a subscriber my friend.
Re: "16mm only needs around 720p": I've recently looked into scanning 8mm Standard/Super and 16mm and found that even 8mm film can benefit from 4k scans! Incredibly fiddly work to get the best out of these smaller formats, but I was blown away by enterprise scanning solutions and how high-def they look.
The results also depends on the ISO of the film.. Low ISO gives better resolution, but more light is needed. A 25 ISO film need to stops in better lighting than 100 ISO.. Full stops from 400 ISO 400 200 100 50 25 12 6
Very true. I would say that the pseudo-random nature of grain makes it act a bit like dithering in audio, in that the data shared between several frames in the noise actually adds a lot of detail. Therefore, to adequately scan a film, I would assume that all the grain has to be present in its fullest form to record the detail properly. Speculation: To go further with my audio analogy, I would imagine it's sort of like the Nyquist sampling theorem in that you'd need double the linear resolution, and therefore you would need to record the square of the actual resolution the film can display (if the film can do 4K, then you'd need to scan it in 16K (I haven't checked the maths)) in order to record it accurately in the digital world.
Yeah, 16mm can go all the way up to 4K. Especially negative film. But even ECO film still looks amazing if scanned properly in higher resolutions than just 1280x720. And keep in mind 16mm was mostly a 4:3 format. So you'd only really benefit from a 16:9 aspect ratio if it was Super16. I say if you CAN get a higher quality scan of a 16mm reel then DO IT. Don't be be cheap and scan your film properly. You can ALWAYS scale down from 4K afterwards anyway at the very end. And yes, the same goes for Super 8, Regular 8 or DS8. Even better if scanned with open gate and really getting the most picture information out of it.
I daresay processing 8mm over resolution could provide a higher resolution by processing a few frames together. The majority of frames are of the same thing with a slight movement - so a moved pixel can be compared a few times and averaged out to a better quality a bit like time interpolation rather than dimensional interpolation.
First, thank you for not Wham!ing anyone who have made it this far into Whamageddon. Second, Star Trek: TNG did have the same issue mentioned about Voyager. All of the scenes were shot on film, then transferred to tape for editing and special effects. There was a demo Blu-ray released showing off a few of the restored episodes as a proof of concept that the SFX could be redone in HD properly before the full seasons were released. From what I recall, not all of the film for a few of the episodes could be found, so those episodes' scenes had to be upconverted and restored from the video tape master. Third, I always enjoy your videos. Looking forward to the next!
I was going to mention this. All 80s/90s Trek was produced the same way. The reason Voyager and DS9 haven't been remastered in HD is largely the time/cost of the work involved, especially in the case of Voyager where a lot more CGI shots were used, all of which is lost now and would need to be redone from scratch. But it was all shot on film, and hopefully still all exists somewhere in their vault.
@@BlobVanDam Enterprise was shot on film, but scanned in 2K because 2001, so the Blu-ray release was a lot less work for that series. But a few of the VFX shots in the early seasons look way softer than the rest. I guess they had to re-render a change in a rush and did so in SD for those shots. I'm not sure if anything other than S4 was actually broadcast in HD, because 2005, but at least they had the foresight to do the scanning and FX at a higher resolution the whole time. I'm so sad about DS9 and Voyager not getting a remaster, especially since they were framed to be wide-screen safe ahead of time. They would look so good on modern TVs! And it wouldn't have to crop the top and bottom off of the frame either, since the video transfer back in the day already did so on all sides. A wide-screen release would just partially un-crop the left and right.
@@kaitlyn__L I recall the CG shots for Enterprise were all done at 720 and upscaled to the 1080 master. I guess full 1080 CG for a TV series was a bit of a stretch for them in the early 2000s. :D Since the CG was already better than SD, it wasn't worth trying to redo it for the Bluray since you're still gaining from it over what people had seen on TV at the time.
What a spectacular channel. What an excellent way to explain standard definition, film, high def, soap opera effect etc. I am overwhelmed by how such a complex subject was simplified as it was. Thank You for another truly great video.
I first became aware of this watching Twilight Zone reruns in the late 1990's and wondering why some episodes looked good and some looked like they were recorded using my family video camera. Turns out they used film but switched over to tape to see if they could save money. They couldn't and went back to film.
My dude, you research SO HARD it's amazing. Thank you for all of your hard work. Such good content, very wow. But seriously, the amount of effort you put into your videos is mind-bogglingly impressive. Thanks for doing what you do!
In this case that's not right. Those videos are enhanced with the help of AI. There are YT-videos showing the result before and after using this technology and believe me, the original (scanned) film has not a resolution of 4k or even close. And laserdisc - btw - has nothing to do with this at all.
Nah, some of the stuff in the 90s was weird, rare, and VERY EXPENSIVE high-res tape recorders. Basically, things that were the precursor to D-Theater (HD VHS). The most viewed "New York in 199X in HD" is shot on tape, it's at 60i. It also has the kinda distinctive videotape color gamut. Also, while the other ones are all on film, a lot are AI upscaled.
Not in all cases! Techmoan has done an excellent video on D theater (HD VHS Tapes) and he posted one of those HD NYC from the 90’s videos on his second channel. I highly reccomended checking out the explanation!
I remember watches loads of shows as a kid and noticing the difference between things shot inside and out. No adult I knew had an answer, but I suspected it was something like this. What an awesome video. Thanks a lot.
Basically, most music video shot on film even in the eighties because musicians didn't like the cheap soap-opera effect of video, and/or wanted to make their clips cinematic. If the music video did got shot on tape that was because the band or publisher was on a shoestring budget, or it was originally a live performance somewhere repurposed as a music video.
I'd argue 16mm can look good at 1080p, I own the sole original Doctor Who serial shot entirely on film, Spearhead From Space in blu-ray format and while yes it has a definite grain to it it still looks very good at that level of scanning. Its an excellent preservation level transfer.
Another thing about motion film - because the grain is random you can get vastly higher resolution by interpolating multiple frames, so there would even be a point at scanning at FAR higher than 8k
Rank, one of the film scanner manufauctuers, said they estimate standard 35mm can resolve about 6K but the optics required to put that much information onto said film are almost impossible. MaxiVision, a short-lived film projection system, said Geneva movement projectors (most projectors), even with ideal lenses, only resolved about 2K because the film flexed so much going through the gate and that simply using a better movement can bring it up to 4K. Even Super-16mm with a top-quality lens can yeild almost full-HD resolution but without the annoying motion artifacts of most current cameras.
@@stephenbaldassarre2289 those numbers may be true for a single frame, but the grain in each frame is in a different place, so across several frames there would be a point to using MUCH higher resolution
@@mycosys That's true to an extent but you are also limited by the uniform scattering of light through the emulsion itself. Then you really start splitting hairs because only one color layer (blue) can achieve full sharpness. Modern emulsions are so thick in the name of low-grain, you'd be lucky to maintain 50% MTF even at 4K, regardless of frame rate. Don't get me wrong, I love film and really miss shooting on it, but there's a lot of factors involved beyond the photochemical particles themselves.
I'm always impressed how good the 4K version of "2001: A space odyssey" looks. Even how sharp the video on the 1960 IBM tablet computer flatscreen looks
@@joesterling4299 It was a good looking movie, but even the critics said it was sooooo borring. That is until Star Trek TMP came along, which was sooooo sooooo borrring.
This was so cool for me, thank you! When I was young, at some point I started to notice that some shows looked "different" from other shows. I didn't have any understanding of resolution or those concepts, I just always knew that there were some shows (like MASH or Star Trek) that looked one way, and other shows (like The Jeffersons or morning soap operas) that looked another way. I couldn't describe it, there was just something in how they looked that felt different. Over time, from simple process of elimination, I started to work out some patterns, like the fact that movies shown on TV always had that first look, and TV only sitcoms often had that second look. Some shows (like Dr. Who) looked the first way for any outdoor scenes, and the second way for indoor scenes. I don't know how old I was when I figured out that the difference was that that first "look" was things shot on film, and the second "look" was things shot on video. I felt proud of my deduction skills and moved on. But I never actually understood why they looked different, what was that indescribable quality that made one feel different from the other. So watching this video brought me right back, and finished the mental journey I had started... a good number of years ago, but this time with some actual science. Understanding how the random, natural orientation of the silver particles on film lend it a sort of underlying texture that differs from the concrete, geometric pixels of video makes so much sense, and puts words and meaning to something that my younger self just felt instinctively. This took me back, and I really appreciate that. Very glad I found this channel.
If you're old enough to remember the phrase "Film at Eleven" on the 6 o'clock news it was as follows: The news anchor would give as much detailed information about an event happening in the field but the actual film wouldn't be processed until it was back in the studio and then shown during the 11:00 news.
One good example of film vs. tape: The Twilight Zone Season 2, and the handful of episodes CBS shot at Television City on tape to "save money". Forever stuck in 480i, with all the fun side effects the Image Orthicon tubes created.
That's the example I always pull out when I talk about the "feel" of film vs. videotape. Those videotaped episodes almost feel like they're not the same series. It just feels wrong. The recent live stagings of Norman Lear shows on ABC also felt very wrong because those series were originally videotaped. I could go on and on with gripes about framerates. Don't get me wrong, I love videotape (and even tube camera effects), but I also am very sensitive to the "look and feel" of something.
This is the first thing that came to mind the moment he mentioned the soap opera effect! Could never put my finger on what made some of those episodes feel so weird
The TZ tape/film comparison shows another strong difference, which is the tonal rendition. Film has an S-curve of contrast, where mid-tones are most contrasty, and contrast gradually reduces as you go into the highlights or lowlights. The old black and white image orthicon video cameras had no precise tone curve correction, only effects similar to early Xerox copiers. Later image orthicon color cameras combined with CRT (picture tube) displays had essentially the shadow part of an S-curve, but highlights would go straight up to maximum and then clip. Later plumbicon video cameras had much better tone curves, but still were usually pretty crude in handling highlights. Today's digital cameras, both still and cinema, with the post processing that produces the final output, emulate film's S-curve and do it quite well.
Ya know, I’ve seen the SciFi channel broadcast what has to be the original Ampex video recordings of those few taped shows. For decades in syndication they must have only ever shown the Telecine (filmed TV screen) versions of those episodes. But what I saw was definitely the actual videotape because Serling was on the set when he did the intro, and when the camera panned over from the actors to him the ‘soap opera effect’ was obvious. In fact, the SciFi channel seems to show both versions of these episodes, Telecine & videotape, for some reason...
Another reason for using film was that it was easier to produce a good quality PAL version for export. Most of the American shows taped on NTSC equipment looked somewhat blurry when it was converted to PAL.
Yeah thank god for PAL without the need to add "dropped frame" bullcrap. Especially if you shot at 25fps (like many TV productions) it was a lot easier to produce a 50i picture from that filmsource.
KRAFTWERK2K6 It ain’t perfect though, films had to be sped up by 4% to match the 25fps signal whereas NTSC had to create a 5th frame via 3:2 pulldown. Seems more like a “pick your poison” situation imo.
@@charlescampuz5812 Only films shot that were shot in 24fps. TV Productions were often shot on 25fps. And i think even some theatrical features as well as it was fairly simple to convert that to 24fps theatrical prints. So the movies where not sped up on TV and video. And TV shows often were converted from NTSC 29fps sources to PAL which resulted in weird blurr and field ghosting. YOu could always tell when it was a US Series or a show from PAL territories. The pal Pitch problem got much worse in the late 90s and early 2000s when digital format conversion became a thing and everything was badly converted to PAL. Extremely noticeable in the later seasons of "Friends" over here where the opening music suddenly run a little pitched. The problem was enhanced when the dubbing was actually done to these already converted 25fps masters which means on Blu-ray all of that now runs 4% too slow while on TV it runs ok.
@@KRAFTWERK2K6 most movies broadcast or sold in PAL/SECAM regions were sped up to 25fps, saving on any conversion costs. There was a noticeable pitch difference, but only if put side-by-side. Things got weird when they started using time compression in the mid-90s...
film-grain: the director's cut DVD of Star Trek The Motion Picture contained CGI additions/replacements that were put through a grain filter (based on analysis of the grain structure of the original stock) to better blend in with the original material.
That's great to know. A lot of modern TV and movie shows, even though shot digitally with practically zero noise, have grain added to them just to give them a more filmy look, rather than artificial, plasticky shiny-and-new look.
@@silkwesir1444 Apparently there was a UHD (and Blu-ray) release of the Director's Cut in 2022? TIL. I'm definitely going to need to get my hands on that; the DVD is one of the very few DVDs I bought after upgrading to Blu-ray a decade ago.
I think one way to elaborate on "You can tell its video because it looks like video." Dynamic range is a good test - and its something you still have to have an eye for, but video productions usually aren't as vibrant as stuff shot on film because the dynamic range they can capture is smaller. Instead of having perfectly saturated blue sky, it might look more overcast even when it wasn't. Especially true when we're sticking to the "up to the late 90s" era mentioned in the video. It certainly gets murkier with super high end large format digital cinema cameras from Arri/Red since their relatively huge pixels have increased dynamic range too, but for generic betacam SP video cameras vs actual 35 or even 16mm film, the dynamic range is a big tell.
here's an image that compare the dynamic range of digital camera to film camera: techxplore.com/news/2015-10-quantumfilm-based-image-sensors-cameras.html
I agree, for now. I was surprised he didnt say "for now" though. 16k scanning will be a thing one day. It wont be needed for the home TV, but VR one day can benefit
As a film photographer i can say that yeah, IMAX will definitely need at least 2 decades of hardware improvements before becoming obsolete. Even 70mm is easily 8k worthy, if not higher depending on film stock.
The film/video difference was something I always noticed when watching Doctor Who as a kid. Whenever Tom Baker was in the Tardis, it always looked different from when he was trouncing around a rock qua-alien planet.
I'm honestly proud to say that I finally have enough expertise that I knew almost all of the information in a Technology Connections video! lol. Lots of great information here and as a film photographer this is definitely going to be THE video to which I recommend people when asked what the benefits of shooting with film are, or why I like the look, etc. It's the perfect one stop, inclusive video for all the information I usually stumble around when describing it to people.
I've got 10,000 slides that my father took in the 60's to 90's. I was early in the scanning game. Who would need greater resolution than 900x600 pixels? That's already larger than your monitor! The problem is that slide scanners peaked around 2005 when digital photography wiped out film. There are no cheap/good/new slide scanners! Kodachrome has incredible resolution if you have the [very expensive] scanner for it -- and the time to scan slides one at a time.
From what you are saying this seems like a real problem. Hopefully some kind of enthusiast made solution will come out in the future, a kin to floppy to SD card reader for the retro computer community.
One thing about Voyager's post production: I'd guess that post production included CGI scenes and hoo boy would I not envy the guy whose job it'd be to find the original files on some Amiga or SGI's hard drive and attempt to re-render them at higher definitions.
It's worse than that. They typically re-create the shot from scratch. Even if they did have the CGI files, they would be of little use as there were no real standards for that data. Most of the hardware used in the 80s and 90 was SGI, Silicon Graphics Inc, basically defunct as of 2009.
The 2k remaster of The Next Generation was hugely expensive and all the CGI had to be re-done from scratch. If you watch the final result it is absolutely stunning. However, DS9 and Voyager used much more CGI and the original digital assets have since been lost: www.treknews.net/2017/02/02/why-ds9-voyager-not-on-blu-ray-hd/
Andy Delle not so bad actually, it was made in on SGI hardware, but when SGI went under the software was ported to windows and mac, and it’s possible to import from older versions into newer versions. Someone who still had the assets opened some of them up and hit “render” in HD and said it would be a lot easier than the Next Gen HD remaster, (where all the assets were lost.) a few years ago. The images are online.
Haha working in tape since it was invented we never used the term "footage" when referring to tape. We called it simply a recording, segment or clip (for a short piece). Because both film and tape could be physically cut u could call them clips. I used to edit by actually cutting tape
@OP Yeah, probably because video is straining its dynamic range slightly, giving it more contrast at the same settings. Film has more dynamic range, letting you "fit it all into video" or "flatten" the brightness values
In the 1990s, a lot of shows were produced on film because they were aware of the impending rise of HDTV. They knew a higher standard would be coming, they just didn't know the exact specifications. And film was the easiest way to get an HD product. This is why Warners did a lot of their television animation on film until around 1999-2000 (Batman Beyond switched from cel to digital ink and paint at this time). Basically, with film, you could rescan the negative to get a higher quality, but you couldn't do that if you just had a video file. In general, most dramas in the US were shot on film because it gave them a more cinematic look, since tape tended to look really cheap. Tape also wasn't as good at capturing color, and had a lower dynamic range. For sitcoms, there's more of a spread, but it came down to network preferences and the production houses. But still, there were example like Cheers, where they stuck with film even after considering switching to videotape. During the first season, Paramount wanted to consider shooting on video to save money, since film is expensive and the show was dead last in the ratings. But the results were terrible, so they stuck with film. And now, we have Norm in glorious HD. kenlevine.blogspot.com/2016/04/another-thing-about-cheers-you-didnt.html
Unions representing filming crews had a lot to say about it. Video shooting required less qualified workers so they put a lot of pressure on producers to shoot on film.
Seinfeld was shot and mastered on film and there exists HD transfers of it but only for tv rebroadcasts and digital copies. I want a Seinfeld Blu-ray gosh darnit
I am not sure I would recomand the HD versions since while the horizontal screen was expanded, some of the picture on the vertical side was lost because they can't fit that on HD screens fully. The DVDs still preserve what was originally intended even though it is not in HD.
@@1685Violin If they ever release it on Blu, they can offer the original 4:3 versions again. The series was already remastered in HD for the DVDs, if you believe the back of the boxes.
I have tried to explain to people for so long that say HD is better than film. And they just cannot comprehend that the original film was in the best HD it's ever going to achieve
Even if something is shot on film, btw, 8 mm is *_nothing_* like 70 mm film; the latter can be blown up to billboard/cinema screen sizes with literally no apparent loss of quality at normal seeing ranges in most people's eyes (not just the average person).
@@leocomerford Still, in my opinion, 16mm gives a much more professional "low budget" look as opposed to video, at least for movies. Brad Jones calls it "shot on shitteo," though I think that's referring more to movies using home video vhs recorders and not studio video.
I think you are overly harsh on 16mm! 720p isn't nearly even enough to capture Super8 satisfactorily. The exact same phenomenon with the music video can be had with old home movies that were badly scanned. Due to the smaller frame, the grain is of greater significance. At lower resolution scans, the grain interferes with the scan resolution and gives horrid results. People often soften up transfers to reduce the effect and so it looks like the original is of lower quality. Even at 1080, this is still not enough to capture the grain on most stocks. You need to think of transfers in a similar manner to audio, with a Nyquist frequency I feel. A 2k scan of Super8 will ensure you aren't throwing away any of the original. Without including the grain in the scan, the result is like an air-brushed model, it's also why modern digital productions can feel lifeless and less engaging visually. With film, every frame is different obviously. This has been noted and in some cases, special noise is introduced back into the digital image to make it less artificial.
There is also something called CCD noisiness or CCD grain factor for digital cameras. That describes how much is CCD sensor resistable to the grain noise (randomness in input signal). Essentially smaller number means there is less noise in the image. Of course the ISO settings alter the noise amount, however this effect wasn't intentional and it is simply result of amplifying or decreasing the input signal from CCD sensor, which already contains the noise. That's also why so many night photos are so much more grainy and noisy than in daylight. You can reduce the noise in post production, however you still need to leave some noise in order to not make the photo look artificial. Today, the noise can also be simulated in computer graphics, and the end result is so convincing that the rendered image looks like a real photo.
@@CZghost What you're describing isn't specific to CCD sensors, it exists in all digital sensors. And low noise is generally considered highly desirable in a digital camera. The reason photographers don't go crazy with noise reduction isn't that they want to keep the grain, but that heavy noise reduction also kills detail. I'm sure a lot of photographers would gladly throw away any hint of noise if it was possible.
@@23Scadu Photo without noise just doesn't quite look right. You can for sure say that there was some sort of artificial processing done, it looks fake. The noise makes it look real, even it's completely fake. Too much noise kills the details, but very little noise to none also kills the aesthetics. It makes look plastic, artificial, fake. That's also reason why people add noise to photoshopped montages, or at least don't try to make the stocks used super clean of noise. It has to sell it, it has to look real. You can obviously tell it's Photoshop, especially if it's art. But it shouldn't look like montage, it should look like photo, produced with professional camera and maybe with little enhancing effect added to it. Best screen wallpapers are photos. Why? Because there's something that people prefer over artificial stuff. The aesthetic.
There was an ill-fated set of FUNimation DBZ Blu-Rays (the "Level" sets) that re-scanned the 16mm film from Fuji TV, got rid of the gate effect, and lightly cropped any edge degradation. It looked brilliant.
As if... Lol. 8mm looks like 240p. Any wonder since the frame is smaller than your pinky nail. 16mm is ok but definitely ot 720p. Any kid from the 1970s who was watching a documentary in high school science class would be absolutely blown away at 720p. The only thing of the era that comes close are 35mm slides and filmstrips.
A year on and I’ve just got the “learning Ancient Greek and Latin” crack. Apparently the word ‘Television’ caused some offence to some stuffed-shirt classical scholars in the UK at the time.
@@robertjenkins6132 as I alluded, back in the 1920’s at the dawn of Television, the admixing of Greek and Latin to form a new compound word was considered by some to be etymological heresy. Classical Greek and Latin were considered “pure” by some scholars, (a pre-Victorian construct) regardless of the roots of those languages.
@@MorgoUK Funnily enough I know T-Shirts saying "Polyamory is wrong! Never mix greek with latin; it's either Poliphilia or Multiamory." Which of course alludes to stuff like this.
Can we talk about some of the amazing examples of TV shows that were actually filmed on...film? The Sopranos. MAN that show looks magnificent. Not surprised it was done on film. The X-Files. Bit of a mixed batch, but most A-shots were done on film while B-shots were done on tape. That 70's Show. Fuck yes. The E.R. Holy heckins, yes.
Mission Impossible looks fantastic in HD, too good in some scenes. You could see the wires holding up a "drone" in a air ducts in one scene. They wouldn't have been seen on TV.
@@default2826 I once got to the bottom of it, around the time "Dunkirk" came out. As it turned out, an IMAX film potential resolution is somewhere around 18k.
@@nostalium Indeed, although the big problem isn't the resolution... it's the film itself. A full length feature film on IMAX weighs 250 kg (550 pounds).
IMAX is pretty cool, but I'm not sure what he'd really talk about. It works in the same way that 35mm film does, just with a MUCH bigger size of film and it's shot horizontally. I guess he could talk about how the soundtrack is separate from the film print, but I mean, I just talked about it in a UA-cam comment. But if you're still interested, look up a channel called Analog Resurgence, I believe he talked about it. And he's also just a great UA-camr who talks about both movie film and still photographic film.
Nobody in the 80s was routinely saying broadcast TV was fuzzy, at least in the UK. Standard definition (625 lines) on a 14 inch cathode ray tube TV or even a larger one looked okay, it's the huge, hi-res screens people watch TV on today and the tendency to zoom in on a 4:3 picture to make it wide-screen that spoils old TV footage and gives a somewhat false impression of how it appeared to the naked eye in the 80s.
Exactly. I worked on professional VCR's in the 90's and 2000's and a 625-line studio quality image shown on a pro monitor was *incredibly* sharp. Unfortunately some detail always got lost in the transmission chain, mainly due to the way colour information was bandwidth reduced by the PAL system.
Very true, I always get annoyed when stores selling TVs use a standard definition format signal and mirror it to tons of large HD TV's, how can you show the clear image and high definition if you're blowing up 4:3 standard definition? Luckily that kind of thing seemed to be a late 2000's thing and by the mid 2010's most stores who took themselves seriously actually upped their game.
I think more than the basic resolution of the scan, it's the quality of the film scanning equipment, the skills of the person doing the transfer, and the colorist working on it after the scan, that determines the quality of a transfer.
I always remember noticing that shows like "All in the Family" had an immediacy because they were shot on tape, which everyone associated with something news-y, and shows like "Mary Tyler Moore" or "Bob Newhart" were shot on film...the fact that the shows on tape were always (supposedly) shot before a live studio audience, gave the whole thing a cinema verite feel (plus it was cheaper). But reproducing videos at high speed transfer rate gave everything a fuzzy look. I always assumed people knew this stuff automatically, but I forget people who arrived on the planet years after me don't know all the stuff I do from just being around longer. (I'm only halfway through your video but I had to stop and comment. I do this to channels I really like. I just had yours recommended to me yesterday. But you may never see my comments since UA-cam seems to disappear them before the channel creator ever sees them. TLDR I suppose) If you look at the original glass plate photos or daguerrotypes, they have incredibly sharp focus.
Except not in Australia, where imported US tv shows like All in the Family were converted from NTSC to PAL which gave them a mushy look, by far the worst look on Australian TV. Australian taped TV shows had that clarity and immediacy you mention because they were broadcast in PAL without conversion. It was amazing to me when I went to the US and saw how taped shows looked without broadcast conversion. They looked as clear and sharp as Australian taped shows.
Well, you could scan film to the point that you can see the individual molecules, or even atoms, of the film stock. That's pretty much infinite, isn't it?
It’s not really a myth though. The problem is that people confuse what is being said when we talk “resolution” Film doesn’t have one. It just is. It’s pure physics. Digital technologies have a discrete amount of information contained in their captures. A 25MP photograph will only ever have 25 million pixels in it. Full stop, end of. It’s pure math. Meanwhile in the film realm the grains of film can number in the *billions* you really are only limited by the sensitivity of the medium and the size of the negative. So while it’s not correct to say that film has a higher “resolution” than digital, it’s not a myth to say it doesn’t have infinite detail. Theoretically, we can get film grains down to the size of single molecules, such that we would be having individual photons interacting with individual grains. Practically we can only get a few orders of magnitude above that level. That is still a shocking amount of detail.
@@Mostlyharmless1985 Lenses have a resolution limit also, as they are not perfect. Just try and compare a cheap 35mm camera with plastic lens to a SLR with high end lens. Even now manufactures are trying to make "4k" lenses for cinema cameras.
I really appreciate the use of the bloopers roll in your work. Thanks for the work that you do in producing these, they're always informative and expand my knowledge of an area.
Being probably twice his age or more, I knew the answer, and was so impressed he got it right. B@W shows filmed on 35mm are stunningly detailed, esp. Perry Mason, from later 1950s.
The HD (and widescreen) versions of Friends are always interesting because, apart from occasionally spotting something that wasn't meant to be in shot, you can occasionally spot the angle that was clearly not on 35mm film like the rest of the cameras. Not sure if it's 16mm or video (I'd guess the latter).
@Jay It was always understood that you could let stuff like boom mics get into the image, as long as it was in the overscan area that wouldn't be visible on most TVs. There was a "safe area" that was smaller than the full frame. But there were a lot of mistakes anyway.
I am pretty sure Friends was entirely shot on film and that any shots that are upscaled are simply where a camera negative went missing - it was originally edited on video and to do the HD version they had to entirely reconstruct the episodes from the raw footage, and it’s easy to believe some pieces got lost. On the Batman Beyond blu-ray set there’s multiple episodes presented as upscales because they simply couldn’t find the footage
Possible I guess. They had quite a few cameras for Friends and so I just assumed it was the 'just in case' angle. That or where they needed to place the camera required a smaller camera than the normal film cameras.
Universal shot their 1980's TV shows in an open matte ratio with the intent for widescreen exhibition. I have cropped episodes of Airwolf, Magnum P.I. Miami Vice, Knight Rider in the 16:9 ratio and it looks perfect in widescreen. Apparently they were preparing ahead for the HDTV transition when the standards were still being finalized and standardized.
Right? It's interesting to find out electronic doesn't mean digital. But you know what's even trippier? Optical doesn't mean digital either. Laserdiscs were basically giant CDs but they stored analog video, as some sort of higher-quality (but not HD) VHS tapes.
tape HAS been digital throughout the TV industry for many years now. and even in 8mm domestic formats like DV based on the low end pro format. DVCAM. refer also to. compressed digital like BETACAM SX or the uncompressed broadcast format DIGIBETA.
0:46 Indeed, I was alive in the 80s and I can guarantee _everything_ looked all fuzzy all the time, whether on TV or in person. (Thought it might be related to the fact that I was highly myopic.)
Joseph Davies This x1000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
I was asking that question when watching Buck Rogers in the 25th Century on MeTv. It looked like it was filmed with modern gear. Thanks Tech Connections!
@@d3v1lsummoner Its mind blowing growing up in the 80's seeing these programs at 480 and now HD. We need a 4K MTV channel that only plays videos from then.
The first time I watched _Frosty the Snowman_ on an HDTV, I swore it looked like it had been traced frame-by-frame in a digital animation program, it was so sharp and perfect. Not so much as a visible cel shadow.
I think it should be worth mentioning that in the late '80s and early '90s there used to be a middle ground between SD video and film, which is early HD video, most notably with the Sony HDVS system. A great example of that system would be the recent Blu-Ray release of Toto's concert at the 1991 Montreux Jazz Festival (a clip of that could be seen here: ua-cam.com/video/S0QDB9FtnUM/v-deo.html ). Many people thought that it was shot on film, but actually it was early analog HD 1125i. Another example is that New York in 1993 video that Techmoan showed on his video about the DTheater DVHS, it was also shot with Sony HDVS.
And I agree with you, it does not look like film, mainly because of the fluidity of motion (higher frame rate), the trailing lights typical of tube video cameras and the lack of film artifacts. However, there is also a preconception that you can't have HD video before 2000 unless it was shot on film, which you can see in the comment section of the aforementioned video, in which there are a few people who insist it was shot on film when it clearly wasn't.
There's also Genesis on Wembley Stadium that the credits on the video claim it was shot in Sony High Definition tape, which unfortunately has never been released on Blu-Ray and no HD preview of it exists, and a video Techmoan showed of New York in 1993 that is obviously tape since it runs at 60 frames per second.
@@jonessperandio Also, it was interlaced video, isn't that what gives video from that time the soap opera effect? What are those "trailing lights"? The video clip for Last Christmas, though shot on film, seems to be largely devoid of film artifacts.
@@Myrtone Maybe... those trailing lights are a result of something like a burn-in produced by powerful stage lights on old tube camera sensors, it doesn't happen on modern CCD/CMOS sensors. As for the lack of film artifacts on that videoclip, my guess is that either it had a good restoration job or it was filmed on a good film stock... many movies from that period were filmed on trash film stock, so you can really see the artifacts when they're scanned in high resolution.
I'm 20 and I shoot Super 8 and 16mm film. Celluloid film has actually been making a comeback over the last five years or so, especially with the new generation who are discovering it for the first time. Probably more 20-35 years olds shoot film than 35-50 years olds these days.
11:34 - Hold on there! TNG was also edited on tape. This is why they had to re-edit the show when making the HD transfer. So no, Voyager isn't really stuck in SD for all times (even its CGI special effects, IF they preserved original files).
The files were not preserved. All the effects houses (Voyager used half a dozen concurrently at some points) went out of business or were bought, and their archives went in the bin at that time. They also all used different software from each other, almost none of which is compatible with software today. Even when the software still exists, conversion is necessary to update the file type to the new version. Even lossier conversion is necessary from all the defunct file-types, like loading a Word file in LibreOffice and the tables aren't lined up quite right anymore. It's cheaper to just make new models from scratch, and that's still hugely expensive. I do hope CBS changes their minds about remastering the other series down the line, but for now they've stated the TNG remaster didn't sell as well as the TOS remaster, and there's currently no plans to do the rest. Probably because the remasters showed up on Netflix and now All Access, and people waiting until each season is $15 on Blu-ray instead of the original $50 or $60. So to change their minds, they either would have to get enough mileage out of viewer figures of the remaster on All Access, or residual sales of the Blu-rays build up over time or something.
@@charmedx3219 TNG's awesome that it was recorded on film... and the remaster/reedit effect were outstanding. It's lacking in the fact it was NOT shot in widescreen! I've heard from many in the Star Trek community that THIS was the reason for the low Blu-ray sales. You can make it all pretty, but can't make the damn thing widescreen to fit our 82" QLED TV's these days! They were already spending $1 million an episode to produce, spend another few thousand to record it widescreen!
TNG was edited on tape and the effects were rendered to tape, but all of the live action shots were filmed, and virtually all of the effects shots were filmed. When they remastered it, they went back to the film and re-composited the effects shots in the digital realm, which was crazy expensive but resulted in a stunning final product. For Voyager this would be difficult. I think in its first couple of seasons there might be film elements for its effects shots, and the live action was filmed, but in later seasons the effects were done with CGI onto tape. In order to remaster Voyager, you’d have to recreate all of the CGI, which would cost millions.
A point to consider: While the vertical resolution of the NTSC standard is set in stone at 480 visible lines, the horizontal resolution can vary *greatly.* It can be as low as 200 pixels for VHS, as high as 720 pixels for DVDs, and potentially higher. It all depends on the bandwidth of the video recording (and the bandwidth capability of the playback hardware). To put it another way, NTSC video is a single line of analog data, which gets rasterized into a stack 480 segments high (more accurately, two stacks 240 segments high per interlaced frame). That stack height is fixed, but the density of data is not, and as a result, the horizontal resolution is variable.
@@MichaelEricMenk You make the erroneous assumption that pixels are always square. That's true in the digital video world because we all agreed that it is very convenient for those pixels to be square (or somewhat close to it), but in early computer graphics, horizontal resolutions varied wildly depending on the system used. Analog video doesn't have a resolution in pixels to speak of but there is a difference in bandwidth which is effectively the same thing. The vertical resolution can be fixed because that is a relatively low bandwidth (and synchronization would be very difficult otherwise) but the horizontal resolution is pretty much undetermined and depends a lot on the source material and the processing chain used.
@@Stoney3K wrote: "You make the erroneous assumption that pixels are always square." No I did not, you brought this up. Stoney3K wrote: "That's true in digital video world because we all agreed that it is very convenient for those pixels to be square" You have not read the standards I see... NONE of the resolutions in the DVD standard have square pixels, or the MPEG-2 streams used to carry standard TV digitally to the TV-antennas.
@@MichaelEricMenk "A valid horizontal resolution on 480px or 576px DVD is 720, 704 and 352." Granted, those are not *square* pixels but they are a fixed amount only defined by a certain digital video standard (otherwise the DVD format would not be able to compress it). They are always a fixed size. The point is that video only has predefined lines and predefined pixels do not exist, much like an analog sound source does not have a sample rate, only a bandwidth that is defined by the source material and not by the standard of the system that is being used.
Re Imax: I worked at Ontario Place one summer of '73 and one of the CInesphere projectionists gave me a tour of the Imax projector (serial # 001 IIRC). It used 70mm film run horizontally to produce the largest images possible with commercially available movie film. What was especially neat to me as a student of audio electronics was that the normal audio track on the film only carried engineering (timing) signals to synch it with a 24 track tape deck (which also used one audio track for synch), producing not only far better quality than was possible from a conventional film's optical sound track but a multi channel audio that allowed the producers to put the sound wherever they wanted it to be in the theatre.
I like how this channel is squeaky-clean, maybe a "damn" in the bloopers, then you show one youtube comment screenshot and it's all giant F-bombs on the screen.
I would like certain SD shows to get re-released on Blu-ray even though they can't be made "HD" because they tried to cram so many episodes on the DVDs that the quality is objectively worse than they were on TV. It's a real shame companies don't see the benefit in doing this.
Probably still be cheaper to just not cram so many episodes on each disk and use a higher bitrate. Not to mention not having to field the inevitable complaints from people who thought they were going to get HD video because Blu-Ray.
I loved Blu Ray disk up until today.l bought the new Rambo movie, would not play in Blu Ray player because I had to do filmwere update on player.i will eventually do the update,I suggest when buying movies make sure you get DVD as well because DVDs will still play,most of my Blu Rays would not untill I get update. This is not good,this reminds me of a format back in the 90s called divix that was sold at circuit city witch was a flop and forced them out of business. When they went out of business there player's stoped working because there was no more updates,this could happen to Blu Rays and there could be no more updates if they stop making player's because of Internet streaming of movies,that would render Blu Ray DVDs useless, so I'm thanking twice about buying a bunch of Blu Ray disks that may not work in near future,I will never stream as for Blu Ray disks I don't know,but I will keep all my regular DVD's, CD's and a few video tapes because they will still work. I will never give in to Hollywood's greed.
DVD and blu-ray has become a collector's market, so the appeal of having lots of episodes on one disc is not something that will get many people In addition, a few series that are in SD have been released on blu-ray, but are treated as HD files, with a nromal amount of episodes per disc the recent Classic Doctor Who sets are an example of this edit: just realised i slightly misread really a good remaster would be in order for a lot of these, but they don't care enough to do it
I love how you're REALLY leaning in to your sense of humor. That bit about "Whadya mean this sounds dubbed in..." is classic Alec. Your earlier videos were very informative, and I appreciated them for that. Your newer stuff is YOU, and I appreciate them even more. Happy (Last...cough cough) Christmas and New Year. (Good job not mentioning them in the video, BTW.)
@@petercarioscia9189 I think a very good indicator of his personality can be found within the outtakes he includes. I would venture to say that the personality and quirky sense of humour we see is pretty close to the original article, if not slightly amplified for effect, as youtubers are wont to do.
OK, here comes a pinned comment about the whole Star Trek thing (and I've edited this to give a little more clarity);
I'm getting a lot of mixed information from commenters about what specifically is preventing Voyager (and Deep Space Nine) from getting an HD Re-release. Some people are saying it's just a money thing, and that Paramount couldn't recoup their investment (as learned from their experience doing the TNG re-release). TNG was, after all, edited on tape so what is the substantial difference?
Yes. *TNG was originally edited on tape, not film.* I implied in the video that there was a "final cut" on film but this isn't the case. From my research and understanding, as helped along by others, the key difference between TNG and Voyager when it comes to making an HD re-release feasible is that while *some* of the VFX work on TNG was rendered to tape, this was largely things like phaser effects, planetary orbital scenes, and other minutia. Most shots of the Enterprise were done with models / practical effects, and as such they existed on film from the get go, meaning they existed with the full detail film can capture. So yes, the version we saw upon its initial release came from a tape, but unlike Voyager (and DS9), a much greater percentage of the original effects footage existed on film.
This meant that to re-release the series in HD, scanning the film negatives took care of a large portion of business. It largely needed to simply be re-edited to match the original cuts and timing. The most complicated part of the process was re-compositing some of the film-based effects. But since they existed on film, it was mainly a matter of re-alignment. Relatively few VFX elements were done exclusively on tape, and so it wasn't a monumental undertaking to recreate them for the HD release.
In contrast, for Voyager and (to a lesser extent) DS9, CGI was used _heavily._ And since that was rendered to tape and composited with the live-action scanned from film, the negatives (assuming they still exist) are just the live scenes and nothing more. If a re-release were to be done in HD, a *much* greater amount of work would need to be done. DS9 doesn't have it quite as bad as Voyager, but it would still be more complex than TNG. Some of the original computer models have been found which would help with recreating the VFX scenes, but in many cases the production houses that did this work went out of business. Which leaves many, _many_ scenes where the entire thing needs to be re-done from scratch. And given how expensive this would be, it's likely never to happen. So no, it's not impossible that we'd see DS9 and Voyager get re-released in HD, but very unlikely. The extra complexities required would add tremendous cost to an already expensive endeavor.
Technology Connections Meanwhile, I’m rewatching TNG n Netflix, the compression artifacts when you see the establishing shots of the Enterprise on a star field are atrocious
You didn't pin it. Maybe UA-cam unpinned it by itself.
Edit: It's pinned now.
Consider, running contemporaneous to DS9 was Babylon 5. B5 famously used Amiga computers running Video Toasters (in the first couple seasons, along with SeaQuest DSV), and that stuff looked waaaay better than what Paramount was doing with DS9 and Voyager. There seems to have been a optimization done in the CGI work that boils down to "how much effort do we REALLY need to spend here? It's going on broadcast TV." DS9/Voyager was fine for TV and VHS, but the models would look like traaash if rendered in modern resolutions. You'd need to rebuild every CGI shot for 2x 7 seasons x 26ish episodes per season.
Also while DS9 was the tits from S1E1, Voyager needed until like S4 to get moderately passable.
I just posted this reply to another comment, but yes...
Here's the TLDR from treknews.net
Basically, TOS was all mastered on film, so they could literally just rescan the master print for each episode and be done with it. They did redo VFX with CGI for the remaster because the film compositing done in the '60s wasn't great and led to muddy shots that looked much worse compared to the HD scans. I believe they put both versions on the TOS Blu-Ray release, which is a nice touch.
TNG, DS9, and VOY were all shot on film (though I have heard that there's some issues with the film stock used for certain shows and even specific seasons, such as season 2 of TNG), but they were all composited and edited in video. There are no master film prints of finished shows like in TOS. In that respect, the three modern series' are all the same. The difference however is that TNG used very little CGI when originally produced, but DS9 and VOY did.
So in all three modern shows a remaster would entail the same scanning of the original raw film footage, re-compositing all the scenes and in-camera effects (assuming the plates are all still usable), and using modern CGI to recreate the VFX from the original video editing. Recreating the CGI from DS9 and VOY is non-trivial because a lot of the original assets are lost, either due to backup failures, deletion to save drive space, and/or closure of the production houses that made them. Even if the assets were available, simply re-rendering them at HD resolution would make them look cartoony at best, and a jumbled pixellated mess at worst. That worked for South Park because it's already a cartoon, but you can still see in their re-rendered early seasons some rasterized assets (like the poof ball on Stan's hat) that are jarringly pixellated. I think DS9 suffers the worst from this because there are some extensive space battles with hundreds of ships, all of which would need to be basically recreated from scratch. That could make remastering one episode of DS9 cost as much as the entire remaster of TNG.
As for 4x3 vs 16x9, just leave it like it is. I think in the case of TNG it was mostly filmed in 4x3, so to get 16x9 you crop out a lot of the top and bottom of the frame and can lose important parts of the scene, or try to rebuild the sides of the set in CGI. In some cases they did shoot on wider film and did some pan-and-scan, but then you risk seeing the edge of the set, lights, tripods, wires, and other crap that then also needs to be digitally removed. All of this would add yet another layer of cost, so just accept the black bars, they won't hurt you, trust me.
I got halfway through the video, paused, scrolled down to correct the hell out of you, and was happy to see this comment instead. You have it right now!
If you go and watch the special features for the blu-rays (plenty of the stuff can be found on youtube), you get a pretty good breakdown of the hows and whys of it all. Further discussions of the possibility of DS9 getting remastered over the years have had all the inherent problems with it and all the CGI brought to light. Also cases of VFX artists who worked at now defunct studios popping up out of the woodwork saying "hey, I have some of those scenes on a hard drive somewhere!", but it's really a lost cause, there's too much to rebuild. A recent DS9 documentary had some select scenes remastered to show what it could have been like (involving a fanmade CGI recreation of a battle scene).
(note: there's very little fan outcry for HD Voyager... Voyager was considered a testbed for using CGI so it's riddled throughout the show from the outset. And while it's a fine enough show and has its fans, it's not lauded with acclaim like TNG and DS9)
What's interesting to note is tha they were considering going full CGI from the start of TNG. They even had VFX houses do tests. They mostly look bad but some shots look actually kinda incredible for 1987. If that had happened, TNG would have been a lost cause too! Thankfully they didn't go that route because not only did it not quite look good enough, but also the CGI industry was tiny, unreliable _and_ unstable at the time.
Monty Python even made a joke out of it:
"Alright, stop that sketch. It got too silly."
"You can't stop it - it's on film!"
"That doesn't make any difference to the viewer at home, does it?"
I loved that. and how different they looked on film (plus their voices were clearly dubbed in post, another difference from that kind of filming)
R.I.P. Terry Jones :'-(
"My God! This building's entirely surrounded by celluloid"
Bloody hell. That's (almost) exactly what I wanted to post.
"Oh my god, I'm on film!"
What do you mean "you can't stop it - it's on film". Is film unstoppable or what?
Monty Python not only used both as you mention, they did jokes about it. "This building is entirely surrounded by film!"
Oh, THAT'S what that sketch was about, okay.
Came here to mention the Society for putting things on top of other things.
Yeah, the Monty Python guys were very meta.
@@dominateeye I am pretty sure that sketch was specifically what taught me the difference between the appearance of video and film. I'd noticed the "soap-opera effect" before, but until that moment, I didn't know what it really was.
"Good lord, i'm on film. How did that happen?"
As a kid I tried to identify what made soap operas look so weird to me. I never could put my finger on it. I asked other people and they didn't know what I was talking about. I thought it was the lighting. Now I find out it was the frame rate I was reacting to. Thank you for solving a mystery of my childhood.
And studio lighting. I want to say Vox did a video on it. I think it was called something like the sitcom look
As a digital native, I can't wrap my head around what people mean by he soap opera effect and why it would be considered a bad thing. TV shows, sitcoms and the like never looked weird to me, but natural, whereas the stuttery or blurred movement in movies sometimes gives me a slight feeling of motion sickness or headache.
Maybe It just comes down to being used to something?
Cause our eyes don't have a framerate, nor does natural movement, so when it comes to a (virtual) reproduction of it, more should be better...
@@LRM12o8 It may have something to do with movies having a slightly altered look to them that makes them visually appealing, a bit like slightly slowed down motion looks cool, while slightly sped up actions looks almost comical. The higher frame rate of video makes it look starkly "too real" but with the odd generic lighting, daytime TV always looked unsettling to me, a bit like a visual version of the Stepford Wives.
As a video gamer, I got to find about this longer while ago, but it was still somewhat of a weird revelation to me. Not only that but while on PC I always like higher framerates (not that I could get more than 40 on my Pentium 3) - for TV/Film I always preferred or were just conditioned to prefer the framerate of film based media. I am not sure if 24 just hits some kind of sweet spot in style, roughness, smoothness and everything in between.
Still, these things would funny enough come back with gaming and being transferred from there back to (Mostly just Japanese) Animation with Guilty Gear Xrd. For those less in the know, this is a fighting game that opted to present look of hand drawn (Japanese) animation from entirely 3D scene character models included. One major thing was purposefully turning off modern interpolation technologies (on the fly generation of maximum amount of frames to an animation as current frames per second allows) and doing every "frame" of the game by hand. Most interestingly, like said, this actually transferred over even to animation industry as some people from that same game development team left the company to find an anime studio that uses 3D and same techniques for animation one recent more known things being "Beastars" TV anime.
The framerate conundrum especially for animation and movies etc. is fascinating. Even in Fighting games some games that tried to copy Guilty Gear while skipping steps ended up looking cheaper and "unnaturally smooth" while smoothness is actually the natural state of anything moving. We are just real weird.
@@SumeaBizarro hm, well I'm a gamer myself, but I remember that even before that, as a kid I've already sometimes noticed stuttery cameraflights (mostly in documentaries) and got frustrated with fight scenes that were so blurry that I could hardly tell what they were doing in cinema movies.
Ever since I learned about FPS and got used to gaming at min. 50 FPS, I often get a slight feeling of motion sickness from movies.
I don't think that there are any averse affects of high framerate for the image quality or the feeling of a movie, but that Hollywood simply managed to convince people otherwise, because they didn't want to upgrade for some reason. I mean there surely must be a reason why TV adapted higher framerate cameras, as soon as more than 24 FPS was possible. And I can even understand that Hollywood didn't upgrade right away, cause at that time storage was probably too expensive. TV stations didn't have to store their video footage, they just broadcasted it live once and that was it. But before storage for HFR movies was feasible, Hollywood discovered the option of hiding lackluster acting through the limits of 24 FPS: just make your kicks and punches faster than the camera can clearly capture them and nobody will notice, if the fight choreography doesn't look convincing in reality, that's my guess.
I mean, if you watch old action movies from Bruce Lee or Jackie Chan, they're fight scenes are much slower, and there's no blurriness in the motion. They slowed and perfectly timed their moves to hide the technical limitations of their camera equipment through their acting, not the other way around!
But at some point, demand for faster and faster action rose, to the point it became too fast for the cinema cameras. But instead of upgrading the cameras, Hollywood pulled a Jedi mind-trick on people, telling them "You don't want higher framerate and sharp high-speed action scenes", that's what it feels like to me.
The reasons Hollywood people bring up against HFR don't make sense to me. Apart from the "soap opera effect", the most notable one I've heard is that it drive storage costs for the movie footage too much.
Sure, 48FPS or 72FPS would double or tripple the storage requirement, respecively.
But that didn't stop Hollywood from going from 1080p to 4K (which is a good improvement), *quadrupling* the amount of storage space required and now they go from 4K to 8K (for diminishing returns, imo), quadrupling the storage requirements again! Also HDR adds a significant increase, as well.
So storage costs clearly can't be the issue!
I agree that for animated movies, it's not an issue, they're fine at 24Hz. The issues I have with 24Hz movies don't apply to animated movies, because there's no cameras involved, every frame is a perfectly sharp hand-crafted picture, so there's no blur, unless it was intentionally added and I guess the fact that the drawings, no matter how good, are never photo-realistic prevents my brain from expecting fluid motion and thus getting motion sick (just a theory).
I've actually just watched a video about why interlaced 60Hz animations look terrible compared to the original 24Hz animations it makes total sense to me. the tl:dr is that "movement" is carefully timed to give it purpose and expression, and the AI interlacing programs destroy that, making the animations feel bland and lifeless, because Computers can't emulate the imperfection and irregularity of real life.
That also means, if an animator was to animate something at 60FPS natively, hand-crafting 60 frames per second, it would take awful lot of time, but it would look just fine. It's not the framerate, but the way it's achieved, that makes the difference in animated movies.
This is the video I'm talking about: ua-cam.com/video/_KRb_qV9P4g/v-deo.html
"Or IMAX and you should just give up trying" LMAOO I used to be a projectionist for them. Its HUGE.
I didn't really get that comment, does he mean you should stop trying to determine the resolution equivalency of imax? like why bother releasing an old imax in 4k when we can do it in 16k in a decade?
I'm also a former projectionist and I totally lost it at that remark!
@@jonaboy3 I think he was saying you should give up trying to determine the resolution in order to scan it -- because the resolution is just way too high to edit digitally (and maybe to capture to begin with). Editing 8K video is already too much for many prosumer-level computers. 16K would be four times that! Most computers would just choke.
@@GregConquest 8k is alrdy the standard for many pro producers even UA-cam, see linus techtips as an example. So doing even higher res-es(?) on scanning nowadays for those deeeeeep pocket enterprises shouldn't be a problem at all... Whether or not its worth it (due to the fact that why would you produce 16 or 32k video now that ppl wont be able to consume properly/or give you a decent advantage over other production methods) acting as a limlting factor for such endeavor being accomplished at all, is a different matter
@@TheLiasas Pro producers don't use prosumer-level computers. And I was explaining why 16K would be prohibitively expensive. You missed the point on both.
Regarging early tv: I think the most amazing thing was that they would shoot live TV-shows live for the east coast, radio the live signal to the west coast on a non-standard frequency, film the broadcast, develop the film and then broadcast a projection of the film live to the west coast because they had no way of time shifting otherwise. TV Studios would use more film than actual hollywood at the time.
And thank god they did that. Because these recordings are the only thing left that survived early TV times.
Well in addition to AT&T microwave links, AT&T also used coax cables. Yes there were coax cables running hundreds of miles across the country tapping off in each major city. A single cable could carry hundreds of phone calls or one TV signal. So you can imagine the hourly expense to rent time on the cable or microwave link.
@@andydelle4509 You can thank Ma Bell for that one, nobody ever gave then enough credit after the '84 breakup.
@@andydelle4509 True there.
@@andydelle4509 In the old "Bell Telephone" days the phone was the ONLY utility we had that never failed. Electricity,Gas,Water all had occasional outages, But from the 60's to the '80s I don't remember a single time our phone "went out"! it was a seriously over built sturdy system.
”Photography seems like a very trivial thing today but it wasn’t so easy for the vast majority of its history. The goal of this video isn’t to go through the entire history of photography”
That’s not very Technology Connections of you
“so we’re gonna fast forward to about 1900”
Ah, there it is.
I read this as he said it.
Only said what was necessary! No need for all the daguerreotype and wet plate stuff
@Jack Wiegmann I'm sure that's for another video
@@freewilliam93 as did I!
Always watch this channel with subtitles on! You never know what hilarity you'll miss otherwise!
Since you mentioned Monty Python, they once did a sketch where the characters suddenly realized the building was "surrounded by film". They kept trying to escape, but every time it cut to an exterior shot, they were on film and retreated back inside in panic.
Man, I never got that sketch, gotta rewatch it now.
Yep came here to say this.
On a related note:
Since you have been making videos for 15 years, I have noticed that your early videos, and many other early UA-cam videos, are currently not available at resolutions greater than 240p, but if they were originally filmed at a higher resolution, could they be re-published at a resolution more like the one at which originally filmed?
I also wonder if the practice of filming past content at a higher resolution than it was shown on television was more common in Germany (and maybe other Germanic countries like the Netherlands) than in the English-speaking world given than these people and known for planning ahead.
@@Myrtone If the source video originally was a higher resolution, or a higher resolution version can be made, one could publish a new version of the video in higher resolutions. The issue is that UA-cam won't let you alter an existing video post with a different video. This is to prevent fraud; someone posting a video that goes viral and then changing the video to an ad to capitalize on all the views, but it means that the only way to do a higher resolution is to make a new video post.
So people can either delete the original video and post a new version, or leave the old video up and post a new version. The issue with the first one is that you'll break anyone who previously linked to that specific video (think of all the rick-rolls that would be broken if that original video was deleted). You'll also lose all the views associated with that original video, which could affect your visibility. As for leaving the old one up and posting a new version, that's actually what happened with the Wham! Last Christmas video. The original was posted 10 years ago and has 422 Million views, and the new 4K version has 8.7 million views:
ua-cam.com/video/E8gmARGvPlI/v-deo.html
ua-cam.com/video/bwNV7TAWN3M/v-deo.html
Sometimes you'll see people post links in descriptions, or add cards that link to the newer, better version. However, even then the new version may not get many views. I think the Wham! video is getting so many due to it being a perennial Christmas classic song, and people are primed to want to view the video this time of year, even though it's so old. For most older UA-cam videos, even if it's only a decade old, most people want to see something new instead of something they've already watched, but in HD.
@@marsilies In that case, a new video post is indeed what I'm suggesting. I'm sure there are plenty who haven't yet watched Andrew's early videos.
Film shoots were incredibly expensive. So when shooting professionally, there was a budget for your stock and how many feet of film was going to be developed. This required more forethought, planning, and actors who could nail their lines in a minimum number of takes. If you were shooting excessive numbers of takes or angles, you probably would go over budget. Today, storage is dirt cheap and getting a viewable image is instant... rather than putting all your hopes and prayers that what you shot was on a good batch of film and nobody screwed up the process along the way.
The same holds true, to some degree, with music. It was expensive to record music. It was expensive to shoot film. But here we are... people making videos of what they ate for lunch.
"First time listening to...insert something here"
that's what makes Queen's concert in Hungary in 1986 really impressive. the whole concert was shot in 35mm film with 16 different cameras, and it was all thanks to the hungarian government. they hired the best cameramen in the country. and it's why we have the full remastered concert in HD on blu-ray right now
I have yet to post my lunch eating to my channel, but I am always searching for good ideas. 😄
And at least back in the day, there was a infrastructure for it. If a movie had a decent budget, it was pretty common to have a filmlab on set so that the film could be processed within a day and the director could just look at shot the next to determine if they needed to do reshoot it or not. Plus with that budget, it wouldn't have too difficult to order for more rolls quickly. And even on a low budget, there wasn't too much of a trouble to find a filmlab in your local area that could process your movie quickly. Now filmlabs are rare and expensive so you need to put extra money to process your film. And even if a movie has a big budget, the infrastructure is gone so getting more rolls of film is harder, more expensive and chances are that you can't have a filmlab on set.
@@Naltrex Queen live in Montreal 1981 was also shot on film.
Dynamic range is the easiest way to tell between a film source and a tape source.
i so expected this to be covered.
it's what really sets old videos apart. that blown out digital look... ugh.
Up until 10 years ago.
Until HDSLRs and DOF adapters came along, depth of field was the easiest way to spot that something was shot on 35mm or larger film. If you had a terrible film to video transfer the dynamic range could be horrid.
I would like, but..
I'm assuming dynamic range was better on film?
Today I learned the "soap opera" effect is a real thing. Growing up, I always noticed this... odd fluidity with soap operas. They all flowed so smoothly, and everyone I tried explaining it to didn't understand. I thought I was crazy until this video.
It's totally a thing. Many dont see it at 60fps though, but it gets really bad with motion smoothing
Yea looks terrible
and today we have TV's that by default "upgrade" all inputs to have the soap opera effect. I sometimes find it painful to watch other peoples TV's. Especially when that is turned on AND they are viewing the image in the wrong aspect radio and for some reason dont notice that everyone looks squashed
@@dlarge6502 i hate that post-processing so much, it makes me queezy
It always made me feel uneasy
someone should start a streaming service with a similar premise to what GOG did. "We'll stream your old movies and TV that you've forgotten about and aren't selling, a cut of the revenue going to you, and if you let us borrow your film masters we'll cast it to 4K at no cost to you!"
No
I'm just messing with you. I've never even been there before.
FOF: fantastic ol' film
The point of GOG is that you own the games that you buy, so streaming isn't really similar to GOG.
Anyway, scanning, cleaning the scans and redoing audio is a VERY expensive process. For this reason, this idea can't really happen.
The Criterion Channel, it's like the Netflix of older 4k film transfers. They usually try to collaborate with the directors or cinematographers of various films to make sure the quality matches their vision and final cut of the other released formats and theatrical cuts. You can also buy their Blu-Rays from The Criterion Collection, which they jam-pack with special features, commentaries, and other behind the scenes stuff. Other companies worth mentioning too: Arrow Video, Eureka, and BFI.
That Apollo 11 documentary that came out last year had some absolutely AMAZING looking shots from the 60's, and not just stuff like the launches. Much of the footage from "behind the scenes" looks better than anything released today, it is so much more natural looking it's like you're actually there...it actually feels odd at first.
It's also a great example that (assuming the proper specs) digital technology can recreate the original analog signal perfectly, both video and audio.
Yup, it´s the initial sampling that matters. The SloMos from the launch pad of the Saturn V lifting up are gorgeous. I could watch that for hours.
Hey, do you have a link or search term of some kind?
@@anastasiajcrowe Apollo 11 2019 film
Welcome to the magic of 70mm film, the quality is crazy good.
My father also got some digitalized very old pictures in high resolution of the place where I live but like from the 60s, 70s and whatever all in black and white and they have incredible amount of detail that I can zoom a lot on stuff with great quality, I'm guessing the original government photos were also in a big size.
It literally took decades for the digital camera to get anywhere close but at least nowadays almost everything even cheap has OK quality greatfully and good stuff has amazing quality.
As a photographer who uses mostly film, I get a kick out of people who are surprised how beautiful and rich film can look. (Not all the time, but if its processed, stored and displayed properly.)
Yep. People are often mind-blown that my photos are taken on a cameras from the 60's and 70's
using film in the last decade, for stills photography, is all but redundant. 50megapixel + cameras are easy enough to get and so much more editable. 20mpx is probably more than film can practically match anyway.
@@southerncharity7928 It's not redundant, because it has a different look to it that's almost impossible to recreate digitally. You can get fairly close, but it is still detectable most of the time.
@@southerncharity7928 While modern high-end digital photography is undoubtably superior in terms of quality, there's something about film that just can't be recreated. It's almost like it a 2.5D image with a tiny layer of depth that you just can't get out of digital.
@@Moxtrox if there is a perceivable factor it can only be viewed by yourself, in that as soon as you scan it digitally, any advantage is gone
"It looks like it was shot on the tape" I can totally feel that sentence... Though we cant techincally explain the difference... Our eyes do catch it ! Perfectly explained.
While on the surface the sentence is a tautology - floor is obviously made out of floor, it's true - tape has a look to it that film doesn't, and vice versa.
It is a perfect example of how our perception is split into several part. The final processed image mostly removes the difference, since it is not true information that needs to be presented to our brain directly. But the imperfections are recorded and analyzed by our brain too, and if needed, it can tell, this is probably taped or filmed or whatever, but in most case, it is just discarded as irrelevant information.
We CAN technically explain the difference. It's motion blur associated with 25-30 FPS and lack of it in 60 FPS. It's really obvious when once you understand it: ua-cam.com/video/oaaBOu086Xg/v-deo.html
@@MrUserasd tape can record full digital broadcast DigiBeta or it can be crap ..but if it istransmitted on 'analog' tv band it is NOT 60 FPS!
@@bloozee But nevertheless it has less motion blur.
Wow. I’m a (retired) adjunct professor of photography and your explanation of visual recording media was perfection! Your jumps back and forth between film to tape to digital was concise and specific - and coherent and even accessible! Amazing. Brilliant.
And… thank you!
Yeah, as a photographer who went back to film, I’d agree he pretty much nailed it.
Here I am, at 1AM watching a video on stuff that I mostly already know, AND YET how could I resist? This guy could read the Cleveland phonebook and he would have my undivided attention.
You're so right. The way he explains things in technical and layman's terms at the same time make his videos so great and the topics are great.
I always learn something
literally me. It's 3:30am and I already knew most of this, but hearing it presented by him makes me feel more intelligent lol.
that hit me - here i am 1:26 am gotta be at work at 8 am lol
But he lives in the Chicago area.
Ha you think 1am is late? It’s 5:09am for me and I haven’t slept yet!
Star trek TNG was mastered on tape. To create the HD version you're talking about, they had to dig out the shots, re-edit it together, and redo the special effects from scratch digitally. My understanding is they are currently in the process of doing just this to voyager now.
Gregory Parsons Not possible with Voyager, apart from maybe the first couple of seasons. Voyager went to using CGI and that was all rendered to tape, not film. So unlike TNG, where the effects were shot to film and then composited onto tape, there are no film elements for Voyager after the first couple of seasons, other than the live action shots.
They’d have to recreate all of the CGI for Voyager, which would cost millions of dollars. Not gonna happen.
another commenter pointed out that apparently the remake for Voyager is cancelled due to an unfortunate lack of demand, apparently other star trek remasters didn't reach sales expectations.
@@sunspot42 it could, a few million dollars to make millions. All of the non filmed effects of TNG had to be redone (lasers, matt paintings, compositing) at the cost of millions. They intended to do it for voyager too, but it seems the blurays of TNG didn't sell well enough. 💶
@@lonesnark Let's be clear - virtually all of TNG's effects were filmed, along with the live action. The effects were composited onto video back in the day. In order to produce the HD remaster, they had to scan in all of those film elements - live action and effects shots - and recomposite as needed. Some effects shots would have half a dozen or more film passes, so this was a considerable effort, but virtually all the bits and pieces existed on film.
For Voyager that simply isn't the case. After the first couple of seasons the effects were done direct to video via CGI. There are no film elements. All of those effects would have to be re-rendered in HD, and I suspect many of the original CGI elements don't exist in any form - they'd have to be recreated from the ground up. That would cost well in excess of ten million dollars for the entire series. There's no way Paramount is gonna blow that kind of money on Voyager - they'd never recoup their investment. Heck, they didn't make back the money they blew on the far simpler HD remaster of TNG.
@@sunspot42 TNG Effects had to be partially re-done digitally as well since they also were only done on the Video Level. But yeah model shots would be an entire different thing. Also why we probably won't be getting any HD release of Babylon 5 any time soon.
Some episodes of “The Twilight Zone” look very, very different from others, and I think it’s because at some point in production they switched from film to videotape. I’m only guessing that that is the reason, though. Edit: Wikipedia confirms that 6 episodes of season 2 (1960) (The Lateness of The Hour, The Night of The Meek, The Whole Truth, Twenty-Two, Static, and Long Distance Call) were recorded on videotape as a cost-cutting action, but because it didn’t cut costs enough and it was nearly impossible to edit, it was never repeated.
Also different photographer styles were present. Like at some point in the show they wanted to move the camera around more.
Also old videocon imaging tubes back then had serious problems with contrasting creating a need for different elaborate lighting and makeup techniques to get a decent product. All time consuming and costly. Not really cheeper to produce a vastly inferior less durable product.
@@michaelshultz2540 When the Twilight Zone were done on tape, the video cameras used "image orthicon" pick up tubes which produced dark halos when photographing bright objects such as candles. The vidicon tubes that came later ('70s) eliminated this problem..
@@michaelshultz2540 Very good comment there.
@@TheOzthewiz I think those tubes only really started to get traction in the 80s, as many music videos and even live broadcast sports games (such as night games of US baseball) are clearly the older type. In fact I think even some footage from the 90s has a bit of the effect.
One of my favourite examples of film vs tape was Doctor Who.
Most if not all serials of classic Doctor Who as mentioned were shot on tape in the studio and film on location, like most shows in the 70s.
But with the 3rd Doctor's first serial "Spearhead From Space" because of a strike at the BBC, the entire serial was shot on location with 16mm film and because of that, it was the first classic serial released on Blu-Ray.
I believe in the days when this mattered the BBC considered 16mm equivalent to SD and 35mm equivalent to HD. this was before 4k was a thing, so thanks to them for doing a thin they didn't see a point in doing on paper. the Spearhead from Space blu ray shows you could get an HD image from 16mm
On a side note. A small theater near where I grew up continued using a film projector until about a year ago. I was pretty cool going into that back room to see the projector. That thing was a work of art. I'm more surprised that the major studios still sell film versions of their productions.
We had a little local theater near us that used film until 4 or 5 years ago. I used to go there just for the novelty of watching a movie on actual film. Then they upgraded to digital. And not a great quality projector at that, because I was able to see the pixels sometimes. I ended up just buying my own video projector and now I watch movies at home. No point in paying high prices just to watch someone else do video when I can do it myself. Film had a novelty to it, and there was more cost/effort to produce the physical media & project it. Video I can watch at home, and it's cheaper for me in the long run.
@@chuckheider9938 That's what the one near me did. I haven't been there since they "upgraded" the projector so I can't speak on the quality. They completely renovated the place as well. It was definitely that old school aspect that drew people in, the tickets were always pretty low priced, but the movies didn't come out until about 1 1/2 months after they debuted.
18:53 Ok, I know I'm 4000 comments late to the discussion. BUT, as my dad is a huge 16mm collector (probably has close to 1000 films at this point- and if you ever want to cover the original Kodacolor lenticular color film, let me know!), 720p is not enough for 16mm film, unless you're talking about the EARLY 16mm stuff, pre-1930. (16mm film released in 1925, but you just do not find any film from 1925 for some reason. I don't know why, despite having a camera, possibly two released in 25).
16mm scans should be 1080p minimum, with 2K resolution (i.e. 1440p) being the best "mid-range" option. (Do note that 16mm film scanners for the home are still stupid expensive, with the cheapest being about $5000).
But really if you see this, please let me know. I'd love to help out with a video on the first consumer motion-picture format, and I can give you access to SOOOOO MANY resources.
(I know you said do the best you can, just wanna clarify. Also, at 19:13, 16mm films degrade A LOT- particularly the earliest kodachrome from 35/36, and almost all ektachrome from the 60s-80s. Turns either red or purple. And this is totally ignoring vinegar syndrome)
@@emeryththeman man, great job breaking it down in depth! Personally I think 4K is optimal for 16mm transfers, I feel like someone who says 720p hasn't actually seen 16mm film in person. That should capture most of the grain and make it actually look as it does from a projector.
Yeah. I agree. I've seen higher quality 16mm and there is even a difference between 2K and 4K scan. Although technicians when doing back of the napkin calculations say it was about 3K - for good quality 16mm film. 35mm is more like 6-8K if not higher (there were very good quality films used sometimes). 70mm is probably 16K range at least. Although I haven't seen personally scans of those (and only 1 or 2 of 35mm) so I don't feel like an expert.
I am pretty sure the Wham! video could be released in 8K - but that's a bit too high for UA-cam. Or at least was.
You're lucky! And yeah 2K in home scanner are still 5-10K$. You can get your film scanned by professional working for the cinema industry's
The Walking Dead & The Shield are good examples of how good 16mm film can look. Both of them wanted grittier looking shows and presumably the cost benefits of shooting on 16mm instead of 35mm.
This. So much this. 720p is just barely enough to capture Super8, should always be digitized to 1080p.
Here in Australia we had an extra issue with the quality of taped shows. Because US programs were NTSC and the Australian TV standard was PAL, the conversion would make US taped shows look mushy. These were usually taped studio shows like All in the Family or all the soap operas. Filmed shows, like say, The Addams Family and Hogan's Heroes looked much better. Australian taped shows, like the studio bits of Division 4 and variety shows like the Paul Hogan show, were also much sharper and cleaner than US taped shows. (Those Australian shows also used film for outdoor shots, and that was noticeable in the same way as your Monty Python example.) I was amazed when I visited the US to see how taped TV shows looked so much sharper when broadcast in their original NTSC formats. Until then I thought American TV was really poorly made, and not degraded in the broadcast format conversion.
Same here in south Africa, for the same reason. However a lot of the home made content was shot on 8mm and 16mm film, because the stock cost was a lot less than 35mm, leading to the film actually being equal to the broadcast video, though you can see the old broadcast tapes ageing with the reruns, they are a lot softer than any telecine conversion, even using the old hopping patch telecines that they used to have. As well a lot of the archives are from home recorders, as the broadcasters would recycle tapes a lot, as it was expensive, and nobody was really archiving anything other than the film stock.
But it's also true that PAL has 625 lines per frame and NTSC only 525.
@@dbeierl but pal had a 25fps and ntsc was 30fps so pal can be a lot blurrier
@@rty1955 Of course! Now I understand! That's why my 24megapixel Stills camera images with their lousy 0fps framerate look so mushy compared to the crystal clear images from my 1megapixel, 15fps 10 year old webcam! Thanks.
This is spoofed in the UK TV series "The Day Today" where the news reports from American channel CBN are processed to exaggerate the effects of poor NTSC to PAL conversion (examples clips can be found on UA-cam if you search for "the day today barbara wintergreen").
"And we were all having fun watching these amazing movies at the cinema, enjoying our popcorn, socializing with the neighbours"
This line hits different in 2020
I agree, very different.
A much simpler time, really.
Yeah
The Before Times
"And then a little thing happened called 'television'."
Look at the trailer for the 4K version of 2001: A Space Odyssey. It’s from the 1960’s and looks amazing today.
@@AzathothsAlarmClock I read the book when I was 12 and saw the movie when I was about 17.
I did not fall asleep but was not sure what to make of it.
@@AzathothsAlarmClock See 2001 on a proper large cinema screen (preferably watch the 4K restoration by Christopher Nolan) and you won't fall asleep. It was never intended to be seen on a small screen, you really need to see it on a huge screen that fills your entire field of vision - when you do that, suddenly it all zips by.
Well thats no surprise as it was filmed with 70mm celluloid.
The Prisoner transfer for Bluray was awesome. One of the downsides here, is that even when they did use film, sometimes you didn't notice how out of focus things were. And sometimes they do a better job of correcting for noise than others. I never noticed how out of focus Ghostbusters was, but they made it worse by not properly accounting for digital noise over the top of that. It's why it's often best to scan the same image multiple times as the noise and sampling errors will tend to vary a bit, allowing for a cleaner scan.
Yep 70mm looks even better than 35mm when given a proper 4k transfer
“Millions of people will learn Latin and Greek from their televisions”
I see what you did there! Very nice
^ This guy gets it.
@Trey Stephens "Tele", Latin word meaning "far away", and "vision", the greek word meaning "to see"...
@@djsherz That's not quite the reference I was making, but if the writer of that line were hinting at that I would be A) not surprised and B) a little blown away by the subtlety there!
@@TechnologyConnections I believe you may have been referencing the early rationale for television i.e. learning foreign languages. Well, we got PBS as well as the Gong Show so it's all good.
Nonsense. Everyone knows the one true language to finally unite the world that everyone will speak in the future is Esperanto...! /s
When I got into broadcast TV in 1976, we used 2" quad tape machines to record sporting events and save highlights. Our instant replay/slo-mo machine was a set of polished discs that could record up to 30 seconds. The replays we wanted to save were immediately transferred to the 2" tape machine to free up the disc recorder for another replay. We had to rush to get the transfer done, and often would miss the next hilite if it happened before we got the disc recorder back into record. The entire machine was contained in 3 heavy, metal boxes that stacked one on top of the other. We had one tech assigned just to that machine to keep it in working order throughout the broadcast. It was always "fouling", requiring re-polishing of the discs. It was a fiddly, quirky, and maintenance-intensive device that was as big as a small wardrobe/armoire and weighed hundreds of pounds. It was the Ampex HS-100. I bet you'd like to investigate it! Here's a link:
ua-cam.com/video/cW7jvmoLQ7o/v-deo.html
A video disc is mentioned during the 1977 Bathurst 1000 ( a motor race in Australia) by the commentators and always wondered what that was.
Ah, the days of Big Iron. I volunteered at the local cable TV center (back when community TV production was a thing) in high school in the late 80s and got to play around with honkin' big switchers and cameras on shoots. These were the days when movie production companies would send out packets with 8x10 prints from the movie for use in movie reviews, and you "digitized" them by putting them on a rostrum and pointing a camera at the things.
Here is the clip that mentions the videodisc
ua-cam.com/video/8z9zjdtvtL8/v-deo.html
I really appreciate that Alec is willing to put his gaffs at the end. Lol makes for a little chuckle and shows he isn’t afraid to look anything other than perfect. I’m really starting to like this fellow.
Im glad someone explains why some shows had the "soap opera effect." I always have to explain to people why vhs videos on youtube are sometimes in 60fps.
I don’t know how it’s so hard for people to understand.. me and my sister noticed and figured it out when we were kids. I gotta say tho it still surprises me seeing really high quality old video
I also noticed even modern TV cameras are probably 50/60 Hz because moving with camera for example in TV news is not causing blurry like in movies.
Soap operas arent filmed at 60fps theyre 30
@@joshuarichard6827 And in my opinion BETTER than that 24 HD crap that breaks up at the edges when there is any motion.
@@joshuarichard6827 They've been filmed and broadcast in 60fps on TV for decades. Sometimes they get reduced to 30fps when uploaded online though.
Kenneth Branagh's "Murder On The Orient Express" from 2017 was shot on 65mm FIlm. The first time I watched it I almost instantly start scratching my head about the look of it, so I looked it up. It is so clearly not digital and GORGEOUS. I love filming digital but Film just has a look you can't recreate.
(I edited to correct, as I had said 70mm when it was actually 65mm. A commenter Michael Parente pointed this out in the comments.)
leave it to Kenneth Branagh to do something that sounds pretentious but looks or is done so good you forgive him pulling a pretentious show (Like his 4hr take on Hamlet, such a good interpretation but FOUR HOURS????)
If you can see a movie shot and projected in 70mm, absolutely do it. I've seen both 2001 and Patton in that format at a local film festival (in a proper old theater that seats 1600 with one freaking enormous screen), and they looked amazing.
Should I get me some 70mm movie film?
Here is a list of 70mm movies.
en.wikipedia.org/wiki/List_of_70_mm_films
No news there. Most high production movies are not filmed digitally. What you were noticing was probably a creative lens choose or maybe they staged the shots to use more of the cameras FOV(less zooming-in in post).
I remember watching Doctor Who as a kid and wondering why the interior scenes where always so clear while the outdoor scenes were always a bit grainy.
Doctor Who would've been a good example to use since it uses both film and tape. What I always found an amusing artefact of the video half of the production was the way bright lights would swear across the screen when moving.
If you watch Spearhead from Space, that was shot on film and then you look at the next Episode and the quality drops cause they filmed on tape
@@gussyt1761 The next story, not the next episode. Spearhead from Space is a story made up of four episodes. Yes, I know they were shown as compilation edits in the USA, but Spearhead from Space is not a Doctor Who episode, it's a Doctor Who story :)
@@Takeshi357 That is called comet-tailing and was an artefact of the imaging tubes in the cameras. It was greatly reduced with advances in tube camera technology, and then eliminated altogether when cameras switched to using CCDs rather than image tubes.
Of course it has a name.
Been rewatching The Waltons from the 1970's, it was shot on 35mm film and looks gorgeous on my 4k tv without any goofy anamorphic distortion. An amazing amount of detail put into those sets by Lorimar/Warner Bros. Also another way to tell if a tv show was tape or film is an announcer may say in the end credits something like: "All in the Family was videotaped before a live studio audience" or the credits will read to the effect: "Videotaped at Bueno Vista studios Burbank CA" or it may credit Panavision, Kodak & etc. if on film.
This is an awesome presentation, I'm a photographer and this is one of the most informative pieces I've seen. In such a short time, that's amazing.
My mind was not blown by that Last christmas remaster, because I bought West Side Story, The Wizard of Oz and Robocop on blu-ray. The image quality is so good, they look like they were filmed in the 2000's
You'll occasionally get some color weirdness from old film samples, but it's nothing a good mastering studio can't fix with time and money. I just feel really bad for the actors filming for Technicolor films, because the first couple generations required the set to be dramatically overlit to get adequate exposure. Those lights get really damn hot and make the entire set feel like a large toaster oven.
The Wizard of Oz was even a special case since it was shot on early multistrip Technicolor which was shot on basically on 3 individual strips of film. One for each colour. And that was then layered to a single strip colour print. When they scanned the camera negatives they had to do it once for every colour. And that also helped to eliminate dust and dirt a lot easier since the image restoration software they used compared all 3 pictures of all 3 film-strips for every frame and if there was a speck of dust on only ONE of them but not on the other two, it removed that speck of dust and made it look like the other two.
Kémy me prefer watching the original version of a movie not remastered version
or the beatles’ remastered movies...
I wonder if they will ever re-master the Thriller music video. It's very unusual for music videos to be shot on film in the first place, and when they did, it was often a promotional project from a film director who was already in the industry. Motion picture film stock is EXPENSIVE so it's only used for projects which have a very high return on investment.
On a somewhat related subject, Rolling Stone did a reprint of the John Lennon cover photo, taken in 1980. One of my younger co-workers was shocked to learn that was a 1980 photo. The quality was so sharp one could see the pores in his skin, along with the vivid color!
The year 1980 matches the birth of a generation.
It would MORE shocking if it was a POST 1980 photo all things considered.
That probably was shot on medium format and a higher resolution than any digital camera he'd ever used.
There are high resolution scans of Playboy Centerfolds of the 1980s. The level of detail is absolutely amazing. Those have been shot using Hasselblads back then, if I remember correctly.
@@splashstrike It was almost certainly was at least Medium Format (120/220 roll film). Professional photographers used 35mm for "field" work where portability was important (I.E. photo journalism). Fashion Photography was dominated by Medium Format, Portrait work was also dome at even larger formats such as 4"x 5" (or bigger!) cut sheet film using a large "view camera". P.S. There ARE Medium format digital cameras today. But they cost as much as a decent used car.
This guy just taught me exactly how photography and television works concisely in a way that I understood completely, all in the space of 5 minutes...I actually felt myself getting smarter 😂 you've earned yourself a subscriber my friend.
Re: "16mm only needs around 720p": I've recently looked into scanning 8mm Standard/Super and 16mm and found that even 8mm film can benefit from 4k scans! Incredibly fiddly work to get the best out of these smaller formats, but I was blown away by enterprise scanning solutions and how high-def they look.
The results also depends on the ISO of the film..
Low ISO gives better resolution, but more light is needed.
A 25 ISO film need to stops in better lighting than 100 ISO..
Full stops from 400 ISO
400
200
100
50
25
12
6
Very true. I would say that the pseudo-random nature of grain makes it act a bit like dithering in audio, in that the data shared between several frames in the noise actually adds a lot of detail. Therefore, to adequately scan a film, I would assume that all the grain has to be present in its fullest form to record the detail properly.
Speculation:
To go further with my audio analogy, I would imagine it's sort of like the Nyquist sampling theorem in that you'd need double the linear resolution, and therefore you would need to record the square of the actual resolution the film can display (if the film can do 4K, then you'd need to scan it in 16K (I haven't checked the maths)) in order to record it accurately in the digital world.
Yeah, 16mm can go all the way up to 4K. Especially negative film. But even ECO film still looks amazing if scanned properly in higher resolutions than just 1280x720. And keep in mind 16mm was mostly a 4:3 format. So you'd only really benefit from a 16:9 aspect ratio if it was Super16. I say if you CAN get a higher quality scan of a 16mm reel then DO IT. Don't be be cheap and scan your film properly. You can ALWAYS scale down from 4K afterwards anyway at the very end. And yes, the same goes for Super 8, Regular 8 or DS8. Even better if scanned with open gate and really getting the most picture information out of it.
I daresay processing 8mm over resolution could provide a higher resolution by processing a few frames together. The majority of frames are of the same thing with a slight movement - so a moved pixel can be compared a few times and averaged out to a better quality a bit like time interpolation rather than dimensional interpolation.
There's also a lot of early HD shows that were shot on 16mm film to get high resolution with a lower cost.
First, thank you for not Wham!ing anyone who have made it this far into Whamageddon.
Second, Star Trek: TNG did have the same issue mentioned about Voyager. All of the scenes were shot on film, then transferred to tape for editing and special effects. There was a demo Blu-ray released showing off a few of the restored episodes as a proof of concept that the SFX could be redone in HD properly before the full seasons were released. From what I recall, not all of the film for a few of the episodes could be found, so those episodes' scenes had to be upconverted and restored from the video tape master.
Third, I always enjoy your videos. Looking forward to the next!
Not only that, the studio decided to re-create all of the VFX for the BluRay releases from scratch. It was a monumental and quite expensive process.
I was going to mention this. All 80s/90s Trek was produced the same way. The reason Voyager and DS9 haven't been remastered in HD is largely the time/cost of the work involved, especially in the case of Voyager where a lot more CGI shots were used, all of which is lost now and would need to be redone from scratch. But it was all shot on film, and hopefully still all exists somewhere in their vault.
@@Xpndable And just like that, I now understand why the TNG box sets used to cost a proverbial fortune.
@@BlobVanDam Enterprise was shot on film, but scanned in 2K because 2001, so the Blu-ray release was a lot less work for that series. But a few of the VFX shots in the early seasons look way softer than the rest. I guess they had to re-render a change in a rush and did so in SD for those shots. I'm not sure if anything other than S4 was actually broadcast in HD, because 2005, but at least they had the foresight to do the scanning and FX at a higher resolution the whole time.
I'm so sad about DS9 and Voyager not getting a remaster, especially since they were framed to be wide-screen safe ahead of time. They would look so good on modern TVs! And it wouldn't have to crop the top and bottom off of the frame either, since the video transfer back in the day already did so on all sides. A wide-screen release would just partially un-crop the left and right.
@@kaitlyn__L I recall the CG shots for Enterprise were all done at 720 and upscaled to the 1080 master. I guess full 1080 CG for a TV series was a bit of a stretch for them in the early 2000s. :D Since the CG was already better than SD, it wasn't worth trying to redo it for the Bluray since you're still gaining from it over what people had seen on TV at the time.
I thought this was common knowledge, This is why we've had constant remasters of even 60-70 year old movies in 1080p and some in 4k
What a spectacular channel. What an excellent way to explain standard definition, film, high def, soap opera effect etc. I am overwhelmed by how such a complex subject was simplified as it was. Thank You for another truly great video.
I first became aware of this watching Twilight Zone reruns in the late 1990's and wondering why some episodes looked good and some looked like they were recorded using my family video camera. Turns out they used film but switched over to tape to see if they could save money. They couldn't and went back to film.
Not to mention that they also couldn't edit the videotape. The videotape setup severely limited the visual scope of the program.
My dude, you research SO HARD it's amazing. Thank you for all of your hard work. Such good content, very wow.
But seriously, the amount of effort you put into your videos is mind-bogglingly impressive. Thanks for doing what you do!
I bet this is where all these "New York City in 19XX 4k HD" videos are from.
Eh not nessisarely. A LOT of it is laserdisc, which supported HD in the 90’s
In this case that's not right. Those videos are enhanced with the help of AI. There are YT-videos showing the result before and after using this technology and believe me, the original (scanned) film has not a resolution of 4k or even close. And laserdisc - btw - has nothing to do with this at all.
@@thebravegallade731 Also, HD digital tape recorders. Yes, very expensive technology at the time, but it was available.
Nah, some of the stuff in the 90s was weird, rare, and VERY EXPENSIVE high-res tape recorders. Basically, things that were the precursor to D-Theater (HD VHS). The most viewed "New York in 199X in HD" is shot on tape, it's at 60i. It also has the kinda distinctive videotape color gamut.
Also, while the other ones are all on film, a lot are AI upscaled.
Not in all cases! Techmoan has done an excellent video on D theater (HD VHS Tapes) and he posted one of those HD NYC from the 90’s videos on his second channel. I highly reccomended checking out the explanation!
I remember watches loads of shows as a kid and noticing the difference between things shot inside and out. No adult I knew had an answer, but I suspected it was something like this.
What an awesome video. Thanks a lot.
Basically, most music video shot on film even in the eighties because musicians didn't like the cheap soap-opera effect of video, and/or wanted to make their clips cinematic. If the music video did got shot on tape that was because the band or publisher was on a shoestring budget, or it was originally a live performance somewhere repurposed as a music video.
Ayyyy that's me at 0:25! Thanks for adding my comment! :)
The man himself
Lol awesome.
useless ass comment
Ima be honest, that was a pretty shit comment
Give me the link of the original music video
I'd argue 16mm can look good at 1080p, I own the sole original Doctor Who serial shot entirely on film, Spearhead From Space in blu-ray format and while yes it has a definite grain to it it still looks very good at that level of scanning. Its an excellent preservation level transfer.
I mean Fruitvale Station was shot on 16mm
ua-cam.com/video/3Nh9BTMWj9M/v-deo.html Even super 8 with a professional camera can look full HD.
Discovering the existance of an entire playlist about analogue TV at 1AM is not the great news it seems. This is going to be a long night
Another thing about motion film - because the grain is random you can get vastly higher resolution by interpolating multiple frames, so there would even be a point at scanning at FAR higher than 8k
high end film can easily go beyond 8k
Rank, one of the film scanner manufauctuers, said they estimate standard 35mm can resolve about 6K but the optics required to put that much information onto said film are almost impossible. MaxiVision, a short-lived film projection system, said Geneva movement projectors (most projectors), even with ideal lenses, only resolved about 2K because the film flexed so much going through the gate and that simply using a better movement can bring it up to 4K. Even Super-16mm with a top-quality lens can yeild almost full-HD resolution but without the annoying motion artifacts of most current cameras.
@@stephenbaldassarre2289 those numbers may be true for a single frame, but the grain in each frame is in a different place, so across several frames there would be a point to using MUCH higher resolution
@@mycosys That's true to an extent but you are also limited by the uniform scattering of light through the emulsion itself. Then you really start splitting hairs because only one color layer (blue) can achieve full sharpness. Modern emulsions are so thick in the name of low-grain, you'd be lucky to maintain 50% MTF even at 4K, regardless of frame rate. Don't get me wrong, I love film and really miss shooting on it, but there's a lot of factors involved beyond the photochemical particles themselves.
I'm always impressed how good the 4K version of "2001: A space odyssey" looks. Even how sharp the video on the 1960 IBM tablet computer flatscreen looks
It looked amazing in 1968, on a pristine 70mm print and in a good theater.
@@joesterling4299 It was a good looking movie, but even the critics said it was sooooo borring. That is until Star Trek TMP came along, which was sooooo sooooo borrring.
This was so cool for me, thank you! When I was young, at some point I started to notice that some shows looked "different" from other shows. I didn't have any understanding of resolution or those concepts, I just always knew that there were some shows (like MASH or Star Trek) that looked one way, and other shows (like The Jeffersons or morning soap operas) that looked another way. I couldn't describe it, there was just something in how they looked that felt different.
Over time, from simple process of elimination, I started to work out some patterns, like the fact that movies shown on TV always had that first look, and TV only sitcoms often had that second look. Some shows (like Dr. Who) looked the first way for any outdoor scenes, and the second way for indoor scenes. I don't know how old I was when I figured out that the difference was that that first "look" was things shot on film, and the second "look" was things shot on video. I felt proud of my deduction skills and moved on. But I never actually understood why they looked different, what was that indescribable quality that made one feel different from the other.
So watching this video brought me right back, and finished the mental journey I had started... a good number of years ago, but this time with some actual science. Understanding how the random, natural orientation of the silver particles on film lend it a sort of underlying texture that differs from the concrete, geometric pixels of video makes so much sense, and puts words and meaning to something that my younger self just felt instinctively. This took me back, and I really appreciate that. Very glad I found this channel.
If you're old enough to remember the phrase "Film at Eleven" on the 6 o'clock news it was as follows: The news anchor would give as much detailed information about an event happening in the field but the actual film wouldn't be processed until it was back in the studio and then shown during the 11:00 news.
Technology Connections: where you already know how something works, but still want to watch 21 minutes of Alec explaining it to you anyway.
One good example of film vs. tape: The Twilight Zone Season 2, and the handful of episodes CBS shot at Television City on tape to "save money". Forever stuck in 480i, with all the fun side effects the Image Orthicon tubes created.
That's the example I always pull out when I talk about the "feel" of film vs. videotape. Those videotaped episodes almost feel like they're not the same series. It just feels wrong. The recent live stagings of Norman Lear shows on ABC also felt very wrong because those series were originally videotaped. I could go on and on with gripes about framerates.
Don't get me wrong, I love videotape (and even tube camera effects), but I also am very sensitive to the "look and feel" of something.
This is the first thing that came to mind the moment he mentioned the soap opera effect! Could never put my finger on what made some of those episodes feel so weird
The TZ tape/film comparison shows another strong difference, which is the tonal rendition. Film has an S-curve of contrast, where mid-tones are most contrasty, and contrast gradually reduces as you go into the highlights or lowlights. The old black and white image orthicon video cameras had no precise tone curve correction, only effects similar to early Xerox copiers.
Later image orthicon color cameras combined with CRT (picture tube) displays had essentially the shadow part of an S-curve, but highlights would go straight up to maximum and then clip. Later plumbicon video cameras had much better tone curves, but still were usually pretty crude in handling highlights. Today's digital cameras, both still and cinema, with the post processing that produces the final output, emulate film's S-curve and do it quite well.
I see you caught this episode before I could. Really hope I can join your video team at next years MFF. I know how to do a video, really I do!
Ya know, I’ve seen the SciFi channel broadcast what has to be the original Ampex video recordings of those few taped shows.
For decades in syndication they must have only ever shown the Telecine (filmed TV screen) versions of those episodes. But what I saw was definitely the actual videotape because Serling was on the set when he did the intro, and when the camera panned over from the actors to him the ‘soap opera effect’ was obvious.
In fact, the SciFi channel seems to show both versions of these episodes, Telecine & videotape, for some reason...
Another reason for using film was that it was easier to produce a good quality PAL version for export. Most of the American shows taped on NTSC equipment looked somewhat blurry when it was converted to PAL.
Yeah thank god for PAL without the need to add "dropped frame" bullcrap. Especially if you shot at 25fps (like many TV productions) it was a lot easier to produce a 50i picture from that filmsource.
KRAFTWERK2K6 It ain’t perfect though, films had to be sped up by 4% to match the 25fps signal whereas NTSC had to create a 5th frame via 3:2 pulldown. Seems more like a “pick your poison” situation imo.
NTSC - Never The Same Colour
@@charlescampuz5812 Only films shot that were shot in 24fps. TV Productions were often shot on 25fps. And i think even some theatrical features as well as it was fairly simple to convert that to 24fps theatrical prints. So the movies where not sped up on TV and video. And TV shows often were converted from NTSC 29fps sources to PAL which resulted in weird blurr and field ghosting. YOu could always tell when it was a US Series or a show from PAL territories. The pal Pitch problem got much worse in the late 90s and early 2000s when digital format conversion became a thing and everything was badly converted to PAL. Extremely noticeable in the later seasons of "Friends" over here where the opening music suddenly run a little pitched. The problem was enhanced when the dubbing was actually done to these already converted 25fps masters which means on Blu-ray all of that now runs 4% too slow while on TV it runs ok.
@@KRAFTWERK2K6 most movies broadcast or sold in PAL/SECAM regions were sped up to 25fps, saving on any conversion costs. There was a noticeable pitch difference, but only if put side-by-side. Things got weird when they started using time compression in the mid-90s...
19:45 The camera's comedic timing is one of those things that happens occasionally that you could never convincingly argue wasn't staged.
film-grain: the director's cut DVD of Star Trek The Motion Picture contained CGI additions/replacements that were put through a grain filter (based on analysis of the grain structure of the original stock) to better blend in with the original material.
Hello.
And yet Star Trek the Motion Picture still was a boring as ever.
That's great to know. A lot of modern TV and movie shows, even though shot digitally with practically zero noise, have grain added to them just to give them a more filmy look, rather than artificial, plasticky shiny-and-new look.
too bad they did not follow the advice given at the end of the above video, which is why we don't have the Director's Cut in HD (or better)
@@silkwesir1444 Apparently there was a UHD (and Blu-ray) release of the Director's Cut in 2022? TIL. I'm definitely going to need to get my hands on that; the DVD is one of the very few DVDs I bought after upgrading to Blu-ray a decade ago.
I think one way to elaborate on "You can tell its video because it looks like video." Dynamic range is a good test - and its something you still have to have an eye for, but video productions usually aren't as vibrant as stuff shot on film because the dynamic range they can capture is smaller. Instead of having perfectly saturated blue sky, it might look more overcast even when it wasn't. Especially true when we're sticking to the "up to the late 90s" era mentioned in the video.
It certainly gets murkier with super high end large format digital cinema cameras from Arri/Red since their relatively huge pixels have increased dynamic range too, but for generic betacam SP video cameras vs actual 35 or even 16mm film, the dynamic range is a big tell.
here's an image that compare the dynamic range of digital camera to film camera: techxplore.com/news/2015-10-quantumfilm-based-image-sensors-cameras.html
“IMAX and you should just give up trying"
Had me floored there for a minute haha!
I agree, for now. I was surprised he didnt say "for now" though. 16k scanning will be a thing one day. It wont be needed for the home TV, but VR one day can benefit
As a film photographer i can say that yeah, IMAX will definitely need at least 2 decades of hardware improvements before becoming obsolete. Even 70mm is easily 8k worthy, if not higher depending on film stock.
I bought a couple IMAX frames from their gift shop and it's just 70mm film that is a bit taller.
What's the difference between IMAX film and 70 mm?
imax would likely need 32k
every minute would consume a blue ray ?
The film/video difference was something I always noticed when watching Doctor Who as a kid. Whenever Tom Baker was in the Tardis, it always looked different from when he was trouncing around a rock qua-alien planet.
I'm honestly proud to say that I finally have enough expertise that I knew almost all of the information in a Technology Connections video! lol. Lots of great information here and as a film photographer this is definitely going to be THE video to which I recommend people when asked what the benefits of shooting with film are, or why I like the look, etc. It's the perfect one stop, inclusive video for all the information I usually stumble around when describing it to people.
I've got 10,000 slides that my father took in the 60's to 90's. I was early in the scanning game. Who would need greater resolution than 900x600 pixels? That's already larger than your monitor! The problem is that slide scanners peaked around 2005 when digital photography wiped out film. There are no cheap/good/new slide scanners! Kodachrome has incredible resolution if you have the [very expensive] scanner for it -- and the time to scan slides one at a time.
Or money to pay for it to be scanned professionally, and trust in the company doing it not to lose/destroy your family memories.
Or use 1000 dpi on normal paper scanners or one that are dedacited to film. My frampa has one
Long term data storage is so interesting
From what you are saying this seems like a real problem. Hopefully some kind of enthusiast made solution will come out in the future, a kin to floppy to SD card reader for the retro computer community.
@@zsin128 dedicated*
One thing about Voyager's post production: I'd guess that post production included CGI scenes and hoo boy would I not envy the guy whose job it'd be to find the original files on some Amiga or SGI's hard drive and attempt to re-render them at higher definitions.
It's worse than that. They typically re-create the shot from scratch. Even if they did have the CGI files, they would be of little use as there were no real standards for that data. Most of the hardware used in the 80s and 90 was SGI, Silicon Graphics Inc, basically defunct as of 2009.
The 2k remaster of The Next Generation was hugely expensive and all the CGI had to be re-done from scratch. If you watch the final result it is absolutely stunning. However, DS9 and Voyager used much more CGI and the original digital assets have since been lost: www.treknews.net/2017/02/02/why-ds9-voyager-not-on-blu-ray-hd/
Great Joe I saw a video recently of an Amiga that got thrown out by a production company and they found raw footage from Titanic on it.
Pretty sure voyager didn't touch any amigas. This is heavy SGI stuff.
Babylon 5 on the other hand...
Andy Delle not so bad actually, it was made in on SGI hardware, but when SGI went under the software was ported to windows and mac, and it’s possible to import from older versions into newer versions. Someone who still had the assets opened some of them up and hit “render” in HD and said it would be a lot easier than the Next Gen HD remaster, (where all the assets were lost.) a few years ago. The images are online.
I’ve always thought of tape footage as “glossy” while film footage is “flat”.
how about gooey vs prickly?
Haha working in tape since it was invented we never used the term "footage" when referring to tape. We called it simply a recording, segment or clip (for a short piece). Because both film and tape could be physically cut u could call them clips. I used to edit by actually cutting tape
@@contradictorycrow4327 what "sick cuts"?
@OP Yeah, probably because video is straining its dynamic range slightly, giving it more contrast at the same settings. Film has more dynamic range, letting you "fit it all into video" or "flatten" the brightness values
@@silkwesir1444I tried to compare and realized that it's all just gooey prickles and prickly goo.
In the 1990s, a lot of shows were produced on film because they were aware of the impending rise of HDTV. They knew a higher standard would be coming, they just didn't know the exact specifications. And film was the easiest way to get an HD product. This is why Warners did a lot of their television animation on film until around 1999-2000 (Batman Beyond switched from cel to digital ink and paint at this time). Basically, with film, you could rescan the negative to get a higher quality, but you couldn't do that if you just had a video file.
In general, most dramas in the US were shot on film because it gave them a more cinematic look, since tape tended to look really cheap. Tape also wasn't as good at capturing color, and had a lower dynamic range. For sitcoms, there's more of a spread, but it came down to network preferences and the production houses. But still, there were example like Cheers, where they stuck with film even after considering switching to videotape. During the first season, Paramount wanted to consider shooting on video to save money, since film is expensive and the show was dead last in the ratings. But the results were terrible, so they stuck with film. And now, we have Norm in glorious HD.
kenlevine.blogspot.com/2016/04/another-thing-about-cheers-you-didnt.html
Batman Beyond just got a Blu-Ray release!
I noticed it when I watched Seinfeld in later reruns broadcast in HD widescreen.
Unions representing filming crews had a lot to say about it. Video shooting required less qualified workers so they put a lot of pressure on producers to shoot on film.
"Whaddaya mean, it sounds dubbed in?" You are so adorable. :)
Seinfeld was shot and mastered on film and there exists HD transfers of it but only for tv rebroadcasts and digital copies. I want a Seinfeld Blu-ray gosh darnit
I am not sure I would recomand the HD versions since while the horizontal screen was expanded, some of the picture on the vertical side was lost because they can't fit that on HD screens fully.
The DVDs still preserve what was originally intended even though it is not in HD.
@@1685Violin If they ever release it on Blu, they can offer the original 4:3 versions again. The series was already remastered in HD for the DVDs, if you believe the back of the boxes.
I have tried to explain to people for so long that say HD is better than film. And they just cannot comprehend that the original film was in the best HD it's ever going to achieve
Even if something is shot on film, btw, 8 mm is *_nothing_* like 70 mm film; the latter can be blown up to billboard/cinema screen sizes with literally no apparent loss of quality at normal seeing ranges in most people's eyes (not just the average person).
Yeah, 70mm has so much quality that you could name an 8k digital intermediate with it.
[cue "Also Sprach Zarathustra"]
Evil Dead 1 was almost shot on 8mm. Until they realized just how abysmal is looked when blown up for a 35mm projector, and so they went with 16mm.
Yes, apparently a lot of television was shot on or at least mastered to 16mm film, so film does not guarantee 35mm quality.
@@leocomerford Still, in my opinion, 16mm gives a much more professional "low budget" look as opposed to video, at least for movies. Brad Jones calls it "shot on shitteo," though I think that's referring more to movies using home video vhs recorders and not studio video.
So is nobody gonna ask why he has so many pictures of goats?
Goat pictures: because *Lazy Game Reviews* - _LGR Plays - Goat Simulator_
ua-cam.com/video/j4HY_fxwiZo/v-deo.html
*Laughs in Aberforth Dumbledore*
Don't kink shame.
🤣🤣🤣🤣🤣🤣🤣🤣I literally laughed out loud at work reading this
Because goats a cute. I saw some llamas in there too. Also cute.
I think you are overly harsh on 16mm! 720p isn't nearly even enough to capture Super8 satisfactorily. The exact same phenomenon with the music video can be had with old home movies that were badly scanned. Due to the smaller frame, the grain is of greater significance. At lower resolution scans, the grain interferes with the scan resolution and gives horrid results. People often soften up transfers to reduce the effect and so it looks like the original is of lower quality. Even at 1080, this is still not enough to capture the grain on most stocks. You need to think of transfers in a similar manner to audio, with a Nyquist frequency I feel. A 2k scan of Super8 will ensure you aren't throwing away any of the original. Without including the grain in the scan, the result is like an air-brushed model, it's also why modern digital productions can feel lifeless and less engaging visually. With film, every frame is different obviously. This has been noted and in some cases, special noise is introduced back into the digital image to make it less artificial.
There is also something called CCD noisiness or CCD grain factor for digital cameras. That describes how much is CCD sensor resistable to the grain noise (randomness in input signal). Essentially smaller number means there is less noise in the image. Of course the ISO settings alter the noise amount, however this effect wasn't intentional and it is simply result of amplifying or decreasing the input signal from CCD sensor, which already contains the noise. That's also why so many night photos are so much more grainy and noisy than in daylight. You can reduce the noise in post production, however you still need to leave some noise in order to not make the photo look artificial. Today, the noise can also be simulated in computer graphics, and the end result is so convincing that the rendered image looks like a real photo.
@@CZghost What you're describing isn't specific to CCD sensors, it exists in all digital sensors. And low noise is generally considered highly desirable in a digital camera. The reason photographers don't go crazy with noise reduction isn't that they want to keep the grain, but that heavy noise reduction also kills detail. I'm sure a lot of photographers would gladly throw away any hint of noise if it was possible.
@@23Scadu Photo without noise just doesn't quite look right. You can for sure say that there was some sort of artificial processing done, it looks fake. The noise makes it look real, even it's completely fake. Too much noise kills the details, but very little noise to none also kills the aesthetics. It makes look plastic, artificial, fake. That's also reason why people add noise to photoshopped montages, or at least don't try to make the stocks used super clean of noise. It has to sell it, it has to look real. You can obviously tell it's Photoshop, especially if it's art. But it shouldn't look like montage, it should look like photo, produced with professional camera and maybe with little enhancing effect added to it. Best screen wallpapers are photos. Why? Because there's something that people prefer over artificial stuff. The aesthetic.
There was an ill-fated set of FUNimation DBZ Blu-Rays (the "Level" sets) that re-scanned the 16mm film from Fuji TV, got rid of the gate effect, and lightly cropped any edge degradation. It looked brilliant.
As if... Lol. 8mm looks like 240p. Any wonder since the frame is smaller than your pinky nail. 16mm is ok but definitely ot 720p.
Any kid from the 1970s who was watching a documentary in high school science class would be absolutely blown away at 720p. The only thing of the era that comes close are 35mm slides and filmstrips.
A year on and I’ve just got the “learning Ancient Greek and Latin” crack. Apparently the word ‘Television’ caused some offence to some stuffed-shirt classical scholars in the UK at the time.
No comprehend. You're referring to Greco-Roman origins, tele- + vision? Why offence?
@@robertjenkins6132 as I alluded, back in the 1920’s at the dawn of Television, the admixing of Greek and Latin to form a new compound word was considered by some to be etymological heresy. Classical Greek and Latin were considered “pure” by some scholars, (a pre-Victorian construct) regardless of the roots of those languages.
@@MorgoUK Funnily enough I know T-Shirts saying "Polyamory is wrong! Never mix greek with latin; it's either Poliphilia or Multiamory."
Which of course alludes to stuff like this.
@@profezzordarke4362 There's a homosexual joke that's the same way.
It’s just a direct quote pulled from Walt Disney’s Carousel of Progress
Can we talk about some of the amazing examples of TV shows that were actually filmed on...film?
The Sopranos. MAN that show looks magnificent. Not surprised it was done on film.
The X-Files. Bit of a mixed batch, but most A-shots were done on film while B-shots were done on tape.
That 70's Show. Fuck yes.
The E.R. Holy heckins, yes.
Seinfeld and Friends were also shot on film, too.
@@Airlane-rq9yd I notice it constantly on Doctor Who.
Miami Vice and Baywatch were also shot on film.
Mission Impossible looks fantastic in HD, too good in some scenes. You could see the wires holding up a "drone" in a air ducts in one scene. They wouldn't have been seen on TV.
it was mostly stock footage that was sourced from video. That was standard for pretty much every show since the 90s.
18:56 Ok, now you have to do a video on IMAX.
Its history is really interesting and started with a multiprojector moving stage system
@@default2826 I once got to the bottom of it, around the time "Dunkirk" came out. As it turned out, an IMAX film potential resolution is somewhere around 18k.
@@nostalium Indeed, although the big problem isn't the resolution... it's the film itself. A full length feature film on IMAX weighs 250 kg (550 pounds).
@@AlexRMcColl the movie did deal with some heavy topics
IMAX is pretty cool, but I'm not sure what he'd really talk about. It works in the same way that 35mm film does, just with a MUCH bigger size of film and it's shot horizontally. I guess he could talk about how the soundtrack is separate from the film print, but I mean, I just talked about it in a UA-cam comment. But if you're still interested, look up a channel called Analog Resurgence, I believe he talked about it. And he's also just a great UA-camr who talks about both movie film and still photographic film.
Nobody in the 80s was routinely saying broadcast TV was fuzzy, at least in the UK. Standard definition (625 lines) on a 14 inch cathode ray tube TV or even a larger one looked okay, it's the huge, hi-res screens people watch TV on today and the tendency to zoom in on a 4:3 picture to make it wide-screen that spoils old TV footage and gives a somewhat false impression of how it appeared to the naked eye in the 80s.
unfinished television current TVs don't display games right either. Probably the way they make the picture causes the worse quality now.
Exactly. I worked on professional VCR's in the 90's and 2000's and a 625-line studio quality image shown on a pro monitor was *incredibly* sharp. Unfortunately some detail always got lost in the transmission chain, mainly due to the way colour information was bandwidth reduced by the PAL system.
Very true, I always get annoyed when stores selling TVs use a standard definition format signal and mirror it to tons of large HD TV's, how can you show the clear image and high definition if you're blowing up 4:3 standard definition?
Luckily that kind of thing seemed to be a late 2000's thing and by the mid 2010's most stores who took themselves seriously actually upped their game.
I think more than the basic resolution of the scan, it's the quality of the film scanning equipment, the skills of the person doing the transfer, and the colorist working on it after the scan, that determines the quality of a transfer.
I always remember noticing that shows like "All in the Family" had an immediacy because they were shot on tape, which everyone associated with something news-y, and shows like "Mary Tyler Moore" or "Bob Newhart" were shot on film...the fact that the shows on tape were always (supposedly) shot before a live studio audience, gave the whole thing a cinema verite feel (plus it was cheaper). But reproducing videos at high speed transfer rate gave everything a fuzzy look. I always assumed people knew this stuff automatically, but I forget people who arrived on the planet years after me don't know all the stuff I do from just being around longer. (I'm only halfway through your video but I had to stop and comment. I do this to channels I really like. I just had yours recommended to me yesterday. But you may never see my comments since UA-cam seems to disappear them before the channel creator ever sees them. TLDR I suppose)
If you look at the original glass plate photos or daguerrotypes, they have incredibly sharp focus.
Because it was shot on tape, MTM and BN still look nice in repeats. All in the family is looking blurry if it is repeated.
The Jefferson's on metv is terrible recorded on tape
Actually, I meant shot on film.
Except not in Australia, where imported US tv shows like All in the Family were converted from NTSC to PAL which gave them a mushy look, by far the worst look on Australian TV. Australian taped TV shows had that clarity and immediacy you mention because they were broadcast in PAL without conversion. It was amazing to me when I went to the US and saw how taped shows looked without broadcast conversion. They looked as clear and sharp as Australian taped shows.
I don't think I've ever seen AITF not look bad. I'm guessing the original NTSC copies look far better, but the PAL conversions look abysmal.
Thankyou for not furthering the myth that film has ‘infinite detail’ / ‘more resolution than digital’
Great video
Well, you could scan film to the point that you can see the individual molecules, or even atoms, of the film stock. That's pretty much infinite, isn't it?
@@niek024 - Beyond film not giving any more details at that level, if the atomic level is the theoretical maximum, that is, intrinsically, finite.
@@kruks Well, there's always quarks and stuff... But I agree that somewhere down the line there's no longer extra information about the image.
It’s not really a myth though. The problem is that people confuse what is being said when we talk “resolution” Film doesn’t have one. It just is. It’s pure physics. Digital technologies have a discrete amount of information contained in their captures. A 25MP photograph will only ever have 25 million pixels in it. Full stop, end of. It’s pure math.
Meanwhile in the film realm the grains of film can number in the *billions* you really are only limited by the sensitivity of the medium and the size of the negative. So while it’s not correct to say that film has a higher “resolution” than digital, it’s not a myth to say it doesn’t have infinite detail. Theoretically, we can get film grains down to the size of single molecules, such that we would be having individual photons interacting with individual grains. Practically we can only get a few orders of magnitude above that level. That is still a shocking amount of detail.
@@Mostlyharmless1985 Lenses have a resolution limit also, as they are not perfect. Just try and compare a cheap 35mm camera with plastic lens to a SLR with high end lens. Even now manufactures are trying to make "4k" lenses for cinema cameras.
I really appreciate the use of the bloopers roll in your work.
Thanks for the work that you do in producing these, they're always informative and expand my knowledge of an area.
Being probably twice his age or more, I knew the answer, and was so impressed he got it right. B@W shows filmed on 35mm are stunningly detailed, esp. Perry Mason, from later 1950s.
The HD (and widescreen) versions of Friends are always interesting because, apart from occasionally spotting something that wasn't meant to be in shot, you can occasionally spot the angle that was clearly not on 35mm film like the rest of the cameras. Not sure if it's 16mm or video (I'd guess the latter).
@Jay It was always understood that you could let stuff like boom mics get into the image, as long as it was in the overscan area that wouldn't be visible on most TVs. There was a "safe area" that was smaller than the full frame. But there were a lot of mistakes anyway.
I am pretty sure Friends was entirely shot on film and that any shots that are upscaled are simply where a camera negative went missing - it was originally edited on video and to do the HD version they had to entirely reconstruct the episodes from the raw footage, and it’s easy to believe some pieces got lost. On the Batman Beyond blu-ray set there’s multiple episodes presented as upscales because they simply couldn’t find the footage
Possible I guess. They had quite a few cameras for Friends and so I just assumed it was the 'just in case' angle. That or where they needed to place the camera required a smaller camera than the normal film cameras.
Universal shot their 1980's TV shows in an open matte ratio with the intent for widescreen exhibition. I have cropped episodes of Airwolf, Magnum P.I. Miami Vice, Knight Rider in the 16:9 ratio and it looks perfect in widescreen. Apparently they were preparing ahead for the HDTV transition when the standards were still being finalized and standardized.
I’ve been calling tape ‘digital’ for years and years and I’m so happy to finally understand fully the difference between the two
Right? It's interesting to find out electronic doesn't mean digital.
But you know what's even trippier? Optical doesn't mean digital either. Laserdiscs were basically giant CDs but they stored analog video, as some sort of higher-quality (but not HD) VHS tapes.
Is tape not digital?
@@bluesailormercury And add to that some LD's had Digital audio, but the video was still Analog.
tape HAS been digital throughout
the TV industry for many years now. and even in 8mm domestic formats like DV based on the low end pro format. DVCAM. refer also to. compressed digital like BETACAM SX or the uncompressed broadcast format DIGIBETA.
@Scott Luther snd DigiBeta and BetacamSX and DVCAM and miniDV ... tape HAS been digital for years!
19:46 Always perfect....
Camera: No screw you
The comedic timming of that camera was perfect
@@CapPotato388Seriously, it honestly feels staged.
I'm not surprised at the quality of the film, but more surprised someone would take the time to make such a good digital transfer.
0:46 Indeed, I was alive in the 80s and I can guarantee _everything_ looked all fuzzy all the time, whether on TV or in person. (Thought it might be related to the fact that I was highly myopic.)
It was actually the mountains of cocaine.
5:58 that doesnt look creepy at all
Nah its handsome. ^_^ This Technology Connections Dude is awesome! Lmaao. :P Its a little creepy, but hilarious for stoners into this stuff. :P
Wasn't that the thumbnail for "A Tale of Two CD Players"?
@@ArtisticAutisticandAiling U ok bro lol?
Roberto XS Yes it was!
Just the horror-clown-doll grin.
On film in the streets
On tape in the sheets
Kinky ;)
I prefer early-2000s internet video in the sheets: xkcd.com/598/
I dont know how I got here, but I could listen to this guy talk about anything and still be interested.
A lot of younger Star Trek junkies first experienced this when they discovered why Deep Space 9 doesn't have a proper HD release.
The remaster they did for the documentary is gorgeous. I really wish they could do the entire series.
Joseph Davies This x1000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
I was asking that question when watching Buck Rogers in the 25th Century on MeTv. It looked like it was filmed with modern gear. Thanks Tech Connections!
I was watching Perry Mason on MeTV(?) and noticed immediately you could make out the weave in people's clothes. Knew it had to have been filmed.
@@d3v1lsummoner Its mind blowing growing up in the 80's seeing these programs at 480 and now HD. We need a 4K MTV channel that only plays videos from then.
Universal actually did a Blu-ray transfer for Buck Rogers. It looks pretty incredible.
The first time I watched _Frosty the Snowman_ on an HDTV, I swore it looked like it had been traced frame-by-frame in a digital animation program, it was so sharp and perfect. Not so much as a visible cel shadow.
The Jefferson's looks like crap, must be tape
I think it should be worth mentioning that in the late '80s and early '90s there used to be a middle ground between SD video and film, which is early HD video, most notably with the Sony HDVS system. A great example of that system would be the recent Blu-Ray release of Toto's concert at the 1991 Montreux Jazz Festival (a clip of that could be seen here: ua-cam.com/video/S0QDB9FtnUM/v-deo.html ). Many people thought that it was shot on film, but actually it was early analog HD 1125i.
Another example is that New York in 1993 video that Techmoan showed on his video about the DTheater DVHS, it was also shot with Sony HDVS.
Still looks very video, not film.
And I agree with you, it does not look like film, mainly because of the fluidity of motion (higher frame rate), the trailing lights typical of tube video cameras and the lack of film artifacts. However, there is also a preconception that you can't have HD video before 2000 unless it was shot on film, which you can see in the comment section of the aforementioned video, in which there are a few people who insist it was shot on film when it clearly wasn't.
There's also Genesis on Wembley Stadium that the credits on the video claim it was shot in Sony High Definition tape, which unfortunately has never been released on Blu-Ray and no HD preview of it exists, and a video Techmoan showed of New York in 1993 that is obviously tape since it runs at 60 frames per second.
@@jonessperandio Also, it was interlaced video, isn't that what gives video from that time the soap opera effect? What are those "trailing lights"?
The video clip for Last Christmas, though shot on film, seems to be largely devoid of film artifacts.
@@Myrtone Maybe... those trailing lights are a result of something like a burn-in produced by powerful stage lights on old tube camera sensors, it doesn't happen on modern CCD/CMOS sensors.
As for the lack of film artifacts on that videoclip, my guess is that either it had a good restoration job or it was filmed on a good film stock... many movies from that period were filmed on trash film stock, so you can really see the artifacts when they're scanned in high resolution.
Your grasp of this subject is incongruous with your apparent young age. Amazing.
I'm 20 and I shoot Super 8 and 16mm film. Celluloid film has actually been making a comeback over the last five years or so, especially with the new generation who are discovering it for the first time. Probably more 20-35 years olds shoot film than 35-50 years olds these days.
11:34 - Hold on there! TNG was also edited on tape. This is why they had to re-edit the show when making the HD transfer. So no, Voyager isn't really stuck in SD for all times (even its CGI special effects, IF they preserved original files).
Voyager will be stuck in SD because of poor TNG Blu-ray sales making the cost of re editting it not worth it paramount said.
I don't think old CGI effects get any benefit out of HD, if anything it makes them worse
The files were not preserved. All the effects houses (Voyager used half a dozen concurrently at some points) went out of business or were bought, and their archives went in the bin at that time. They also all used different software from each other, almost none of which is compatible with software today. Even when the software still exists, conversion is necessary to update the file type to the new version. Even lossier conversion is necessary from all the defunct file-types, like loading a Word file in LibreOffice and the tables aren't lined up quite right anymore. It's cheaper to just make new models from scratch, and that's still hugely expensive.
I do hope CBS changes their minds about remastering the other series down the line, but for now they've stated the TNG remaster didn't sell as well as the TOS remaster, and there's currently no plans to do the rest. Probably because the remasters showed up on Netflix and now All Access, and people waiting until each season is $15 on Blu-ray instead of the original $50 or $60. So to change their minds, they either would have to get enough mileage out of viewer figures of the remaster on All Access, or residual sales of the Blu-rays build up over time or something.
@@charmedx3219 TNG's awesome that it was recorded on film... and the remaster/reedit effect were outstanding. It's lacking in the fact it was NOT shot in widescreen! I've heard from many in the Star Trek community that THIS was the reason for the low Blu-ray sales. You can make it all pretty, but can't make the damn thing widescreen to fit our 82" QLED TV's these days! They were already spending $1 million an episode to produce, spend another few thousand to record it widescreen!
TNG was edited on tape and the effects were rendered to tape, but all of the live action shots were filmed, and virtually all of the effects shots were filmed. When they remastered it, they went back to the film and re-composited the effects shots in the digital realm, which was crazy expensive but resulted in a stunning final product.
For Voyager this would be difficult. I think in its first couple of seasons there might be film elements for its effects shots, and the live action was filmed, but in later seasons the effects were done with CGI onto tape. In order to remaster Voyager, you’d have to recreate all of the CGI, which would cost millions.
A point to consider: While the vertical resolution of the NTSC standard is set in stone at 480 visible lines, the horizontal resolution can vary *greatly.* It can be as low as 200 pixels for VHS, as high as 720 pixels for DVDs, and potentially higher. It all depends on the bandwidth of the video recording (and the bandwidth capability of the playback hardware).
To put it another way, NTSC video is a single line of analog data, which gets rasterized into a stack 480 segments high (more accurately, two stacks 240 segments high per interlaced frame). That stack height is fixed, but the density of data is not, and as a result, the horizontal resolution is variable.
A valid horizontal resolution on 480px or 576px DVD is 720, 704 and 352.
So DVD supports that lower pixel size you are referring to.
@@MichaelEricMenk You make the erroneous assumption that pixels are always square. That's true in the digital video world because we all agreed that it is very convenient for those pixels to be square (or somewhat close to it), but in early computer graphics, horizontal resolutions varied wildly depending on the system used.
Analog video doesn't have a resolution in pixels to speak of but there is a difference in bandwidth which is effectively the same thing. The vertical resolution can be fixed because that is a relatively low bandwidth (and synchronization would be very difficult otherwise) but the horizontal resolution is pretty much undetermined and depends a lot on the source material and the processing chain used.
@@Stoney3K wrote: "You make the erroneous assumption that pixels are always square."
No I did not, you brought this up.
Stoney3K wrote: "That's true in digital video world because we all agreed that it is very convenient for those pixels to be square"
You have not read the standards I see... NONE of the resolutions in the DVD standard have square pixels, or the MPEG-2 streams used to carry standard TV digitally to the TV-antennas.
@@MichaelEricMenk "A valid horizontal resolution on 480px or 576px DVD is 720, 704 and 352."
Granted, those are not *square* pixels but they are a fixed amount only defined by a certain digital video standard (otherwise the DVD format would not be able to compress it). They are always a fixed size.
The point is that video only has predefined lines and predefined pixels do not exist, much like an analog sound source does not have a sample rate, only a bandwidth that is defined by the source material and not by the standard of the system that is being used.
Re Imax: I worked at Ontario Place one summer of '73 and one of the CInesphere projectionists gave me a tour of the Imax projector (serial # 001 IIRC). It used 70mm film run horizontally to produce the largest images possible with commercially available movie film. What was especially neat to me as a student of audio electronics was that the normal audio track on the film only carried engineering (timing) signals to synch it with a 24 track tape deck (which also used one audio track for synch), producing not only far better quality than was possible from a conventional film's optical sound track but a multi channel audio that allowed the producers to put the sound wherever they wanted it to be in the theatre.
I like how this channel is squeaky-clean, maybe a "damn" in the bloopers, then you show one youtube comment screenshot and it's all giant F-bombs on the screen.
I would like certain SD shows to get re-released on Blu-ray even though they can't be made "HD" because they tried to cram so many episodes on the DVDs that the quality is objectively worse than they were on TV. It's a real shame companies don't see the benefit in doing this.
It's rare but i've seen a few SD blurays, mostly old Anime shows or concert videos.
Probably still be cheaper to just not cram so many episodes on each disk and use a higher bitrate. Not to mention not having to field the inevitable complaints from people who thought they were going to get HD video because Blu-Ray.
I loved Blu Ray disk up until today.l bought the new Rambo movie, would not play in Blu Ray player because I had to do filmwere update on player.i will eventually do the update,I suggest when buying movies make sure you get DVD as well because DVDs will still play,most of my Blu Rays would not untill I get update. This is not good,this reminds me of a format back in the 90s called divix that was sold at circuit city witch was a flop and forced them out of business. When they went out of business there player's stoped working because there was no more updates,this could happen to Blu Rays and there could be no more updates if they stop making player's because of Internet streaming of movies,that would render Blu Ray DVDs useless, so I'm thanking twice about buying a bunch of Blu Ray disks that may not work in near future,I will never stream as for Blu Ray disks I don't know,but I will keep all my regular DVD's, CD's and a few video tapes because they will still work. I will never give in to Hollywood's greed.
I know that Discotek has done some SD on Blu-Ray releases of some anime TV series.
DVD and blu-ray has become a collector's market, so the appeal of having lots of episodes on one disc is not something that will get many people
In addition, a few series that are in SD have been released on blu-ray, but are treated as HD files, with a nromal amount of episodes per disc
the recent Classic Doctor Who sets are an example of this
edit: just realised i slightly misread
really a good remaster would be in order for a lot of these, but they don't care enough to do it
I love your in-depth explanations. Even when you explain something I'm already mostly aware of. Thank you very much for your work
I love how you're REALLY leaning in to your sense of humor. That bit about "Whadya mean this sounds dubbed in..." is classic Alec.
Your earlier videos were very informative, and I appreciated them for that. Your newer stuff is YOU, and I appreciate them even more.
Happy (Last...cough cough) Christmas and New Year.
(Good job not mentioning them in the video, BTW.)
How do you know the newer videos are "HIM" and not just a UA-cam character/caricature of him?
Agreed, he is getting better at the humour, and I hope he keeps it up. He is really making this interesting.
I appreciate both aspects significantly, partly because he acheives a level of pedantry that I can only aspire to. It's interesting and entertaining.
@@petercarioscia9189 I think a very good indicator of his personality can be found within the outtakes he includes. I would venture to say that the personality and quirky sense of humour we see is pretty close to the original article, if not slightly amplified for effect, as youtubers are wont to do.
I hope he doesn't overdo it though. Channels like Half As Interesting have way too much humor and too little info.
I always wondered why outdoor shots looked so different in TV shows when I was a kid, it was always really jarring.