AI video upscaling already exists with Nvidia GPUs, as does AI based SDR to HDR conversion. Personally I use my PC as a media box with my S95B for exactly that reason, though it's nice to hear I might not have to with future upgrades. As far as the upscaling goes it's so-so. It does a fantastic job at clearing up compression artefacts and makes stuff a bit sharper but it doesn't hold a candle to slower professional Ai upscaling technology like Topaz labs Video AI (any home theater geek with a large 1080p Bluray & DVD collection owes it to themselves to buy that software). RTX HDR however is a revelation. I honestly can't go back after being able to experience everything in HDR. It's obviously not accurate but with stuff like UA-cam Videos and older SDR Tv shows who really cares. I'd love for Vincent to cover it sometime.
It's extremely frustrating to have people act like artificial intelligence is new and not something Nvidia was doing since 2018. This is why people think Nvidia is a bubble. They think it's all new....
@@aquaneon8012 I guess we'll have to wait and see when these new TV's release. Though to be honest I feel like it'll be hard to compete with Nvidia. The fact is that even with my 4070 TI, AI upscaling uses a decent amount of power and about 20-40% GPU usage. It was unusable with my old 3070 laptop as it created too much heat and sounded like a jet engine. So if a 3070 mobile (equivalent to a PS5 in terms of speed) was reasonably taxed by this then it's hard too believe the cheap Soc's in Tv's will have enough grunt to run a similar algorithm without it being sufficiently less sophisticated. Then again maybe they've come up with a more efficient method than Nvidia (doubtful) or are finally using a decently fast chip (even more doubtful). So who knows.
The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Especially with the way nvidia has allotted bandwidth on their ports up until now, (60hz 8k max, 7680x2160 at 120hz max, limits on multiple 240hz 4k screens) - though that could change with the 5000 series potentially. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps. That's from a perspective of using a gaming TV as a multi-purpose PC gaming + desktop display, with some media usage.
I think you are comparing a cgi rendering to a real time game so to speak. You can pre-bake/upscale movies, sure, you can even buy movies that have been upscaled. From what I understand, Topaz works best with tweaked configuration per title or even clip, and though it's pretty fast on a powerful gpu it would take awhile to render/AI upconvert a big library of videos. AI upscaling/machine learning upscaling is done in real time on varied content, (plus pc gaming apparently). Dynamic "real-time" delivered content. So... why not have the capabilities of both? I agree in general though for static movies and pre-recorded programming. . . why not just pre-bake them as upscaled? I'd say the same for streaming services but they use dynamic bitrates so real time AI upscaling could still have huge benefits there.. It's "real time" content like live shows, live streams, news, and gaming that AI upscaling in real time would benefit most, but my viewing habits digest a lot of those. Since streaming services bit rates are lower and even fluctuate dynamically, streams would also benefit. So it's really only per-recorded local libraries for collectors that have been locally upscaled or bought upscaled that wouldn't derive benefit.
Not sure I'd call ai "niche". It's the way all tech seems to be heading. I'd defo be worried if Samsung said "it'll become available via an update later" for specific models though. Like they did with HDR10+ on my 8 year old tv that never materialised.
Absolutely right. They'll stop supporting this on the 24 models as soon as the 25s come out. And I bet if someone checked upscaling on release software compared to now, there'd be zero improvement in quality.
I do find AI upscaling an interesting idea. I think it should be available on the TV, but there also should be a way to turn it off in case there are errors.
Would errors happen, though? I bet the TV makes sure the 4 or 16 pixels that replace the 1 or 4 original ones keep the same overall light and color, so you never get very far from the original? I'm very curious for an "AI upscaling off" button, to compare and save energy. In the store, 8K demos look so amazing, it's hard to look away :-D
Zoomed in. Always I see comparison that make it smaller on our screen than original. With the TV in out room, we'd be looking at details to appreciate quality.
I'd be curious to see how it handles high pattern noise situations. Things like finely knit clothing that can throw off digital sensors and cause aliasing where there'd be none with analogue recording.
I'm in tech and cyber security. AI has become a marketing term since Chat GPT went mainstream. Most "AI" is just machine learning or some other algorithm, but it's unknown to common consumers.
Exactly. That term is being slapped on absolutley everything now for the simple reasons that companies believe that if they DON'T put that on everything they make consumers will think they are outdated and behind the times.
@aagfnv I saw a toaster boasting AI on here or LinkedIn somewhere. It's getting ridiculous. I could understand if they stated AI was used in the design process. But everyone is skipping that, that's part of reaching singularity.
Think of it like having a huge texture library for a video game. An extremely massive library and intelligently generative off of that. Your game had low detail textures, now it has high detail textures that are based off of the original ones. The original texture and it's mapped grid is still intact, but the holes/missing pixels of detail in the higher resolution screen's finer grid are filled intelligently. It's a fill operation, but one based on a huge AI learning library.
The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps.
I can’t imagine many scenarios using this, but one thing that would pique my interest is how retro games would look. The community has long experimented with different upscaling algorithms to find the best look for older systems on modern screens.
Exactly this I was wondering today. Some have shown it, but a zoomed out view always (triple face palm). Imagine an algo that turns 640P VGA racing games into 4K with lovely details taken from a dedicated database. Cranked up to 120 Hz, of course. Now what to do with the tinny car noises? :) In the time the 80s games needed to load, modern AI could scrape the internet for 1440P gameplay to develop bespoke real-time algo that overlap in front of actual 640P gameplay. No need to upscale, just pick new pixel to substitute, loads of them :)
As the debate continues to rage on for James Cameron's decision to use Park Road Post Production and their machine learning algorithm, it dawned upon me that one day this technology will trickle down into consumer products either as an integrated or standalone depending on functionality, capability and price points. There are quite a few people who are fans of the final 4K physical media product for Aliens, The Abyss and True Lies. This trend may gradually become a norm for some people as Vincent pointed out, "Do you care if the upscale detail was present in the original source or generated by AI?" I would argue that for the typical consumer and non enthusiast of films, they will not care just like how they don't care if it streaming or not in order to get the best audio and video quality. The enthusiast details mean very little to them, all they care is if it looks good or not. I still remember a time when enthusiasts were moaning how demonstration panels were set on vivid or high frame rate interpolation to get peoples attention. A.I. upscaling is that except on steroids, which sadly may become a more common technology that is applied to entertainment images.
That would be great if we also had a setting for streaming services that was "Bluray quality". Unfortunately, a lot of content is highly compressed or standard definition, so needs some help to look decent.
The upscaling aspect of 8K TVs seems to be the most underreported tech that's already in stores. The salesman in my local store seemed quite unaware of it, and was utterly unable to help me to a demo for it. So many grat old films in 1080P or 720P...how do they come out? Fuzzy documentaries...if it's ANY better, I'm here for it!
Yes, I care. These are all stopgaps / hacks to overcome flaws in display technology or bad source material. The solution is not to fake it, but to fix the problem: better source material and better display tech. I wish all these companies would put their r&d effort into fixing the root of the problem.
While I don't disagree at the theoretical/utopian level, Samsung does not produce the streaming content from Netflix etc., distribute sports, or publish movies, so I don't know what you expect them to do other than improve whatever the subpar content the TV receives. In an ideal world, all movies/shows would be available in 4K HDR on 4K Blu Rays, and sports would be shot in 4K, 120 fps. But in the real world, some people still buy DVDs, most people stream from highly compressed sources, and sports is only available in 720p/60 or 1080i/60. Your complaints should be to those that generate/produce the content in the first place.
If the difference is imperceptible from ‘normal’ viewing distance (say 2m or more) then the end justifies the means with regard to AI enhancement and should l benefit the viewer. Whether so remains to be seen. Your eloquence as ever in delivering the technical detail without pause in a digestible format would garner praise from Mr Spock, so long may you rock 🖖👏
generative video upscaling has been a thing on PCs and consoles for the past few years (ie: adding in generated pixels to say a 1080p image to upcale it to a 1440p image) and the next iteration is totally generated 'tween' frames to 'boost' frames per second. Having this built into tvs is no great surprise.
The difference is that if written properly, a game can provide instructions for the consoles/computers to know what is going to happen in future frames, whereas a TV is only receiving the information after the fact. However, unlike a game, where the users responsiveness must be taken into account, limiting the time available for the frame generation to occur, if a TV were to delay its output by several frames (i.e. lag behind the input source) by creating a sort of frame buffer, a user wouldn't notice/care. Definitely interesting to say the least.
That's what I really want to know. I want to watch my Scarface DVD in 8k. Or maybe I could just buy the 4k version (if it exists) and see if it looks any better in 8k AI upscaling mode. Also, I'll need to afford the TV, which is something I'm working on.
@@ditroia2777 Oh cool, that's promising. Was that real-time upscaled though, or post processed? If I can get an App for my 8K TV or PC that takes Miami Vice, Night Rider and Airwo;f and make it 8K, that'd be pretty neat. Let alone music videos, if it can give that depth perception we see on demo signal in TV stores. 8K Samsung QLED 900C is just gorgeous with that Demo. I want to see what it and 900D do with 720-1080P media...
@HDTVTest Do you think samsung will match or even surpass sony in image processing department courtesy of generative AI from what you've seen in that event???? And also how does this generative AI upscaling tech stack up against sony's cognitive processor XR's upscaling??
Cognitive intelligence is another way of saying AI. Like Apple saying, Spatial Audio instead of Dolby Atoms to make you think it’s something else. Sony always thinks they are Apple of TV world NOT.
A great example of this that regular people can see now is the new 4k release of Aliens. James Cameron used the same kind of A.i. to upscale the film. If you've watched that movie over the years, the amount of clarity they were able to obtain in the new release is kinds crazy. People seem to either love it or hate it. I can see the benefit for sure
@@pewburritoeverything but faces it seems to do a very good job of. When faces are static they look great, movement is still hit or miss. It's the best current example I think of the path it's going towards. I would say in another year or so, once it gets moving objects down, it'll be almost flawless. It's very uncanny. I think it will be great until people start using it as a replacement instead of an optional enhancement. It's like 48fps and the "soap opera" effect it creates. When it wasn't intended, it sticks out, but when used properly, it doesn't throw you off as much. We have all gotten used to 24fps and what recordings look like on a basic level, we're getting into enhancements that people outside of enthusiast group will have a hard time accepting easily. The more people use it properly will dictate the acceptance level
So will these Ai features only be set, released on 2024 S90D & S95D QD-OLED televisions? Or there 2024 QLED as well? Will getting a discounted S90C set me further back then forward from my current mini LED?
There is AI and tv, but remember, there are sometimes a catch 22 with them to actually work, like resolution, chroma, what frame rate, and which inputs they are available on and not, along with copyrights stuff
Thank you for this very interesting and enlightening information on picture quality enhancement. Your time and efforts to share this with us all are much appreciated.
I like Upscaling of course and think AI could probably do a good job. Two things though...most of the time, still images, even compressed ones or 1080p Streaming look absolutely fine, even on a large 4K screen with a relatively close viewing distance. Where things fall apart is when there is Movement. Since Hollywood still isn't budging on it's 24p fixation, what I need most to enjoy content on my TV is good motion interpolation. This has never been Samsung's strongges area though. Another little "worry" is, that Samsung tends to have there own "Vision" of what a good image is supposed to look like, and that tends to be over-sharpened, overly colourful and overly bright.
I just have to say I LOVE that for the first time in so much time, Someone talking about the new buzzword actually calls it what it is, a Neuronal Network. I really dislike how AI is being used so freely, when no real Artificial Intelligence exists yet..
bravo a step in a right direction. if only the ai could keep the motion resolution intact while in fast moving scenes and when panning camera it would be great
As someone who often uses Topaz AI to upscale old movies/tv shows especially to remove noise (which it's really good at) to make them "watchable" as well as using DLSS in games on my pc. This sounds very interesting indeed. I'm sure the first couple of generations of the tech will have its faults but it'll get better in time like all of the ai upscalers on pc have done. With 90-95% of terrestrial tv channels still being less than hd this is the kinda stuff that would make me watch those channels again as i pretty much avoid them. Shame I've just ordered a new tv (S90c) so it's gonna be another 8 years or so until i upgrade again but at least it'll be mature and cheaper by then
I've known about this for a while because a while ago, they were talking about video games graphics that would be generated by the GPU. This would be more efficient than trying to render the details from local libraries. I believe it works in practice, but I don't think it's in any games yet. Generally, I'm right, but I could be wrong about the detail I've given, so be forgiving. When I read about this concept I wondered when they would apply it to TVs to make a 4k video from 24 fps to 120 fps with AI generated frames to smooth motion. Making it look like it was filmed that way. That would be its best application in a TV. Making low res content higher is nice, but far less important for improving the movie experience, I think. Not seeing the judder or soft images during motion would be a massive step forward.
NVIDIA and Intel use DLSS and XeSS for upscaling their games from a lower resolution to a higher resolution. It not only upscales it also makes the image more stable with a temporal pass in the upscaling. DLSS also has frame generation which should technically double your framerate but in practise doesn't because there is s slight overhead with using it. And Intel will soon be adding ExtraSS which will work a little bit different.
I do this quite often with Topaz, but it's a very compute heavy process, barely doing a few frames per second on a high end GPU. I can't see how a dinky TV processor can do such thing in real time...
I have been using NVidia DLSS for PC games for a couple of years now. I think the technology is really awsome. Do not worry about the algorithm generating details that were not supposed to be there in the original material. The algorithm looks at the picture on a micro-level and detects the patterns that are there in the original material and scales them up to the target resolution in the most plausible way. Upscaling is always going to be generating pixels that were not there in the source material. This technolgy scales the image up in the smartest and most plausible way. It is far superior to any other upscaling methods. It does not come up with wohle elements that are not supposed to be there. But I know TV purists will always turn off fearures that will alter the picture.
The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps. So at least for PC gaming at high hz, AI upscaling hardware on a gaming TV could be very useful.
@@gjk2012it's not.. Why would you trust a person who's source is looking at ai upscale uploads from pirates.... Lol.... If you have an Nvidia gpu, or Nvidia shield, it will do AI upscaling and look amazing... Nvidia is the leader of ai.... Why would you think they're bad at the technology they literally were first to invent? Ai upscaling is something Nvidia has been doing since 2018. It's not new....
@@Tential1 nope. You really think A.I upscaling in Games is the same like in real world realistic movies. lol. Boy, computer games don't look realistic and DLSS sometimes even creates an oil paint like image. So he's definitely right and you have no clue, obviously.
Thanks for your overview of Samsungs AI tech. I am ok with the use of AI for picture enhancement if it's implementation shows an improved image and motion processing. I am most curious if the AI algorithm improves star field scenes, one of the most challenging scenes for mimi led tvs.
We went through all the effort to make them include Filmmaker mode only for them to "add texture" that wasn't missing sounds like people who put their tvs's sharpness on max designed this really I want my tv to show me what the creator wanted to show me not a robot "with an oh so large data set" hallucinating grain on walls lmao
That's not happening ..the creator didnt watch a stream of video with 25% the color data of the captured scene.. We aren't even getting 4:2:2 from Blu-ray..whatever we are watching on Netflix is maybe 30% of the creator's Master.
Not sure exactly the same , think it's also for empathising main subject to make it more 3D , good news 5 years updates , pretty sure will use same alpha 11 next year . So will get more AI. Think its need quite a bit of memory , and need to download model data . So for best would need to scan movie and download models
Is the C4 a big downgrade without MLA. Do you think next year's C5 in 48 inches will have MLA? Is it true or false no Dolby Vision support doesn't matter on a Samsung QD-OLED
C4 is the right choice for most people. G4 I'd super bright especially in specular highlights but probably not worth the price difference. C5 likely won't have MLA in any size unless the G5 makes a significant leap in technology which I doubt. Dolby Vision maybe won't matter to a lot of people but it's personally something I wouldn't but a TV without. Dolby Vision content can be a bit hit or miss but when it hits, it's stunning.
Im not too excited by different game modes on the samsung. create an ideal image and it will be ideal for all genres. Some genres have a tendency to be more demanding, and whatever benefits them will benefit others by default. What manufactures need to do is focus on a quality implementation, not pointless and often substandard gimmicks. Tvs do not need more settings. They need to do what they are supposed to do well.
AI can miss me. The latest James Cameron 4K Blu-rays use AI upscaling from 1080p, and they look HORRID. Also, upscaling video with Topaz Video AI is very tough to achieve believable results. It just looks bad. Just scan the film at a higher resolution to begin with. I'd rather have an interpolated low-res image than a high-res one where a computer is guessing (usually poorly) what should fill in the blanks. AI upscaling does have it's uses, but really only in very small steps.
Sadly though, millions of people out there every day are fooled into thinking it does. So, they blindly just go out and buy Samsung TVs thinking they are getting "the best TV". But, that's the power of good marketing for you! Spend enough millions on crafting a marketing message and you can just about fool all the people all the time, unfortunately.
Yep, all starts here. Next time you say something like this it’ll be “man, I didn’t know ai could blow me and cook at the same time!…” then from there it’s surly all downhill boys
yeah but on low end trash Hardware SoCs with 5W power consumption. So, there is no GPU like RTX 4080 etc with the horsepower for real AI. And even real AI isn't the holy grail for movie productions - it looks not natural.
@@Tential1 I hope you’re kidding. DLSS uses temporal data and motion vectors to upscale an image. Generative AI is a completely different technology. YOU have no idea what you’re talking about
I heard that Samsung and Qualcomm have made a deal with AMD to do work together. Could Samsung be getting a bit of help from AMD by seeing how their FSR works and implement their own version in the tvs?
I don’t know if this will works always good... but i hope that we can select than an option to stay a maximum with the original picture quality. Today with my 15 year old plasma i’m still waiting for a huge upgrade with a correct price. I hope i can still wait until csot injekt printed 75/77 inch rgb oleds or nanoled/qdel screens are on the market for 1600/1700 euro’s and less, i hope ces 2025 will be better than ces 2024. If i’m realistic one of the ces 2026 models will be my next hardware upgrade. So blackfriday 2026 or january 2027 i buy one. Hopefully my plasma works until that time. Cauze today the price gap between 55 inch qd oled and 75 inch mini led is to much. I think real native 4K will be supported a lot more in the next 4 years. Games on consoles works still today with 1440p and not 2160p. Next generation gameconsoles 2160p with 60 fps is possible.
High-end phones already do and sadly I really don't see it getting close to DSLR. AI relies on there being some hints of the details that are missing still left in the image (subtle changes in the colours of the pixels), tiny sensors just don't pickup those details at all which is why skin usually looks flat lacking any texture and red hair usually looks brown, and any subtle changes in the tone of hair in general is lost. Smartphone photos only look good on those tiny screens where we can't see all the detail is missing. Also remember as we make tiny sensors more light sensitive to try to compensate for their size, we can also then do the same with bigger sensors. So inherently the gap between the two doesn't really close that much.
when the tech can transform SD into HD, I'm in. Apart from that native 4K is good enough, it does not need to be any sharper. Tests have shown that people prefer film grain in a film, to give it that "film" aspect to it. I would agree.
@@Bushwacked487 watched enough review putting g4 as best so far even over way more costly ones, the processing LG is using this year along with all the Xtra features 144hrtz and just suite of gaming features, detail and brightness in picture g4 just sits at the top so far but wait around for it
Yes, Ive been commenting this for a while. Upscaling using the full power of a dedicated nvidia RTX graphics card has to be miles better than integrated SOCs the TVs use. Plus they have experience with DLSS already.
@@yadspi The RTX upscaling already exists and can be used with any brand of TV, whereas what is mentioned in this video is just a demo for high end 8K TVs most people will never buy. Also many people have a TV attached to their PC rig and can take advantage of it right now.
I can appreciate the technology, but, at the same time, I'm getting increasingly annoyed by all the AI fakery that seems to be the trend (not only) in visual processing. Whatever happened to simply showing the original unaltered content? I would say that it's fine as long as it can be disabled, except it doesn't really matter, because as soon as such a feature is readily available, it will inevitably become the new "normal" anyway. We can already see the effects of AI making up stuff in other areas, and it's very clear many people are unable to even notice when an AI "enhancement" is mangling images for them under the illusion of added detail or generating horribly unnatural translations or text.
we are going to get which device do we let upgrade the picture . Best is from say a streamer like Netflix - ie extra premium as they have much more power and time to upgrade media - ie they could spend 3 days upgrading a movie with some human input ( but they won't as costs ) . Like which DAC is best . Nvidia and AMD will get better and not limited to soon to be older TV. Ie you want your upgrade solution to ultimately be independent of the TV . However let these TV upscale from 1080p to 4K for much content as will do a pretty good job. Best at moment is Topaz AI and some similar. you can upgrade a DVD over night , upscale and watch on TV next day , Whet the future will bring is system where AI studies whole movie and learns every character , so even if poor data scenes, if has excellent data from other parts of movie and from the WWW. same fo voice
I would not pay extra for the AI feature at this time. I already turn off all motion and image enhancing features on my Sony but I'm open to new technology making image enhancement better. I'll wait and see.
@@frankreynolds9930 no, they don't prefer it actually...they just don't know any better. But, that's what they end up with when they just blindly follow the crowd instead of actually doing their homework before making a purchase. Sadly, millions do exactly that.
Having seen reviews of Nvidia DLSS, it is on its 3rd generation, and while it looks very good, there are still ghosting and shimmering problems at times.
I just bought the MZ2000 and It’s disappointed. It has almost no new improvements compared to the JZ2000 (2021 model). I even feel it’s a bit less.. is the pinacle of Panasonic TV’s over since they are not assembled in Japan anymore?
It has the MLA panel which is brighter and more impactful with the right kind of HDR content. It also makes Dolby Vision IQ usable, by finally allowing to turn off noise reduction settings in that mode. Otherwise it is probably rather similar to the top model from just 2 years earlier, e.g. both have near perfect SDR color accuracy out of the box without any calibration and superb shadow detail. I would generally recommend to wait a bit longer before upgrading the TV, because the differences from year to year usually are incremental.
@@redrum_2k The MLA panels have raised blacks. I rather have deep Blacks than 200 nits more. The black level on mz2000 look like my old pioneer kuro, that’s not good. The jz and hz2000 are superior imo. I will return my mz2000 and buy a good plasma instead till they have improved technology. Old resolution also looks very bad on the mz2k. It has the best PQ when gaming compared to LG and Sony. That’s it..
@@redrum_2k MLA panel is not an improvement for everyone. Raised blacks is worse than getting a few more nits in brightness. IMO the biggest deal of oled is the deep blacks. MLA panel black almost looks like my old Pioneer kuro plasma black.. yea the upgrades are very minimal but the price is not. I’ll return the tv and get myself a proper plasma in the meanwhile. Improved QD oled is what I will get next time
That's just DLSS-esque upscaling, the first version of which was available to mass-market in the Nvidia 20-series graphics cards, released in 2018. Today, Intel's XeSS also support similar technologies. There's really nothing new if that's all there's to the Samsung version. It's also misleading to just compare two TVs and show that one is using less power and attributed that to whatever AI things they have. It's possible that the newer TV just has a more efficient panel. So if you set everything the same, you may still see the same energy consumption advantage - or perhaps even higher saving - without the need of whatever AI shenanigans. Of course it could really be the AI that was responsible for the improvement, but a demo like that was hardly convincing by itself.
The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps. So for pc gaming on a gaming tv, putting AI upscaling hardware on the tv sounds like it could be very useful if done right.
@@elvnmagi9048 The main thing I was meaning to say in my first comment was that what Samsung had shown in the presentation wasn't anything new. We haven't seen anything from the LG G4 but it contains a newer SoC from the C-series and the newer chip is purportedly doing similar upscaling/reconstruction shenanigans to what's outlined here. But the point is AI image reconstruction in video is "old tech" in 2024. Digital Foundry did an older video showing the Nvidia Shield doing similar upscaling on regular video playback a few years back. You can argue that it has been the domain of Nvidia until recently, but it still isn't anything new and if anything Samsung's presentation, to me, is a tacit admission "we were behind, but we have caught up!".
@@lagrangemechanics Yep i use a 2019 shield regularly, but that's not the same as having a more modern AI chip in the display itself, due to the bandwidth limitations of ports and cables in regard to PC gaming on a gaming TV. More modern AI upscaling may be faster, cleaner, and provide more detail as the generations progress. Also, in regard to media, the shield does an ok job to 4k but it is a big leap to do 8k fast enough, clean enough, and with high enough detail gained. For PC gaming, the bandwidth savings by upscaling on the display side is important, especially with the way nvidia has allotted bandwidth on their ports up until now, (60hz 8k max, 7680x2160 at 120hz max, limits on multiple 240hz 4k screens) - though that could change to where a single port could get full hdmi 2.1 bandwidth with the 5000 series potentially, I hope so at least.
@@tomblade I don't think so, very low res/quality source means the AI has less to work with and has to interpret more, leading to more mistakes and weirder results.
@@0M0rtyno, it doesn't. You're theory crafting, rather than looking at the actual results. This isn't new tech dude. Nvidia artificial intelligence upscaling has been here. And it's been tested. Low resolution content, it's like magic. Hardware unboxed has shown that for video games upscaling is better than native image quality. I know, it's weird. Why do you think Nvidia is worth 2.2 trill....
So many things in one video I never thought I would hear from a ”purist” like Vincent. To comment on just one of them, 4:35, if I can’t notice or care about upscaling from 2m away then why bother anyways? Or to continue that logic then might as well max out the sharpening, deblocking and all other effects that the ”experts” always tell consumers to turn off? Sounds like marketing, marketing, marketing….”AI upscaling” is nothing new….
Exactly. Companies like Sony and Panasonic have already been doing this exact same thing in their processors for YEARS now. No different. They just put a cool new label on it. That's all.
4:33 - I can't - not even with native 4K images on my 55X85L vs 1440p or even 1080p sometimes. (around 2m, maybe slightly less?) So I definitely don't want the wasted energy of a stronger processor doing useless work. Especially 8K. 8K!!! This stuff is not doubling, it's an exponential increase! Useless insanity. Give me better brightness and good dimming zones (or robust OLED, but I'm still worried, plus price), because I can actually see that. (I can see that Sony lied to me about my TV being able to do 4K HDR at 120Hz VRR - I mean, it does display that signal - the rest works it looks like 1080p. Even the text in Windows is displayed in a sub 4K font renderer, despite it putting out 4K - it's very weird and damn I was p...eeved. 4K 60 HDR VRR is perfect, but I didn't buy a new TV + cable for that... Also low dimming zones, but I can't sideload adblocked YT/Twitch APKs on the otherwise pretty nice discounted Mini LED Panasonics we had at the store. In fact they didn't have any proper app store at all. So the Sony for a still quite nice discount.)
Once you have seen how they consistently over-saturate and over-boost all of their colors by default on every model they make you can't unsee it, unfortunately.
So you want a 1080p image to only cover a 25% box in the centre of your TV? A SD image to only be a tiny 5% window? Sounds unreasonable. No one does that. It has to be upscaled to fill the screen area somehow!
I'd like to show you a blind test. Play you an (advanced) upscaled image without you knowing, for few days, then switch to "raw, pure 1:1 native image" for few days without you knowing. The findings would be interesting. I personally couldn't care less. If there's something that can enhance those crappy 1080p videos with poor bitrate on UA-cam or simply old videos, that's fantastic news.
You only THINK you want it because you've never seen a raw, pure, 1:1 native image. Everything low-res you consume on a phone, computer or TV gets algorithmically upscaled to the display resolution, because displaying a raw, pure 1:1 360p or 480p signal in on a 4K display is absolutely horrible. The only displays capable of displaying low-res signals in presentable ways in high-res panels were CRT TVs, since there aren't really "pixels", but instead a phosphor coating on a grid that sets the maximum resolution, and an electron gun which's properties allow it to display any resolution under the maximum resolution clearly. Knowing all this: 1) You want upscaling. 2) You want AI upscaling (since it's the best upscaling). 3) I know that you want it because you're customer, and customer's don't know what they want because they don't know anything about the products they're using. 4) Everything will come with AI upscaling in 5 years, you won't be able to turn it off, and even if you manage to turn it off, you will come back to it because it's awesome.
It’s interesting technology, but it’s being deployed in the wrong place. They should give this technology to the creative industries so that for example a filmmaker can use the tool to enhance their image if they want to, and so that when you watch it at home you know that they have approved it. I would not ever want to watch a movie with AI added in afterwards. Filmmakers already hate motion smoothing, I can’t imagine they are going to be happy hearing people are watchign their movies at home with AI generative upscaling.
Well, remember here, these companies HAVE to keep coming up with the "cool new thing" every single year to convince people that they are getting something more this year than they got last year. They HAVE to do this also to (at least try to) convince people to UPDGRADE, long before their old TV actually breaks or wears out. That's the way the entire tech industry stays in business. There is a gun to their collective heads of these tech companies called "year-over-year revenue growth" and they will do whatever they possibly can to make sure those shareholders of theirs stay happy. Even if what they are telling you each year is about 80% marketing fluff, hype and pure BS.
Call me a purist or boring - but i much prefer standard linear upscaling, without any artifical gap-filling. What if someone were to hack its data set/search algorithms. Couldnt that lead to wildly inaccurate/inappropriate results? No thanks.
My last Samsung TV was bought in 2015 and started out good but was gradually made worse over time as it installed successive updates. I wasn't even using its OS or any apps, just passing an image through HDMI ports! By the end it got so bad that displaying an image with changing shades of grey would make it crash and have to be unplugged from the wall in order for it to display an image again. This experience ruined my trust of the company; I'll never buy another Samsung TV no matter what features it has.
AI video upscaling already exists with Nvidia GPUs, as does AI based SDR to HDR conversion.
Personally I use my PC as a media box with my S95B for exactly that reason, though it's nice to hear I might not have to with future upgrades.
As far as the upscaling goes it's so-so. It does a fantastic job at clearing up compression artefacts and makes stuff a bit sharper but it doesn't hold a candle to slower professional Ai upscaling technology like Topaz labs Video AI (any home theater geek with a large 1080p Bluray & DVD collection owes it to themselves to buy that software).
RTX HDR however is a revelation. I honestly can't go back after being able to experience everything in HDR. It's obviously not accurate but with stuff like UA-cam Videos and older SDR Tv shows who really cares. I'd love for Vincent to cover it sometime.
It's extremely frustrating to have people act like artificial intelligence is new and not something Nvidia was doing since 2018. This is why people think Nvidia is a bubble. They think it's all new....
I wonder how nvidia's ai upscaling compares to sony and samsung tvs.
@@aquaneon8012 I guess we'll have to wait and see when these new TV's release.
Though to be honest I feel like it'll be hard to compete with Nvidia.
The fact is that even with my 4070 TI, AI upscaling uses a decent amount of power and about 20-40% GPU usage.
It was unusable with my old 3070 laptop as it created too much heat and sounded like a jet engine.
So if a 3070 mobile (equivalent to a PS5 in terms of speed) was reasonably taxed by this then it's hard too believe the cheap Soc's in Tv's will have enough grunt to run a similar algorithm without it being sufficiently less sophisticated.
Then again maybe they've come up with a more efficient method than Nvidia (doubtful) or are finally using a decently fast chip (even more doubtful).
So who knows.
The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Especially with the way nvidia has allotted bandwidth on their ports up until now, (60hz 8k max, 7680x2160 at 120hz max, limits on multiple 240hz 4k screens) - though that could change with the 5000 series potentially. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps. That's from a perspective of using a gaming TV as a multi-purpose PC gaming + desktop display, with some media usage.
I think you are comparing a cgi rendering to a real time game so to speak. You can pre-bake/upscale movies, sure, you can even buy movies that have been upscaled. From what I understand, Topaz works best with tweaked configuration per title or even clip, and though it's pretty fast on a powerful gpu it would take awhile to render/AI upconvert a big library of videos.
AI upscaling/machine learning upscaling is done in real time on varied content, (plus pc gaming apparently). Dynamic "real-time" delivered content. So... why not have the capabilities of both?
I agree in general though for static movies and pre-recorded programming. . . why not just pre-bake them as upscaled? I'd say the same for streaming services but they use dynamic bitrates so real time AI upscaling could still have huge benefits there.. It's "real time" content like live shows, live streams, news, and gaming that AI upscaling in real time would benefit most, but my viewing habits digest a lot of those. Since streaming services bit rates are lower and even fluctuate dynamically, streams would also benefit. So it's really only per-recorded local libraries for collectors that have been locally upscaled or bought upscaled that wouldn't derive benefit.
You'd be counting on Samsung supporting a niche product for more than a year which Samsung does not have track record of doing.
Not sure I'd call ai "niche". It's the way all tech seems to be heading. I'd defo be worried if Samsung said "it'll become available via an update later" for specific models though. Like they did with HDR10+ on my 8 year old tv that never materialised.
Absolutely right.
They'll stop supporting this on the 24 models as soon as the 25s come out.
And I bet if someone checked upscaling on release software compared to now, there'd be zero improvement in quality.
The best tv tech channel on UA-cam.
I do find AI upscaling an interesting idea. I think it should be available on the TV, but there also should be a way to turn it off in case there are errors.
Would errors happen, though? I bet the TV makes sure the 4 or 16 pixels that replace the 1 or 4 original ones keep the same overall light and color, so you never get very far from the original?
I'm very curious for an "AI upscaling off" button, to compare and save energy. In the store, 8K demos look so amazing, it's hard to look away :-D
honestly AI upscaling is really interesting, if it works on 480p signals we could have real time upscaling on old games and have them look nicer.
Hopefully there will be an Authentic Image mode.
yeah hopefully. But Samsung TVs are trash..so...
There are TV's out that cant even turn off the soap opera motion smoothing. This is getting scary.
You realize that no consumer level TV is showing you the authentic image right? They all process the signal even with all settings off.
Authentic Image mode as in... A I mode? :)
Play the same content on 3 different TVs (without AI upscaling) and you'll get 3 slightly different images. So how do you define Authentic Image?
I really can't tell the difference in those 8K comparison shots. They should try showing DVD content non-AI and AI upscaling side by side.
It just looks like they turned the sharpness slider up
Zoomed in. Always I see comparison that make it smaller on our screen than original. With the TV in out room, we'd be looking at details to appreciate quality.
I'd be curious to see how it handles high pattern noise situations. Things like finely knit clothing that can throw off digital sensors and cause aliasing where there'd be none with analogue recording.
I'm in tech and cyber security.
AI has become a marketing term since Chat GPT went mainstream.
Most "AI" is just machine learning or some other algorithm, but it's unknown to common consumers.
Exactly. That term is being slapped on absolutley everything now for the simple reasons that companies believe that if they DON'T put that on everything they make consumers will think they are outdated and behind the times.
Like having data sizes of GB (Gigabytes) in power of 10 instead of power of 2 like GiB (Gibibytes). Pure marketing trash.
Can't wait for my AI-enchased yerba mate smoothies.
@aagfnv I saw a toaster boasting AI on here or LinkedIn somewhere. It's getting ridiculous.
I could understand if they stated AI was used in the design process. But everyone is skipping that, that's part of reaching singularity.
Machine learning is one of the subfields of the AI field in computer science.
Waiting for the lg g4 review
Generating detail that doesn't exist isn't improving the quality its changing it.
Think of it like having a huge texture library for a video game. An extremely massive library and intelligently generative off of that. Your game had low detail textures, now it has high detail textures that are based off of the original ones. The original texture and it's mapped grid is still intact, but the holes/missing pixels of detail in the higher resolution screen's finer grid are filled intelligently. It's a fill operation, but one based on a huge AI learning library.
I guess it was only a matter of time for TV companies to make their own form of DLSS
The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps.
I can’t imagine many scenarios using this, but one thing that would pique my interest is how retro games would look. The community has long experimented with different upscaling algorithms to find the best look for older systems on modern screens.
Exactly this I was wondering today. Some have shown it, but a zoomed out view always (triple face palm).
Imagine an algo that turns 640P VGA racing games into 4K with lovely details taken from a dedicated database. Cranked up to 120 Hz, of course. Now what to do with the tinny car noises? :)
In the time the 80s games needed to load, modern AI could scrape the internet for 1440P gameplay to develop bespoke real-time algo that overlap in front of actual 640P gameplay. No need to upscale, just pick new pixel to substitute, loads of them :)
As the debate continues to rage on for James Cameron's decision to use Park Road Post Production and their machine learning algorithm, it dawned upon me that one day this technology will trickle down into consumer products either as an integrated or standalone depending on functionality, capability and price points. There are quite a few people who are fans of the final 4K physical media product for Aliens, The Abyss and True Lies. This trend may gradually become a norm for some people as Vincent pointed out, "Do you care if the upscale detail was present in the original source or generated by AI?" I would argue that for the typical consumer and non enthusiast of films, they will not care just like how they don't care if it streaming or not in order to get the best audio and video quality. The enthusiast details mean very little to them, all they care is if it looks good or not. I still remember a time when enthusiasts were moaning how demonstration panels were set on vivid or high frame rate interpolation to get peoples attention. A.I. upscaling is that except on steroids, which sadly may become a more common technology that is applied to entertainment images.
I only need 1 preset on TV. It should be called - “No added crap by Samsung offline mode”
That would be great if we also had a setting for streaming services that was "Bluray quality".
Unfortunately, a lot of content is highly compressed or standard definition, so needs some help to look decent.
lmao
Build your own TV.
The upscaling aspect of 8K TVs seems to be the most underreported tech that's already in stores. The salesman in my local store seemed quite unaware of it, and was utterly unable to help me to a demo for it.
So many grat old films in 1080P or 720P...how do they come out? Fuzzy documentaries...if it's ANY better, I'm here for it!
Yes, I care. These are all stopgaps / hacks to overcome flaws in display technology or bad source material. The solution is not to fake it, but to fix the problem: better source material and better display tech. I wish all these companies would put their r&d effort into fixing the root of the problem.
While I don't disagree at the theoretical/utopian level, Samsung does not produce the streaming content from Netflix etc., distribute sports, or publish movies, so I don't know what you expect them to do other than improve whatever the subpar content the TV receives. In an ideal world, all movies/shows would be available in 4K HDR on 4K Blu Rays, and sports would be shot in 4K, 120 fps. But in the real world, some people still buy DVDs, most people stream from highly compressed sources, and sports is only available in 720p/60 or 1080i/60. Your complaints should be to those that generate/produce the content in the first place.
I'll get on the phone with Federico Fellini and Walt Disney right away! Do you have quarters?
Great stuff. Thanks a lot for making this video.
If the difference is imperceptible from ‘normal’ viewing distance (say 2m or more) then the end justifies the means with regard to AI enhancement and should l benefit the viewer. Whether so remains to be seen. Your eloquence as ever in delivering the technical detail without pause in a digestible format would garner praise from Mr Spock, so long may you rock 🖖👏
Whatever happen with reproduce the original source?
Streaming services happened.
generative video upscaling has been a thing on PCs and consoles for the past few years (ie: adding in generated pixels to say a 1080p image to upcale it to a 1440p image) and the next iteration is totally generated 'tween' frames to 'boost' frames per second. Having this built into tvs is no great surprise.
The difference is that if written properly, a game can provide instructions for the consoles/computers to know what is going to happen in future frames, whereas a TV is only receiving the information after the fact. However, unlike a game, where the users responsiveness must be taken into account, limiting the time available for the frame generation to occur, if a TV were to delay its output by several frames (i.e. lag behind the input source) by creating a sort of frame buffer, a user wouldn't notice/care. Definitely interesting to say the least.
How well does it upscale DVD 480 to 1080 & 4K?
That's what I really want to know. I want to watch my Scarface DVD in 8k. Or maybe I could just buy the 4k version (if it exists) and see if it looks any better in 8k AI upscaling mode. Also, I'll need to afford the TV, which is something I'm working on.
I’ve downloaded 1080p AI upscaled rips of tv shows that only ever came out on DVD and it’s looks pretty good.
@@ditroia2777 Oh cool, that's promising. Was that real-time upscaled though, or post processed? If I can get an App for my 8K TV or PC that takes Miami Vice, Night Rider and Airwo;f and make it 8K, that'd be pretty neat. Let alone music videos, if it can give that depth perception we see on demo signal in TV stores. 8K Samsung QLED 900C is just gorgeous with that Demo. I want to see what it and 900D do with 720-1080P media...
@@SilverHolland it was a torrent.
@HDTVTest Do you think samsung will match or even surpass sony in image processing department courtesy of generative AI from what you've seen in that event???? And also how does this generative AI upscaling tech stack up against sony's cognitive processor XR's upscaling??
Cognitive intelligence is another way of saying AI. Like Apple saying, Spatial Audio instead of Dolby Atoms to make you think it’s something else. Sony always thinks they are Apple of TV world NOT.
LG use to throw up words like AI way before Sony on everything. It never worked properly but they still use it for ages.
A great example of this that regular people can see now is the new 4k release of Aliens. James Cameron used the same kind of A.i. to upscale the film. If you've watched that movie over the years, the amount of clarity they were able to obtain in the new release is kinds crazy. People seem to either love it or hate it. I can see the benefit for sure
it added extra wrinkles to Ripley in some shots that makes her look like an elderly lady
@@pewburritoeverything but faces it seems to do a very good job of. When faces are static they look great, movement is still hit or miss. It's the best current example I think of the path it's going towards. I would say in another year or so, once it gets moving objects down, it'll be almost flawless. It's very uncanny. I think it will be great until people start using it as a replacement instead of an optional enhancement. It's like 48fps and the "soap opera" effect it creates. When it wasn't intended, it sticks out, but when used properly, it doesn't throw you off as much. We have all gotten used to 24fps and what recordings look like on a basic level, we're getting into enhancements that people outside of enthusiast group will have a hard time accepting easily.
The more people use it properly will dictate the acceptance level
@@pewburrito give her a break, she was under a lot of stress
So will these Ai features only be set, released on 2024 S90D & S95D QD-OLED televisions? Or there 2024 QLED as well? Will getting a discounted S90C set me further back then forward from my current mini LED?
There is AI and tv, but remember, there are sometimes a catch 22 with them to actually work, like resolution, chroma, what frame rate, and which inputs they are available on and not, along with copyrights stuff
New Sony reference monitor video soon?
So now Samsung reads Text from my screen to send home and build a better profile by also knowing which games I like? Great! Where’s the off switch?
As if you’re nothing more than a consumer.
Thank you for this very interesting and enlightening information on picture quality enhancement. Your time and efforts to share this with us all are much appreciated.
I like Upscaling of course and think AI could probably do a good job. Two things though...most of the time, still images, even compressed ones or 1080p Streaming look absolutely fine, even on a large 4K screen with a relatively close viewing distance.
Where things fall apart is when there is Movement. Since Hollywood still isn't budging on it's 24p fixation, what I need most to enjoy content on my TV is good motion interpolation. This has never been Samsung's strongges area though.
Another little "worry" is, that Samsung tends to have there own "Vision" of what a good image is supposed to look like, and that tends to be over-sharpened, overly colourful and overly bright.
Hey Vincent where is the G4 review?
I just have to say I LOVE that for the first time in so much time, Someone talking about the new buzzword actually calls it what it is, a Neuronal Network.
I really dislike how AI is being used so freely, when no real Artificial Intelligence exists yet..
bravo a step in a right direction. if only the ai could keep the motion resolution intact while in fast moving scenes and when panning camera it would be great
As someone who often uses Topaz AI to upscale old movies/tv shows especially to remove noise (which it's really good at) to make them "watchable" as well as using DLSS in games on my pc. This sounds very interesting indeed.
I'm sure the first couple of generations of the tech will have its faults but it'll get better in time like all of the ai upscalers on pc have done. With 90-95% of terrestrial tv channels still being less than hd this is the kinda stuff that would make me watch those channels again as i pretty much avoid them.
Shame I've just ordered a new tv (S90c) so it's gonna be another 8 years or so until i upgrade again but at least it'll be mature and cheaper by then
I've known about this for a while because a while ago, they were talking about video games graphics that would be generated by the GPU. This would be more efficient than trying to render the details from local libraries. I believe it works in practice, but I don't think it's in any games yet. Generally, I'm right, but I could be wrong about the detail I've given, so be forgiving.
When I read about this concept I wondered when they would apply it to TVs to make a 4k video from 24 fps to 120 fps with AI generated frames to smooth motion. Making it look like it was filmed that way. That would be its best application in a TV. Making low res content higher is nice, but far less important for improving the movie experience, I think. Not seeing the judder or soft images during motion would be a massive step forward.
NVIDIA and Intel use DLSS and XeSS for upscaling their games from a lower resolution to a higher resolution. It not only upscales it also makes the image more stable with a temporal pass in the upscaling. DLSS also has frame generation which should technically double your framerate but in practise doesn't because there is s slight overhead with using it. And Intel will soon be adding ExtraSS which will work a little bit different.
Pretty extraordinary stuff. You can also see how Ai will be applied across all manner of technologies.
Thanks for the in depth information.
I do this quite often with Topaz, but it's a very compute heavy process, barely doing a few frames per second on a high end GPU. I can't see how a dinky TV processor can do such thing in real time...
I have been using NVidia DLSS for PC games for a couple of years now. I think the technology is really awsome. Do not worry about the algorithm generating details that were not supposed to be there in the original material. The algorithm looks at the picture on a micro-level and detects the patterns that are there in the original material and scales them up to the target resolution in the most plausible way. Upscaling is always going to be generating pixels that were not there in the source material. This technolgy scales the image up in the smartest and most plausible way. It is far superior to any other upscaling methods. It does not come up with wohle elements that are not supposed to be there. But I know TV purists will always turn off fearures that will alter the picture.
The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps.
So at least for PC gaming at high hz, AI upscaling hardware on a gaming TV could be very useful.
Am happy with my Panasonic mz2000
I would be interested in seeing DVD's upscaled to 8K. That's the only time I'll use AI upscaleing and turning off the soap opera motion smoothing.
it's really bad - there are many many many A.I. Upscales from the pirates.
@@Datenschutz_Datenschutz If it's bad, I won't even bother.
@@gjk2012it's not.. Why would you trust a person who's source is looking at ai upscale uploads from pirates.... Lol.... If you have an Nvidia gpu, or Nvidia shield, it will do AI upscaling and look amazing... Nvidia is the leader of ai.... Why would you think they're bad at the technology they literally were first to invent? Ai upscaling is something Nvidia has been doing since 2018. It's not new....
@@Tential1 nope. You really think
A.I upscaling in Games is the same like in real world realistic movies. lol. Boy, computer games don't look realistic and DLSS sometimes even creates an oil paint like image. So he's definitely right and you have no clue, obviously.
Pioneer kind of did this with their Fuga prototype years back? seems similar in some ways
Yes, we need more pixels to enjoy this Kubrick.
Will ai upscale to 8k soon and higher frame rates?
The hfr Gemini man looked great
Excited to see where this goes in 5 years time. Ill then be retiring my G3
This was a great samsung ad. Thankfully I have no interest in auto AI game detection/image manipulation.
Thanks for your overview of Samsungs AI tech. I am ok with the use of AI for picture enhancement if it's implementation shows an improved image and motion processing. I am most curious if the AI algorithm improves star field scenes, one of the most challenging scenes for mimi led tvs.
We went through all the effort to make them include Filmmaker mode
only for them to "add texture" that wasn't missing
sounds like people who put their tvs's sharpness on max designed this
really I want my tv to show me what the creator wanted to show me not a robot "with an oh so large data set" hallucinating grain on walls lmao
That's not happening ..the creator didnt watch a stream of video with 25% the color data of the captured scene..
We aren't even getting 4:2:2 from Blu-ray..whatever we are watching on Netflix is maybe 30% of the creator's Master.
So basically a DLSS. Sounds great
Doesn’t the LG G4 use AI upscaling?
Not sure exactly the same , think it's also for empathising main subject to make it more 3D , good news 5 years updates , pretty sure will use same alpha 11 next year . So will get more AI. Think its need quite a bit of memory , and need to download model data . So for best would need to scan movie and download models
The upscaling on my samsung qn95c is staggeringly good. I fail to see any tv clean it up better!!!!
Is the C4 a big downgrade without MLA. Do you think next year's C5 in 48 inches will have MLA? Is it true or false no Dolby Vision support doesn't matter on a Samsung QD-OLED
C4 is the right choice for most people. G4 I'd super bright especially in specular highlights but probably not worth the price difference.
C5 likely won't have MLA in any size unless the G5 makes a significant leap in technology which I doubt.
Dolby Vision maybe won't matter to a lot of people but it's personally something I wouldn't but a TV without. Dolby Vision content can be a bit hit or miss but when it hits, it's stunning.
Im not too excited by different game modes on the samsung. create an ideal image and it will be ideal for all genres. Some genres have a tendency to be more demanding, and whatever benefits them will benefit others by default.
What manufactures need to do is focus on a quality implementation, not pointless and often substandard gimmicks. Tvs do not need more settings. They need to do what they are supposed to do well.
Samsung is all about bloat and bling, it's what they do.
Anyone know, if the Ai settings on the LG C9 has any effect whatsoever?
I would like to know too
AI can miss me. The latest James Cameron 4K Blu-rays use AI upscaling from 1080p, and they look HORRID. Also, upscaling video with Topaz Video AI is very tough to achieve believable results. It just looks bad. Just scan the film at a higher resolution to begin with. I'd rather have an interpolated low-res image than a high-res one where a computer is guessing (usually poorly) what should fill in the blanks. AI upscaling does have it's uses, but really only in very small steps.
SAMSUNG biggest company : big does NOT necessarily mean better.
Sadly though, millions of people out there every day are fooled into thinking it does. So, they blindly just go out and buy Samsung TVs thinking they are getting "the best TV". But, that's the power of good marketing for you! Spend enough millions on crafting a marketing message and you can just about fool all the people all the time, unfortunately.
@@michaelbeckerman7532Different priorities. Most don't care. They don't like looking at dark, bland colors even if it super accurate.
Yep, all starts here. Next time you say something like this it’ll be “man, I didn’t know ai could blow me and cook at the same time!…” then from there it’s surly all downhill boys
Vincent when will the test for the G4 come?
Basically DLSS (deep learning super sampling) for TVs?
yeah but on low end trash Hardware SoCs with 5W power consumption. So, there is no GPU like RTX 4080 etc with the horsepower for real AI. And even real AI isn't the holy grail for movie productions - it looks not natural.
No. DLSS upscales by using previous frame data not generative ai
@@od1sseas663lol you have no idea what you're talking about.
@@Tential1 he is correct tho. Generative AI is closer to a look up table to find a closest matching image.
@@Tential1 I hope you’re kidding. DLSS uses temporal data and motion vectors to upscale an image. Generative AI is a completely different technology. YOU have no idea what you’re talking about
But if you have two tvs, playing the same image, using the AI, would the output be different?
I heard that Samsung and Qualcomm have made a deal with AMD to do work together. Could Samsung be getting a bit of help from AMD by seeing how their FSR works and implement their own version in the tvs?
Imagine AI upscaling your CCTV 360p footage to 4K. Saves lots of storage!
I love the AI upscaling in Nvidia Shield. I can't wait to see what the new Shield delivers.
I don’t know if this will works always good... but i hope that we can select than an option to stay a maximum with the original picture quality.
Today with my 15 year old plasma i’m still waiting for a huge upgrade with a correct price. I hope i can still wait until csot injekt printed 75/77 inch rgb oleds or nanoled/qdel screens are on the market for 1600/1700 euro’s and less, i hope ces 2025 will be better than ces 2024. If i’m realistic one of the ces 2026 models will be my next hardware upgrade. So blackfriday 2026 or january 2027 i buy one. Hopefully my plasma works until that time. Cauze today the price gap between 55 inch qd oled and 75 inch mini led is to much. I think real native 4K will be supported a lot more in the next 4 years. Games on consoles works still today with 1440p and not 2160p. Next generation gameconsoles 2160p with 60 fps is possible.
This is going to be incredible once applied to cameras, too. AI will be able to make a phone camera picture look like a DSLR camera.
High-end phones already do and sadly I really don't see it getting close to DSLR.
AI relies on there being some hints of the details that are missing still left in the image (subtle changes in the colours of the pixels), tiny sensors just don't pickup those details at all which is why skin usually looks flat lacking any texture and red hair usually looks brown, and any subtle changes in the tone of hair in general is lost.
Smartphone photos only look good on those tiny screens where we can't see all the detail is missing.
Also remember as we make tiny sensors more light sensitive to try to compensate for their size, we can also then do the same with bigger sensors. So inherently the gap between the two doesn't really close that much.
when the tech can transform SD into HD, I'm in. Apart from that native 4K is good enough, it does not need to be any sharper. Tests have shown that people prefer film grain in a film, to give it that "film" aspect to it. I would agree.
G4 vs c4 vs G3 vs C3 vs A95L pleaseee 😅
G4 is best tv of the year
G3, C3 and C4 are all old news now.
Top 3 this year are G4, A95L and S90D
@@djbpresents9584says guy who has definitely not seen every flagship tv this year
@@Bushwacked487 watched enough review putting g4 as best so far even over way more costly ones, the processing LG is using this year along with all the Xtra features 144hrtz and just suite of gaming features, detail and brightness in picture g4 just sits at the top so far but wait around for it
Spoiler alert
G4 and A95L are the best
We know this already
video on nvidia sdr to hdr when? so much better then this
Yes, Ive been commenting this for a while. Upscaling using the full power of a dedicated nvidia RTX graphics card has to be miles better than integrated SOCs the TVs use. Plus they have experience with DLSS already.
yes because everyone has a PC with an Nvidia card attached to their TVs so a channel focused on TVs should definitely do a video a bout that first /s
😂@@yadspi
@@yadspi The RTX upscaling already exists and can be used with any brand of TV, whereas what is mentioned in this video is just a demo for high end 8K TVs most people will never buy. Also many people have a TV attached to their PC rig and can take advantage of it right now.
Keep your diapers on
Wil this ai chip be in the new samsung oleds to?
Depends on which CPU they use for Smart TV functions, AMD, Intel, Qualcomm, Rockchip all have new CPUs with NPU cores.
I can appreciate the technology, but, at the same time, I'm getting increasingly annoyed by all the AI fakery that seems to be the trend (not only) in visual processing. Whatever happened to simply showing the original unaltered content? I would say that it's fine as long as it can be disabled, except it doesn't really matter, because as soon as such a feature is readily available, it will inevitably become the new "normal" anyway. We can already see the effects of AI making up stuff in other areas, and it's very clear many people are unable to even notice when an AI "enhancement" is mangling images for them under the illusion of added detail or generating horribly unnatural translations or text.
we are going to get which device do we let upgrade the picture . Best is from say a streamer like Netflix - ie extra premium as they have much more power and time to upgrade media - ie they could spend 3 days upgrading a movie with some human input ( but they won't as costs ) . Like which DAC is best .
Nvidia and AMD will get better and not limited to soon to be older TV. Ie you want your upgrade solution to ultimately be independent of the TV . However let these TV upscale from 1080p to 4K for much content as will do a pretty good job.
Best at moment is Topaz AI and some similar. you can upgrade a DVD over night , upscale and watch on TV next day
,
Whet the future will bring is system where AI studies whole movie and learns every character , so even if poor data scenes, if has excellent data from other parts of movie and from the WWW. same fo voice
So, this is like a built in Topaz AI into a tv. Interesting. Should I wait for teams to improve old shows like DS9, or should I just buy this tv? :D
I would not pay extra for the AI feature at this time. I already turn off all motion and image enhancing features on my Sony but I'm open to new technology making image enhancement better. I'll wait and see.
So far AI upscaling have made people look weird and if that's the case here then I just need to know if it can be turned off.
The QN900C looked better than the QN900D in those images. The Zebra image demonstrates that well.
Contrast dies on the QN900D image.
Sounds good but samsung oversharpens the image in their tvs
And don't forget their laughable over-boosting of all their colors. 2024 now and they STILL can't let go of that. What the hell are they afraid of?
@@michaelbeckerman7532Maybe because their general audience prefers it. No wonder they are still market leaders.
@@frankreynolds9930 no, they don't prefer it actually...they just don't know any better. But, that's what they end up with when they just blindly follow the crowd instead of actually doing their homework before making a purchase. Sadly, millions do exactly that.
What about Sony a80 and 90 L AI upscaling ?
Having seen reviews of Nvidia DLSS, it is on its 3rd generation, and while it looks very good, there are still ghosting and shimmering problems at times.
I just bought the MZ2000 and It’s disappointed. It has almost no new improvements compared to the JZ2000 (2021 model). I even feel it’s a bit less.. is the pinacle of Panasonic TV’s over since they are not assembled in Japan anymore?
It has the MLA panel which is brighter and more impactful with the right kind of HDR content. It also makes Dolby Vision IQ usable, by finally allowing to turn off noise reduction settings in that mode. Otherwise it is probably rather similar to the top model from just 2 years earlier, e.g. both have near perfect SDR color accuracy out of the box without any calibration and superb shadow detail. I would generally recommend to wait a bit longer before upgrading the TV, because the differences from year to year usually are incremental.
@@redrum_2k The MLA panels have raised blacks. I rather have deep
Blacks than 200 nits more. The black level on mz2000 look like my old pioneer kuro, that’s not good. The jz and hz2000 are superior imo. I will return my mz2000 and buy a good plasma instead till they have improved technology. Old resolution also looks very bad on the mz2k. It has the best PQ when gaming compared to LG and Sony. That’s it..
@@redrum_2k MLA panel is not an improvement for everyone. Raised blacks is worse than getting a few more nits in brightness. IMO the biggest deal of oled is the deep blacks. MLA panel black almost looks like my old Pioneer kuro plasma black.. yea the upgrades are very minimal but the price is not. I’ll return the tv and get myself a proper plasma in the meanwhile. Improved QD oled is what I will get next time
@@superbn0va So MZ2000 looks like plasma which is not good, but then you rather buy plasma than MZ2000. Do you understand your own words?
@@steamstories1279 do you understand the difference in price? Lmao
That's just DLSS-esque upscaling, the first version of which was available to mass-market in the Nvidia 20-series graphics cards, released in 2018. Today, Intel's XeSS also support similar technologies. There's really nothing new if that's all there's to the Samsung version.
It's also misleading to just compare two TVs and show that one is using less power and attributed that to whatever AI things they have. It's possible that the newer TV just has a more efficient panel. So if you set everything the same, you may still see the same energy consumption advantage - or perhaps even higher saving - without the need of whatever AI shenanigans. Of course it could really be the AI that was responsible for the improvement, but a demo like that was hardly convincing by itself.
The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps.
So for pc gaming on a gaming tv, putting AI upscaling hardware on the tv sounds like it could be very useful if done right.
@@elvnmagi9048 The main thing I was meaning to say in my first comment was that what Samsung had shown in the presentation wasn't anything new.
We haven't seen anything from the LG G4 but it contains a newer SoC from the C-series and the newer chip is purportedly doing similar upscaling/reconstruction shenanigans to what's outlined here. But the point is AI image reconstruction in video is "old tech" in 2024. Digital Foundry did an older video showing the Nvidia Shield doing similar upscaling on regular video playback a few years back.
You can argue that it has been the domain of Nvidia until recently, but it still isn't anything new and if anything Samsung's presentation, to me, is a tacit admission "we were behind, but we have caught up!".
@@lagrangemechanics Yep i use a 2019 shield regularly, but that's not the same as having a more modern AI chip in the display itself, due to the bandwidth limitations of ports and cables in regard to PC gaming on a gaming TV. More modern AI upscaling may be faster, cleaner, and provide more detail as the generations progress.
Also, in regard to media, the shield does an ok job to 4k but it is a big leap to do 8k fast enough, clean enough, and with high enough detail gained.
For PC gaming, the bandwidth savings by upscaling on the display side is important, especially with the way nvidia has allotted bandwidth on their ports up until now, (60hz 8k max, 7680x2160 at 120hz max, limits on multiple 240hz 4k screens) - though that could change to where a single port could get full hdmi 2.1 bandwidth with the 5000 series potentially, I hope so at least.
First thing I do on any TV is turn off all fake AI stuff. I want an accurate image, please.
AI doesn't make it more, "authentic"?
@@guadalupe8589 lol
I agree but for very poor quality material it could be useful in future tvs.
@@tomblade I don't think so, very low res/quality source means the AI has less to work with and has to interpret more, leading to more mistakes and weirder results.
@@0M0rtyno, it doesn't. You're theory crafting, rather than looking at the actual results. This isn't new tech dude. Nvidia artificial intelligence upscaling has been here. And it's been tested. Low resolution content, it's like magic. Hardware unboxed has shown that for video games upscaling is better than native image quality. I know, it's weird. Why do you think Nvidia is worth 2.2 trill....
I don't want any AI messing with my movies, is there a way to disable the AI function?
You say that, for now....
@@guadalupe8589 I want un-adulterated image quality, as soon as AI is introduced it becomes fake.
dont buy Samsung.
@@Datenschutz_Datenschutz I will never buy a Samsung TV - Always LG
You can turn it off on LGs.
Some of these features seem useful but I don't need or want most of them.
Thanks!
So many things in one video I never thought I would hear from a ”purist” like Vincent. To comment on just one of them, 4:35, if I can’t notice or care about upscaling from 2m away then why bother anyways? Or to continue that logic then might as well max out the sharpening, deblocking and all other effects that the ”experts” always tell consumers to turn off? Sounds like marketing, marketing, marketing….”AI upscaling” is nothing new….
Exactly. Companies like Sony and Panasonic have already been doing this exact same thing in their processors for YEARS now. No different. They just put a cool new label on it. That's all.
I would think there would be lag and slow response if this is on.
Dlss3/fsr3/xess for tv?
4:33 - I can't - not even with native 4K images on my 55X85L vs 1440p or even 1080p sometimes. (around 2m, maybe slightly less?)
So I definitely don't want the wasted energy of a stronger processor doing useless work.
Especially 8K. 8K!!!
This stuff is not doubling, it's an exponential increase! Useless insanity.
Give me better brightness and good dimming zones (or robust OLED, but I'm still worried, plus price), because I can actually see that.
(I can see that Sony lied to me about my TV being able to do 4K HDR at 120Hz VRR - I mean, it does display that signal - the rest works it looks like 1080p. Even the text in Windows is displayed in a sub 4K font renderer, despite it putting out 4K - it's very weird and damn I was p...eeved.
4K 60 HDR VRR is perfect, but I didn't buy a new TV + cable for that...
Also low dimming zones, but I can't sideload adblocked YT/Twitch APKs on the otherwise pretty nice discounted Mini LED Panasonics we had at the store. In fact they didn't have any proper app store at all. So the Sony for a still quite nice discount.)
Awesome, I can't wait for every character on The Simpsons to have 7 1/2 fingers on each hand now!
uh... where is the AW review and new tv's?! you posted unboxing videos many days ago
Samsung could be half price and I still wouldn't ever buy one.
Once you have seen how they consistently over-saturate and over-boost all of their colors by default on every model they make you can't unsee it, unfortunately.
no less they don't have Dolby vision
@@michaelbeckerman7532Samsung colours are better than both lg nd sony , sony looks too dim nd natural nd lg is just sh8t
no AI in my image, I want raw, pure, 1:1 native image
The only true answer.
So you want a 1080p image to only cover a 25% box in the centre of your TV? A SD image to only be a tiny 5% window?
Sounds unreasonable. No one does that. It has to be upscaled to fill the screen area somehow!
I'd like to show you a blind test. Play you an (advanced) upscaled image without you knowing, for few days, then switch to "raw, pure 1:1 native image" for few days without you knowing. The findings would be interesting.
I personally couldn't care less. If there's something that can enhance those crappy 1080p videos with poor bitrate on UA-cam or simply old videos, that's fantastic news.
You only THINK you want it because you've never seen a raw, pure, 1:1 native image. Everything low-res you consume on a phone, computer or TV gets algorithmically upscaled to the display resolution, because displaying a raw, pure 1:1 360p or 480p signal in on a 4K display is absolutely horrible.
The only displays capable of displaying low-res signals in presentable ways in high-res panels were CRT TVs, since there aren't really "pixels", but instead a phosphor coating on a grid that sets the maximum resolution, and an electron gun which's properties allow it to display any resolution under the maximum resolution clearly.
Knowing all this:
1) You want upscaling.
2) You want AI upscaling (since it's the best upscaling).
3) I know that you want it because you're customer, and customer's don't know what they want because they don't know anything about the products they're using.
4) Everything will come with AI upscaling in 5 years, you won't be able to turn it off, and even if you manage to turn it off, you will come back to it because it's awesome.
@@simonchasnovsky1835is it ACTUAL AI or a sophisticated algorithm?
I’ll pass. I want true to source, not fakery. That’s also why I hate “filters” on pics.
All images on any device is a recreation of reality. In a sense, it's all, "fakery". FYI, upscaling is on any device you see video on
Me too, I prefer to view the raw binary file structure. 😂
AI calibration would be the best thing they can offer alas they don't care
The more I care is relative to how close I am to the screen and how big the screen is.
It’s interesting technology, but it’s being deployed in the wrong place. They should give this technology to the creative industries so that for example a filmmaker can use the tool to enhance their image if they want to, and so that when you watch it at home you know that they have approved it. I would not ever want to watch a movie with AI added in afterwards. Filmmakers already hate motion smoothing, I can’t imagine they are going to be happy hearing people are watchign their movies at home with AI generative upscaling.
Well, remember here, these companies HAVE to keep coming up with the "cool new thing" every single year to convince people that they are getting something more this year than they got last year. They HAVE to do this also to (at least try to) convince people to UPDGRADE, long before their old TV actually breaks or wears out. That's the way the entire tech industry stays in business. There is a gun to their collective heads of these tech companies called "year-over-year revenue growth" and they will do whatever they possibly can to make sure those shareholders of theirs stay happy. Even if what they are telling you each year is about 80% marketing fluff, hype and pure BS.
G4 vs A95L vs S90D is the battle that matters. S90D is the better sharper looking TV over S95D which is ridicolous!
Why it the S90D sharper?
@@Jeroenneman s95Ds anti glare panel causes a pretty noticeable difference in contrast
And the s90d doesn't have the one connect box which is also a big win over the s95d
Thank you have good day.
Call me a purist or boring - but i much prefer standard linear upscaling, without any artifical gap-filling. What if someone were to hack its data set/search algorithms. Couldnt that lead to wildly inaccurate/inappropriate results? No thanks.
My last Samsung TV was bought in 2015 and started out good but was gradually made worse over time as it installed successive updates. I wasn't even using its OS or any apps, just passing an image through HDMI ports! By the end it got so bad that displaying an image with changing shades of grey would make it crash and have to be unplugged from the wall in order for it to display an image again. This experience ruined my trust of the company; I'll never buy another Samsung TV no matter what features it has.
I just need an off switch in the settings.