Why is this video still up? You said you would pull videos like this down and possibly replace them with a correct video. Again your display latency part is wrong. Time between refreshes is not the input latency, it is not that simple. You have made numerous errors on this subject for years and still do not listen. Also one sample point for each refresh rate, come on Linus the graph is obvious BS.
The fact that so many 4K panels absolutely refuse to use integer scaling when they could has been an annoyance of mine for a long time. It seems such an easy win, too. Admittedly, this is less of a problem with computer monitors than it is with full-size TVs, thanks to the ability to select the video output resolution in one's OS. But it's still _a_ problem, as shown in this video.
Yeah, I don't get it either. Integer scaling is literally the easiest method of scaling. Whoever develops the monitor BIOS is going through MORE work to have a WORSE result. Honestly, every single monitor should integer scale from it's native resolution whenever it can (like 480p to 1440p). There's no reason any monitor shouldn't be able to do that.
Exactly why I haven't bothered with 4K much and instead gone with 1440p and probably will for the next few years. It annoys the heck out of me being ""stuck"" with 4K if I buy a 4K panel.
I was literally about to comment about how you should be able to get basically native results for 1080p on a 4k monitor. But you're gelling me they don't do integer scaling usually? WTF! That's just obviously what you would do.
7:43 The half-pixel offsetting is most likely implemented to make the "pixel shift" feature, that protects most consumer OLED monitors from burn-in, work in the 1080p mode
@@dalebeefsteak119 it's the same display technology which supposedly has pixel shift for mitigation, and a few years isn't "forever". If my expensive displays can't go two years without developing burn-in, I don't want them, they don't get my money. I'll wait for MicroLED for desktop monitors and TV's at this point. Not much I can do about phones unfortunately since the market is pretty much exclusively OLED now.
I actually had no idea about the edids forcing 1080p/4k and the forced upscale into 4k. That is really interesting and really highlights the limits and flaws of being the first gen product. I definitely want a newer version of this panel in the next few years, but for now, it's a nice novelty and a look into what the future holds.
The EDID on my asus tuf vg27aq was actually a problem source for me when trying to use DSR. It has a built-in 4k downscaling mode, but because of the bandwidth involved, it's capped at 60hz. But, I didn't want "4k" on the display, I wanted 1440p at 144hz on the display, but the game to run at 4k or above downsampled by my Nvidia gpu to 1440p. Ended up having to edit the EDID using a program or something and haven't had an issue since. Now I can use 4xDSR for 1440p@140hz just fine.
I feel like digging out a 1080p 32inch monitor to compare and judge the scaling would have been a good idea here. Lets see how close to native 1080p it really is.
@@leeroyjenkins0 exactly the equal signs doesn’t make sense. This is another instance where the editor isn’t technical something something but then everyone who reviews the video missed it as well. But to us it stands out as obviously wrong
@bighammer3464 Adding longer chains of unit notations makes things scan & read less clearly, particularly for laymen. It's often better communications to stick with something like that. That said, continue to smugly fellate yourself over your expertise is the YT comments. Nothing is stopping you.
No. It's definitely 1080p at 240hz at 2.08ms in term of latency based on its bandwidth. What he is showing is the apple to apple comparison of what if we run both 4k and 1080p at a fixed 240hz. This is how fast it takes the controller to complete 240fps from the time it receives the input. I imagine if you run 4k and 1080p both at 480hz, this ratio of 4.17:2.08 will increase significantly on the 4k side.
I don't think I understood why they couldn't or didn't integer scale the image. It's just dobuling on both axis, and would pretty much just be 4K. No real scaling, just doubling up the pixels. I'm confused :( Like, it seems so easy, I couldn't imagine the logic/code behind it would be difficult.
What's funny is this used to be supported by old Windows back in the CRT days, back then the assumption was that displays would be capable of different framerates at different resolutions and Windows actually had options for configuring that.
Makes sense, it was designed around CRTs where it was quite specifically how they worked. Kinda interesting how the removal of CRTs from not just practical use but even consideration when developing stuff was so complete that the feature got completely ignored and removed.
Windows still has those options, they're just hidden in some weird attempt to make things "easier". (Just like the refresh rate being a separate dropdown behind another menu, which invariably means people leave their displays on lower refresh rates...) If you go to "Advanced display settings", then select your display's "Display adapter properties", you get a "List all modes" button which... lists all your display's modes (each combination of Resolution + Color Depth + Refresh Rate), and lets you select them.
IIRC i had a ~$200 40 inch TV in 2016 that had a feature like this. It could do 4K60, or 1080p120, if you didnt change the setting in the TV, and set the computer to display 1080p120 you'd get these weird artifacts that looked like broadcast TV with poor signal dropping, definately didnt look as good as when the TV was switched to 1080p120. Now i didnt do this most of the time because this was a TV, so to switch i had to go deep into the menus with a remote that didnt always have working batteries for.
This dual mode feature sounds quite fascinating even with its limitations. Superior motion clarity and color performance seen here makes me excited about the future of display technology.
I own this monitor and I will say it’s amazing. Literally perfect for my use case. I get a 4k panel for whenever I play single player games and crank the settings, then I get a 1080p 480hz panel for when I waste my life away with my friends on Overwatch. Came from a 1440p 300hz Asus mom to this and personally couldn’t be happier
This product is such a head scratcher to me, because I'd assume that if a higher bandwidth requiring mode is supported, then so will literally anything beneath it. So why can't you just pick it out in windows, let the monitor go black for two seconds, and woop you're all set. As of scaling, I HATE that integer scaling isn't a common thing. I bought a 1440p monitor like six years ago specifically because of FTL Faster Than Light (a game that runs at only 720p and nothing else). And full screening the game still produced a blurry image. Part of me died inside that day. Please let that never be an issue in the future. If I play 1080p games on my 4k TV, I expect the image to not get blurred by the scaling.
Integer scaling is even easier to implement too, so why don’t they do it? One option for you would be to set your display as 720p and then use integer scaling on your PC to output a 1440p signal, that way your monitor doesn’t scale anything and it is all done before the signal is output.
Hey, if you don't want to use exclusive fullscreen mode and GPU-side integer scaling, there is an app called "Magpie" that can do integer scaling without having to change the display mode. There is also "Lossless Scaling", but it used a Windows Magnifier hack that doesn't work anymore as of the latest Windows versions. Magpie works because it actually does a screen capture and then scales in software, which means slightly higher latency, but far better quality.
Integer scaling, black frame insertion and interpolation are all greatly underutilized technologies... especially important for emulation. I went back to play FTL and it was so small at my 3440x ultrawide it was barely playable. Had to use an integer scaling solution and try and get it looking alright ("no more blurry scaling" thread on ftl reddit) as the NVCP solution is... for fullscreen. DSR sucks as well. CRTs will always be better than LCDs. We really need to get manufacturing of them back, or find a technology that actually meets or surpasses it in all areas.
The button also serves the practical purpose of have a hardware difference between 1080p and 4K mode, which you could use to configure the monitors separately. It would be interesting if there were a "pass through" switch that did the same thing and just hijacked the EDID message.
9:13 - those aren't TCONs but DP-LVDS converters. TCONs are what convert the pixel data to electric signals that drive the panel matrix(almost, just to simplify). Rocked the zisworks monitor for long time and used its higher refresh/lower rez rates for shooters and rhythm games like osu!
bro those monitors are even more expensive in EU than in the US. A few days ago I ordered an ASUS PG32UCDM, thats the 4k 240hz QD-OLED, from Amazon and paid 1713€!! (thats 1857$) yes, this is with tax but its still more than I think in any state in the US. Or are you from Canada? how expensive is it there with tax included? (obv only the price WITH tax matters bc thats the actual price you pay lol)
4k@240 and 1080p@480 dual mode at 32" was my dream monitor, thanks for doing a deep dive on this monitor! Seems like it might be worth waiting one or two more generations for this tech to get a little more polish.
LG escaped from the jaws of victory. How hard would it be to just emit highest possible refresh rate and resolutions combinations in one EDID data and have on screen option to switch between integer and blurry scaling modes? EDID would support advertising both 1080p480 and 4Kp240 at the same time and allow the OS to switch to correct mode without requiring any extra button presses on the monitor.
I mean, having a physical button to frequently switch between the modes is quite handy. I own this display and press it more than once per day, so having to do it in the software would be a little bit annoying and takes extra time. but offering both would be ideal.
@@patrikmediai think having it in software and hardware would be the superior option, gives choices, bonus points if the make it something u can bind a macro to
If you did go with this monitor, you can just buy an EDID emulator, which will allow you to have any EDID configuration and usually at least 10 custom profiles of choice, what may be a better pairing however, is an OSSC Pro, which also has an EDID manager, but also comes with lossless integer scaling, pixel repetition scaling, subpixel rendering, line multiplication, and many more features, including very high-quality HDR-BFI algorithms for 1080p, you can only do 120Hz BFI unfortunately for now at least, but 120Hz HDR-BFI with 4K subpixel rendering should offer pretty same solid results and give you nearly lossless IQ and pretty amazing motion resolution/clarity. What was failed to be mentioned in the video, is that 1080p 480Hz offers far, and I mean far higher dynamic resolution than 4K 240Hz, we are talking 300 TVL vs around 800 TVL of motion-resolution, and 120Hz BFI would almost certainly offer 1:1 1080p static:dynamic resolution (equal static and motion resolution @ 1080p).
@@patrikmedia Software could "press the button" for you automatically. I guess you don't press the button randomly but for some specific application or game. That part could be obviously automated similar to Nvidia drivers allow automatically applying different GPU settings depending on which application you're running.
This refusal to integer scale or properly scale resolutions in general is why I will always use GPU scaling in my drivers when I can, and use lossless scaling to 2x when I can't.
Interger scaling is the only thing that makes me want an 8k display now. I'll never use 8k but the ability to switch between 4k & 1440p sounds amazing!
I'll keep my OLED. I grew up with CRT's even had one in early college back in the day and the OLED picture looks lightyears beyound that of any crt and the footprint is a whole lot smaller.
I'm not a gamer, but your explanation of this monitor is pretty cool. I've been working in tech for a good while, and I've found that when we can't figure out what something is for, it's often in anticipation of a future implementation of something. Your explanation was pretty fascinating, I've never really given a lot of attention to monitors, because, as I said, I'm not a gamer. I do photo processing, so the quality is important, but not so much doing the video clarity. Great demo, thanks for the work you put into these!
Just Listened to the WAN show, and i heard linus mention "maybe i shouldn't have covered this topic." But i highly appreciated the content about this, as none of the other channels i've seen gone into detail about this. The only content i've seen so far is that they've stated "it doesn't look quite as sharp" I've purchased this Monitor and the first thing i've noticed was the scaling in the 1080p mode. I would've appreciated if any of the channels covered the topic to the detail you've gone into it. Here is me hoping that LG will still introduce integer scaling for the 1080p mode, as it currently just looks a bit whacky.
first time in a while that i watch a full episode of something that's entirely not targeted to me, not even remotely. Dayum, people that are into retro gaming for real pay attention to the most minuscule things, to the point where i, a nerd that follows this channel and many more of tech on a daily basis and knows what they are talking about in most videos, could not even follow the logic of it. Yet, it was interesting enough to keep me watching till the end, so kudos to the editing.
I'm also slightly confused by this monitor's "24/27 inch mode" where they add big black bars to make the 1080p output look like it's on a smaller monitor. I understand this, but why didn't they add an option to use a centered section at 1920x1080 so that it would have great PPI, great scaling, and still be ultra high refresh rate. I'm sure there is some technological explanation why this can't be added, but if it could, I would've loved to see it.
@@yoriiroy1720 you sound a little upset. regardless, i don't see why the option isn't given. Sure the majority of people wouldn't want to use it, but they also won't want to use the blurry 27" or 24" scaled 1080p modes. For esports, I can see why someone would want to at least try out sitting closer to their monitor while achieving high PPI and refresh rate. As small as 16" is, I wouldn't exactly call it unplayable for everyone.
@@salus5319 then just buy a small 1080p monitor. Why buy a large 4K monitor just to play on a tiny 1080p portion of it when you can get a cheaper 1080p monitor and it would be better and larger? If you are only interested in playing Esports games then why go for a 4K monitor?
@@conorstewart2214 you're completely missing the point. Why do you want less options? Aside from that, OLED has its advantages for esports. On top of this perhaps people do more than 1 thing, and enjoy the option of 4k, as well as the esports option of 480hz.
This with 5K would be great! (That makes the low fast resolution 1440p*) And when writing down my wish-list. Let's make it a 16:10 ratio. *with 16:10 it would not be 2560 x 1440 but 2560 x 1600 in low fast res and 5120 x 3200 for high slow res. It would make for the perfect monitor for productivity and entertainment.
I literally have a 43" Hisense low end TV from last year that does this. It's 4K 60hz or 1080p 120hz, I believe TCL actually had the patent on it called "Dual Line Gate" technology for the previous 3 years.
The explanation on the differentiation of 4K and 1080p in dual mode was very enlightening. Looking forward to seeing how LG and other manufacturers improve upon this technology.
The argument on the CRT is completely wrong or I didn't get what you said. CRTs had pixels because they need to differentiate Red, Green and Blue lights sources after the conversion from the electron beam to (white) light thanks to the photophor layer. In fact, even before colored CRTs, they were already using aperture grills to increase the sharpness of the image (electron beam is gaussian by nature, hence generating blur. That's what they try to mimick for the retrogame experience). Those apertures had different names like shadow masks, black matrix, etc. Your argument on the possibility to scale the raster by playing on the scan speed would be relevant only for monochrome CRTs, without grills, which I've only seen on cheapo oscilloscopes. And even then, you would have limitations because the electronic back then was not so evolved that what we can have nowadays with a clean quartz oscillator or a FPGA (remember than first CRTs were using the phase of the electric plug to generate the oscillation signal). And limitations of the space between each beam : either increasing gap between the beam, thus decreasing brightness, or decrease the gap and then increase the blur.
CRTs phosphors aren't pixels. The type of phosphor grid used has nothing to do with clarity, the size or density matters a lot more. I own a Samsung Syncmaster 955DF 19inch pseudo 1440p monitor, it's a dot mask CRT and it looks sharper than an Aperture Grille monitor like the FW900 that has bigger phosphors. Brightness increases with higher Hz and the electron beams aren't gaussian. What happens is the light difraction from the glass and wacky things that happen with the phosphor layer. You take a slow mo 300 000 fps camera for a CRT scanline, it looks like a perfect line.
As I understand the electron gun still fires wherever it’s told regardless of where the phosphors are. So a CRT can light up any fraction of what appears to be a pixel which modern displays with actual hard set pixels cannot do. As an example, a blue electron gun firing right in the gap between two blue phosphors would slightly light up the corners of both. A modern panel would have to approximate the resulting color of an entire pixel and display that because it can’t light up only the corners of a pixel. So this makes every resolution appear to be native on a CRT. I’m pretty sure this is also why things like the retrotink are very convincing on 4k TVs because we can brute force this effect by having so much more resolution to work with. For example by displaying a 240p image on a 4k tv through a retrotink or similar scaler, each 240p “pixel” is actually made up of many 4k pixels, enough that we can recreate the CRT look fairly convincingly (I’ve never seen one in person myself) Idk if that made sense but the channel Technology Connections video “These are not pixels” that has a bunch of really good demonstrations and macro shots of different CRTs that explain it way better than I can
@@Pe721 Yes, but the FW900 looks more blurry than my Samsung because it has bigger phosphors. With my Samsung, it's the first time ever that i watched extremely sharp text on a CRT.
These silent ringing sounds from time to time in the video drive me mad. It sounds exactly like my doorbell. 6:02 for example has one and there is plenty more in the video.
@@daniel-bg5nq You can only hear it on certain volume levels, might be a frequency that is hard to hear for some. Sounds like a silent doorbell, exactly like my doorbell and it drove me mad. Thought somebody is ringing my door.
i do appreciate the Display makers are always at work. Every time we think we have hit a wall for capability they find another path or style of panel to refine the sharpness or reduce the power usage. Wider and wider resolutions and cheaper. One of the best industries for a consumer.
i can tell you by expierence that 144hz crt is faster than 240 hz oled. I tried both and its night and day. Motionclarity is a pixel off using a OLED. You need twice the hz and fps to match high end crt.
@@Pe721 144 Hz x 2 = 288 Hz ≈ 240 Hz you'd also need a GPU capable of driving 240fps otherwise S&H will increase persistence. Yes there will always be a difference between S&H and pulse displays but you'd have to be pretty sensitive to discern that at 240fps during gameplay. for most people that's buttery smooth no matter which display tech.
@@Pe721 Due to the phosphors and the way a crt works, it pretty much flashes the image at you. Similiar to the ULMB Linus talkes about at 1:30. Did you compare with ULMB enabled, or diabled? Don't know if oled even supports something ULMB, thats more of an lcd+led backlight technology. but that should deliver a similiar effect to a crt
Did he say "More faster-terr-verr"? @5:43 and "close-lee-err" @7:19 This is more for humor value since the return of LTT going back to their roots of being an informative channel is a welcomed change.
@@ParasAryan I skimmed through it first to see if it was something interesting that I would want to watch, as I haven’t watched Linus in a while, but remember that I liked his videos.
very interesting in the integer scaling chart that 8K does 1:1 scaling for basically every major standard currently in existence? seems like a pretty obvious marketing point for 8K panel makers to sell (that everything looks crystal clear, VHS to UHD4K) that I'm surprised I've not seen it before.
Why dual mode??? Just add an integer multiplier into your game. It's that easy. just trip-licate 480p to 1440p. Sure, 480p is a blocky but it's 100% just as sharp.
I still don't see why you couldn't just use one EDID and instead change the way things are scaled, like the algorithm actually scaling by integer multiplication instead of blurring everything when upscaling things (when the resolution is an integer multiple)
i would like to see 480hz myself, but at 1080p you have the bandwidth to have 960hz or even 1000hz already. just have no scaling option. just give us the hz, frame generation is going to quadruple the fps soon enough.
the background song of this video while wearing headphones kept sounding like someone was knocking on my window or door i thought i was going crazy because no one was there.... thanks linus
You can't use Integer Scaling with DSC which is pretty much a requirement for high framerate on Nvidia cards. And there's also no easy way to disable it.
idfk why but for some reason this video is killing my eyes. I just see so much motion blur on that game that it hurts my head, I can't keep watching this but take my like anyways Linus
The switch to 1080p@480 is more a "compromise mode". By the 1 pixel offset you only need to drive half of the pixels to the video data, which makes the 480 Hz refresh possible. Otherwise the bandwidth in the panel itself is not sufficient to sustain a 4k@480 throughput, to update every pixel - so they just drop half of it away :D Anyway, in the early ages of 4K panels, we had to configure some Tcon registers to switch between 4k30 mode and a 1080p120 mode (the interface bandwidth is the same). There was one catch - that the tcon itself applied some filtering - normally it looked like integer scaling, but when certain thresholds were met, then a sharp corner suddenly became smoothed and rounded :D
Thank you for clearing that up. The whole video I was thinking "I'm pretty sure the 7900xtx supports 4K480hz with DSC over DP2.1" so I was confused why Linus was saying that the display port bandwidth was the issue. Just to be clear on how it works are you saying that monitor is only sending pixel instructions to some of the pixels and then the pixels not getting instructions are just matching or averaging the instructions of the neighboring pixels? I'm looking at the picture at 8:00 but it's hard to tell what I'm seeing when everything's white I can't tell a pixel from a sub pixel and I don't have a close up of the pixels in normal mode for comparison.
The bandwidth is only relevant for the interface between the monitor and the GPU. But in this case by identifying as a 1080p monitor the GPU is sending a 1080p signal, the processing on what pixels to drive is taking place on the monitor, so it is the monitor processing that needs to double the pixel on 2 directions. What I'm trying to say is that the argument of driving half the pixels because other wise it would be a 4k signal at 480hz is not what is happening here as it is internal processing on the monitor is not affected by display port or hdmi bandwidth
@@iothomas Ok, I guess my question is: Why does this monitor not support 4k480hz? When I first heard of the gimmick I assumed that the 1080p mode would be letterboxed in the middle of the screen or something, instead it's doing it full screen at 480hz. If the monitor upscaled the 1080p signal to a 4k signal using some upscaling algorithm then it would then have the same bandwidth as a 4k480hz. Yes it has already got past the display port bandwidth restriction, but if that was the real issue then they could have just used DP2.1UHBR13.5 because that supports 4k480hz with DSC and is already in use on AMD 7000 series GPUs and on a couple monitors like Samsung Odyssey Neo G95NC which needs it for its ultrawide 240hz display and the Gigabyte AORUS FO32U2P which actually uses UHBR20(only needed if you have some weird use case that does not support DSC.) Maybe dp2.1 adds too much to the cost, but I doubt its more than 100-200$ and then this monitor would stand out way more. It's either some annoying product segmentation and they will be soon releasing a 4k480hz monitor screwing over everyone who bought this monitor(perhaps waiting for next gen nvidia gpus since right now only amd and intel support 4k480hz) , or there is some internal component that can't handle the the 4k480hz signal and their weird upscaling is a result of the solution that doesn't overload this component.(Which I'm pretty sure is what @cameramaker was saying I just didn't fully understand the exact process he was describing)
this is off topic but i came to this video just to thank all ~100 ltt peeps for having the back of the little guys. Hate is everywhere, but the silent majority appreciates what yall do
They still have a Grid and its not as simple as he says. 4k on CRT looks kinda blurry for that reason but with interlaced Signal is quiet crazy. Can only be achived if you tunnel your gpu to intel igpu output nowadays that might add input lag.
@@Pe721 If I remember right there are some units with a dense enough phosphor pitch to be sharp at 4k. Either way they adapt very well to different resolutions.
Actually, they do...If you send some frequencies that are a bit too off from standard, you can destroy your yoke or high voltage transformer. Actually blew up a 30,000$ 42" CRT monitor by sending it incorrect frequencies a while back. RIP since gl finding a yoke for those NEC/Mitsubishi presentation monitors.
@@natsukage3960 seems like a driving problem rather than the actual resolution being fed to it, unless very specific resolutions cause some resonance issue? But that would be specific to the yoke and/or board.
That would be MicroLED, not MiniLED. MiniLED is already available for PC monitors, MicroLED is probably years away as they work on scaling it down to smaller display sizes.
I would guess they still get burn in at some point, anything that's going to be "per pixel" is going to wear out unevenly to some extent. But I'll definitely grab a microled monitor/tv when they're somewhat affordable. Haven't really had any issues with OLED so far though, and it's what's relatively more affordable today.
Never realised 8k displays can do integer scaling for both 1440p (for gaming) and 4k (for media/productivity). I don't personally see any value in 8k native resolution. But this actually gives me a big reason to get an 8k display
I hated the burn in on my crt growing up and there was nothing we could do about it but buy a new crt. We were too poor to replace it so we sucked it up.
I have been using this monitor (LG 32GS95UX-B) for over 2 weeks, it is not a bad device, but the advantages of an OLED (if you are used to LG c2, c3 or Apple displays like the Studio Display or PRO XDR) do not really come into their own. Glass or glossy would simply make the colors pop much more and the font would also be much sharper on the LG 32GS95UX-B. I hope LG brings out one with a glossy coating for the next model at the level of the LG C series models, that would simply be a DREAM! Maybe next year?
It’s funny to my how not long ago so many have tried to convince us 60hz was enough and later that anything above 120/144hz wasn’t necessary or perceivable.
Let's see if at least Linus accepts the challenge, since no one seems to understand my question: In a competitive online game if you're lucky you can have 20ms network latency, 20ms means that your opponent's position will be updated with a frequency of 50fps even if your monitor is capable of managing 1000fps, where is the advantage?
when are you guys going to talk about those new 3D TVs that are popping up at tech shows all over the world. samsung is working on one that uses eye tracking instead of glasses. i actually have a feeling that nintendo's next switch handheld will use something similar. switch 3D could be awesome.
I'm a professional valorant coach who does content sometimes. Been using this monitor as part of a partnership with LG - and for my workflow I absolutely LOVE this display. Absolutely game changing for esports players who do content as well.
EDID is a really frustrating part of HDMICP as it often seems like every vendor has implemented the standards in slightly different ways. Doesn't matter much for the individual, but when I know that when dealing with large meeting room systems, or even just having to go give presentations in random spaces every day, it gets painful. Some dedicated conferencing systems still ship with dedicated EDID emulators for a few reasons, but that's what it takes to keep settings from auto adjusting, or even to keep some PCs from dropping the connection when they EDID check and that part of the monitor's hardware has gone to sleep.
im surprised ltt has taken this long to cover this. also would be cool for u guys to look into Lossless scaling. its a software on steam that is basically downloading fps. it offers resolution scaling AND frame generation. i heard about it from a star citizen redditor and thought why not? i can try and always return it. its on sale for the summer sale anyway. and man its trippy. it just works. i think it doesnt work perfectly in every instance but wow when it does. one of the use cases is for games that are locked at a lower fps like 60. low and behold i just started ac3 remastered and its locked at 63. and it works FLAWLESSLY. perfect 125fps and no perceivable input lag. it didnt work for the first descendant tho. it has this weird choppy cursor movement. same case when i tried it for frame generation on yt. mouse is super choppy
Thanks for catching the 240Hz = 2.08ms that should have read 480Hz = 2.08ms. We're getting a fix and replacement done through UA-cam.
Being beard makes you look like human
I didn't catch that but I gonna pretend I did.
Bring the beard back bro
Why is this video still up? You said you would pull videos like this down and possibly replace them with a correct video. Again your display latency part is wrong. Time between refreshes is not the input latency, it is not that simple. You have made numerous errors on this subject for years and still do not listen.
Also one sample point for each refresh rate, come on Linus the graph is obvious BS.
im ready for the comparison to the asus equivalent monitor. do they scale correctly?
I clicked because Linus's face was on the left side.
me too
Same. Prefer my Linus as a leftie
I clicked because he was thinking
Fr
I clicked because I wanted to watch the video.
The fact that so many 4K panels absolutely refuse to use integer scaling when they could has been an annoyance of mine for a long time. It seems such an easy win, too.
Admittedly, this is less of a problem with computer monitors than it is with full-size TVs, thanks to the ability to select the video output resolution in one's OS. But it's still _a_ problem, as shown in this video.
this
So annoying to watch 1080p content on a 4K tv and have it be a blurry mess just because.
Yeah, I don't get it either. Integer scaling is literally the easiest method of scaling. Whoever develops the monitor BIOS is going through MORE work to have a WORSE result.
Honestly, every single monitor should integer scale from it's native resolution whenever it can (like 480p to 1440p). There's no reason any monitor shouldn't be able to do that.
Exactly why I haven't bothered with 4K much and instead gone with 1440p and probably will for the next few years.
It annoys the heck out of me being ""stuck"" with 4K if I buy a 4K panel.
I was literally about to comment about how you should be able to get basically native results for 1080p on a 4k monitor. But you're gelling me they don't do integer scaling usually? WTF!
That's just obviously what you would do.
7:43 The half-pixel offsetting is most likely implemented to make the "pixel shift" feature, that protects most consumer OLED monitors from burn-in, work in the 1080p mode
So tell me why my note 10 plus has really bad burnin after a few years and I am permanently looking at a ghost keyboard
@@ctsxhp326it might not have the feature and maybe you use it a lot 🤷♂️
@@ctsxhp326 He said monitors, and it doesn't work forever, mate.
@@dalebeefsteak119 it's the same display technology which supposedly has pixel shift for mitigation, and a few years isn't "forever". If my expensive displays can't go two years without developing burn-in, I don't want them, they don't get my money. I'll wait for MicroLED for desktop monitors and TV's at this point. Not much I can do about phones unfortunately since the market is pretty much exclusively OLED now.
Half pixel shift won't help against burn-in by bright objects bigger than one pixel, as the most are.
8:57 editor used ai-generated linus voice to say "from your gpu" cuz he omitted it; the future is now
genius
how tf did u notice this bro
I actually had no idea about the edids forcing 1080p/4k and the forced upscale into 4k. That is really interesting and really highlights the limits and flaws of being the first gen product. I definitely want a newer version of this panel in the next few years, but for now, it's a nice novelty and a look into what the future holds.
The EDID on my asus tuf vg27aq was actually a problem source for me when trying to use DSR. It has a built-in 4k downscaling mode, but because of the bandwidth involved, it's capped at 60hz. But, I didn't want "4k" on the display, I wanted 1440p at 144hz on the display, but the game to run at 4k or above downsampled by my Nvidia gpu to 1440p. Ended up having to edit the EDID using a program or something and haven't had an issue since. Now I can use 4xDSR for 1440p@140hz just fine.
I feel like digging out a 1080p 32inch monitor to compare and judge the scaling would have been a good idea here. Lets see how close to native 1080p it really is.
9:24 is a bit confusing. "240 Hz = 4.17 ms" and "240 Hz = 2.08 ms". I guess the second one is supposed to be 480 Hz haha
@@leeroyjenkins0 exactly the equal signs doesn’t make sense. This is another instance where the editor isn’t technical something something but then everyone who reviews the video missed it as well. But to us it stands out as obviously wrong
@bighammer3464 Adding longer chains of unit notations makes things scan & read less clearly, particularly for laymen. It's often better communications to stick with something like that.
That said, continue to smugly fellate yourself over your expertise is the YT comments. Nothing is stopping you.
No. It's definitely 1080p at 240hz at 2.08ms in term of latency based on its bandwidth. What he is showing is the apple to apple comparison of what if we run both 4k and 1080p at a fixed 240hz. This is how fast it takes the controller to complete 240fps from the time it receives the input. I imagine if you run 4k and 1080p both at 480hz, this ratio of 4.17:2.08 will increase significantly on the 4k side.
It's not an equality too. It has the same meaning, but a second isn't a Hertz.
@@aliasor835 Try reading
I don't think I understood why they couldn't or didn't integer scale the image. It's just dobuling on both axis, and would pretty much just be 4K. No real scaling, just doubling up the pixels. I'm confused :(
Like, it seems so easy, I couldn't imagine the logic/code behind it would be difficult.
it literally is by far the simplest scaling but yeah, no idea why.
What's funny is this used to be supported by old Windows back in the CRT days, back then the assumption was that displays would be capable of different framerates at different resolutions and Windows actually had options for configuring that.
Makes sense, it was designed around CRTs where it was quite specifically how they worked. Kinda interesting how the removal of CRTs from not just practical use but even consideration when developing stuff was so complete that the feature got completely ignored and removed.
I've seen this on old LCDs as well, such as my HP. Moving down in resolution opened up 75 Hz compared to the standard 60 Hz.
Windows still has those options, they're just hidden in some weird attempt to make things "easier". (Just like the refresh rate being a separate dropdown behind another menu, which invariably means people leave their displays on lower refresh rates...)
If you go to "Advanced display settings", then select your display's "Display adapter properties", you get a "List all modes" button which... lists all your display's modes (each combination of Resolution + Color Depth + Refresh Rate), and lets you select them.
It's still a thing.
@@ThundertacticsThank you i remembered digging through every setting in windows and came across this but forgot how to get there 😂
8:57 What happened here, is AI Linus trying to take over?
Probably overdubbed in post.
Maybe a mix of jump cuts between footage (to fix a mistake) and some missing audio tweaks/enhancements on part of that audio .
Maybe originally said CPU and dubbed a correction.
Most likely a correction for something factually incorrect that was noticed after filming
The same thing they have been doing for a decade, voicing over mistakes with another take
That Yu-Gi-oh intro random made me smile ... thanks 🤣
IIRC i had a ~$200 40 inch TV in 2016 that had a feature like this. It could do 4K60, or 1080p120, if you didnt change the setting in the TV, and set the computer to display 1080p120 you'd get these weird artifacts that looked like broadcast TV with poor signal dropping, definately didnt look as good as when the TV was switched to 1080p120. Now i didnt do this most of the time because this was a TV, so to switch i had to go deep into the menus with a remote that didnt always have working batteries for.
I think you should have made graphics for the pixel shift, I didn't really understand what the panel zooms were trying to show
"More faster than you would expect...", 5:44, and "but if you look closelier," 7:18, made me laugh.
I've always liked their "more fasterer" that they pull out sometimes. =)
Linus has been saying that for years, always been a funny quirk of his.
hah, noticed that too. This video a bit weird with spelling and the ai voice at 8:58
@@vroxy7946 I have never noticed that he does it frequently.
Painful to hear
This dual mode feature sounds quite fascinating even with its limitations. Superior motion clarity and color performance seen here makes me excited about the future of display technology.
Omg I feel for the test engineer who did not pick up on this off by one error. Huge it true!
I own this monitor and I will say it’s amazing. Literally perfect for my use case. I get a 4k panel for whenever I play single player games and crank the settings, then I get a 1080p 480hz panel for when I waste my life away with my friends on Overwatch. Came from a 1440p 300hz Asus mom to this and personally couldn’t be happier
This product is such a head scratcher to me, because I'd assume that if a higher bandwidth requiring mode is supported, then so will literally anything beneath it. So why can't you just pick it out in windows, let the monitor go black for two seconds, and woop you're all set.
As of scaling, I HATE that integer scaling isn't a common thing. I bought a 1440p monitor like six years ago specifically because of FTL Faster Than Light (a game that runs at only 720p and nothing else). And full screening the game still produced a blurry image. Part of me died inside that day. Please let that never be an issue in the future. If I play 1080p games on my 4k TV, I expect the image to not get blurred by the scaling.
If it only is for one game at 720p with fixed hz you should try a CRT
Integer scaling is even easier to implement too, so why don’t they do it? One option for you would be to set your display as 720p and then use integer scaling on your PC to output a 1440p signal, that way your monitor doesn’t scale anything and it is all done before the signal is output.
Hey, if you don't want to use exclusive fullscreen mode and GPU-side integer scaling, there is an app called "Magpie" that can do integer scaling without having to change the display mode. There is also "Lossless Scaling", but it used a Windows Magnifier hack that doesn't work anymore as of the latest Windows versions. Magpie works because it actually does a screen capture and then scales in software, which means slightly higher latency, but far better quality.
Integer scaling, black frame insertion and interpolation are all greatly underutilized technologies... especially important for emulation.
I went back to play FTL and it was so small at my 3440x ultrawide it was barely playable. Had to use an integer scaling solution and try and get it looking alright ("no more blurry scaling" thread on ftl reddit) as the NVCP solution is... for fullscreen. DSR sucks as well.
CRTs will always be better than LCDs. We really need to get manufacturing of them back, or find a technology that actually meets or surpasses it in all areas.
You can play it with lossless scaling. I play it on 1080p with nearest neighbour and it looks great
The button also serves the practical purpose of have a hardware difference between 1080p and 4K mode, which you could use to configure the monitors separately. It would be interesting if there were a "pass through" switch that did the same thing and just hijacked the EDID message.
Lovin' the Yu-Gi-Oh reference at 1:00
9:13 - those aren't TCONs but DP-LVDS converters. TCONs are what convert the pixel data to electric signals that drive the panel matrix(almost, just to simplify).
Rocked the zisworks monitor for long time and used its higher refresh/lower rez rates for shooters and rhythm games like osu!
Can't wait to sell another kidney for my Monitor 😁
Another?
@@onebladeprop Morgues are full of them!
@@treborrrrr🤔
Must be a mountain biker too 💸
bro those monitors are even more expensive in EU than in the US. A few days ago I ordered an ASUS PG32UCDM, thats the 4k 240hz QD-OLED, from Amazon and paid 1713€!! (thats 1857$)
yes, this is with tax but its still more than I think in any state in the US. Or are you from Canada? how expensive is it there with tax included? (obv only the price WITH tax matters bc thats the actual price you pay lol)
4k@240 and 1080p@480 dual mode at 32" was my dream monitor, thanks for doing a deep dive on this monitor! Seems like it might be worth waiting one or two more generations for this tech to get a little more polish.
@armandomontanez8511 it doesn't need to get polished bud it's ready !!
Also allows time for GPUs capable of driving those monitors to become more affordable, hopefully.
8:57 "from your GPU" is computer generated isn't it? The audio seems weird
Could be cropped recordings
sounds like a correction. He probably said CPU.
I appreciate the Yu-Gi-Oh reference in the intro. 🤙🏻
LG escaped from the jaws of victory. How hard would it be to just emit highest possible refresh rate and resolutions combinations in one EDID data and have on screen option to switch between integer and blurry scaling modes? EDID would support advertising both 1080p480 and 4Kp240 at the same time and allow the OS to switch to correct mode without requiring any extra button presses on the monitor.
I mean, having a physical button to frequently switch between the modes is quite handy. I own this display and press it more than once per day, so having to do it in the software would be a little bit annoying and takes extra time. but offering both would be ideal.
@@patrikmediai think having it in software and hardware would be the superior option, gives choices, bonus points if the make it something u can bind a macro to
If you did go with this monitor, you can just buy an EDID emulator, which will allow you to have any EDID configuration and usually at least 10 custom profiles of choice, what may be a better pairing however, is an OSSC Pro, which also has an EDID manager, but also comes with lossless integer scaling, pixel repetition scaling, subpixel rendering, line multiplication, and many more features, including very high-quality HDR-BFI algorithms for 1080p, you can only do 120Hz BFI unfortunately for now at least, but 120Hz HDR-BFI with 4K subpixel rendering should offer pretty same solid results and give you nearly lossless IQ and pretty amazing motion resolution/clarity. What was failed to be mentioned in the video, is that 1080p 480Hz offers far, and I mean far higher dynamic resolution than 4K 240Hz, we are talking 300 TVL vs around 800 TVL of motion-resolution, and 120Hz BFI would almost certainly offer 1:1 1080p static:dynamic resolution (equal static and motion resolution @ 1080p).
Never heard of tvl before. Interesting@@Wobble2007
@@patrikmedia Software could "press the button" for you automatically. I guess you don't press the button randomly but for some specific application or game. That part could be obviously automated similar to Nvidia drivers allow automatically applying different GPU settings depending on which application you're running.
this is breaking my brain, why have none of the other youtubers gone top this detail.. well done lmg and especially lab team
What do you guys think Linus originally said at 8:58 lol post ur ideas in the replies 😂
I was about to comment that Linus sounded like a robot in that minute
This refusal to integer scale or properly scale resolutions in general is why I will always use GPU scaling in my drivers when I can, and use lossless scaling to 2x when I can't.
Linus gets 2 oled monitors. Meanwhile, I don't have 1 oled monitor
And even if I had one, my ancient GPU wouldn't be able to drive it properly.
@@treborrrrr your GPU can drive?!
Same
Bro I got a 1030 can’t run 1440p
@@alnair228 mine can run 😎
Interger scaling is the only thing that makes me want an 8k display now. I'll never use 8k but the ability to switch between 4k & 1440p sounds amazing!
Bring back the good old CRTs
There are plenty for sale
I'll keep my OLED. I grew up with CRT's even had one in early college back in the day and the OLED picture looks lightyears beyound that of any crt and the footprint is a whole lot smaller.
I'm not a gamer, but your explanation of this monitor is pretty cool. I've been working in tech for a good while, and I've found that when we can't figure out what something is for, it's often in anticipation of a future implementation of something. Your explanation was pretty fascinating, I've never really given a lot of attention to monitors, because, as I said, I'm not a gamer. I do photo processing, so the quality is important, but not so much doing the video clarity. Great demo, thanks for the work you put into these!
"more faster-er" and "closely-er" are why i watch LTT
Just Listened to the WAN show, and i heard linus mention "maybe i shouldn't have covered this topic."
But i highly appreciated the content about this, as none of the other channels i've seen gone into detail about this.
The only content i've seen so far is that they've stated "it doesn't look quite as sharp"
I've purchased this Monitor and the first thing i've noticed was the scaling in the 1080p mode. I would've appreciated if any of the channels covered the topic to the detail you've gone into it.
Here is me hoping that LG will still introduce integer scaling for the 1080p mode, as it currently just looks a bit whacky.
7:20 "A pixel is a pixel, you cant say it's only a half!"
“Well blueberry “””””1””””” c2”
*Holds down monitor’s EDID button before turning it on resulting in a half pixel shift*
first time in a while that i watch a full episode of something that's entirely not targeted to me, not even remotely. Dayum, people that are into retro gaming for real pay attention to the most minuscule things, to the point where i, a nerd that follows this channel and many more of tech on a daily basis and knows what they are talking about in most videos, could not even follow the logic of it.
Yet, it was interesting enough to keep me watching till the end, so kudos to the editing.
I'm also slightly confused by this monitor's "24/27 inch mode" where they add big black bars to make the 1080p output look like it's on a smaller monitor. I understand this, but why didn't they add an option to use a centered section at 1920x1080 so that it would have great PPI, great scaling, and still be ultra high refresh rate. I'm sure there is some technological explanation why this can't be added, but if it could, I would've loved to see it.
here is your technical explanation: cause it would be a fucking 16inch image. Have fun playing on that.
@@yoriiroy1720 you sound a little upset. regardless, i don't see why the option isn't given. Sure the majority of people wouldn't want to use it, but they also won't want to use the blurry 27" or 24" scaled 1080p modes. For esports, I can see why someone would want to at least try out sitting closer to their monitor while achieving high PPI and refresh rate. As small as 16" is, I wouldn't exactly call it unplayable for everyone.
@@salus5319 then just buy a small 1080p monitor. Why buy a large 4K monitor just to play on a tiny 1080p portion of it when you can get a cheaper 1080p monitor and it would be better and larger? If you are only interested in playing Esports games then why go for a 4K monitor?
@@conorstewart2214 you're completely missing the point. Why do you want less options? Aside from that, OLED has its advantages for esports. On top of this perhaps people do more than 1 thing, and enjoy the option of 4k, as well as the esports option of 480hz.
@@conorstewart2214 Because physical space is limited and some people use their PC's for more than just gaming?
Chefs kiss on the yugio pun on the intro
Those question marks really drew my eyes
Nah, they distracted me from Linus being on the left...
very in-depth and infromative
I clicked because the question marks were blue
This with 5K would be great!
(That makes the low fast resolution 1440p*)
And when writing down my wish-list. Let's make it a 16:10 ratio.
*with 16:10 it would not be 2560 x 1440 but 2560 x 1600 in low fast res and 5120 x 3200 for high slow res.
It would make for the perfect monitor for productivity and entertainment.
Me viewing this on my phone: "ahh, yes! I can totally see the difference."
Not having integer scaling in displays is an absolutely unacceptable practice for ALL display manufacturers.
I literally have a 43" Hisense low end TV from last year that does this. It's 4K 60hz or 1080p 120hz, I believe TCL actually had the patent on it called "Dual Line Gate" technology for the previous 3 years.
interesting, wonder why Hisense didn't push that more, or maybe they did and I didn't notice. This LG monitor looks amazing.
The explanation on the differentiation of 4K and 1080p in dual mode was very enlightening. Looking forward to seeing how LG and other manufacturers improve upon this technology.
The argument on the CRT is completely wrong or I didn't get what you said. CRTs had pixels because they need to differentiate Red, Green and Blue lights sources after the conversion from the electron beam to (white) light thanks to the photophor layer. In fact, even before colored CRTs, they were already using aperture grills to increase the sharpness of the image (electron beam is gaussian by nature, hence generating blur. That's what they try to mimick for the retrogame experience). Those apertures had different names like shadow masks, black matrix, etc.
Your argument on the possibility to scale the raster by playing on the scan speed would be relevant only for monochrome CRTs, without grills, which I've only seen on cheapo oscilloscopes. And even then, you would have limitations because the electronic back then was not so evolved that what we can have nowadays with a clean quartz oscillator or a FPGA (remember than first CRTs were using the phase of the electric plug to generate the oscillation signal). And limitations of the space between each beam : either increasing gap between the beam, thus decreasing brightness, or decrease the gap and then increase the blur.
CRTs phosphors aren't pixels.
The type of phosphor grid used has nothing to do with clarity, the size or density matters a lot more. I own a Samsung Syncmaster 955DF 19inch pseudo 1440p monitor, it's a dot mask CRT and it looks sharper than an Aperture Grille monitor like the FW900 that has bigger phosphors.
Brightness increases with higher Hz and the electron beams aren't gaussian. What happens is the light difraction from the glass and wacky things that happen with the phosphor layer.
You take a slow mo 300 000 fps camera for a CRT scanline, it looks like a perfect line.
As I understand the electron gun still fires wherever it’s told regardless of where the phosphors are. So a CRT can light up any fraction of what appears to be a pixel which modern displays with actual hard set pixels cannot do. As an example, a blue electron gun firing right in the gap between two blue phosphors would slightly light up the corners of both. A modern panel would have to approximate the resulting color of an entire pixel and display that because it can’t light up only the corners of a pixel. So this makes every resolution appear to be native on a CRT. I’m pretty sure this is also why things like the retrotink are very convincing on 4k TVs because we can brute force this effect by having so much more resolution to work with. For example by displaying a 240p image on a 4k tv through a retrotink or similar scaler, each 240p “pixel” is actually made up of many 4k pixels, enough that we can recreate the CRT look fairly convincingly (I’ve never seen one in person myself)
Idk if that made sense but the channel Technology Connections video “These are not pixels” that has a bunch of really good demonstrations and macro shots of different CRTs that explain it way better than I can
@@xguitarist_ Because 4K OLEDs have a stronger pixel density than CRT phosphor density
@@saricubra2867 with this little hsync on your Samsung it aint even close to resolutions /hz a fw900 can do.
@@Pe721 Yes, but the FW900 looks more blurry than my Samsung because it has bigger phosphors.
With my Samsung, it's the first time ever that i watched extremely sharp text on a CRT.
Thank you for doing the due diligence on the scaling issues. Every other reviewer just seems to be smitten by its dual refresh rate.
These silent ringing sounds from time to time in the video drive me mad. It sounds exactly like my doorbell. 6:02 for example has one and there is plenty more in the video.
Wtf you on about? "Silent" ringing?
@@daniel-bg5nq You can only hear it on certain volume levels, might be a frequency that is hard to hear for some. Sounds like a silent doorbell, exactly like my doorbell and it drove me mad. Thought somebody is ringing my door.
@@dusteyezz784 hmmm I'm getting older, but that normally results in loss of higher frequency i thought? Weird either way!
i do appreciate the Display makers are always at work. Every time we think we have hit a wall for capability they find another path or style of panel to refine the sharpness or reduce the power usage. Wider and wider resolutions and cheaper. One of the best industries for a consumer.
1:29 No it doesn't. phosphor decay on CRTs and Plasmas took 5+ ms which is their persistence meaning that a 240Hz OLED with it's
i can tell you by expierence that 144hz crt is faster than 240 hz oled. I tried both and its night and day. Motionclarity is a pixel off using a OLED. You need twice the hz and fps to match high end crt.
@@Pe721 144 Hz x 2 = 288 Hz ≈ 240 Hz
you'd also need a GPU capable of driving 240fps otherwise S&H will increase persistence.
Yes there will always be a difference between S&H and pulse displays but you'd have to be pretty sensitive to discern that at 240fps during gameplay. for most people that's buttery smooth no matter which display tech.
@@Pe721 Due to the phosphors and the way a crt works, it pretty much flashes the image at you. Similiar to the ULMB Linus talkes about at 1:30. Did you compare with ULMB enabled, or diabled? Don't know if oled even supports something ULMB, thats more of an lcd+led backlight technology. but that should deliver a similiar effect to a crt
@@Pe721 You have a crt thats 144hz?
Did he say "More faster-terr-verr"? @5:43 and "close-lee-err" @7:19
This is more for humor value since the return of LTT going back to their roots of being an informative channel is a welcomed change.
WE LOVE YOU LINUS
your first
Congrats buddy
W
We love Linus!!!!!
Hell yeah
Nice videos as always, Linus! I am a long time subscriber from 2021!
You watched the whole video in 30 sec only?
@@ParasAryan😂
@@ParasAryan I skimmed through it first to see if it was something interesting that I would want to watch, as I haven’t watched Linus in a while, but remember that I liked his videos.
@@BloxDude69huge lie
@@epzo Sure buddy
6:02 almost made me pass out 😵💫
You have the best intro ever tbh
very interesting in the integer scaling chart that 8K does 1:1 scaling for basically every major standard currently in existence? seems like a pretty obvious marketing point for 8K panel makers to sell (that everything looks crystal clear, VHS to UHD4K) that I'm surprised I've not seen it before.
Why dual mode??? Just add an integer multiplier into your game. It's that easy. just trip-licate 480p to 1440p. Sure, 480p is a blocky but it's 100% just as sharp.
I still don't see why you couldn't just use one EDID and instead change the way things are scaled, like the algorithm actually scaling by integer multiplication instead of blurring everything when upscaling things (when the resolution is an integer multiple)
Why not 1440p at 480hz, that should work through one cable?
That makes too much sense.
Maybe Because of the uneven scaling to 1440p to 4k 6:46
@user-jm3mi5zj3e I wouldn't mind it if it was natively displayed without scaling. It would be around 21" pretty good for esport titles.
I think the mode is meant for esports and there are prob more people who play in 1080p high refreshrate than 1440p at those crazy high fps
let see the new one(LG) gonna came out with 27"1440p 480z next month or 2
i would like to see 480hz myself, but at 1080p you have the bandwidth to have 960hz or even 1000hz already.
just have no scaling option.
just give us the hz, frame generation is going to quadruple the fps soon enough.
My acer monitor keeps dying, too bad i dont have a couple grand to spare lol
great vid
Oleds are so expensive, but 1 or 2 more years and ill def get one
Damn the day after tomorrow came out 20 years ago?! Thank you linus for making me feel old.
That’s cray cray
Why you actin so cray cray
the background song of this video while wearing headphones kept sounding like someone was knocking on my window or door i thought i was going crazy because no one was there.... thanks linus
I still have a 1080p 60hz monitor with no money to upgrade 😕, but it's still cool to see these new monitors
In the same boat
Surely you can get cheap used 1080p 144Hz monitors no?
There cheap is my expensive sadly
You can't use Integer Scaling with DSC which is pretty much a requirement for high framerate on Nvidia cards. And there's also no easy way to disable it.
CRTs were ahead of it's time
They weren't ahead of the time, basically everything that came after them sucks and did a technological downgrade.
idfk why but for some reason this video is killing my eyes. I just see so much motion blur on that game that it hurts my head, I can't keep watching this but take my like anyways Linus
The switch to 1080p@480 is more a "compromise mode". By the 1 pixel offset you only need to drive half of the pixels to the video data, which makes the 480 Hz refresh possible. Otherwise the bandwidth in the panel itself is not sufficient to sustain a 4k@480 throughput, to update every pixel - so they just drop half of it away :D Anyway, in the early ages of 4K panels, we had to configure some Tcon registers to switch between 4k30 mode and a 1080p120 mode (the interface bandwidth is the same). There was one catch - that the tcon itself applied some filtering - normally it looked like integer scaling, but when certain thresholds were met, then a sharp corner suddenly became smoothed and rounded :D
Thank you for clearing that up. The whole video I was thinking "I'm pretty sure the 7900xtx supports 4K480hz with DSC over DP2.1" so I was confused why Linus was saying that the display port bandwidth was the issue.
Just to be clear on how it works are you saying that monitor is only sending pixel instructions to some of the pixels and then the pixels not getting instructions are just matching or averaging the instructions of the neighboring pixels? I'm looking at the picture at 8:00 but it's hard to tell what I'm seeing when everything's white I can't tell a pixel from a sub pixel and I don't have a close up of the pixels in normal mode for comparison.
The bandwidth is only relevant for the interface between the monitor and the GPU. But in this case by identifying as a 1080p monitor the GPU is sending a 1080p signal, the processing on what pixels to drive is taking place on the monitor, so it is the monitor processing that needs to double the pixel on 2 directions.
What I'm trying to say is that the argument of driving half the pixels because other wise it would be a 4k signal at 480hz is not what is happening here as it is internal processing on the monitor is not affected by display port or hdmi bandwidth
@@iothomas Ok, I guess my question is: Why does this monitor not support 4k480hz? When I first heard of the gimmick I assumed that the 1080p mode would be letterboxed in the middle of the screen or something, instead it's doing it full screen at 480hz. If the monitor upscaled the 1080p signal to a 4k signal using some upscaling algorithm then it would then have the same bandwidth as a 4k480hz. Yes it has already got past the display port bandwidth restriction, but if that was the real issue then they could have just used DP2.1UHBR13.5 because that supports 4k480hz with DSC and is already in use on AMD 7000 series GPUs and on a couple monitors like Samsung Odyssey Neo G95NC which needs it for its ultrawide 240hz display and the Gigabyte AORUS FO32U2P which actually uses UHBR20(only needed if you have some weird use case that does not support DSC.)
Maybe dp2.1 adds too much to the cost, but I doubt its more than 100-200$ and then this monitor would stand out way more. It's either some annoying product segmentation and they will be soon releasing a 4k480hz monitor screwing over everyone who bought this monitor(perhaps waiting for next gen nvidia gpus since right now only amd and intel support 4k480hz) , or there is some internal component that can't handle the the 4k480hz signal and their weird upscaling is a result of the solution that doesn't overload this component.(Which I'm pretty sure is what @cameramaker was saying I just didn't fully understand the exact process he was describing)
this is off topic but i came to this video just to thank all ~100 ltt peeps for having the back of the little guys.
Hate is everywhere, but the silent majority appreciates what yall do
Every video calls me poor in more and more ways.
Linus: "Spackle and I, you're a professional writer now"
Also Linus: "More Fasterer than you would expect"
0:37 CRT problem solved, they literally dont care what signal you feed them.
They still have a Grid and its not as simple as he says. 4k on CRT looks kinda blurry for that reason but with interlaced Signal is quiet crazy. Can only be achived if you tunnel your gpu to intel igpu output nowadays that might add input lag.
@@Pe721 If I remember right there are some units with a dense enough phosphor pitch to be sharp at 4k.
Either way they adapt very well to different resolutions.
Actually, they do...If you send some frequencies that are a bit too off from standard, you can destroy your yoke or high voltage transformer. Actually blew up a 30,000$ 42" CRT monitor by sending it incorrect frequencies a while back. RIP since gl finding a yoke for those NEC/Mitsubishi presentation monitors.
@@natsukage3960 seems like a driving problem rather than the actual resolution being fed to it, unless very specific resolutions cause some resonance issue? But that would be specific to the yoke and/or board.
I love how u always test Doom, this time Halo. I hope Quake is next. You and I are both oldschool it seems :)
Historically, the term for the day after tomorrow was "Overmorrow".
Which remains as the term in the Dutch and German language to this day.
@@PhyrexJ and in Swedish too. "Övermorgon" and you can even add one "över" for each additional day you want to skip
@@algotnsame in Dutch, add as many "over"s as you want.
In German, we actually say that ("Übermorgen").
And still is in many European languages.
I came into the video ready to talk about the issue highlighted and glad you focused heavily on this. It's a trade off consumers must be inform on.
Why do we need these OLED monitors, where are the MicroLED backlit monitors that are like oled but do not burn out !!!!
That would be MicroLED, not MiniLED. MiniLED is already available for PC monitors, MicroLED is probably years away as they work on scaling it down to smaller display sizes.
@@Ozzianman Thanks for correcting me, I meant MicroLED. 👍
@@vladbb59rus MiniLED, MicroLED, very easy to mistake them.
I would guess they still get burn in at some point, anything that's going to be "per pixel" is going to wear out unevenly to some extent. But I'll definitely grab a microled monitor/tv when they're somewhat affordable.
Haven't really had any issues with OLED so far though, and it's what's relatively more affordable today.
Never realised 8k displays can do integer scaling for both 1440p (for gaming) and 4k (for media/productivity). I don't personally see any value in 8k native resolution. But this actually gives me a big reason to get an 8k display
Meanwhile my wallet:
I absolutely love your intro linus ❤
They could've easily made it a 25'' 1440p 360Hz OLED, but they choose this abomination.
1440p needs to go away. It's a terrible resolution. We need to be moving towards stuff that properly scales.
@@Aliothale What kind of scaling are you referring to?
There is an upcoming 1440p 480hz 27 inch coming later this year
@@PD-ws4td 27 inch? Yikes.
@@WrexBF 27 inch is aight in terms of ppi
Linus, that Day After Tomorrow reference was great, and we are the old ones now.
Is burn-in still a problem with OLED in 2024?
Yes and no, and when it is still a issue it's covered in most warranties l
@@QuickChange919 not true
Burn in can be solved by software
If your software moves a few pixels here and there, You're safe. If not, You'll get burn-in.
not in normal use
Yes, warranties covering burn-in is the only saviour these days! But OLED is top-notch! 🔥
A 4k monitor can display 2k, and 1080 without scaling, as the pixel counts of those 3 monitors are evenly divisible by 4k.
everyone hates their OLED once it burns
CRTs also can suffer from burn-in.
I hated the burn in on my crt growing up and there was nothing we could do about it but buy a new crt. We were too poor to replace it so we sucked it up.
I have been using this monitor (LG 32GS95UX-B) for over 2 weeks, it is not a bad device, but the advantages of an OLED (if you are used to LG c2, c3 or Apple displays like the Studio Display or PRO XDR) do not really come into their own.
Glass or glossy would simply make the colors pop much more and the font would also be much sharper on the LG 32GS95UX-B.
I hope LG brings out one with a glossy coating for the next model at the level of the LG C series models, that would simply be a DREAM! Maybe next year?
It’s funny to my how not long ago so many have tried to convince us 60hz was enough and later that anything above 120/144hz wasn’t necessary or perceivable.
Let's see if at least Linus accepts the challenge, since no one seems to understand my question:
In a competitive online game if you're lucky you can have 20ms network latency, 20ms means that your opponent's position will be updated with a frequency of 50fps even if your monitor is capable of managing 1000fps, where is the advantage?
Love the Doug entrance to the video
when are you guys going to talk about those new 3D TVs that are popping up at tech shows all over the world. samsung is working on one that uses eye tracking instead of glasses.
i actually have a feeling that nintendo's next switch handheld will use something similar. switch 3D could be awesome.
I'm a professional valorant coach who does content sometimes. Been using this monitor as part of a partnership with LG - and for my workflow I absolutely LOVE this display. Absolutely game changing for esports players who do content as well.
EDID is a really frustrating part of HDMICP as it often seems like every vendor has implemented the standards in slightly different ways. Doesn't matter much for the individual, but when I know that when dealing with large meeting room systems, or even just having to go give presentations in random spaces every day, it gets painful. Some dedicated conferencing systems still ship with dedicated EDID emulators for a few reasons, but that's what it takes to keep settings from auto adjusting, or even to keep some PCs from dropping the connection when they EDID check and that part of the monitor's hardware has gone to sleep.
The way Linus started this vid is literally the same way Doug Demuro intro's every one of his videos 😆
(~) 8:58 your overdub game is getting better. I *almost* didn't notice it lol.
Loved the "its time to D-D-D-DUEL!!!!" message, i often still use it on my non knowing co-workers xD
im surprised ltt has taken this long to cover this.
also would be cool for u guys to look into Lossless scaling. its a software on steam that is basically downloading fps. it offers resolution scaling AND frame generation. i heard about it from a star citizen redditor and thought why not? i can try and always return it. its on sale for the summer sale anyway. and man its trippy. it just works. i think it doesnt work perfectly in every instance but wow when it does. one of the use cases is for games that are locked at a lower fps like 60. low and behold i just started ac3 remastered and its locked at 63. and it works FLAWLESSLY. perfect 125fps and no perceivable input lag. it didnt work for the first descendant tho. it has this weird choppy cursor movement. same case when i tried it for frame generation on yt. mouse is super choppy
I bought this, and it feels wonderful being able to have 4k for normal games and use and 1080 for counterstrike and Valorant in one monitor.