YALL HAD ME FREAKING OUT WITH THE SPED UP INTRO!!! I literally just put together an audio system from the thrift store to replace my broken logitech computer speakers and this is the first video I clicked to make sure it's working and I was like WTF HOW IS IT HIGHER PITCHED???
I thought youtube was remembering playback speed based on channels now as the last thing I watched by LTT was the wan show and I always watch that sped up.
Just a note, I have this monitor you have to change the Color options in the Nvidia control panel to get rid of that banding, set it to max RGB Full and 10BPC and you'll see a world of difference. Hope it helps....
@@prich0382no, it does not You can do 1440p @480hz with RGB and 10 bit SDR DSC will allow for pretty much most you can throw at it Its just a matter of how compressed that image is going to be Personally dont like HDR (not sure if its doable with HDR)
Full RGB or 10 bit needs even more bandwidth. However, I much prefer having the full RGB turned even if I need to lower the refresh rate. I cant stand banding
The opening being a bit sped up was throwing me for a loop because the voices sounded so weird in my ears. It was sped up just enough to have an uncanny feeling.
The intro being sped up and campy made me appreciate stuff i have grown to just see as background noise. Your intro song is still so great and i love the actual bumper animation. Great channel guys. Love that you keep it fun even though it's a medium sized business now
I preordered the Alienware AW3423DW over 2 years ago. I paid full price and do not regret it in the slightest. It is the one upgrade I have made to my setup that I feel zero buyers remorse for as I watch prices fall and technology get better. It is absolutely incredible. I can't see any reason that I would ever need to upgrade. If you don't have an OLED now is the time to get one. Prices have fallen and the technology had matured enough that unless you are shopping in the ultra budget category the nothing else makes sense. I would skimp on other parts and try to fit an OLED in your budget if at all possible. It has made a bigger difference for me than any other PC part ever has.
Agreed. Some game look great in sdr but look awful relative to the HDR counterpart. My monitor was the reason why some of my friends upgraded their monitors because true HDR just look that much better.
@@AXLP_LaZEReD Besides that, I'd say that it's better to wait for a newer iteration/replacement for WOLED. As of now, both technologies (the former and QD-OLED) are too flawed for the price (unless $800-1200 is not a huge amount of money for you). And I'm not even talking about relatively low brightness and burn-in. If you can wait, you'll certainly be better off doing that. If dropping $1000 is no biggie for you, or you can see yourself upgrading in just a couple of years from now, or you just can't wait/bear the LCD misery any longer... Only then you should really go for an OLED now. Imo.
@@williambrown3699 In a video (can’t remember which), Plouffe is doing a monitor review and there’s some talk about him being the display guy at LMG. He says something like “Yeah, I’ve worked with a lot if displays, I even own a display.”
Linus's comments on the "sync" options around 9:50 is something worth further discussion. I got the Alienware AW2725DF about a year ago now and I disabled Freesync after just a week of use. I love the monitor and my computer (5800X3D/6950XT) is powerful enough to drive it at well over 100FPS in any game, and I didn't notice any difference in frame timing/feeling of the game between Freesync on and off to warrant using it. What I did notice with Freesync on though was EXTREME flickering. I only learned this after purchase: when Freesync is enabled and you experience large fluctuations in FPS (CS2 for example, goes from usually around 250-300FPS occasionally down to 100) the monitor flickers and changes brightness HARD. Not an issue with Freesync disabled, and it does seem mostly unnecessary at these high refresh rates so it's not much of an issue, but it's something that should be tested on these high refresh rate OLEDs.
You should definitely do a test with the employees to test how many can spot dsc. Pretty much every review of Monitors using it says you can't tell. But seeing tech savvy people doing a side by side would be very interesting.
@@Loanshark753100%, like switching from the DP cable to the hdmi 2.1 should have switched from dsc 3.0 to dsc 2.0. Which is a ratio of 3 down to a ratio of 2, which is better.
The conclusion felt too short, is the color bending now still obvious with the bit more bandwidth? What was used at the end 48GB/s instead of 32GB/s? Of the required 72GB/s that's still just a tiny fraction more, the color bending must still be very obvious as it was with 32GB/s.
Yeah, I was hoping they'd at least switch to the same wallpaper to compare, but they didn't even do that and ignored the issue for the remainder of the video...
A dedicated video would be very interesting. This is one of the reasons I'm holding off on upgrading my monitor.... together with not trusting cable manufacturers.
Been using this monitor for 2 weeks now and I'm super happy with it. I was doubtful if it would be a significant upgrade from my 1440p 27" 280hz IPS but the OLED and 480hz is insaneeeee.
@@GhosTigre Yeah it really feels as if you're watching the gameplay through a real life window. I haven't tried the monitor at 280 hz (old monitor hz) yet though.
Absolutely loved the Sonic Colab. I just met my favourite Sonic youtuber (mastertrainergaming) at Pax in Melbourne Australia, and this popped up on the way to the airport. What a perfect way to end a weekend with Linus Sonic himself.
You Should know full well in a sponsored video from Linus that his opinions are always his own, never paid for. On a side note also - if someone wants to sponsor a video for a fantastic product and sends him one to use in his personal space he would be a fool to say no…
yup on my 240hz LG oled, the gradient banding especially on scenes where it turns from grey to black has this blob-like bubble that slowly wraps around where the grey meets the black parts of the scenes. it's a bit distracting but only happens in nighttime/indoor stuff
@@maybebutwhatever yup it's odd indeed. I rarely see LG OLED TV owners complaining about the color banding in forums, while it seems blatantly obvious on scenes (for my specific 240hz lg oled model) with a lot of gradients, like what linus shows in the vid. It's real, the banding is real and it's there and never goes away. It's sometimes distracting for me, but it's not a deal breaker for the amazing reduction of input lag compared to a ghosty fast ips. my LG 240hz 1440p has racked up 845 hours since I bought it this April 2024 and the grey banding has somewhat improved? I haven't run the aggressive pixel cleaning function yet. Mostly the automatic image cleaning every 4 hours since I haven't seen any signs of uneven panel wear yet. I don't watch movies with black bars on it, only 16:9 content in youtube. I'm sure no 27 inch OLED owner talks about the gradient banding issue because: 1. there are not a lot of 27 inch oled owners out there 2. OLED monitors are still at a steep entry point for a lot of people so it makes more sense for them to look at better alternatives, like an OLED TV at that price point. 3. a lot of people are still scared of the burn-in problems and panel degradation issues that OLED has in general. 4. Even 240hz 1440p is demanding on the hardware. So you'd need an entirely new PC , separate from the cost of the OLED itself just to enjoy it. Unless you are a competitive gamer or is fine with turning down settings to stay on that 240hz.
@@hh5523tw nah it's the monitor itself. I have the 27gs95qe 240hz OLED from LG and it suffers the same fate of color/grey banding in gradients. It's the one weakness of MLA WOLED and it's never going away. It gets better in normal, bright scenes, but the moment you do get into a darker room scene, or nighttime scene, it's gonna rear its ugly head in like the banding in dragon's dogma's shadows for example.
and here i sit, with a smile on my face, perfectly content with 1080p with 60fps max on a 40 inch tv... why, simplest and primary answer is diminishing returns... i can build a pc that will play every game i own at 1080p with decent settings and good fps for 400-450 dollars... switch that to 1440p, youre looking at absolute bare minimum 550-600... and 4k, if you can get some lightly used parts, maybe 1100 dollars minimum
its for FPS gamers that probably have their game at lowest settings and are playing "older" games, not the latest CoD. They might even run at 1080p isntead of 1440p to get the 480hz.
I bought this monitor and tested it for about a week. I had my wife and kids set different refresh rates from 240 to 480 in cs2. The 240 is easy pick out, but the difference between 360 to 480 was not. Many times I tought I was at 360 when in fact I was at 480 and vice versa. I have above average response times on human benchmark at about 140 to 150 ms. I honestly thought I would consistenty notice the differnce, but it wasnt possible between 360 and 480. The panel being WOLED also doesnt have the same color saturation as a QDOLED. I ended up going with the QD 360 hz for 350 dollars less and was really happy. I think Asus has jacked up the price because of that 480 number, which most people wont be able to take advantage of
I recently upgraded from years of 27 inch IPS monitors to a 49 inch OLED ultra wide. My DAW plugins I use appreciate the wide as hell screen. I love it. Haven't played an fps since unreal and unreal tournament. Just hack and slash for me. 6900 xt and I'm set. Screw going back down... For now?
Didn't even check the banding again after changing to HDMI 2.1. Enable 10-bit color in your driver to reduce the banding, which will be supported with DSC.
Because it makes no sense for these types of videos to be in 60fps. Having them in 1440p makes more sense. Plus, since it's DOUBLE the frames, it would mean a huge increase in file size for their archive servers. Besides, since the monitor is 480hz it literally doesn't matter if the video is in 30 or 60 fps. It still won't be even close. So it makes no difference.
Except whoever sped up the "Linus Tech Tips" intro animation - the original is 25fps (so normally every 5th/6th frame is repeated to fit the 30fps container), but instead of just speeding it up to be 30fps without any repeating frames, they seemed to have sped it up by some other fractional amount so you get weird inconsistent repeating frames, making the "Linus Tech Tips" intro animation not really feel any "faster" despite the speed-up.
10:29 As someone who’s played this game too much, you can use mods or the Lossless Scaling program to get around the capped fps. Makes a huge difference, but disappointing that game studios still do this.
Yeah nvidia has gone crazy. Seems like AMD is doing more and more to close the gap in terms of features. Just not hitting the very top end like nvidia.
I've been AMD for GPUs for a long time as while they might not be as powerful as Nvidia, they just offer more value per greenback spent. Also a bonus if you're a Linux user like myself, then AMD just plain works better out of the box with little fuss, just make sure you're on the latest Linux kernel for best performance, which on Manjaro is a few clicks and your password.
Man that intro was really jarring, mostly because of the pitch increase. I'm so used to UA-cam speeding up video and audio without changing the pitch. Where did that trope come from anyways, some analog thing?
I think it’s really worth mentioning the OLED vrr flickering issue. I don’t know how well this monitor handles it, but every oled so far has had this problem. It really deserves attention. I have a Corsair oled and an Alienware. The Corsair is gsync compatible, but is unusable with it because of flickering. It’s actually a big deal. I haven’t tried as much with the Alienware, but it does seem a bit better (not enough to consider vrr usable though). I don’t see many reviewers mention it, but when these monitors advertise freesync and gsync support, it’s practically false advertising given the flicker. To be clear, I still think OLED is worth it (I have 2 for a reason), I just don’t use gsync.
@@Hathos9 yeah, I saw their stuff on it. As usual, they are fantastic at putting numbers to everything. I would love for more reviewers to acknowledge that and include some of that context as well though. Rtings is where I went when I was figuring out what to buy, but I also watched some video reviews not one of the videos mentioned it and Rtings actually only recently started doing that. I think it was within the last 6 months or so.
Yep. Personally I'd choose a larger monitor at older and lower qualities rather than a super high quality but smaller modern one. I play a lot of city builders, colony sims, resource management, and grand strategy, and most of that doesn't benefit from high refresh rates/more FPS much at all, benefits only a bit from better color fidelity and pixel density, but OH BOY does it benefit from a larger screen and higher max resolution. My childhood/middle school favourites that I used to play on an absolute shitbox 2000s laptop, on a modern system with an older but rather large monitor? Good *god* my nice grown up setup is awesome for those games, and while the improved smoothness and graphics quality is definitely all thanks to my nice modern computer hardware, a bigger (even if not necessarily otherwise better) monitor is absolutely one of the best system upgrades you can buy for the types of games I enjoy.
I also ditched my Ultrawide. It has not kept up at all. When it launched in 2015, it was 100Hz UW and 144Hz non-UW. NOW we got 480Hz non-UW and ultrawide doesn't even have a true 200Hz. Also I feel dual screen is superior for productivity. UW only better for certain apps and single player games.
I just got the QD MSI 49" so far its pretty amazing, they just need better virtual monitor support, you can have two inputs t othe same monitor and do picture in picture which is janky and not really customizeable, but it would be nice to have a fullscreen borderless window in the middle and two open sides so every game isnt stretched to the full 49" and you can't multitask without another monitor, thankfully I can order a taller monitor arms and have my two 27" above the ultrawide but i'd much rather spend an extra $200 for software that coems with the ultrawide and allows virtual monitor screen splitting natively.
Playing above roughly 110fps isn't worth it. The human eye literally can't even see that fast, and your brain is even slower at processing what you've just seen. In fact, I'd say much more than about 80fps is mostly pointless. It can be worth it to go a bit higher so that your 1% lows are higher, which makes for a more consistent experience, but as far as raw speed is concerned, around 110-150 is the literal limit of human biology.
@@GeneralNickles your wrong dude, from 100fps it's smooth enough,that doesn't mean 180 for example feels more real and "smooth" Our brain dont have a limit, you are converting digital to analog
@@GeneralNickles this is not true at all. Those that can’t see the difference don’t know what they’re looking for or how to „feel“ the difference. The brain processing is not the same for everyone either, that’s why some people have very slow reaction times and others very fast. I react to most things with a thousandth of a second(or what most monitors refresh at) and am capable of processing that very same thing I reacted to. Many people, including me, are more than capable of seeing and feeling the difference of 110+fps. I have no idea where you found that „literal limit“ but it is not true at all. This sounds stupid, but seasoned or hardcore gamers or just those with fast brains can and will see the difference. I’m sure most people with ADHD are able to.
@@affe1314 "I'm a super human hardcore gamer and I bought a high refresh monitor because bigger number better. Who cares if it actually makes any difference? I pretend it does because I wasted a shitload of money on it." That's literally what you just said. Your schizophrenic combination of a superiority complex and sunk cost fallacy doesn't change object facts.
@@GeneralNicklesit's really easy to see the difference between 240hz and 360hz in my experience. The human eye doesn't have a limit of 80fps lmao what
at this rate, we will see goverment regulations soon... because some "gamers" will run games at 1000fps and still think it makes them play better... the 5090 will eat 600w, this is getting ridiculous. everyone is promoting energy saving and going green to save the environment, but meanwhile the gamers need more and more power just to satisfy their illusion of compensating their lack of skill with more fps...
@@captainheat2314 build better houses.. here in germany, we never had or needed air conditioning for our homes.. fix the source problem, don't fight the symptome. we can no longer afford mindsets like yours "if they are doing it, we do it as well" ... you can game very well with alot less power hungry systems.
1:20 Seems to me you guys have way to much fun to be working, but then, again when the boss is running around in a onesie, maybe I am just a grumpy old man.
When I got the email about that monitor I was like "HOLY CRAP THAT'S A THING?!?!?" "Wait, will I have to sell one or both kidneys to buy it? Both? Nevermind."
If the DSC mode is the same it will be noticeable whatever the cable you're using. At 5:44 you can see theres an on/off toggle but not one to change the level of compression. So for now your cable doesn't matter as it will be hugely compressed anyway... not even rx7000 series would help bcs they're dp2.1 uhbr 13.5 (54GB/s) Edit: Spelling
@@NavneetRaonah sounds like a bit depth thing, might be handled in lower accuracy to decrease response times gaming monitors in general will simply try to put out images as fast as possible, especially in gaming mode, which means minimal to no image processing on the SoC in the monitor
I’ve been daily driving this monitor since the beginning of 2024. It’s super expensive but maaaan… it’s so nice. I love it, super stressed out about burn in but there’s a few good things in the monitor settings that really help stop burn in. Been almost 10 months and still going great. I love it. Definitely not going to be good bang for the buck. But if you have the cash and really want it.. it’s a fantastic, beautiful experience.
doesn't the RX 7900xtx support DisplayPort 2.1? Though I can't find too much information on if it supports UHBR20 directly? looking further looks like a big portion of the 7000 do support DisplayPort 2.1 edit: apparently display Port has done a USB and DisplayPort 2.1 is what you need to implement to support UHBR20 , however you don't need to support UHBR20 if you haveDisplayPort 2.1, so dumb
@@bod9001a consumer rx7000 GPUs don't support the full dp2.1 bandwidth only current GPUs that do are the professional AMD 7000 GPUs. Eg the 7900xtx does uhbr13.5 which is 54Gb/s not the full 80Gb/s of dp2.1
YALL HAD ME FREAKING OUT WITH THE SPED UP INTRO!!! I literally just put together an audio system from the thrift store to replace my broken logitech computer speakers and this is the first video I clicked to make sure it's working and I was like WTF HOW IS IT HIGHER PITCHED???
lol
I thought I was on 2x speed😭
Lmaoo. And here I was checking if I have put video in 2× and was confused
honestly it kinda sounds like og linus from years ago lmao
I thought youtube was remembering playback speed based on channels now as the last thing I watched by LTT was the wan show and I always watch that sped up.
Sounds like that sonic costume is on a bit tight there
truly underrated comment here
Only the Intro
They're non-functional anyway, I doubt it affects him
@@GodardScientific feature, not bug
bro ate helium
not the helium-induced sonic linus for an intro
edit: so the whole intro is sped up instead?? weird artistic choice editors, but i dig it
@@jefez75 I was so sure I had accidentally sped up the video or something
@@KramerTheCourier me too omg
nah
@@KramerTheCourier I thought the same lol, it is almost 1.5x
They do it from time to time.
"This is a completely different wallpaper so we are unable to check for banding".
FFS take 60 seconds and change the wallpaper and look!
7:55 btw, and yeah that's a fumble
@@egarcia1360based on the different icon grouping, it might be his own system, and he just changed inputs on the monitor.
@@alhdgyszlike he didn't have the wallpaper available to download...
They’ve changed, guys, they’re not rushing out videos anymore. 😂
their test benches tend to use unactivated copies of Windows, which means they can't change the wallpaper
Just a note, I have this monitor you have to change the Color options in the Nvidia control panel to get rid of that banding, set it to max RGB Full and 10BPC and you'll see a world of difference. Hope it helps....
That might reduce the max refresh rate
@@prich0382 it does not, I have the monitor.
@@prich0382no, it does not
You can do 1440p @480hz with RGB and 10 bit SDR
DSC will allow for pretty much most you can throw at it
Its just a matter of how compressed that image is going to be
Personally dont like HDR (not sure if its doable with HDR)
Full RGB or 10 bit needs even more bandwidth. However, I much prefer having the full RGB turned even if I need to lower the refresh rate. I cant stand banding
@@azenyr agreed however since this monitor has dsc it is not an issue.
Linus: " look at how good this is!"
**Me watching on 2 inch screen on my phone, 30 FPS, 360P**
aint our fault your watching on a nokia apparently
@@RealForensicTVthe video is still only in 30 fps…
The eternal conundrum of trying to sell good screen quality from another poorer screen.
@@victor555117 it's probably actually in 60, but youtube has started locking higher bit and framerates behind paywalls
Come on.. I'm watching it on 0.75" less than 144p.... 6 FPS....
2:00
"To be clear, everything we're seeing here is something we've seen before"
*Linus drops objects*
Yep, definitely seen that before.
It was perfect 😂
Came here looking for this comment 😅
The opening being a bit sped up was throwing me for a loop because the voices sounded so weird in my ears. It was sped up just enough to have an uncanny feeling.
i watch every youtube video on 1.8x speed so i new they pitch raised him but didnt notice the speed being off
@@Nderak your poor brain must be fried, wth 🤣
@@aaronlandry3947 yeah I had to check the speed in my UA-cam app twice cuz I wasn't sure what was going on I figured it was at least at 1.25.
I'm thinking it was shot in cinematic 24fps and sped up to 30fps. Fluid motion while speeding things up just enough to feel uncanny.
@@Nderakah. So you either have the tism or terrible adhd. Or both
The intro being sped up and campy made me appreciate stuff i have grown to just see as background noise. Your intro song is still so great and i love the actual bumper animation. Great channel guys. Love that you keep it fun even though it's a medium sized business now
I preordered the Alienware AW3423DW over 2 years ago. I paid full price and do not regret it in the slightest. It is the one upgrade I have made to my setup that I feel zero buyers remorse for as I watch prices fall and technology get better. It is absolutely incredible. I can't see any reason that I would ever need to upgrade. If you don't have an OLED now is the time to get one. Prices have fallen and the technology had matured enough that unless you are shopping in the ultra budget category the nothing else makes sense. I would skimp on other parts and try to fit an OLED in your budget if at all possible. It has made a bigger difference for me than any other PC part ever has.
Agreed. Some game look great in sdr but look awful relative to the HDR counterpart. My monitor was the reason why some of my friends upgraded their monitors because true HDR just look that much better.
I second this sentiment
"If you don't have an OLED now is the time to get one"
I uhh, I wish but, the cheapest OLED in my country is around $1k 😅
@@AXLP_LaZEReD Besides that, I'd say that it's better to wait for a newer iteration/replacement for WOLED. As of now, both technologies (the former and QD-OLED) are too flawed for the price (unless $800-1200 is not a huge amount of money for you). And I'm not even talking about relatively low brightness and burn-in.
If you can wait, you'll certainly be better off doing that. If dropping $1000 is no biggie for you, or you can see yourself upgrading in just a couple of years from now, or you just can't wait/bear the LCD misery any longer... Only then you should really go for an OLED now. Imo.
@@juanblanco7898 what flaws are you referring to?
3:00 I like the fact that LTT is still keeping up with the "owns a display" bit with Plouffe even after like an year or so😂😂
can someone give me the lore on this?
@@williambrown3699 In a video (can’t remember which), Plouffe is doing a monitor review and there’s some talk about him being the display guy at LMG. He says something like “Yeah, I’ve worked with a lot if displays, I even own a display.”
@@CanIHasThisName The video is "Double your FPS for free" at 9:56
well of course, he owns the legendary alienware aw3423dw
LTT was sponsored by LG and given the monitor. Scumbag marketing bullshit.
I clicked on this video to quickly test my speakers as I was having problems with the audio. Everyone's voices being high pitched didn't help...
what are the fucking chances lmfao
Same. I had just reconnected my headset after swapping the bats and was like, 'sit dying???
i thought it was my system only lmao this is too funny
Can’t wait for a new monitor sponsor to replace this one in a few months
Then other and other and other one.
Hopefully with something less idiotic than this one.
@Garrus-w2h 690hz 5K
@@costafilh0how is it idiotic
@@costafilh0 and its idiotic how?
9:50 what about my kitchen sink? 🤔🤌
Linus's comments on the "sync" options around 9:50 is something worth further discussion.
I got the Alienware AW2725DF about a year ago now and I disabled Freesync after just a week of use.
I love the monitor and my computer (5800X3D/6950XT) is powerful enough to drive it at well over 100FPS in any game, and I didn't notice any difference in frame timing/feeling of the game between Freesync on and off to warrant using it.
What I did notice with Freesync on though was EXTREME flickering. I only learned this after purchase: when Freesync is enabled and you experience large fluctuations in FPS (CS2 for example, goes from usually around 250-300FPS occasionally down to 100) the monitor flickers and changes brightness HARD.
Not an issue with Freesync disabled, and it does seem mostly unnecessary at these high refresh rates so it's not much of an issue, but it's something that should be tested on these high refresh rate OLEDs.
You should definitely do a test with the employees to test how many can spot dsc. Pretty much every review of Monitors using it says you can't tell. But seeing tech savvy people doing a side by side would be very interesting.
You'd probably need some specifically crafted test images that defeat the compression scheme.
@@shinyhappyrem8728 not for the test i had in mind. That would be about real world usage and if a group of tech savvy people can see the difference.
The compression is probably less visible at lower compression ratios.
@@Loanshark753100%, like switching from the DP cable to the hdmi 2.1 should have switched from dsc 3.0 to dsc 2.0. Which is a ratio of 3 down to a ratio of 2, which is better.
The conclusion felt too short, is the color bending now still obvious with the bit more bandwidth? What was used at the end 48GB/s instead of 32GB/s? Of the required 72GB/s that's still just a tiny fraction more, the color bending must still be very obvious as it was with 32GB/s.
This video was an advert. Glosses over all the important stuff and talks nonsense like "feels more connected".
Yeah, I was hoping they'd at least switch to the same wallpaper to compare, but they didn't even do that and ignored the issue for the remainder of the video...
it's awful, dsc should never have become mainstream, visually lossless is a LIE
@@johnnypopstar people not notice they are watching an comercial 🤣
A dedicated video would be very interesting. This is one of the reasons I'm holding off on upgrading my monitor.... together with not trusting cable manufacturers.
Who else checked the playback speed?
I’ve been running at 1.5 speed for some longer videos yesterday. I swear it was glitched and still on.
I am used to Louis Rossmann. This intro was in slowmotion
I was about to check
UA-cam's playback speed option doesn't change the pitch of the sound though
Guilty! 👋🏼
Been using this monitor for 2 weeks now and I'm super happy with it. I was doubtful if it would be a significant upgrade from my 1440p 27" 280hz IPS but the OLED and 480hz is insaneeeee.
It’s instant huh? Even at the same Hz as your old one right?
@@GhosTigre Yeah it really feels as if you're watching the gameplay through a real life window. I haven't tried the monitor at 280 hz (old monitor hz) yet though.
speeding up the intro/title screen was honestly a great touch. it didnt go unnoticed.
9:03 casual dragon lore
Noticed that as well
Yeah I did a double take
immediate pause and rewind
Watching at 144p, one fully appreciates the clarity of the monitor. I can almost make out the words!
Linus' voice went back to NCIX tech tips days
I restarted the video twice thinking its that bug when audio plays faster than video. You got me there nicely.
Absolutely loved the Sonic Colab. I just met my favourite Sonic youtuber (mastertrainergaming) at Pax in Melbourne Australia, and this popped up on the way to the airport. What a perfect way to end a weekend with Linus Sonic himself.
am i right that the intro is 1 note higher? (1:17)
Yeah I think they increased the pitch of the entire intro including the voices of the skit at the beginning
Yes its fasters cause its Sonic
@@Andreeee75 ⚠️✋️sanic
dopplereffekt?
I sped the into up to 2x and it made it even better 😂
Linus upgrading his monitor every three months when someone sponsors it for advertising.
Just like his TVs
And coming up with a load of rubbish to justify it.
@@thedizzlor hmmm smells like jealousy to me.
You Should know full well in a sponsored video from Linus that his opinions are always his own, never paid for.
On a side note also - if someone wants to sponsor a video for a fantastic product and sends him one to use in his personal space he would be a fool to say no…
"golly gee indeed! Simply a load of rubbish. 🇬🇧🇬🇧🇬🇧"
It's funny how no review ever mentions the gradient banding problems on WOLED, but Linus instantly spots it. Maybe make a video about that?
yup on my 240hz LG oled, the gradient banding especially on scenes where it turns from grey to black has this blob-like bubble that slowly wraps around where the grey meets the black parts of the scenes. it's a bit distracting but only happens in nighttime/indoor stuff
@@carlestrada that's odd
On lg OLED tv banding almost disappear when you activate pc mode. In standard "game mode" it's looks awful for sure.
It's the problem of the image, not the monitor.
@@maybebutwhatever yup it's odd indeed. I rarely see LG OLED TV owners complaining about the color banding in forums, while it seems blatantly obvious on scenes (for my specific 240hz lg oled model) with a lot of gradients, like what linus shows in the vid.
It's real, the banding is real and it's there and never goes away. It's sometimes distracting for me, but it's not a deal breaker for the amazing reduction of input lag compared to a ghosty fast ips.
my LG 240hz 1440p has racked up 845 hours since I bought it this April 2024 and the grey banding has somewhat improved? I haven't run the aggressive pixel cleaning function yet. Mostly the automatic image cleaning every 4 hours since I haven't seen any signs of uneven panel wear yet. I don't watch movies with black bars on it, only 16:9 content in youtube.
I'm sure no 27 inch OLED owner talks about the gradient banding issue because:
1. there are not a lot of 27 inch oled owners out there
2. OLED monitors are still at a steep entry point for a lot of people so it makes more sense for them to look at better alternatives, like an OLED TV at that price point.
3. a lot of people are still scared of the burn-in problems and panel degradation issues that OLED has in general.
4. Even 240hz 1440p is demanding on the hardware. So you'd need an entirely new PC , separate from the cost of the OLED itself just to enjoy it. Unless you are a competitive gamer or is fine with turning down settings to stay on that 240hz.
@@hh5523tw nah it's the monitor itself. I have the 27gs95qe 240hz OLED from LG and it suffers the same fate of color/grey banding in gradients. It's the one weakness of MLA WOLED and it's never going away. It gets better in normal, bright scenes, but the moment you do get into a darker room scene, or nighttime scene, it's gonna rear its ugly head in like the banding in dragon's dogma's shadows for example.
WHAT AN INTRO!! That was soo sick!
I feel like the pitch shifted linus at the beginning is how my wife hears him whenever I have a normal LTT video on.
Thanks for the great vid :)
Lol, the intro voice was so weird that I had to rewatch it because I had no idea what the video was about.
*The RTX4090 that's required to run a modern game at 480hz 1440p not included.
imagine new tech, requires new tech.
even a 4090 cant do that for most games. Almost all games in fact.
and here i sit, with a smile on my face, perfectly content with 1080p with 60fps max on a 40 inch tv... why, simplest and primary answer is diminishing returns... i can build a pc that will play every game i own at 1080p with decent settings and good fps for 400-450 dollars... switch that to 1440p, youre looking at absolute bare minimum 550-600... and 4k, if you can get some lightly used parts, maybe 1100 dollars minimum
its for FPS gamers that probably have their game at lowest settings and are playing "older" games, not the latest CoD. They might even run at 1080p isntead of 1440p to get the 480hz.
@@unholysaint1987 Good for you, but asceticism isn't for everyone. Most of us have standards.
That elephant stability was impressive. I see you editor.
I bought this monitor and tested it for about a week. I had my wife and kids set different refresh rates from 240 to 480 in cs2. The 240 is easy pick out, but the difference between 360 to 480 was not. Many times I tought I was at 360 when in fact I was at 480 and vice versa. I have above average response times on human benchmark at about 140 to 150 ms. I honestly thought I would consistenty notice the differnce, but it wasnt possible between 360 and 480. The panel being WOLED also doesnt have the same color saturation as a QDOLED. I ended up going with the QD 360 hz for 350 dollars less and was really happy. I think Asus has jacked up the price because of that 480 number, which most people wont be able to take advantage of
I recently upgraded from years of 27 inch IPS monitors to a 49 inch OLED ultra wide. My DAW plugins I use appreciate the wide as hell screen. I love it. Haven't played an fps since unreal and unreal tournament. Just hack and slash for me. 6900 xt and I'm set. Screw going back down... For now?
wtf did your voice get higher ?
@@ThirdyAntonio the entire video is sped up for some reason
Because he’s Sonic.
someon commented before the intro
He started HRT
I was so Confused
I though I messed up my audio settings
The banding is solved by enabling Dithering in the color settings
Didn't even check the banding again after changing to HDMI 2.1. Enable 10-bit color in your driver to reduce the banding, which will be supported with DSC.
@7:47 oof, good that you put a warning there. That's awfully seated indeed :p
Filming 480hz monitor in 30hz, pain. Why cant LTT change to 60
Because it makes no sense for these types of videos to be in 60fps. Having them in 1440p makes more sense. Plus, since it's DOUBLE the frames, it would mean a huge increase in file size for their archive servers. Besides, since the monitor is 480hz it literally doesn't matter if the video is in 30 or 60 fps. It still won't be even close. So it makes no difference.
double the data rate. streaming sucks.
Imagine 4k60 with hdr and no compression. One can dream
stop consuming high numbers without even knowing what the fuck do they mean ffs
@@TheFinePlayer 4k@60 would be piss easy for their budget. They are already in 4k and going from 30 to 60 does not double file size
no way at 9:02 Linus got killed by a skin worth at the very least 3 times the price of the monitor lmao
"worth". It doesn't even exist, it's just meaningless.
@@Tomazack i almost never dislike comments
@@Tomazack I almost never like comments
@@Tomazack what?
@@Tomazack huh?
Watching a video about a 480hz monitor at 30hz...
At least your photographers and editors know how to make it bearable.
Except whoever sped up the "Linus Tech Tips" intro animation - the original is 25fps (so normally every 5th/6th frame is repeated to fit the 30fps container), but instead of just speeding it up to be 30fps without any repeating frames, they seemed to have sped it up by some other fractional amount so you get weird inconsistent repeating frames, making the "Linus Tech Tips" intro animation not really feel any "faster" despite the speed-up.
the guy at 9:03 had a fricking dragon lore
Yeah. I had to double take as I was like "wait, was that a dragon lore??" Lol
I had to skip back for that too :D
Hahaha... What is that?
@@arnox4554 one of the rarest and most expensive skins in cs2
One of the most expensive counter strike skins @@arnox4554
They're having fun, I like to see that! Keep it up!
9:57 Only Linus keeps the protective plastic on the GPU fans when testing. 😂
10:29 As someone who’s played this game too much, you can use mods or the Lossless Scaling program to get around the capped fps. Makes a huge difference, but disappointing that game studios still do this.
What game is it?
@@UnknownButlovesFood I timestamped the section in the original comment: it's Sonic Frontiers.
Timberborn at 11 minutes
If 50 series is a dollar more I'm with amd, I wanted Nvidia benefits, but enough is enough
Yeah nvidia has gone crazy. Seems like AMD is doing more and more to close the gap in terms of features. Just not hitting the very top end like nvidia.
@@perkulant4629top end always has diminishing returns
its not a dollar more (its more more)
I've been AMD for GPUs for a long time as while they might not be as powerful as Nvidia, they just offer more value per greenback spent. Also a bonus if you're a Linux user like myself, then AMD just plain works better out of the box with little fuss, just make sure you're on the latest Linux kernel for best performance, which on Manjaro is a few clicks and your password.
Im grabbing the bleach theres a damned hedgehog on my screen
Drop the hypodermic; he doesn’t have Covid. 🫣
Wow the intro sound I know everyone has ran to the comments to note about it but its kinda good.
Man that intro was really jarring, mostly because of the pitch increase. I'm so used to UA-cam speeding up video and audio without changing the pitch. Where did that trope come from anyways, some analog thing?
I think it’s really worth mentioning the OLED vrr flickering issue. I don’t know how well this monitor handles it, but every oled so far has had this problem. It really deserves attention. I have a Corsair oled and an Alienware. The Corsair is gsync compatible, but is unusable with it because of flickering. It’s actually a big deal. I haven’t tried as much with the Alienware, but it does seem a bit better (not enough to consider vrr usable though). I don’t see many reviewers mention it, but when these monitors advertise freesync and gsync support, it’s practically false advertising given the flicker. To be clear, I still think OLED is worth it (I have 2 for a reason), I just don’t use gsync.
+1
Rtings did a comparison of different OLED monitors with this issue. Asus handles it pretty well. I don't notice it on my PG27AQDM, for example.
@@Hathos9 yeah, I saw their stuff on it. As usual, they are fantastic at putting numbers to everything. I would love for more reviewers to acknowledge that and include some of that context as well though. Rtings is where I went when I was figuring out what to buy, but I also watched some video reviews not one of the videos mentioned it and Rtings actually only recently started doing that. I think it was within the last 6 months or so.
No compromise. Except size. Perfect.
Yep. Personally I'd choose a larger monitor at older and lower qualities rather than a super high quality but smaller modern one. I play a lot of city builders, colony sims, resource management, and grand strategy, and most of that doesn't benefit from high refresh rates/more FPS much at all, benefits only a bit from better color fidelity and pixel density, but OH BOY does it benefit from a larger screen and higher max resolution. My childhood/middle school favourites that I used to play on an absolute shitbox 2000s laptop, on a modern system with an older but rather large monitor? Good *god* my nice grown up setup is awesome for those games, and while the improved smoothness and graphics quality is definitely all thanks to my nice modern computer hardware, a bigger (even if not necessarily otherwise better) monitor is absolutely one of the best system upgrades you can buy for the types of games I enjoy.
what are the odds of getting killed by a dragon lore 9:02
I was killed with DLores on Deathmatches quite often.
The gun? It looks lame
I also ditched my Ultrawide. It has not kept up at all. When it launched in 2015, it was 100Hz UW and 144Hz non-UW. NOW we got 480Hz non-UW and ultrawide doesn't even have a true 200Hz.
Also I feel dual screen is superior for productivity.
UW only better for certain apps and single player games.
I just got the QD MSI 49" so far its pretty amazing, they just need better virtual monitor support, you can have two inputs t othe same monitor and do picture in picture which is janky and not really customizeable, but it would be nice to have a fullscreen borderless window in the middle and two open sides so every game isnt stretched to the full 49" and you can't multitask without another monitor, thankfully I can order a taller monitor arms and have my two 27" above the ultrawide but i'd much rather spend an extra $200 for software that coems with the ultrawide and allows virtual monitor screen splitting natively.
Best intro on the channel yet like Wowzers.
More than a minute straight of High pitched Linus should be considered a violation of the Geneva Conventions.
He's a canadian, he doesn't violate the Geneva convention, he adds to it.
1:03 Linus has the perfect opportunity to follow up with "whatever you say, Eggman" and didn't do it
0:04 why does his voice sound like a 5 year old kid with a sugar rush 😂😂😂
I've been playing MW2 multiplayer on my Samsung Odyssey OLED G9 lately and I'm in love. So good.
Linus's kids convinced him to be Sonic for Halloween this year 😂
I never knew I needed to see Linus dressed as Sonic and now the image will haunt me for the rest of my days.
Another one for the sonic lore 🌝
Editor note: We didn’t have to raise Linus’ voice pitch in the intro. It’s natural, we actually lower his pitch in normal videos.
The sacrifices for playing above 240fps don't seem worth it.
Playing above roughly 110fps isn't worth it.
The human eye literally can't even see that fast, and your brain is even slower at processing what you've just seen.
In fact, I'd say much more than about 80fps is mostly pointless.
It can be worth it to go a bit higher so that your 1% lows are higher, which makes for a more consistent experience, but as far as raw speed is concerned, around 110-150 is the literal limit of human biology.
@@GeneralNickles your wrong dude, from 100fps it's smooth enough,that doesn't mean 180 for example feels more real and "smooth"
Our brain dont have a limit, you are converting digital to analog
@@GeneralNickles this is not true at all. Those that can’t see the difference don’t know what they’re looking for or how to „feel“ the difference. The brain processing is not the same for everyone either, that’s why some people have very slow reaction times and others very fast. I react to most things with a thousandth of a second(or what most monitors refresh at) and am capable of processing that very same thing I reacted to. Many people, including me, are more than capable of seeing and feeling the difference of 110+fps. I have no idea where you found that „literal limit“ but it is not true at all. This sounds stupid, but seasoned or hardcore gamers or just those with fast brains can and will see the difference. I’m sure most people with ADHD are able to.
@@affe1314 "I'm a super human hardcore gamer and I bought a high refresh monitor because bigger number better. Who cares if it actually makes any difference? I pretend it does because I wasted a shitload of money on it."
That's literally what you just said.
Your schizophrenic combination of a superiority complex and sunk cost fallacy doesn't change object facts.
@@GeneralNicklesit's really easy to see the difference between 240hz and 360hz in my experience. The human eye doesn't have a limit of 80fps lmao what
WHAT DO YOU MEAN, the purple hue looks beautiful
Oh my God, they actually increased the pitch of Linus voice in the first minute. Amazing! :D
Holy shit since when was 480Hz now so widely seen, I'm still processing 240
at this rate, we will see goverment regulations soon... because some "gamers" will run games at 1000fps and still think it makes them play better... the 5090 will eat 600w, this is getting ridiculous. everyone is promoting energy saving and going green to save the environment, but meanwhile the gamers need more and more power just to satisfy their illusion of compensating their lack of skill with more fps...
@@FalconDS9if people are allowed to use air conditioning which power usage is in the kw's you should be able to game
@@captainheat2314 build better houses.. here in germany, we never had or needed air conditioning for our homes.. fix the source problem, don't fight the symptome.
we can no longer afford mindsets like yours "if they are doing it, we do it as well" ... you can game very well with alot less power hungry systems.
finally Linus accepted sonic Jesus
@@gogon8973 Linus Chan
1:20 Seems to me you guys have way to much fun to be working, but then, again when the boss is running around in a onesie, maybe I am just a grumpy old man.
When I got the email about that monitor I was like "HOLY CRAP THAT'S A THING?!?!?" "Wait, will I have to sell one or both kidneys to buy it? Both? Nevermind."
9:03 this dude casually has a dlore
All we need now is a 600 Hz monitor that can do integer pulldown for every common video frame rate.
The Acer Nitro XV240 F6 offers a 600Hz refresh rate at a 1920x1080 resolution
240 does the same, 24, 30, 60, 80 and 120 fit in, what benefit does 600 have besides refresh rate?
@@prich0382 not 50 and 25 though.
@@RappinPicard Where is 25 and 50 used??
@@prich0382 Europe and other former PAL regions
What is up with the elephant in the room 😅 3:18
Don’t mention it
intro is pure cinema, more of this please
If the DSC mode is the same it will be noticeable whatever the cable you're using. At 5:44 you can see theres an on/off toggle but not one to change the level of compression. So for now your cable doesn't matter as it will be hugely compressed anyway... not even rx7000 series would help bcs they're dp2.1 uhbr 13.5 (54GB/s)
Edit: Spelling
I love the 'owns a display' title he always gets.
downsizing to an ROG Ally made moving so much easier. Now i can game anywhere i want with the AR Glasses.
10:55 the only ubisoft game that drained half of my life😭😭😭
What is the name of the game? It looks fun.
What game is it
Same question bro, what's the game name?
@@nepppppp8222 what game?
@@nicosteyn2630 Bad OP not givng the name. I think is Anno 1800
Bro you did me so dirty, I just came from a rave and was like WTF WHY IS THE PITCH SO HIGH, AM I DEAF??? 😭😭😭😭
9:02 - Unexpected Dragon Lore
The color banding problem is an instant deal-breaker for me. Did that ever get resolved?
It might’ve been because of the lack of DP 2.0.
Or an inherent problem with WOLEDs. Enabling dithering should sort it out!
@@NavneetRaonah sounds like a bit depth thing, might be handled in lower accuracy to decrease response times
gaming monitors in general will simply try to put out images as fast as possible, especially in gaming mode, which means minimal to no image processing on the SoC in the monitor
The "activate Windows" at 1:50
9:02 Bro got killed by a fucking Dragon lore xD
@@N0TP4UL saw that to that skin is so expensive 😀
i swear i didn't think it was possible for Linus to stoop any lower
Love the inro. Hopefully, you'll use it more
bro called me poor in 5 different languages
is the playback speed sped up by like %10?
Just for the intro
After that intro, it is illegal for Linus to complain about how much a team member spent on anything.... ever again.
I’ve been daily driving this monitor since the beginning of 2024. It’s super expensive but maaaan… it’s so nice. I love it, super stressed out about burn in but there’s a few good things in the monitor settings that really help stop burn in. Been almost 10 months and still going great. I love it. Definitely not going to be good bang for the buck. But if you have the cash and really want it.. it’s a fantastic, beautiful experience.
Took me 15 seconds exactly to register that he was wearing sonic costume...
That intro was crack personified lol
What game is he playing at the end of the video
Anno 1800
@@WickedCitizen I have that game and I was thinking it looks like it but some of it looks very different. 😂😂
It's been ages since I loaded it up
9:21 as a halo fan, this makes me so sad
@@TennisGvy one day it'll be back where it once was 😔
I play Halo a lot and it still seems to be pretty active.I am on xbox tho
Finally Linus' real voice, non of that fake lowered voice.
They have ForgedAllianceForever installed on that rig?
Big respect. The classics never get old
This is the best intro on UA-cam Ever. I’m not even kidding.
I'm honestly happy for you. Different strokes for different folks, I guess. This is the kind of thing I dislike and unsub for.
Ioved it tkk🎉
Why isn't there a way to use multiple cables from your gpu to your monitor to double the bandwidth?
That's not how that works
@@MegaLokopo You have to be trolling ......
It's actually a good question because you could do interlacing. Just like the olden days
@@SinisterSlay1 That's not what's he's asking in the slightest and you know it lol
@@SinisterSlay1time to bring 2160i into the mix lol
6:32 still died lmao
"I hAd A sHoTgUn"
inmagine this being the first video you watch of ltt and seeing him in an sonic suit
I love how the editing note from linus was "can you make my voice more annoying?"
doesn't the RX 7900xtx support DisplayPort 2.1? Though I can't find too much information on if it supports UHBR20 directly? looking further looks like a big portion of the 7000 do support DisplayPort 2.1
edit: apparently display Port has done a USB and DisplayPort 2.1 is what you need to implement to support UHBR20 , however you don't need to support UHBR20 if you haveDisplayPort 2.1, so dumb
I think the issue was that his PC in in the basement, and the cables in the walls were only DP1.4.
@@phuzz00 I was responding to the thing saying no current GPU supports UHBR20?
@@bod9001a consumer rx7000 GPUs don't support the full dp2.1 bandwidth only current GPUs that do are the professional AMD 7000 GPUs. Eg the 7900xtx does uhbr13.5 which is 54Gb/s not the full 80Gb/s of dp2.1
@@KermenTheFrog so is it another case of confusing naming standards for ports is it xD
@@bod9001a pretty much they should have required the full 80 instead of making it optional as to avoid confusion