I feel like most times its a tradeoff between getting the newest tech and also having to tolerate the quirks the newest tech has. As the quirks diminish well its not "newest tech" anymore.
Yeah, I would also say that it depends on the kind of device. For example I bought the first gen Pixel watch because I wanted to experience Googles first smart watch. It is very likely that there will be a better and cheaper one available next year but I am still very happy with it even if it has some quirks. If I buy a new Laptop for work I may not accept first gen issues since I depend on it functioning well as opposed to e.g. my watch. It's just personal preference and even if everyone tells me the device I just bought is horrendous, if it works for me than I don't care what the others say
That's why it's not about just the newest tech, it's about the *best* tech. Obviously it's subjective, but measuring against what you want to use it for and what you care about is always, always more important than whether it was released yesterday, a year ago, 2 years ago, or 5 years ago.
I heard that the new DWF was going to come in a variety of colors, unfortunately they used Pantone colors in their Adobe software mockups to send to the manufacturer.
Is this why Microcenter has 8 open box DW? I just reserved one for $899. Worth it? This is to go with my 4090 and 7900x. I also have the old Alienware 3418DW I bought a vertical stand to stack it on top of the new qdold. Man idk now. $899 is pretty sweet. I also have my pc hooked up to a 55 inch LG C2 for some chill sofa sessions. Tough dude. Help!!! My order will be ready in the morning
I've had an aw3423dw for months, and I don't regret it. It was such an enormous step upwards and has given me enhanced joy while playing games for those months that I wouldn't have if I waited. I am also very happy that technology marches forward for others and myself to keep benefitting from.
I mirror these sentiments exactly. The $200 was worth both the months of enjoyment I've had, and the physical G-sync module. I think Linus got it *very* wrong about g-sync not being worth it outside of a console. There are plenty of games (assassin's creed, for example) where it takes all the little jitters and dips that you get even on a high end graphics card and just... Makes them go away. I'm glad that there's a cheaper model available and this tech is dipping towards mainstream so quickly, but I'm not upset that I bought when I did. It was a *huge* purchase for me, and it'll easily last the decade until micro LED is feasible.
YES. Just consider it the renting fee for previewing six months of the world's greatest piece of equipment. That rounds down to 35$ per month. Would you rather game on a older display or pay extra for the experience? However, the new one is still in some ways improved, you still have the option to sell this one, accept the price difference as just rent, and get the latest one. I have been having similar thoughts about the monitor I got a year ago. It was around 600$ on sale so I got it. Then a few months later, I started having some issues when it would not accept a USB-C connection and requiring a reboot to fix. And then I found OLED displays with similar size and resolution for slightly higher prices. But at least I got a good display for the past year which was way better than bending down and working with my laptop screen. I got a lot of work done more efficiently and enjoyed lots of content. It even gave me a advantage in some games. So I don't regret it.
As someone who doesn’t research things like monitors that often, it would be nice if you had references to show what a “normal” display is capable of to give context to the capabilities of the new one. I don’t know how much better (if at all) .16ms black to white is than a regular LCD for example. That being said, I’ve had my heart set on the Alienware monitor for a LONG time and was planning on ordering one in a couple of days. Glad to see this video come out showing a cheaper model that is just as good. Great video as always!
Watch Monitors Unboxed,Tim is great at explaining this.Here is a video about the C2 to help you understand how incredibly faster OLEDs are : ua-cam.com/video/jRzGvkqSNaI/v-deo.html
monitor reviews are posted all the time, and there are plenty of reviews of this monitor comparing to others. idk how you've avoided all that research other than just being lazy. Linus posts for his viewers and his viewers have watched every one of his videos, they don't need to compare to other monitors because this video is kinda built on top of the others.. as it should be with tech. You can't expect all the information you need to be in one video.
It depends, for example people who buy latest and greatest iphones/laptops/earphones/smart watches etc It doesn't always have to be "I got a 3090 three months ago and now the 4090 is here"
@@colts8146 there's something better right around the corner but for some people the improvement is needed at that point and the upgrade 3-6 months down the line is not always necessary. I bought the pixel 6 pro and the 7 is better and there is no argument there but I still won't buy the 7 or regret my purchase regardless. I wanted an upgrade back then, and the upgrade I got was massive and taking all in consideration, worth it. I have a phone that doesn't give me any problems whatsoever. I find it so weird when people make that comparison when the majority of the people don't even have the money to change equipment every month. For content, they can show it off but realistically... They don't actually upgrade that much and you can confirm that by watching the Intel upgrade series or just any random videos where they talk about their stuff.
@@colts8146 people who buy consoles may think somewhat along those lines lol. People know there is always going to be better stuff in the future ( usually cheaper * cough* Occulus) but as Jose put it in that time and space they are getting what is arguably the best tech of the moment.
the test colors mentioned at 8:25 built in to the dwf are availabe on the dw as well. if you put your computer to sleep so there is nothing on the display then hold the control stick left for ~5 seconds it pops up all grey. After that continue clicking to the left to get red/blue/green/white/black. I'd really love to have those picture in picture options though :( was very surprised to find them missing on such a large display.
I will give this a try. I have zero regrets but I was disappointed it doesn't have PBP like my 2017 LG curved ultrawide. Granted, I used it all of six times probably. But, when you need it, you need it.
I learned a long time ago the best tech equipment to buy is stuff that isn't cutting edge not only due to value but reliability of the item tends to be better.
@@Excelsior_Espio 360hz haha.. I also love tinfoil hat technology! 240hz is max necessary (and even that's a stretch come fight me), anything beyond youre better of tossing money into the fireplace - at least it will give you something in return (heat)
In addition to new features, being patient also lets you avoid QC issues. For instance: in 9 months, new RTX 4090 owners probably won't have to worry about their card catching on fire.
@@Begohan1234 while he's not wrong at all, I still feel like that's a oversight on Nvidia's part just because it's so easy to do and widespread. Like NO-ONE caught this before shipping? You telling me even Phil that works there but you have no idea what his actual job is only that whatever it is he's a fuck up and sucks at it didn't manage to do it?
@@APhamx7 just saying how many people have done it in the wild shows there's something there and I know in my personal experience trying to recreate someone's fuck deals with a lot of factors that you can't or wouldn't know to recreate. Plus the company it self has way more time with these cards before they hit market to have full batches happen as soon as they hit the shelf. Gamer Nexus is completely right and I'm not arguing that at all, but as a huge publicly traded company always there's a unicorn type of stupidity out there.
I have the original monitor talked about here, and it is probably one of the best purchases I've ever made. I use it for office work at least as much as movies and games. It isn't the highest resolution thing in the world, and I wish it had USB C video in, but the colors are so nice and the lack of any glowing black bars is great. It's super easy in the eyes, and I don't regret the purchase at all
Agreed. Also, at least there's a USB-C to DP 1.4 adapter. I got that running from my laptop to the monitor. Works great and activates the G-Sync while also allowing HDR.
I used to purchase everything second-hand through the "local" topic of a hardware forum, and sell my unused gear this way too. I can't imagine how much money I saved this way - and I met some great people too! So if you are on a budget and something like that exists around you with a real community (so that you don't get scammed, unlike other second hand options), go for it!
I literally only buy second-hand unless something new is dirt cheap. I stay well behind the curve anyway so I can get my dream stuff often for a tenth of the price or more, all while supporting giving electronics a second life and reducing e-waste.
@X̶̲̅ H̶̲̅A̶̲̅ sometimes people sell their stuff when the warranty hasn't expired yet. Buying second hand also means that potential QA issues have been dealt with by the original owner. If not, you can choose not to buy it.
I bought the AW3423DW when it came out knowingly taking the early adopter risks and don’t regret my purchase at all. I spent a lot of time reading and watching reviews and luckily got to see it in person at microcenter and compare before buying. This monitor is so incredible for literally everything. Revisiting games with hdr is like playing them again for the first time. I still would’ve bought the Gsync version over the freesync anyway. Either one is 100% worth the money.
Gsync is very nice to have, if you don't have a killer gpu I bet you make nice use of gsync when the fps drop. Freesync premium only goes down to 35 fps.
@@JC-Alan I think 200$ is not much when you are dropping a grand on a monitor, that you will surely use for 3-4+ years! Gsync ultimate is really nice to have. My 2c :P
100% same here. 4090 really gets to flex on this display too with QHD Ultrawide. I'll echo what I've told my buddies about this display; The QD-OLED "DW" is a much bigger upgrade and well worth it compared with a GPU upgrade. Whatever framerate increase you might get, doesn't compare with the perfect contrast, near-instant pixel response and outrageous colors QD-OLED can produce. Period.
One thing I like about gsync is variable overdrive. The module uses a frame predictor as a part of its scalar that predicts when the next frame will arrive and adjusts the overdrive to match, meaning you don't get any ghosting artifacts on LCD displays. For an OLED panel though, this doesn't really matter because the pixel response time is effectively instant. The DWF is a clear winner in my opinion and is definitely going on my wishlist. It's $1900 in Australia right now, hopefully it will come down to ~$1500 on sale.
Being a PC gamer in Australia has been getting so much more expensive lately. With the 4090 at $3200 (AUD for anyone reading), and a top tier display like this being almost 2 grand, it’s getting to be out of reach for so many people. Even just last year things were much better.
@@___DRIP___ yeah I’m in Oz as well, I built my pc before just before covid so I actually got some good deal, but cos of inflation and the weak currency it’s driving tech prices up bonkers to a point where it’s unattainable even for middle to high income earners
I also bought the DW full price and waited 3 months for it to arrive. I don't regret a thing, this monitor is amazing. I only wish more games knew how to take advantage of the display. I end up spending more time than I want to trying to hack HDR support into my games. Although supporting 4K inputs would be a game changer for me. It would enable me to watch HDR TV and movies on my DW using a little streaming box I have. Netflix is the only streaming service with HDR support on PC :(
@@FatalDreidel The blacks are black, what you might experience is the polarization layer or panel coating which makes them look grey when there is light hitting the monitor, in dark rooms it should be as good as any other OLED. Of course if you use the monitor mostly in lit rooms then its a problem.
@@nosleep1870 Did you ever update your previous display? Not sure it was something possible. Just enjoy the display you won't miss the few new settings available from the DW
This is now my 2nd time fully watching this video and I am so damn tempted to buy one of these but knowing there's a lot of brands working on monitors with this exact panel tech makes me still wait.
I'm more worried about burn-in and image retention 3 years down the road, having to battle DELL support for the warranty since I've heard they fight you with tooth and claw to deny it.
@@laggmonstret My ASUS monitor started flickering on 1 corner 2 DAYS BEFORE the warranty expired. 1 day later they agreed it was faulty and I got a brand new one. Overall megacorps are equally bad in their own ways. If my monitor failed literally 25-48 hours later I would've lost $1000 lol. I do wonder if 3 years is an EXACT cutoff point
I almost bought this monitor on launch and also as recently as two weeks ago. I'm glad I've been patiently awaiting the monitor scene to continue to grow. It will only get better and cheaper from here.
2 роки тому+3
I almost bought DW but they proposed a discount for using financing, so why not? No interest and discount sounds good. Well, then they told me to fuck off, so I did exactly that and did not buy it. Later on I bought DWF and that seems to be better choice than I expected.
From what I remember, the Gsync Ultimate module is an FPGA, which is gonna be at least $100 in bulk for the transceivers required for DP 1.4. Cutting that would easily explain the cost savings.
And probably you would need to update the NV module when you reflash the monitor FW, but you can only update the FPGA with new Bitstream via some developer debug interface, because the stuff not supporting anything else.
Weird that Nvidia of all companies wouldn't have an ASIC made when that's their core business. @@lorincmate There's no reason the mainboard couldn't reprogram an FPGA as part of a firmware update. All the FPGAs I've ever worked with supported good old SPI eeproms. There's like 99.5% chance that whichever processor they're using already supports SPI (and if not, can almost certainly bitbang it). If the G-Sync Ultimate module is to blame for the lack of FW updates, it's not a technical limitation, it's Nvidia fuckery.
If you're always several years behind the latest and greatest, then you'll benefit from huge upgrades from what you currently have, just like other people! It'll usually be cheaper though, and often with a lot of bugs worked out and plenty of reviews to sample from. The only downside is that sometimes prices get weirdly high for a product nearing the end of its life cycle, and sometimes support for it may be diminishing too.
I have a Dell Alienware AW3423DW QD-OLED and I have exactly zero buyer's remorse. I've been waiting for this tech for years. Normally you get a new device and it is cool at first but, eventually, the novelty wears off. However, the crazy beautiful colors of this monitor blow me away every single day. Not just good photos and watching movies, but even small things like the color in favicons, emoji, syntax highlighting, or my lock screen. The true blacks are game changing. I love this thing. Quantum dots are the future.
@@thrace_bot1012 It will be cheaper to make a blue micro-LED panel and print quantum dots on it than it will be to manufacture a panel with 3-4 colors of LEDs per pixel. So quantum dots are also the future! In 3-5 years when mico-LED becomes viable for consumer displays, I will consider upgrading. My Alienware QD-OLED will go to my Mom, my 34" curved ultrawide LG LED-backlit LCD from 2017 that my Mom is currently using may go to my niece, and the 24" Planar 1080p touchscreen LCD from 2010 or 2011 that my niece is using may come back to me and get mounted in my kitchen where I can pull up my recipes with dirty fingers or the butt end of a silicone spatula. Ahh, the cycle of life.
@@ba11ard I am using it now, it is as good as the day I got it! Errr, almost. Zero dead pixels, zero burn-in, zero buyers remorse. I use this for both work and play so it is on up to sixteen hours a day, but my computer it set to put the display to sleep when I am not using it. Sometimes I forget to turn caffeine off and the display is on all night with my locks screen background static, I try to be careful but it happens and still zero burn-in. I do panel maintenance regularly, but ignore it when it asks while I am using it. Think like once a day to a few times a week for the short cycle instead of every few hours, and maybe once a month or a few times a year for the long cycle of panel maintenance. The reason I said "almost perfect" is because the purple anti-glare coating has degraded in places. I found out, after some months, that opening cans of carbonated water or beer shoots out tiny droplets. When they land on the monitor, they dissolve this purple coating eventually. You can't tell when it is on because the droplets are too small, but you can see where the droplets were when it is off. I hold my hand over my drinks when I open them now and am careful to wipe any droplets that splash on the monitor off right away. I recommend it, I think you will be happy. The price has come down since I got it, too! Get the non-nVidia variant (i.e. without Gsync) since the Gsync variant cannot accept firmware updates and nVidia cards support Freesync now. That saves a little more money, too.
Thanks for talking about the Source tone mapping in the settings. I was wondering why BF2042 in HDR wasn't working properly and other games were clipping on my AW3423DWF. I turned on that setting plus calibrated with the ''HDR Calibration Tool'' in the windows settings and now everything works fine. This should be the default setting for real. @Linus Tech Tips
@@lucasrem HDR has nothing to do with resolution. There are no consumer monitors that do "higher!" than 10bit. The resolution is clearly stated 3440x1440, not a "quad HD ready monitor". And none of any of the incorrect things you are saying has to do with the clipping issue Silveraga was talking about and has resolved. As for the actual specifics of how HDR is handled on the monitor: "Note that the AW3423DWF has a true 10-bit panel, but it’s limited to 8-bit at 100Hz (120Hz via custom resolution). However, with HDR content, you get GPU dithering (8-bit + 2-bit FRC), which is indistinguishable from native 10-bit. In contrast, the AW3423DW model is limited to 144Hz at 10-bit."
I just got one of these and couldn't agree with Silveraga more, the Source Tone Map is actually a wild improvement in quality, the fact that its not called out repeatedly in this video is almost criminal. This monitor looked AWFUL out of the box, I was convinced something was wrong with it, because things looked washed out, color quality was meh - benchmarks looked worse than they did on my original AW3418DW. I thought maybe it was Windows HDR handling, but when I connected the computer to my TV it worked flawlessly. Sure enough, came back to this video, skipped to the HDR setting, turned the "Source Tone Map" on and it's like an entirely different panel. Dell/Alienware is doing a WILD disservice by not having that on by default.
@curtisbme the good news is.. for games that use exclusive full screen, I can choose the refresh rate just for that game. Some benefit from the extra 31Hz like competitive shooters, others are graphical cinematic eye candy that I'd prefer to use native HDR10.
I bought the s95b for msrp on release. I could now get it for 40% of the original price. I've certainly learned my lesson with early adopting new display technologies :)
@@QueueTeePies from the few things i saw, the c1 is superior in most ways to the c2, as they dropped functions out of it. not 100% but thats what my memory is telling me
The S95b always had too many firmware problems for me to consider buying one. The LG C1 is the best value gaming display on the market, by a wide margin, and unless Samsung pulls a rabbit out of their hat and releases a 4K240hz QD-OLED monitor I'm probably going to upgrade to another LG in a few years.
My story's LITERALLY the exact same as Plouffe's. Huge Oled simp my whole life, immediately went out and copped the AW34. It only puts my LG 27gl850 "to shame" in scenes with a lot of black or darkness, but the vivid colors come close. HEAVILY used since May-June and zero burn in or issues.
I think mine has some stuck pixels, but, that could easily be because I don't know what I'm doing and ran panel refresh a lot thinking that it'd be better than using pixel refresh more than once... turns out I'm a dumbass and that could've shortened the lifespan of the monitor, on the plus side, though, I found a cool open source app to force windows to make task bar, etc true black.
I've always taken the "this is as good as I can afford" approach when buying any parts for my builds. This is the only way to make sure I'm never disappointed. Sorry Plouffe, hope you learned your lesson dude!
@@Exilum I know, but it still sucks paying all that cash and having it's replacement come less than half a year later and show it up for quite a bit less money. He still got a great monitor either way though, well worth the money.
@@dragon2knight Yep. My three questions to evaluate these types of situations is "did I get what I wanted when I bought it?", "am I happy with it?", "does it still serve its purpose as well as before?"
A lot of LTT monitor videos have that RGB colour map. I have no idea how to read it, and my only takeaway is hearing the presenter tell me it's good. Could LTT do a video or a short that teaches us how to read that graph, what everything means and why?
@@lucasrem Wait, DCI-P3 is only 8-bit? I thought DCI-P3 was above sRGB and sRGB is literally 0-255 (2^8=256 colors) for red, green, and blue, hence 8-bit.
The color* map literally shows what is acceptable and what is out of spec. You just asked the equivalent to Linus making a video about how to draw in the lines on a coloring book. Common sense is a personal responsibility..
Go to Monitors Unboxed (2nd Hardware Unboxed) Channel and look for the video “what are response times, overshoot and cumulative deviation” it many of the monitor terminology and for reading graphs 🥰💪👍
@@C4Oc. DCI-P3 is 10 bit, keep in mind there a different ways to view that, because technically monitors have 24bit color or something, dont quote me on that, but thats measuring it the "old" way. But no, DCI-P3 is 10 bit, not 8 bit, all HDR color formats are 10bit AFAIK
I am extremely thankful for all the early adopters who go through the painstaking work of making purchasing much easier for me months or even years later.
I’ve recently started to think about replacing my two 24” monitors with an ultra wide and I’m glad you made a video on this model! The DWF will probably be my first OLED monitor and I’m excited to make the switch soon!
I switched to a 49" ultrawide, and I don't think it is worth going any smaller. I love this thing! Sure, it may only be a VA panel, but 1440p with 120hz makes up for it and is pretty much all I need rn. Plus, it only cost me $750 for this AOC monitor compared to the others which are $1000+
@@smallbutdeadly931 I've been testing out monitors recently as I've never really used ultrawides before, and I'll admit that there's a part of me that feels almost "cheated" when I deal with games that don't render at 32:9. The two big displays that I've been using are the AW3423DW (21:9, the G-Sync one from this video) and the Neo G9 (32:9). For gaming, I decided to try out God of War. I really wanted to try a racing game like Forza Horizon 5, but I only have it through Game Pass Ultimate and I don't care for the Microsoft Store on Windows. (I might bite the bullet and try it out though as racing or sim games are supposedly the bread and butter of these huge displays.) Anyway, I didn't realize that God of War only supports 21:9 and not 32:9, so I was a bit surprised when I switched to the Neo G9 and got black bars. It still looked pretty good, and the blacks -- while not as good as the OLED -- were still quite good. Also, one small problem with the dimming on the Neo G9 is that it seems like they are really aggressive about ignoring highlights, which happens to include the cursor on the desktop. So, if it's on a brighter window, it's very visible, but if the cursor is on a black background? It's really dim. Now, that's not an issue with ultrawides as that would theoretically happen on a 16:9 monitor with similar auto-dimming functionality.
You will literally never go back. I have the DW with the new firmware and it’s the best monitor on the planet hands down. Got it for 1100 bucks too. Better than the DF still imo.
I was also an early adopter of the DW and its been the best thing to have to work and play. So good in fact, I just bought an S95B on sale to get the same tech in my theatre room. I will have to admit, not being able to disable the messages for pixel and panel refresh is dumb and not being able to update the firmware is even dumber, but playing on this monitor for any kind of content has been amazing. HDR off for desktop use. HDR on for games that support it.
@@mauree1618 They don't mean disable the refresh, just the message that pops up reminding you to run it. Most people prefer to just let it refresh when the monitor goes to sleep, the monitor doesn't want you to do that.
@@mauree1618 windows doesnt use the correct HDR tone mapping. There is an auto HDR slider but that is not that great compared to SDR which windows is made in. HDR games are mastered for HDR and when its on, will display the correct lighting and color.
I love to be on the early adopter train but recently I've been more conscious of the cost benefit for me particularly and that has lead me to be very happy with getting previous generation or older tech.
This. If you hold off for the best model announced, you'll be holding off forever. There is always a better model announced but not out yet, and always a better model out in a few months. You just gotta get what you're gonna get and enjoy it for what it is. If you get lucky, you might even land on a generation that makes substantially more progress than the one before or after it.
It's amazing to see more manufacturers produce better QD OLED monitors but it's kinda unfortunate that they all seem to be focusing on making 34" ultrawide displays and not cover some other form factors. A 16:9 4k QD OLED would be amazing but apparatnly that'll still take some time.
@@clownavenger0 1440p is going to be even less likely. They just manufacture single motherglass for 4k and cut it up to pieces with maximum yield. 32" is pretty much the optimal size for most desks for viewing distance/depth.
variable refresh rate is great even from things that are not consoles.... having your hz adjust to frame rate always makes for a smoother experience and removes screen tearing as long as you cap frame rate a little before the max hz
I've found it can help a lot with games where the frame rate swings broadly. Going from 90fps to 160 fps rapidly is very noticeable, even though both are perfectly playable frame rates. I personally tend to just set the max frame rate in that game lower to handle it, but adaptive sync goes a long way to making that feel smooth as is.
I really hope the QD-OLED offerings next year are more prolific. I really want one that supports DP 2.1. I usually sit on a monitor for 5 - 10 years. It'd be nice to do that again.
@@redclaw72666 c9 already has burn in for me and my friend's c1 also has that. Galaxy S3 S5 s8 S10 and S20 also already have burn in, including Lenovo oled laptop. It seems every device I know that is over 4 years old is with burn in. By the way look at used devices for sale, like half of those with oled have burn in Linus already has burn in on his LG and it isn't even one year old yoo
@@redclaw72666 OLEDs will eventually just burn in, 10 years seems like a reasonable end period for one. It’s Organic LEDs, they slowly die as they’re used. You could sit on an older monitors for about 10-15 years of constant use, not gonna be a thing for even the best OLEDs.
As someone who owns the AW2334DW and is currently using it, I stand firmly with the buyer that Linus quoted, no image retention, its soo bright that I often have it at 70 percent brightness anyway and also i got it for £1099 because of an offer DELL UK was doing and now I did it, no regrets and I like NVIDIA ULTIMATE and im happy I have the tech, and even if LIKE EVERYTHING it isnt perfect, I know that the next time I buy a new monitor it will be even better. But damn, I have constantly proud of this monitor, LOVE IT.
I got it for £920 last month and its genuinely incredible, the only fault I can think of is the pop up that comes on every 4 hours to tell you to pixel refresh.
@@demetter7936 You can tell it to do the pixel refresh and not ask again on the popup, after which it will automatically do it if it goes into standby after 4+ hours of being turned on!
I've enjoyed this monitor so much over the past 5 months that I don't really feel all that bad for spending $200 more than I needed to. I think I've even played more games than I normally would just because I like the monitor so much
I actually got the none F version not knowing when this new F version was going to release, I got the MOB02 update and no issues with the bubble wrap like others have had. I also got it for under 1000 on a sale they did so I really cannot complain about it. The monitor is beautiful and anyone who has seen me on it instantly stops and has a quick look. So glad there's a cheaper option available and hopefully it continues like this.
@@oscarreyes4807 it was from Dell themselves. Here in the UK its 1200 normally but they had a discount for 980 I then applied another code they had going on and it worked. It was like a day or 2 on sale around a month back? Oh and free shipping so I got one hell of a deal for a total of like 900 including the tax. Instead of the 1200 it normally cost.
That's kinda how I feel, I didn't get mine under $1k, but did snag it during a lull in scalper activity, and sure, there are things about the DWF that seem good, but, not enough for me to scrap the DW, also glad that the price is lesser for others, especially since people like me who have a very rare medical condition need OLED's due to how much better it is on our eyes to have actual black being used vs "black" of backlit panels.
My only complaint about the monitor is that the screen is almost impossible to clean. You can use a torch and not see anything at all but if off angle.... Very off angle you may see a slight haze on the screen still that will never go away. It's not the bubble wrap thing either. It isn't noticeable in anyway if you use it normally but it's annoying knowing it's there lol. Oh well 3 yr warranty on it for burn in so imma just use it for as long as that happens. Can't wait to see OLED become more common.
Hey Linus, you show these colour accuracy graphs a lot but but I don’t really understand how we get that data, could you perhaps do a video explaining the whole colour accuracy testing process? Thanks, Aaron
He talks about it about halfway through the X-rite i1 Basic Pro 2 review. UA-cam doesn't allow links even to UA-cam so you'll have to search on your own. Also talks about it in the Techquickie video about color calibrating your monitor a little bit. Luke also talks about a bit in a separate Techquickie "Why do Monitors Display Colors Differently?" Though it's more in relation to what subsets of the graph show.
11:06 Is no one gonna talk about that clean hand wave to continue the slideshow on the website on the monitors.... Side Note: Currently bought 2 of the AW3423DWF hopefully to mount one above the other or do a side by side display.
I don't think that's planned to be a video yet given it's like his 4th Oled TV for a monitor in a row. And they've kinda already talked about it in some videos previously and even in this video he mentions having burn in after only a month lol
@@daymianhogue1634 i want to buy an Oled TV to upgrade to, but that is the major fear I have. I don't want to spend $1000+ just to end up with a display that gets burn in which I know Oled's are known for lol. Been keeping an eye on the Samsung Quantam Oled since Linus stated those have less chance of burn in.
I have put thousands of hours on my 55 lg CX OLED and not a single sign of anything wrong with it. It even tells me when it's time to do its own maintenance on the screen.
@@TheCommanderTaco if you're just gonna use it as a TV you're less likely to get stuff burned in. And you can get oleds for well below 1k now for TV. You can get like a 55" LG A1 for like $600ish from best buy on clearance and just a couple weeks ago they had the A2s on sale for like $550. Wouldn't be surprised if the A1 goes on an even bigger clearance blackfriday and/or if the A2 goes back on that sale again on blackfriday. Burn in more so comes from using it as a monitor because like UI elements stay static on screen for too long.
After watching this video, I finally bought the DWF; I have an AMD GPU and waiting for the $200 price drop / freesync was so worth it. I know that buying it is a little ridiculous, but who cares what others think of my purchases. Ever since I built my first computer in 2016, I have ALWAYS wanted an Oled 34" 3440*1440p high refresh rate monitor. I remember a 34" 2560*1080 VA 60hz costing 600$ back in the day and that was too expensive. Now that my dream spec monitor has finally come out I have saved up since the announcement of the DW and pulled the trigger. Can't wait till this Monday to get it. Thanks Linus, a 7 year fan.
@@aayaan1935 it's a fantastic monitor. It's exorbitant but god it's worth every penny. It's gonna be hard to get anything better for quite a while I'm still very happy with the purchase and highly advise you to get it if you've been on the fence!
Linus Media Group should consider getting a dedicated captioning team. The "generic" captions [or whatever service provider they use to caption these videos] tend to miss a lot of the specialty terminology. idk if there's a way to avoid that and specify the terms that appear in the vid, but things like "Blue" ["Plouffe", 0:08] and "cutie" ["QD", 10:17] are weird to see on an otherwise high quality video that hasn't been auto-captioned unless youtube decided to remove the "auto-generated" description for w/e reason from the option and those are actually the consequence of automatic captions
I have the AW3423DW and so far really happy with it, i feel the dwf model is not really an upgrade, but a different version/model that's great for people with AMD cards that don't have any use for the g-sync module.
@Daymian Hogue I don't think it will get that many new features, but it's great to be able to update if you have any bugs, so far I have not experienced any bugs with my model, I have the model with M0B102 firmware, so the second iteration. I believe a few people with the M0B101 version have some annoying bugs.
This is the exact video I was waiting for. I planned on picking up the DW in the next few weeks as I was sure the extra $200 was going to make for a better gaming experience with my 3090. Even without the gsync module it seems the DWF is the better buy as long as freesync plays nice with my GPU (I've never used a freesync panel, nor have I done any gaming in HDR). The black color scheme of the monitor is just giving on the cake at this point.
as stated by linus in this video - if you have a beefy enough GPU, youre almost always gonna be playing with an uncapped framerate for the most minimal response time (higher then monitor refresh rate still equates to more responsive peripheral inputs - even moreso, screen tearing is nigh on indistinguishable at framerates that high); the only reason you'd ever want the DW model is for games you know will run BELOW 60fps to help smoothe out the jarring frametimes/screen tearing - the DWF is definitely a better buy, however, im not sure if going strictly freesync was a great idea, it still requires a bit of a workaround on nvidia cads IIRC from my last freesync monitor (returned it for a proper gsync module'd monitor btw). G-sync, while stupid that it has multiple tiers of 'certification' is still definitely king when it comes to VRR so *realistically* the only drawback of the DWF model would be slightly less VRR friendliness, but again, if you have a beefy enough gpu, there really is no need to even enable gsync until games finally tank your GPU to below 60fps... which will be quite a while lol
@@kendog0013 Have you even used an HFR or VRR display? Because it sounds like you haven't and are just regurgitating information you don't really understand. Any drop in framerate below your monitor's refresh rate introduces judder. It doesn't matter if it's 140FPS on a 144Hz display, it won't look smooth to your eyes. Gsync/freesync stops that, full stop. It's also a much better experience running at 140FPS/144Hz Gsynced than it is 180FPS/144Hz nosynced, because 'uncapped framerate' above your refresh rate suffers from tearing. Also, you can only sync so low until the lack of frames severely detracts from the visual smoothness. As someone used to HFR, I can't even tolerate 60FPS games, because they look as bad as 30FPS did back when we were all using 60Hz monitors. VRR does not help at all in this case. But, this is more a user experience things, because anyone still using 60Hz won't be that bothered, because their eyes simply haven't been trained on better. My comfort zone is at least 90FPS, gsynced of course. Additionally, as someone with a G-sync compatible display (34GK950F-B, not even verified), 'Freesync' based gsync works absolutely fine. In fullscreen (or borderless). Look up MPO's (multiplane overlays) if you want to use VRR in windowed mode. Never, ever, use the windowed mode option in the NVCP; it's horrific. Gsync (VRR) is an absolute requirement for gaming for me. It's an invaluable technology that should've been in spec since LCD technology was developed. Crazy it took some proprietary BS to become adopted decades later, but that seems to be the industry norm now..
@@Gambit8319I've been gaming on PCs since the late 90s, back when screen tearing was often just a thing you dealt with. Some of the worst screen tearing I've ever had was when VRR was new tech and video card drivers didn't yet include max frame rate settings. Going above the refresh rate with VRR enabled can be a horrible experience.
I have 9 months of amazing experiences from my OG QD-OLED that I wouldn’t have had if I waited. I’m happy. And I would have played through lots of games with a sub par experience by using an IPS or VA panel. No regrets. We’ll kind of. I really wanted 2 displayports.
Dammit Linus I literally just purchased the OG model a day ago after going back and forth trying to decide whether to get the F or non-F, and thought the few extra bells and whistles would be worth the relatively small price increase; never saw anything mentioned about the firmware update ability or better color accuracy. At least I didn't pay full price for it (found a 15% discount code for Dells' website) and it still seems like it'll be an excellent monitor judging from friends who already have it, but still a little bummed I didn't wait just a bit longer for this video.
Both versions look incredible! I get that the new one does have a number of benefits, cheaper, and that 165hz vs 175hz is a negligible difference, I just hate that it loses some hz compared to it's predecessor. The DW also having better HDR than the DWF, but 4k HDR is big for new consoles. Ugh it's such a tough choice!
Better wait for further (users) reviews and feedback first. At the moment the DW 10-bit colour is supported up to 144Hz, but would drop down to 8-bit if using 175Hz; whereas accordingly some someone posted on the Overclockers UK forum stating that according to the Dell's rep response, the DWF would only do 10-bit colour up to 100Hz, and above 100Hz it will drop down to 8-bit colour. Not sure if that would be a deal-breaker for you. I just bought the DW because I was able to stack the Black Friday discount plus a 10% off discount code bring the total down to £879. I got the Amex offer of getting £100 credit back for purchase over £750 at Dell, so effectively I got the DW for just £779. The DWF is most likely going to take forever to become available in the UK, and but the time they do become available, most likely they would be at over £1000-£1100 so there's really no point for waiting for me.
I feel you, I'm in the same boat. I'm leaning towards better HDR though. I'm also considering/waiting for other manufacturers to release their 21:9 34" QD monitors, like Samsung Odyssey OLED G8 G85SB or MSI MEG 342C.
@@MarineRX179 so if i were to buy one of these 2, for the better color option combined with higher fps, i should get the DW instead of the DWF? sorry if you've already answered this, just dont know much about high end monitors, ha
I love the fact that at this point, there are so many people at LMG that name collisions are just a normal part of the job. "Just get me Jake!" "Which one?" "Nevermind, get me Nick!" "Which one, dude?" "Oh screw it, get me Alex" "... Are you being serious right now?"
Just ordered one of these today. I’m thinking I should have shelled out the few extra hundred dollars for the GSync variant but I think the dwf will be just fine
11:06 Linus's timing with the Hand gestures there holy. The LTT Store website had the images flying in from the right looking like Linus swiped over a touchscreen xD
i still think NVIDIA should just design an interface and module that the owner of a device can install or upgrade based on what they need kind of like a CM module from PI and the interface could always have more pins if needed
I said "F it" and bought both the DW and DWF. Was always keen on Alienware's 3440x1440 ultrawides, as my older AW3418DW (x2) monitors are great still. Excited to build my new PC and test these new monitors out, and truly experience QD OLED for the first time outside of stores.
I have three ultrawides - 2x AW3418DWs and 1x MSI MAG341CQ... the Alienwares have been incredible, the MSI has been a disappointment. I've been debating going for the 3420 or 3423 lineup to replace the MSI - Dell/Alienware computers may be overpriced and questionable quality, but their peripheral game is 100% on-point.
So far, the 3423DW and DWF are amazing. The only thing I wasn't really prepped for was the QD-OLED pixel refresh cycle it has to run every 4 hours to avoid burn in. Lasts like 3-6mins at most. You can skip this or delay it, and manually do it, but they prompt you on the DWF as you go to turn it off, and the DW does it automatically before putting itself in rest, or off. The picture quality on 400nit true black is my preferred choice on both, and I've lowered the brightness to around 55. But they also have the 1000nit peak brightness for ridiculous range, and amazing visuals. Honestly, no issues with them at all. They are new so maybe I'll update this post every so often and keep feedback if people want to see the monitor's worth, or if they have issues down the road. But for now, highly recommend them for anyone who has the cash to spend
@@thecowboyfromcali what do you mean the refresh cycle has to run every 4 hours??? Like the monitor has to reset or something? Or like if it’s just sitting on desktop screen not doing anything then it has to make something change and warm the colors back up???
@@brianfender6811 The OLED pixel refresh as far as I understand, is to eliminate burn in and make sure your picture is always as crisp as can be. By leaving your monitor on for over 4 hours, you run the risk of certain pixels "sticking" where it will imprint color. So basically the refresh cycle is just to allow the system to do a diagnostic check, and make sure no pixels are showing signs of burn in. It serves the same purpose as turning the monitor off for 4ish minutes so the pixels can "refresh" Hope this is the explanation you were looking for, idk much more in terms of technicality
Thank you Linus! Btw, that Daniel haircut is hilarious, kind of like a topiary bush and Linus now has some bangs to play with. Interesting seeing how Covid is changing our lifestyles and fashion (or lack there of. ) lol
I’m confused, is the Gsync version better because of it’s HDR Tone mapping? Or is source moe lose enough to make the choice clear. Wish there was a real. Review out.
what i learned from this video....is that the gsync DW is better at HDR while gaming since that's the purpose of gsync ultimate qualification. obviously not a garrentee on hdr nit peak brightness at 1000 but it certainly can do hdr while gaming with VRR on.
@@SticksUWP you still can't firmware, still have less HDR modes, still have less calibration settings - means you would be paying less for less but get a cool logo sticker.
While the Samsung version is not out yet, they have shown us what it will look like and it looks amazing! No matter which one you get its the same panel and it has been blowing my mind for the last 5 months. The DWF version for $1100 is a KILLER deal.
I considered the other options coming and picked the Alienware because it won't be a "smart" display. I'm connecting this to a computer, I don't need spyware baked in.
Best monitor I’ve seen yet (aesthetically speaking) has to be the Sony Inzone M9. Thing looks beautiful. Not sure of the full specs but it’s a 4K 144hz HDR monitor.
An upgrade to this monitor from nearly anything else is essentially a better purchase compared to an RTX 4080. An absolutely insane deal in comparison, you get true HDR and you don't take a performance hit to run it. I have the non-F version, and I can confidently say that the monitor provides more value than my 3080 does over the 1080ti I had before it. Essentially, if you have a graphics card that can handle 3440x1440 at framerates and settings you're happy enough with, this monitor will trump any other graphical upgrade you can possibly get. Don't bother touching a 4000 series card until you have one of these things. These things are an absolute STEAL if you care about the top end.
I spent 900$ for one of the first Oled gaming monitors to hit the market, in the size range I wanted (27" to 32") there were only 1 panel readily available at the time, with 3 or so brands each selling there version of it, with basically the exact same specs. A few months ago I looked at the spec for more details, and noticed the price already went down to 750$. It's only 450 nits (more around 480 nits according to calibration results, I'm guessing they just couldn't reliably guarantee 500 nits across all units), and although that does mean the HDR doesn't make a big difference compared to SDR, it's still amazing for movies, especially since I had a 21" LCD before. I had a driver issue however, windows for some unholy reason blocked the automatic driver installation, and didn't notify me at any point. It blocked me out of using HDR and streaming at native resolution for months before I finally noticed while fixing something else.
Agreed that it can suck for early adopters. But thank goodness they exist so companies keep improving their product lines since they see value (read: profit) in it.
G sync module doesn’t make sense anymore but the fact that Nvidia works close with manufacturers to make sure their monitors get the best performance and picture quality is the reason why g-sync ultimate certification is still worth getting. The messed up HDR1000 and EOFT tracking in the DWF is the perfect example of this.
Just got my DW about a week ago after waiting for 4 months (these things are sold out everywhere here) and I don't regret anything, this thing looks absolutely stunning. I spent the whole week re-watching and re-playing my favorites and I absolutely loved the time, I've never seen those shows and games like that. OLED makes my midrange IPS from just a few years ago look like a joke by comparison and I'm excited to see what comes next. If I regret anything about this monitor it's the fact that I didn't buy it earlier. If you're still considering getting one of these, go with either and you won't be disappointed.
Man, I had a Gigabyte Auros 27" 1440p144hz IPS before I got the DW... It's night and day. It makes that monitor look like ancient tech. Using anyone else's gaming setup is going to disappoint me from here on out, all I'll be able to see is the lack of detail compared to this one.
Awesome video and advice. I've ordered the DWF! Cannot wait! Sending back the Samsung Odyssey Neo G7 for it. Was far too curvey and not OLED! Keep up the great work!
Seeing this video for the first time. Just bought the DW. It was $200(CAD) less than the DWF so I regret nothing. I’m not too upset about firmware upgrades. I’ve been using my current LG monitor for 3 years and have never even thought about looking at its firmware. I’m not a power user so I just need something that looks pretty when gaming and watching movies. Very happy with my purchase and won’t be upset about new tech that comes out because I won’t be monitor shopping now for another few years 🤙🏻
Yep. The Samsung CRG9 has a lot of “early adopter woes” since it was one of the first 21:9 monitors. Many bugs and things that just don’t work. They’ll never fix it. They’ve moved on to the Odyssey etc.
Not a fan of Samsung, terrible QA, known for making their own products obsolete earlier then they should be by way of replacement or non-existent software support.
I've got an odyssey, but it's one of the first ones so it's bugs won't be patched because they're focused on the new odyssey refreshes :) Won't be buying samsung monitors anymore, no matter how much you pay some part of the experience will feel cheap.
@@Suilujx Yep i owned the Odyssey G9 for about 2 weeks before i returned it due to QA issues for a refund. Samsung always make cutting edge, high spec but very poorly built / QA'd products. Been burned with Samsung products a few times so now i actively avoid them.
@@SecretOfMonkeyIsland784 samsungs ssd's, vram etc anything memory/storage related is always solid. with monitors they just throw in better specs every year without designin things to actually last
I've got an old dell monitor u27 something. It got some oxifdation issues, but I fixed it. I bought samsung in the meanwhile, and I also chose the best from 600 bucks range. As I fixed dell, side by side samsung has so much worse colors, even the creosshair in games is barely visible. Such a cheap experience with it, I hate samsung as a brand. Bad commutation, loses signal from time to time, the only plus it's big 32 inches, and blacks for movies are better. I switched to old dell, as it's so much better in colors. I would not recommend samsung monitors as well.
I’ve owned mine for a few months now and I can honestly say there is no comparison to lcd monitors for me. Saving a couple hundred bucks by waiting months to see if the price would come down is simply not worth the wait. My enjoyment time is far more valuable.
I am watching this right now, with the DW. I got it and installed it this afternoon and when I saw this I had a sense of dread that I should have gotten the DWF but this thing is beautiful and it was on sale for a steal. It is gorgeous and I am using colour calibration on it for professional reasons anyway.
I'm actually thinking about returning the DWF and get the DW. I'm using 3080 and for the 2 main games I play the FPS does drop to below 60 sometimes (especially in Cyberpunk). So I am very curious if the g-sync module would help here
@@yushengcen4656 rtx 4090 and the DW display with every setting at ultra/psycho (ray tracing max too) and it hovers around 60fps in real time man but saying that i dont notice any tearing
I also bought the AW3423DW and im still absolutely stunned by dark loading screens with small symbols and text floating in the abyss. Im happy the new version is less expensive making the technology more accessable to everyone. You cant complain about more features for less money. Also im not into changing settings that much making me not bother about the firmware at all. Since it was set up i cant stop myself from freaking out every time something is literally floating in the air when its dark and i cant make out the edge of the screen.
3:07, wrong approach, you can compare 60/70 to 165/192.5, because you know, percentages. If im not mistaken, those two are comparable in terms of gain, barely noticable. Also, uncapped framerate for lowest latency? Im pretty sure i've seen guys prove that the best you can do is to cap your framerate with ingame limiters to be a bit below your refresh rate, so for example 144hz with 140 cap. And this also varied between GPUs and displays.
Thought about buying this a couple days ago but ultimately decided that I wanted to see the other brands' QD offerings first (as well as a solid review like this one) before I dropped over $1k.
All QD oled monitors and TVs are gonna be pretty similar as it is, Samsung only makes 1 model of mother glass at the moment for it. So they're all gonna be pretty similar to the point most differences will be on a spec sheet or small written review.
Exactly the same as me, i'm waiting to see if any of the new ones will be 4k and have DP 2.0 and HDMI 2.1. Once one of these has that, i'm diving in. Can't wait for 2023
@@daymianhogue1634 Totally, mostly just hoping the competition results in a good option coming around for sub $1k. If not, the DWF here seems like a solid option.
@@Beokabatuka I dont think sub 1k is coming there's like 4 other QDoleds announced and all of them are as expensive or more expensive than the DW. The Dwf is likely to be the "budget" option for qdoled/oled monitors for atleast a little while. The panels are made by Samsung and even Samsungs announced qdoled is more expensive. So Dell/Alienware must've gotten a very good deal from them, that's gotta be hard to beat.
sorry but theres hardly any competition, this monitor was born because of the lack of competition, the monitor market is literally reached 5 years behind the tv market, to which that is the monitor markets competition, all the fald displays released are bugger all competition as well.. the pg27uq released years ago and is better than the buggy falds they're selling at still high af prices today. sorry but I'm sick of people like you only just entering the monitor community recently and claiming competition
@@StevoHDA You’re an idiot if you think monitor competition doesn’t exist. Anybody who has been playing PC games for the last decade can see that. This isn’t some exclusive knowledge that only you or a select few have…. These monitors have the same display tech that TVs do. QD-OLED is literally cut from the same sheet. So instead of rambling on all that garbage you just spouted, how about some actual examples to prove your point?
11:10 please can Linus lab include a number from something regular like in this case a led panel. this help understand how good it is, As i got no idea of a average led panel time otherwise.. Like for testing the 4080 a graph of the 2060 or one the most used gpu
Been there done that, with every Microsoft product from Lumia 920 (dust in the front camera), to xbox one (with the huge power brick), to surface book ( don’t let me start on this). Still never regretted it, happy to be from the first people trying cool products ☺️
I bought into LG Oled as an early adopter and I'm still quite happy. The tech just gets better and I do look forward to upgrading to a matured QD-Oled panel in a few years but the major features and advantages are still present with the earlier LG models. Personally I really like G-Sync built into the panel. Running a 3080ti at 4k in HDR still doesn't get you 60 frames consistently in all games, especially if your running mods. I wouldn't have an issue with dropping the Extreme version (what is it anyways) but G-Sync is significantly better than Freesync at 4k. With 40-series cards we might actually start seeing these techs disappear, replaced by new DLSS tech on the die rather than tech in the monitor.
The difference between a modern LCD and W-OLED is like the difference between W-OLED and QD-OLED. W-OLED looks incredible, don't get me wrong. The blacks are the biggest advantage. However, LCD largely caught up on colors. Quantum dots bring the OLED blacks with even better colors. LCD and W-OLED have narrow blue spectra but wider and more inaccurate red and green spectra. Quantum dots produce extremely narrow spectra for red, green, and blue making the colors look more accurate and vibrant in a way that probably isn't captured by the percentage of DCI-P3 color space listed on the calibration paper. Plus there are no color filters and essentially zero loss of energy so the panel runs at a lower voltage and temperature, significantly extending lifespan. It is a really big difference. I planned to get a W-OLED, but then held off when Samsung announced QD-OLED and it was worth the wait. You should see one for yourself.
Linustechtips channel in a nutshell: make an outrageous clickbait title and then proceed to confute it and prove it wrong during the course of the video.
I am getting excited lots of big upgrades coming. I got the legendary acer predator xb271hu almost 7 years ago and it still has some of the best contrast on the market. QD OLED is coming soon so I might have a good enough reason to upgrade, lots of new GPUs coming, used market is becoming acceptable again.
I bought the DW not long after it came out and I don't regret it in the slightest. It was the first time in my life I had gotten a job and saved up money and I spent it to buy myself an amazing monitor and build myself the best computer I've ever owned and I don't think anything else will ever be as satisfying.
Ive had this monitor in my wishlist since that video. But in that video you did mention that this was the first of its kind, and that waiting should mean competition and revisions and price drops. Now, im not waiting for that reason, I'm waiting because that monitor is not in my budget and wont be any time soon, but glad to see this happening. Hoping by the time I do want to buy this, we might be another version or two in, and maybe Samsung or someone else will attempt to compete as well. the new version not being white is a huuuuge plus haha.
DWF came in yesterday. It cost twice as much as my last monitor. It looks infinitely better than my last monitor. It’s seriously f#cking stupid how good this looks. My only complaint is that it’s not 38 inches. (That’s what she said.)
4:21 I own an RTX 3080 Ti with a 1440p monitor and I cap my frame rate to 143 FPS for almost everything - if I were to play Fortnite, League, Rocket League, CS;GO etc I would leave uncapped. But even for World of Tanks and Warships, 143 FPS is already a ton, and more than I really need, so I save power and screen tearing from ever happening. The NVCP cap is really good with latency and reflex makes better still. also in Control, RDR2, Metro Enhanced, etc I BARELY ever hit my frame cap, and it is usually WITH DLSS Quality enabled - as most games look better in most ways with DLSS on instead of off. Hell I use DLDSR on WoT and WoWS for better clarity, at 2.25x - which is 4k native.
I feel like most times its a tradeoff between getting the newest tech and also having to tolerate the quirks the newest tech has. As the quirks diminish well its not "newest tech" anymore.
Yeah, I would also say that it depends on the kind of device. For example I bought the first gen Pixel watch because I wanted to experience Googles first smart watch. It is very likely that there will be a better and cheaper one available next year but I am still very happy with it even if it has some quirks. If I buy a new Laptop for work I may not accept first gen issues since I depend on it functioning well as opposed to e.g. my watch. It's just personal preference and even if everyone tells me the device I just bought is horrendous, if it works for me than I don't care what the others say
That's why it's not about just the newest tech, it's about the *best* tech. Obviously it's subjective, but measuring against what you want to use it for and what you care about is always, always more important than whether it was released yesterday, a year ago, 2 years ago, or 5 years ago.
Ask any ryzen owmer
Yea I bought a quest 1 knowing damn well something twice as good would come out a year later. Was worth it for sure
@@johnsherby9130 haha quest 1 owners took it in the ass hard haha
I heard that the new DWF was going to come in a variety of colors, unfortunately they used Pantone colors in their Adobe software mockups to send to the manufacturer.
Good one
LOL.
Common proprietary software L
😂
Is this why Microcenter has 8 open box DW? I just reserved one for $899. Worth it? This is to go with my 4090 and 7900x. I also have the old Alienware 3418DW I bought a vertical stand to stack it on top of the new qdold. Man idk now. $899 is pretty sweet. I also have my pc hooked up to a 55 inch LG C2 for some chill sofa sessions. Tough dude. Help!!! My order will be ready in the morning
11:07
I'm sure this wasn't intentional but the timing of the preview banners on lttstore moving with Linus's hand gesture gets me every time
Scrolled for a while to find this comment
@@Master_Wickers Relatable
I've had an aw3423dw for months, and I don't regret it. It was such an enormous step upwards and has given me enhanced joy while playing games for those months that I wouldn't have if I waited. I am also very happy that technology marches forward for others and myself to keep benefitting from.
I mirror these sentiments exactly. The $200 was worth both the months of enjoyment I've had, and the physical G-sync module.
I think Linus got it *very* wrong about g-sync not being worth it outside of a console. There are plenty of games (assassin's creed, for example) where it takes all the little jitters and dips that you get even on a high end graphics card and just... Makes them go away.
I'm glad that there's a cheaper model available and this tech is dipping towards mainstream so quickly, but I'm not upset that I bought when I did. It was a *huge* purchase for me, and it'll easily last the decade until micro LED is feasible.
Dont have it head to head with a LG C2, you will instantly regret the Alienware.
I'll be waiting for 240hz 4k. Excited!
YES. Just consider it the renting fee for previewing six months of the world's greatest piece of equipment. That rounds down to 35$ per month. Would you rather game on a older display or pay extra for the experience?
However, the new one is still in some ways improved, you still have the option to sell this one, accept the price difference as just rent, and get the latest one.
I have been having similar thoughts about the monitor I got a year ago. It was around 600$ on sale so I got it. Then a few months later, I started having some issues when it would not accept a USB-C connection and requiring a reboot to fix. And then I found OLED displays with similar size and resolution for slightly higher prices. But at least I got a good display for the past year which was way better than bending down and working with my laptop screen. I got a lot of work done more efficiently and enjoyed lots of content. It even gave me a advantage in some games. So I don't regret it.
no burn?
As someone who doesn’t research things like monitors that often, it would be nice if you had references to show what a “normal” display is capable of to give context to the capabilities of the new one. I don’t know how much better (if at all) .16ms black to white is than a regular LCD for example. That being said, I’ve had my heart set on the Alienware monitor for a LONG time and was planning on ordering one in a couple of days. Glad to see this video come out showing a cheaper model that is just as good. Great video as always!
Watch Monitors Unboxed,Tim is great at explaining this.Here is a video about the C2 to help you understand how incredibly faster OLEDs are : ua-cam.com/video/jRzGvkqSNaI/v-deo.html
wait for proper input lag (signal processing time) measurements, DW sits at under 5ms.
This. Upvote this.
monitor reviews are posted all the time, and there are plenty of reviews of this monitor comparing to others. idk how you've avoided all that research other than just being lazy. Linus posts for his viewers and his viewers have watched every one of his videos, they don't need to compare to other monitors because this video is kinda built on top of the others.. as it should be with tech. You can't expect all the information you need to be in one video.
Why heart set on an Alienware monitor in particular if you don’t know the minute differences between them? Genuinely curious
Most likely I never bought anything thinking this is as good as it gets
Making silly statements costs him credibility. He doesn't need to do it. He's very capable, successful, knows what he's doing.
If you do then you dont know how technology and progess works. Theres always something better 6 months to a year around the corner.
It depends, for example people who buy latest and greatest iphones/laptops/earphones/smart watches etc
It doesn't always have to be "I got a 3090 three months ago and now the 4090 is here"
@@colts8146 there's something better right around the corner but for some people the improvement is needed at that point and the upgrade 3-6 months down the line is not always necessary.
I bought the pixel 6 pro and the 7 is better and there is no argument there but I still won't buy the 7 or regret my purchase regardless. I wanted an upgrade back then, and the upgrade I got was massive and taking all in consideration, worth it. I have a phone that doesn't give me any problems whatsoever.
I find it so weird when people make that comparison when the majority of the people don't even have the money to change equipment every month. For content, they can show it off but realistically... They don't actually upgrade that much and you can confirm that by watching the Intel upgrade series or just any random videos where they talk about their stuff.
@@colts8146 people who buy consoles may think somewhat along those lines lol. People know there is always going to be better stuff in the future ( usually cheaper * cough* Occulus) but as Jose put it in that time and space they are getting what is arguably the best tech of the moment.
the test colors mentioned at 8:25 built in to the dwf are availabe on the dw as well. if you put your computer to sleep so there is nothing on the display then hold the control stick left for ~5 seconds it pops up all grey. After that continue clicking to the left to get red/blue/green/white/black. I'd really love to have those picture in picture options though :( was very surprised to find them missing on such a large display.
Which firmware version are you on? this might be new firmware's only.
Also mad respect for the 4080 advertisement at the beginning of the video.
@@NorninaGaming M0B102
I will give this a try. I have zero regrets but I was disappointed it doesn't have PBP like my 2017 LG curved ultrawide. Granted, I used it all of six times probably. But, when you need it, you need it.
@@VesperAegis mad respect? What?
I learned a long time ago the best tech equipment to buy is stuff that isn't cutting edge not only due to value but reliability of the item tends to be better.
Yeah, cutting edge is for extreme enthusiasts who also happen to make alot of money lol
@@lucasjuhas12 what if I just want to play rocket league at 1440p 360hz
Huge part is the fact that people who buy stuff first frequently are the guinea pigs for the product or service.
@@Excelsior_Espio better be willing to pay $$$$$
@@Excelsior_Espio 360hz haha.. I also love tinfoil hat technology!
240hz is max necessary (and even that's a stretch come fight me), anything beyond youre better of tossing money into the fireplace - at least it will give you something in return (heat)
In addition to new features, being patient also lets you avoid QC issues. For instance: in 9 months, new RTX 4090 owners probably won't have to worry about their card catching on fire.
Lol, we'll be lucky to get one at all in 9 months with stock being so low.
@@Begohan1234 while he's not wrong at all, I still feel like that's a oversight on Nvidia's part just because it's so easy to do and widespread. Like NO-ONE caught this before shipping? You telling me even Phil that works there but you have no idea what his actual job is only that whatever it is he's a fuck up and sucks at it didn't manage to do it?
@@shannondidntdoit did you watch the video? How was that “easy” to do if so many people tried to purposely reproduce it.
@@APhamx7 just saying how many people have done it in the wild shows there's something there and I know in my personal experience trying to recreate someone's fuck deals with a lot of factors that you can't or wouldn't know to recreate. Plus the company it self has way more time with these cards before they hit market to have full batches happen as soon as they hit the shelf. Gamer Nexus is completely right and I'm not arguing that at all, but as a huge publicly traded company always there's a unicorn type of stupidity out there.
@@Begohan1234 sure it comes down to user error in the end, but the fact that the connector can melt in the first place is fucking horrible lmao
I have the original monitor talked about here, and it is probably one of the best purchases I've ever made. I use it for office work at least as much as movies and games. It isn't the highest resolution thing in the world, and I wish it had USB C video in, but the colors are so nice and the lack of any glowing black bars is great. It's super easy in the eyes, and I don't regret the purchase at all
Same!! It’s a beast of a monitor
Agreed. Also, at least there's a USB-C to DP 1.4 adapter. I got that running from my laptop to the monitor. Works great and activates the G-Sync while also allowing HDR.
I used to purchase everything second-hand through the "local" topic of a hardware forum, and sell my unused gear this way too. I can't imagine how much money I saved this way - and I met some great people too!
So if you are on a budget and something like that exists around you with a real community (so that you don't get scammed, unlike other second hand options), go for it!
I literally only buy second-hand unless something new is dirt cheap. I stay well behind the curve anyway so I can get my dream stuff often for a tenth of the price or more, all while supporting giving electronics a second life and reducing e-waste.
@@FlyboyHelosim you might even get to play debugged games this way :D
@X̶̲̅ H̶̲̅A̶̲̅ sometimes people sell their stuff when the warranty hasn't expired yet. Buying second hand also means that potential QA issues have been dealt with by the original owner. If not, you can choose not to buy it.
I bought the AW3423DW when it came out knowingly taking the early adopter risks and don’t regret my purchase at all. I spent a lot of time reading and watching reviews and luckily got to see it in person at microcenter and compare before buying. This monitor is so incredible for literally everything. Revisiting games with hdr is like playing them again for the first time. I still would’ve bought the Gsync version over the freesync anyway. Either one is 100% worth the money.
Gsync is very nice to have, if you don't have a killer gpu I bet you make nice use of gsync when the fps drop. Freesync premium only goes down to 35 fps.
Yup I don't regret my purchase but it being cheaper would have been nice 🙂
@@JC-Alan I think 200$ is not much when you are dropping a grand on a monitor, that you will surely use for 3-4+ years! Gsync ultimate is really nice to have. My 2c :P
100% same here. 4090 really gets to flex on this display too with QHD Ultrawide. I'll echo what I've told my buddies about this display; The QD-OLED "DW" is a much bigger upgrade and well worth it compared with a GPU upgrade. Whatever framerate increase you might get, doesn't compare with the perfect contrast, near-instant pixel response and outrageous colors QD-OLED can produce. Period.
What's funny is the AW3423DW is often on sale cheaper than want they're asking for the newer model
One thing I like about gsync is variable overdrive. The module uses a frame predictor as a part of its scalar that predicts when the next frame will arrive and adjusts the overdrive to match, meaning you don't get any ghosting artifacts on LCD displays.
For an OLED panel though, this doesn't really matter because the pixel response time is effectively instant. The DWF is a clear winner in my opinion and is definitely going on my wishlist. It's $1900 in Australia right now, hopefully it will come down to ~$1500 on sale.
Being a PC gamer in Australia has been getting so much more expensive lately. With the 4090 at $3200 (AUD for anyone reading), and a top tier display like this being almost 2 grand, it’s getting to be out of reach for so many people. Even just last year things were much better.
@@___DRIP___ yeah I’m in Oz as well, I built my pc before just before covid so I actually got some good deal, but cos of inflation and the weak currency it’s driving tech prices up bonkers to a point where it’s unattainable even for middle to high income earners
I also bought the DW full price and waited 3 months for it to arrive. I don't regret a thing, this monitor is amazing. I only wish more games knew how to take advantage of the display. I end up spending more time than I want to trying to hack HDR support into my games.
Although supporting 4K inputs would be a game changer for me. It would enable me to watch HDR TV and movies on my DW using a little streaming box I have. Netflix is the only streaming service with HDR support on PC :(
I returned mine because the blacks are gray. My 2017 LG OLED looks way better.
@@FatalDreidel The blacks are black, what you might experience is the polarization layer or panel coating which makes them look grey when there is light hitting the monitor, in dark rooms it should be as good as any other OLED. Of course if you use the monitor mostly in lit rooms then its a problem.
but no potential firmware update down the line, must feel bad .. Thinking of returning my DW for the DWF
Windows 11 auto hdr does a good job
@@nosleep1870 Did you ever update your previous display? Not sure it was something possible. Just enjoy the display you won't miss the few new settings available from the DW
This is now my 2nd time fully watching this video and I am so damn tempted to buy one of these but knowing there's a lot of brands working on monitors with this exact panel tech makes me still wait.
I'm more worried about burn-in and image retention 3 years down the road, having to battle DELL support for the warranty since I've heard they fight you with tooth and claw to deny it.
@@laggmonstret My ASUS monitor started flickering on 1 corner 2 DAYS BEFORE the warranty expired. 1 day later they agreed it was faulty and I got a brand new one. Overall megacorps are equally bad in their own ways. If my monitor failed literally 25-48 hours later I would've lost $1000 lol. I do wonder if 3 years is an EXACT cutoff point
@@BenLA5 Ofc, they must have calculated the margins perfect for that hehehee
Burn in doesn’t seem like an issue nowadays
@@Alex-zi1nb It really isn't, there is a bigger chance of your display's board dying on you than it suffering from burn-in.
I almost bought this monitor on launch and also as recently as two weeks ago. I'm glad I've been patiently awaiting the monitor scene to continue to grow. It will only get better and cheaper from here.
I almost bought DW but they proposed a discount for using financing, so why not? No interest and discount sounds good. Well, then they told me to fuck off, so I did exactly that and did not buy it. Later on I bought DWF and that seems to be better choice than I expected.
Alienware always sell overpriced crap. Its not real OLED, because its thick fat ugly. It doesnt take a genius. Whats with the uneven side bezels?
waiting for (hopefully) a 42" QD OLED next year 🤞
What do you think, should we wait for other companies to start making their oled monitors or jump on the dwf?
Also waiting on a monitor purchase, but I might repaint my walls to mix up the viewing experience until I finally buy one.
From what I remember, the Gsync Ultimate module is an FPGA, which is gonna be at least $100 in bulk for the transceivers required for DP 1.4. Cutting that would easily explain the cost savings.
Around $150/chip at big quantity.
+rams+power delivery.
And probably you would need to update the NV module when you reflash the monitor FW, but you can only update the FPGA with new Bitstream via some developer debug interface, because the stuff not supporting anything else.
and still no one use it
Weird that Nvidia of all companies wouldn't have an ASIC made when that's their core business.
@@lorincmate There's no reason the mainboard couldn't reprogram an FPGA as part of a firmware update. All the FPGAs I've ever worked with supported good old SPI eeproms. There's like 99.5% chance that whichever processor they're using already supports SPI (and if not, can almost certainly bitbang it). If the G-Sync Ultimate module is to blame for the lack of FW updates, it's not a technical limitation, it's Nvidia fuckery.
If you're always several years behind the latest and greatest, then you'll benefit from huge upgrades from what you currently have, just like other people! It'll usually be cheaper though, and often with a lot of bugs worked out and plenty of reviews to sample from. The only downside is that sometimes prices get weirdly high for a product nearing the end of its life cycle, and sometimes support for it may be diminishing too.
I have 2 8 yr old gaming monitors.... i love them but also like i can just buy these on ebay down the road for cheap
I have a Dell Alienware AW3423DW QD-OLED and I have exactly zero buyer's remorse. I've been waiting for this tech for years. Normally you get a new device and it is cool at first but, eventually, the novelty wears off. However, the crazy beautiful colors of this monitor blow me away every single day. Not just good photos and watching movies, but even small things like the color in favicons, emoji, syntax highlighting, or my lock screen. The true blacks are game changing. I love this thing. Quantum dots are the future.
Quantum Dots are the present, Micro LEDs are the future.
@@thrace_bot1012 It will be cheaper to make a blue micro-LED panel and print quantum dots on it than it will be to manufacture a panel with 3-4 colors of LEDs per pixel. So quantum dots are also the future!
In 3-5 years when mico-LED becomes viable for consumer displays, I will consider upgrading. My Alienware QD-OLED will go to my Mom, my 34" curved ultrawide LG LED-backlit LCD from 2017 that my Mom is currently using may go to my niece, and the 24" Planar 1080p touchscreen LCD from 2010 or 2011 that my niece is using may come back to me and get mounted in my kitchen where I can pull up my recipes with dirty fingers or the butt end of a silicone spatula. Ahh, the cycle of life.
How is the monitor 2 years later? I'm looking to buy the DWF model. Does it have any burn in yet?
@@ba11ard I am using it now, it is as good as the day I got it! Errr, almost. Zero dead pixels, zero burn-in, zero buyers remorse. I use this for both work and play so it is on up to sixteen hours a day, but my computer it set to put the display to sleep when I am not using it. Sometimes I forget to turn caffeine off and the display is on all night with my locks screen background static, I try to be careful but it happens and still zero burn-in. I do panel maintenance regularly, but ignore it when it asks while I am using it. Think like once a day to a few times a week for the short cycle instead of every few hours, and maybe once a month or a few times a year for the long cycle of panel maintenance. The reason I said "almost perfect" is because the purple anti-glare coating has degraded in places. I found out, after some months, that opening cans of carbonated water or beer shoots out tiny droplets. When they land on the monitor, they dissolve this purple coating eventually. You can't tell when it is on because the droplets are too small, but you can see where the droplets were when it is off. I hold my hand over my drinks when I open them now and am careful to wipe any droplets that splash on the monitor off right away. I recommend it, I think you will be happy. The price has come down since I got it, too! Get the non-nVidia variant (i.e. without Gsync) since the Gsync variant cannot accept firmware updates and nVidia cards support Freesync now. That saves a little more money, too.
Thanks for talking about the Source tone mapping in the settings. I was wondering why BF2042 in HDR wasn't working properly and other games were clipping on my AW3423DWF. I turned on that setting plus calibrated with the ''HDR Calibration Tool'' in the windows settings and now everything works fine. This should be the default setting for real. @Linus Tech Tips
@@lucasrem HDR has nothing to do with resolution. There are no consumer monitors that do "higher!" than 10bit. The resolution is clearly stated 3440x1440, not a "quad HD ready monitor". And none of any of the incorrect things you are saying has to do with the clipping issue Silveraga was talking about and has resolved.
As for the actual specifics of how HDR is handled on the monitor:
"Note that the AW3423DWF has a true 10-bit panel, but it’s limited to 8-bit at 100Hz (120Hz via custom resolution). However, with HDR content, you get GPU dithering (8-bit + 2-bit FRC), which is indistinguishable from native 10-bit. In contrast, the AW3423DW model is limited to 144Hz at 10-bit."
Ye tru
I just got one of these and couldn't agree with Silveraga more, the Source Tone Map is actually a wild improvement in quality, the fact that its not called out repeatedly in this video is almost criminal. This monitor looked AWFUL out of the box, I was convinced something was wrong with it, because things looked washed out, color quality was meh - benchmarks looked worse than they did on my original AW3418DW. I thought maybe it was Windows HDR handling, but when I connected the computer to my TV it worked flawlessly. Sure enough, came back to this video, skipped to the HDR setting, turned the "Source Tone Map" on and it's like an entirely different panel.
Dell/Alienware is doing a WILD disservice by not having that on by default.
@@mackdoherty I bought a refurb model from dell and that setting was on by default! Love this thing but keeping a 16:10 LCD around for work.
@curtisbme the good news is.. for games that use exclusive full screen, I can choose the refresh rate just for that game. Some benefit from the extra 31Hz like competitive shooters, others are graphical cinematic eye candy that I'd prefer to use native HDR10.
I bought the s95b for msrp on release. I could now get it for 40% of the original price. I've certainly learned my lesson with early adopting new display technologies :)
@@QueueTeePies from the few things i saw, the c1 is superior in most ways to the c2, as they dropped functions out of it. not 100% but thats what my memory is telling me
The S95b always had too many firmware problems for me to consider buying one.
The LG C1 is the best value gaming display on the market, by a wide margin, and unless Samsung pulls a rabbit out of their hat and releases a 4K240hz QD-OLED monitor I'm probably going to upgrade to another LG in a few years.
@@QueueTeePies I bought a 3060 during the GPU shortage and now you can get a 3080 for the same price (~$530)
I know this comment is from 2 years ago but I just purchased this exact monitor brand new off Dell for $560
My story's LITERALLY the exact same as Plouffe's. Huge Oled simp my whole life, immediately went out and copped the AW34. It only puts my LG 27gl850 "to shame" in scenes with a lot of black or darkness, but the vivid colors come close. HEAVILY used since May-June and zero burn in or issues.
I think mine has some stuck pixels, but, that could easily be because I don't know what I'm doing and ran panel refresh a lot thinking that it'd be better than using pixel refresh more than once... turns out I'm a dumbass and that could've shortened the lifespan of the monitor, on the plus side, though, I found a cool open source app to force windows to make task bar, etc true black.
I've always taken the "this is as good as I can afford" approach when buying any parts for my builds. This is the only way to make sure I'm never disappointed. Sorry Plouffe, hope you learned your lesson dude!
Plouffe doesn't regret it, though. It was just a joke.
@@Exilum I know, but it still sucks paying all that cash and having it's replacement come less than half a year later and show it up for quite a bit less money. He still got a great monitor either way though, well worth the money.
@@dragon2knight Indeed. And, he did get to use the monitor for the past 6 months!
@@dragon2knight Yep. My three questions to evaluate these types of situations is "did I get what I wanted when I bought it?", "am I happy with it?", "does it still serve its purpose as well as before?"
This
A lot of LTT monitor videos have that RGB colour map. I have no idea how to read it, and my only takeaway is hearing the presenter tell me it's good. Could LTT do a video or a short that teaches us how to read that graph, what everything means and why?
@@lucasrem Wait, DCI-P3 is only 8-bit? I thought DCI-P3 was above sRGB and sRGB is literally 0-255 (2^8=256 colors) for red, green, and blue, hence 8-bit.
I remember Hardware Unbox made a video explaining how that map worked, you should check them out i think they do good monitor reviews.
The color* map literally shows what is acceptable and what is out of spec. You just asked the equivalent to Linus making a video about how to draw in the lines on a coloring book. Common sense is a personal responsibility..
Go to Monitors Unboxed (2nd Hardware Unboxed) Channel and look for the video “what are response times, overshoot and cumulative deviation” it many of the monitor terminology and for reading graphs 🥰💪👍
@@C4Oc. DCI-P3 is 10 bit, keep in mind there a different ways to view that, because technically monitors have 24bit color or something, dont quote me on that, but thats measuring it the "old" way.
But no, DCI-P3 is 10 bit, not 8 bit, all HDR color formats are 10bit AFAIK
I am extremely thankful for all the early adopters who go through the painstaking work of making purchasing much easier for me months or even years later.
Plouffe saying "I own a display" as if it's some sort of brag is still so funny to me
I'm sure he meant to say something about owning a good display or something
@@KJMcLaws for sure, it just came it out in a funny way
@@curiousfullstacks the best way lol
I’ve recently started to think about replacing my two 24” monitors with an ultra wide and I’m glad you made a video on this model! The DWF will probably be my first OLED monitor and I’m excited to make the switch soon!
Having an ultra wide > two monitors. I switched from two 27" monitors a few years ago and the experience is absolutely worth it.
I switched to a 49" ultrawide, and I don't think it is worth going any smaller. I love this thing!
Sure, it may only be a VA panel, but 1440p with 120hz makes up for it and is pretty much all I need rn. Plus, it only cost me $750 for this AOC monitor compared to the others which are $1000+
@@smallbutdeadly931 I've been testing out monitors recently as I've never really used ultrawides before, and I'll admit that there's a part of me that feels almost "cheated" when I deal with games that don't render at 32:9. The two big displays that I've been using are the AW3423DW (21:9, the G-Sync one from this video) and the Neo G9 (32:9). For gaming, I decided to try out God of War. I really wanted to try a racing game like Forza Horizon 5, but I only have it through Game Pass Ultimate and I don't care for the Microsoft Store on Windows. (I might bite the bullet and try it out though as racing or sim games are supposedly the bread and butter of these huge displays.) Anyway, I didn't realize that God of War only supports 21:9 and not 32:9, so I was a bit surprised when I switched to the Neo G9 and got black bars. It still looked pretty good, and the blacks -- while not as good as the OLED -- were still quite good.
Also, one small problem with the dimming on the Neo G9 is that it seems like they are really aggressive about ignoring highlights, which happens to include the cursor on the desktop. So, if it's on a brighter window, it's very visible, but if the cursor is on a black background? It's really dim. Now, that's not an issue with ultrawides as that would theoretically happen on a 16:9 monitor with similar auto-dimming functionality.
You will literally never go back. I have the DW with the new firmware and it’s the best monitor on the planet hands down. Got it for 1100 bucks too. Better than the DF still imo.
@@enzog1078 Same.
I was also an early adopter of the DW and its been the best thing to have to work and play. So good in fact, I just bought an S95B on sale to get the same tech in my theatre room. I will have to admit, not being able to disable the messages for pixel and panel refresh is dumb and not being able to update the firmware is even dumber, but playing on this monitor for any kind of content has been amazing. HDR off for desktop use. HDR on for games that support it.
Why turn it off for desktop use?
@@mauree1618 They don't mean disable the refresh, just the message that pops up reminding you to run it. Most people prefer to just let it refresh when the monitor goes to sleep, the monitor doesn't want you to do that.
That’s not normal. I never get the message since choosing the option to only do the cleaning algorithm while in standby.
@@JeremyWinter Sorry I wasn't clear, I meant to ask why does Eric turn off HDR for desktop use? I'm new to HDR so idk.
@@mauree1618 windows doesnt use the correct HDR tone mapping. There is an auto HDR slider but that is not that great compared to SDR which windows is made in. HDR games are mastered for HDR and when its on, will display the correct lighting and color.
Did anyone notice at 11:06 Linus used magic to change the store page
holy shit
Hahaha. Brilliant
AYO
Man amazing
I love to be on the early adopter train but recently I've been more conscious of the cost benefit for me particularly and that has lead me to be very happy with getting previous generation or older tech.
So for me personally, as long as my tech holds up for 5-7 years I'm not too phased if a better model comes out
This. If you hold off for the best model announced, you'll be holding off forever. There is always a better model announced but not out yet, and always a better model out in a few months. You just gotta get what you're gonna get and enjoy it for what it is. If you get lucky, you might even land on a generation that makes substantially more progress than the one before or after it.
1:22 I used to see him on RTINGS monitor reviews. Was wondering where he is up to. Good to see him back.
It's amazing to see more manufacturers produce better QD OLED monitors but it's kinda unfortunate that they all seem to be focusing on making 34" ultrawide displays and not cover some other form factors. A 16:9 4k QD OLED would be amazing but apparatnly that'll still take some time.
they already have those.
@@clownavenger0 not in QDOLED in standard monitor sizes. No normal person is gonna use a 65” TV as a monitor
@@genderender well 4k is like kinda silly at 27 or 32 inches. Maybe a 42 or 37 would be nice tho
@@clownavenger0 too bad, you can only currently get 65” or 34” 21:9 QDOLED panels. No other size regardless of need
@@clownavenger0 1440p is going to be even less likely. They just manufacture single motherglass for 4k and cut it up to pieces with maximum yield.
32" is pretty much the optimal size for most desks for viewing distance/depth.
variable refresh rate is great even from things that are not consoles.... having your hz adjust to frame rate always makes for a smoother experience and removes screen tearing as long as you cap frame rate a little before the max hz
I've found it can help a lot with games where the frame rate swings broadly. Going from 90fps to 160 fps rapidly is very noticeable, even though both are perfectly playable frame rates. I personally tend to just set the max frame rate in that game lower to handle it, but adaptive sync goes a long way to making that feel smooth as is.
The DW variant now allows firmware updates. Dec 5th 2023
IF you have a nvidia gpu
Why would you ever buy a non-Nvidia GPU? @@peoplesactionsnottheirword8760
and will the fan be quieter after that fw update?
I really hope the QD-OLED offerings next year are more prolific. I really want one that supports DP 2.1. I usually sit on a monitor for 5 - 10 years. It'd be nice to do that again.
Is pled good choice for 5-10 years when burn in is still a thing
@@maegnificant it is still OLED. And oldest display is still just 6months old to believe you
@@ligametis tell that to the LG c9 or cx owners lol none of them have burn in lol
@@redclaw72666 c9 already has burn in for me and my friend's c1 also has that.
Galaxy S3 S5 s8 S10 and S20 also already have burn in, including Lenovo oled laptop. It seems every device I know that is over 4 years old is with burn in.
By the way look at used devices for sale, like half of those with oled have burn in
Linus already has burn in on his LG and it isn't even one year old yoo
@@redclaw72666 OLEDs will eventually just burn in, 10 years seems like a reasonable end period for one. It’s Organic LEDs, they slowly die as they’re used. You could sit on an older monitors for about 10-15 years of constant use, not gonna be a thing for even the best OLEDs.
As someone who owns the AW2334DW and is currently using it, I stand firmly with the buyer that Linus quoted, no image retention, its soo bright that I often have it at 70 percent brightness anyway and also i got it for £1099 because of an offer DELL UK was doing and now I did it, no regrets and I like NVIDIA ULTIMATE and im happy I have the tech, and even if LIKE EVERYTHING it isnt perfect, I know that the next time I buy a new monitor it will be even better. But damn, I have constantly proud of this monitor, LOVE IT.
I got it for £920 last month and its genuinely incredible, the only fault I can think of is the pop up that comes on every 4 hours to tell you to pixel refresh.
Also an owner, and I love it. No regrets!
@@demetter7936 You can tell it to do the pixel refresh and not ask again on the popup, after which it will automatically do it if it goes into standby after 4+ hours of being turned on!
@@minamoto_hikari I always do it when it asks because I don't want to risk burn in.
I really like my AW QD-OLED, took me 7 days to get mine
Hello
I've enjoyed this monitor so much over the past 5 months that I don't really feel all that bad for spending $200 more than I needed to. I think I've even played more games than I normally would just because I like the monitor so much
I prefer having G-Sync Ultimate, for me that's the best selling point.
I picked up the DWF last saturday for $799 on Dell's website and a day later it was listed as $100 more😎
just got it for 699
@@badgaem 💀
Yep just got for 699 at bestbuy. Great monitor for the money. kinda wild ngl
I actually got the none F version not knowing when this new F version was going to release, I got the MOB02 update and no issues with the bubble wrap like others have had. I also got it for under 1000 on a sale they did so I really cannot complain about it. The monitor is beautiful and anyone who has seen me on it instantly stops and has a quick look. So glad there's a cheaper option available and hopefully it continues like this.
Whered did u manage to find that sale?
@@oscarreyes4807 it was from Dell themselves. Here in the UK its 1200 normally but they had a discount for 980 I then applied another code they had going on and it worked. It was like a day or 2 on sale around a month back? Oh and free shipping so I got one hell of a deal for a total of like 900 including the tax. Instead of the 1200 it normally cost.
Yep, they had a similar sale in the states. I bought mine in September for 1000+tax
That's kinda how I feel, I didn't get mine under $1k, but did snag it during a lull in scalper activity, and sure, there are things about the DWF that seem good, but, not enough for me to scrap the DW, also glad that the price is lesser for others, especially since people like me who have a very rare medical condition need OLED's due to how much better it is on our eyes to have actual black being used vs "black" of backlit panels.
My only complaint about the monitor is that the screen is almost impossible to clean. You can use a torch and not see anything at all but if off angle.... Very off angle you may see a slight haze on the screen still that will never go away. It's not the bubble wrap thing either. It isn't noticeable in anyway if you use it normally but it's annoying knowing it's there lol. Oh well 3 yr warranty on it for burn in so imma just use it for as long as that happens. Can't wait to see OLED become more common.
Hey Linus, you show these colour accuracy graphs a lot but but I don’t really understand how we get that data, could you perhaps do a video explaining the whole colour accuracy testing process?
Thanks, Aaron
^
He talks about it about halfway through the X-rite i1 Basic Pro 2 review.
UA-cam doesn't allow links even to UA-cam so you'll have to search on your own. Also talks about it in the Techquickie video about color calibrating your monitor a little bit.
Luke also talks about a bit in a separate Techquickie "Why do Monitors Display Colors Differently?" Though it's more in relation to what subsets of the graph show.
11:06 Is no one gonna talk about that clean hand wave to continue the slideshow on the website on the monitors....
Side Note: Currently bought 2 of the AW3423DWF hopefully to mount one above the other or do a side by side display.
If you look now at the cost as a u.s. customer it's now 1399 for the dwf and 1499 for the dw .
OLED displays are definitely the hotness. Waiting for Linus's video on how using the OLED TV's as his daily drivers has been.
I don't think that's planned to be a video yet given it's like his 4th Oled TV for a monitor in a row. And they've kinda already talked about it in some videos previously and even in this video he mentions having burn in after only a month lol
Image retention is not burn in. OLED TV as a monitor been smooth sailing here very minor complaints.
@@daymianhogue1634 i want to buy an Oled TV to upgrade to, but that is the major fear I have. I don't want to spend $1000+ just to end up with a display that gets burn in which I know Oled's are known for lol. Been keeping an eye on the Samsung Quantam Oled since Linus stated those have less chance of burn in.
I have put thousands of hours on my 55 lg CX OLED and not a single sign of anything wrong with it. It even tells me when it's time to do its own maintenance on the screen.
@@TheCommanderTaco if you're just gonna use it as a TV you're less likely to get stuff burned in. And you can get oleds for well below 1k now for TV. You can get like a 55" LG A1 for like $600ish from best buy on clearance and just a couple weeks ago they had the A2s on sale for like $550. Wouldn't be surprised if the A1 goes on an even bigger clearance blackfriday and/or if the A2 goes back on that sale again on blackfriday.
Burn in more so comes from using it as a monitor because like UI elements stay static on screen for too long.
7:44 Translation- HDR is a joke in general with current standards.
After watching this video, I finally bought the DWF; I have an AMD GPU and waiting for the $200 price drop / freesync was so worth it. I know that buying it is a little ridiculous, but who cares what others think of my purchases. Ever since I built my first computer in 2016, I have ALWAYS wanted an Oled 34" 3440*1440p high refresh rate monitor. I remember a 34" 2560*1080 VA 60hz costing 600$ back in the day and that was too expensive. Now that my dream spec monitor has finally come out I have saved up since the announcement of the DW and pulled the trigger. Can't wait till this Monday to get it. Thanks Linus, a 7 year fan.
How was it ?
@@aayaan1935 it's a fantastic monitor. It's exorbitant but god it's worth every penny. It's gonna be hard to get anything better for quite a while I'm still very happy with the purchase and highly advise you to get it if you've been on the fence!
The DW G-Sync Variant can now be updated via DisplayPort with an Nvidia GPU. Input Latency is now cut in half and fan ramp up problem is solved.
Linus Media Group should consider getting a dedicated captioning team. The "generic" captions [or whatever service provider they use to caption these videos] tend to miss a lot of the specialty terminology. idk if there's a way to avoid that and specify the terms that appear in the vid, but things like "Blue" ["Plouffe", 0:08] and "cutie" ["QD", 10:17] are weird to see on an otherwise high quality video that hasn't been auto-captioned
unless youtube decided to remove the "auto-generated" description for w/e reason from the option and those are actually the consequence of automatic captions
The Lab is going to be amazing, i'm looking forward to making companies make quality stuff
🤔 whoa 🤯
I have the AW3423DW and so far really happy with it, i feel the dwf model is not really an upgrade, but a different version/model that's great for people with AMD cards that don't have any use for the g-sync module.
It's not intended to be an upgrade. Just a cheaper alternative of equal quality. Though firmware updates may eventually make it a slight upgrade.
@Daymian Hogue I don't think it will get that many new features, but it's great to be able to update if you have any bugs, so far I have not experienced any bugs with my model, I have the model with M0B102 firmware, so the second iteration. I believe a few people with the M0B101 version have some annoying bugs.
@@TheSolheim that's fair was just saying.
Gsync v2 modules can be used with either amd or nvidia gpu.
@@anthosm really, well, that's good news :)
This is the exact video I was waiting for. I planned on picking up the DW in the next few weeks as I was sure the extra $200 was going to make for a better gaming experience with my 3090. Even without the gsync module it seems the DWF is the better buy as long as freesync plays nice with my GPU (I've never used a freesync panel, nor have I done any gaming in HDR). The black color scheme of the monitor is just giving on the cake at this point.
as stated by linus in this video - if you have a beefy enough GPU, youre almost always gonna be playing with an uncapped framerate for the most minimal response time (higher then monitor refresh rate still equates to more responsive peripheral inputs - even moreso, screen tearing is nigh on indistinguishable at framerates that high); the only reason you'd ever want the DW model is for games you know will run BELOW 60fps to help smoothe out the jarring frametimes/screen tearing - the DWF is definitely a better buy, however, im not sure if going strictly freesync was a great idea, it still requires a bit of a workaround on nvidia cads IIRC from my last freesync monitor (returned it for a proper gsync module'd monitor btw). G-sync, while stupid that it has multiple tiers of 'certification' is still definitely king when it comes to VRR so *realistically* the only drawback of the DWF model would be slightly less VRR friendliness, but again, if you have a beefy enough gpu, there really is no need to even enable gsync until games finally tank your GPU to below 60fps... which will be quite a while lol
@@kendog0013 Have you even used an HFR or VRR display? Because it sounds like you haven't and are just regurgitating information you don't really understand.
Any drop in framerate below your monitor's refresh rate introduces judder. It doesn't matter if it's 140FPS on a 144Hz display, it won't look smooth to your eyes.
Gsync/freesync stops that, full stop. It's also a much better experience running at 140FPS/144Hz Gsynced than it is 180FPS/144Hz nosynced, because 'uncapped framerate' above your refresh rate suffers from tearing.
Also, you can only sync so low until the lack of frames severely detracts from the visual smoothness. As someone used to HFR, I can't even tolerate 60FPS games, because they look as bad as 30FPS did back when we were all using 60Hz monitors. VRR does not help at all in this case. But, this is more a user experience things, because anyone still using 60Hz won't be that bothered, because their eyes simply haven't been trained on better. My comfort zone is at least 90FPS, gsynced of course.
Additionally, as someone with a G-sync compatible display (34GK950F-B, not even verified), 'Freesync' based gsync works absolutely fine. In fullscreen (or borderless). Look up MPO's (multiplane overlays) if you want to use VRR in windowed mode. Never, ever, use the windowed mode option in the NVCP; it's horrific.
Gsync (VRR) is an absolute requirement for gaming for me. It's an invaluable technology that should've been in spec since LCD technology was developed. Crazy it took some proprietary BS to become adopted decades later, but that seems to be the industry norm now..
@@kendog0013 i get screen tearing usually when games go above 144hz (my current monitor) --- quite distinguishable.... end up capping the FR
Buy the DW, the HDR1000 on the DWF is broken (dell says that it will fix it on an update but who knows ).
@@Gambit8319I've been gaming on PCs since the late 90s, back when screen tearing was often just a thing you dealt with. Some of the worst screen tearing I've ever had was when VRR was new tech and video card drivers didn't yet include max frame rate settings. Going above the refresh rate with VRR enabled can be a horrible experience.
I bought an AW3423DW a couple of months ago on firmware version 2. In HDR It's still so pretty that I could cry. Zero regrets.
No regerts
The only review that talks about PIP as one of the differences between this and the 'F'.. worth the 15m+ subscribers for sure !
I have 9 months of amazing experiences from my OG QD-OLED that I wouldn’t have had if I waited. I’m happy. And I would have played through lots of games with a sub par experience by using an IPS or VA panel. No regrets. We’ll kind of. I really wanted 2 displayports.
Cope lol
Dammit Linus I literally just purchased the OG model a day ago after going back and forth trying to decide whether to get the F or non-F, and thought the few extra bells and whistles would be worth the relatively small price increase; never saw anything mentioned about the firmware update ability or better color accuracy. At least I didn't pay full price for it (found a 15% discount code for Dells' website) and it still seems like it'll be an excellent monitor judging from friends who already have it, but still a little bummed I didn't wait just a bit longer for this video.
F in comment section for the timing. :(
If you just bought it, you could return it, assuming there isn't a massive availability wait for the new model.
@@affieuk it's already out for delivery, pretty sure it's too late at this point. I'm not too miffed about it, just bad timing for me.
Both versions look incredible! I get that the new one does have a number of benefits, cheaper, and that 165hz vs 175hz is a negligible difference, I just hate that it loses some hz compared to it's predecessor. The DW also having better HDR than the DWF, but 4k HDR is big for new consoles. Ugh it's such a tough choice!
Better wait for further (users) reviews and feedback first.
At the moment the DW 10-bit colour is supported up to 144Hz, but would drop down to 8-bit if using 175Hz; whereas accordingly some someone posted on the Overclockers UK forum stating that according to the Dell's rep response, the DWF would only do 10-bit colour up to 100Hz, and above 100Hz it will drop down to 8-bit colour.
Not sure if that would be a deal-breaker for you.
I just bought the DW because I was able to stack the Black Friday discount plus a 10% off discount code bring the total down to £879. I got the Amex offer of getting £100 credit back for purchase over £750 at Dell, so effectively I got the DW for just £779. The DWF is most likely going to take forever to become available in the UK, and but the time they do become available, most likely they would be at over £1000-£1100 so there's really no point for waiting for me.
I feel you, I'm in the same boat. I'm leaning towards better HDR though. I'm also considering/waiting for other manufacturers to release their 21:9 34" QD monitors, like Samsung Odyssey OLED G8 G85SB or MSI MEG 342C.
@@MarineRX179 so if i were to buy one of these 2, for the better color option combined with higher fps, i should get the DW instead of the DWF? sorry if you've already answered this, just dont know much about high end monitors, ha
@linustechtips at 11:05 the way Linus moves his hands and the website sliders change is phenomenal and perfectly synced by pure coincidence 😂👍
I love the fact that at this point, there are so many people at LMG that name collisions are just a normal part of the job.
"Just get me Jake!" "Which one?"
"Nevermind, get me Nick!" "Which one, dude?"
"Oh screw it, get me Alex" "... Are you being serious right now?"
Would be real nice if they ever made this thing available in most of Europe, or even gave a time table for when it's going to happen.
Plouffe went for it, paving the way for the rest. Yeah it can suck but when it works, sweet deal. Nice breakdown of the concept with an example 😀
Not much of an early adopter tax.
Just ordered one of these today. I’m thinking I should have shelled out the few extra hundred dollars for the GSync variant but I think the dwf will be just fine
Is this still worth buying? Early Fall 2023?
11:06 Linus's timing with the Hand gestures there holy.
The LTT Store website had the images flying in from the right looking like Linus swiped over a touchscreen xD
This is why I'm holding out for LTT screwdriver 2.0. It's rumored to allow user firmware upgrades and HDR10+.
i still think NVIDIA should just design an interface and module that the owner of a device can install or upgrade based on what they need
kind of like a CM module from PI and the interface could always have more pins if needed
Gsync module is an fpga. They have to re-write the whole thing if they want to make it simply arm based.
I said "F it" and bought both the DW and DWF. Was always keen on Alienware's 3440x1440 ultrawides, as my older AW3418DW (x2) monitors are great still. Excited to build my new PC and test these new monitors out, and truly experience QD OLED for the first time outside of stores.
damn you got both....why choose when you can have both lmao...one for gaming in hdr and one for everything else xD
I have three ultrawides - 2x AW3418DWs and 1x MSI MAG341CQ... the Alienwares have been incredible, the MSI has been a disappointment. I've been debating going for the 3420 or 3423 lineup to replace the MSI - Dell/Alienware computers may be overpriced and questionable quality, but their peripheral game is 100% on-point.
So far, the 3423DW and DWF are amazing. The only thing I wasn't really prepped for was the QD-OLED pixel refresh cycle it has to run every 4 hours to avoid burn in. Lasts like 3-6mins at most. You can skip this or delay it, and manually do it, but they prompt you on the DWF as you go to turn it off, and the DW does it automatically before putting itself in rest, or off. The picture quality on 400nit true black is my preferred choice on both, and I've lowered the brightness to around 55. But they also have the 1000nit peak brightness for ridiculous range, and amazing visuals. Honestly, no issues with them at all. They are new so maybe I'll update this post every so often and keep feedback if people want to see the monitor's worth, or if they have issues down the road. But for now, highly recommend them for anyone who has the cash to spend
@@thecowboyfromcali what do you mean the refresh cycle has to run every 4 hours??? Like the monitor has to reset or something? Or like if it’s just sitting on desktop screen not doing anything then it has to make something change and warm the colors back up???
@@brianfender6811 The OLED pixel refresh as far as I understand, is to eliminate burn in and make sure your picture is always as crisp as can be. By leaving your monitor on for over 4 hours, you run the risk of certain pixels "sticking" where it will imprint color. So basically the refresh cycle is just to allow the system to do a diagnostic check, and make sure no pixels are showing signs of burn in. It serves the same purpose as turning the monitor off for 4ish minutes so the pixels can "refresh"
Hope this is the explanation you were looking for, idk much more in terms of technicality
Title says "being an early adopter sucks" and the very start of video shows a pic of the new screwdriver XD
Thank you Linus! Btw, that Daniel haircut is hilarious, kind of like a topiary bush and Linus now has some bangs to play with. Interesting seeing how Covid is changing our lifestyles and fashion (or lack there of. ) lol
I’m confused, is the Gsync version better because of it’s HDR Tone mapping? Or is source moe lose enough to make the choice clear. Wish there was a real. Review out.
what i learned from this video....is that the gsync DW is better at HDR while gaming since that's the purpose of gsync ultimate qualification. obviously not a garrentee on hdr nit peak brightness at 1000 but it certainly can do hdr while gaming with VRR on.
@@arthurpendragon8192 Gsync is dead. Only fan boys pay 200 more $ and pretend it's doing something.
@@thomasfischer9259 I guess you never experienced a proper G-Sync monitor bro.
@@thomasfischer9259 What if I can actually get the G-Sync model for $100 cheaper? Now what to do....
@@SticksUWP you still can't firmware, still have less HDR modes, still have less calibration settings - means you would be paying less for less but get a cool logo sticker.
0:35 "I own a display" -- Paid in full, not financing it! It counts!
While the Samsung version is not out yet, they have shown us what it will look like and it looks amazing! No matter which one you get its the same panel and it has been blowing my mind for the last 5 months. The DWF version for $1100 is a KILLER deal.
I considered the other options coming and picked the Alienware because it won't be a "smart" display.
I'm connecting this to a computer, I don't need spyware baked in.
Best monitor I’ve seen yet (aesthetically speaking) has to be the Sony Inzone M9. Thing looks beautiful. Not sure of the full specs but it’s a 4K 144hz HDR monitor.
An upgrade to this monitor from nearly anything else is essentially a better purchase compared to an RTX 4080. An absolutely insane deal in comparison, you get true HDR and you don't take a performance hit to run it. I have the non-F version, and I can confidently say that the monitor provides more value than my 3080 does over the 1080ti I had before it.
Essentially, if you have a graphics card that can handle 3440x1440 at framerates and settings you're happy enough with, this monitor will trump any other graphical upgrade you can possibly get. Don't bother touching a 4000 series card until you have one of these things. These things are an absolute STEAL if you care about the top end.
I spent 900$ for one of the first Oled gaming monitors to hit the market, in the size range I wanted (27" to 32") there were only 1 panel readily available at the time, with 3 or so brands each selling there version of it, with basically the exact same specs.
A few months ago I looked at the spec for more details, and noticed the price already went down to 750$.
It's only 450 nits (more around 480 nits according to calibration results, I'm guessing they just couldn't reliably guarantee 500 nits across all units), and although that does mean the HDR doesn't make a big difference compared to SDR, it's still amazing for movies, especially since I had a 21" LCD before.
I had a driver issue however, windows for some unholy reason blocked the automatic driver installation, and didn't notify me at any point. It blocked me out of using HDR and streaming at native resolution for months before I finally noticed while fixing something else.
Just an FYI.... Dell has the newer DWF 165 Hz 34in. OLED for $200 off $799 + tax, free shipping. I just purchased two of them.
Agreed that it can suck for early adopters. But thank goodness they exist so companies keep improving their product lines since they see value (read: profit) in it.
G sync module doesn’t make sense anymore but the fact that Nvidia works close with manufacturers to make sure their monitors get the best performance and picture quality is the reason why g-sync ultimate certification is still worth getting. The messed up HDR1000 and EOFT tracking in the DWF is the perfect example of this.
Agreed. I'm glad I waited even for a $212 price drop on Amazon. That's $1088 before 10.3% sales tax..
hdr with adaptive refresh.
Just got my DW about a week ago after waiting for 4 months (these things are sold out everywhere here) and I don't regret anything, this thing looks absolutely stunning. I spent the whole week re-watching and re-playing my favorites and I absolutely loved the time, I've never seen those shows and games like that. OLED makes my midrange IPS from just a few years ago look like a joke by comparison and I'm excited to see what comes next. If I regret anything about this monitor it's the fact that I didn't buy it earlier. If you're still considering getting one of these, go with either and you won't be disappointed.
Man, I had a Gigabyte Auros 27" 1440p144hz IPS before I got the DW... It's night and day. It makes that monitor look like ancient tech. Using anyone else's gaming setup is going to disappoint me from here on out, all I'll be able to see is the lack of detail compared to this one.
Awesome video and advice. I've ordered the DWF! Cannot wait! Sending back the Samsung Odyssey Neo G7 for it. Was far too curvey and not OLED! Keep up the great work!
how is the dwf?
Seeing this video for the first time. Just bought the DW. It was $200(CAD) less than the DWF so I regret nothing. I’m not too upset about firmware upgrades. I’ve been using my current LG monitor for 3 years and have never even thought about looking at its firmware. I’m not a power user so I just need something that looks pretty when gaming and watching movies. Very happy with my purchase and won’t be upset about new tech that comes out because I won’t be monitor shopping now for another few years 🤙🏻
Yep. The Samsung CRG9 has a lot of “early adopter woes” since it was one of the first 21:9 monitors. Many bugs and things that just don’t work. They’ll never fix it. They’ve moved on to the Odyssey etc.
Not a fan of Samsung, terrible QA, known for making their own products obsolete earlier then they should be by way of replacement or non-existent software support.
I've got an odyssey, but it's one of the first ones so it's bugs won't be patched because they're focused on the new odyssey refreshes :)
Won't be buying samsung monitors anymore, no matter how much you pay some part of the experience will feel cheap.
@@Suilujx Yep i owned the Odyssey G9 for about 2 weeks before i returned it due to QA issues for a refund. Samsung always make cutting edge, high spec but very poorly built / QA'd products. Been burned with Samsung products a few times so now i actively avoid them.
@@SecretOfMonkeyIsland784 samsungs ssd's, vram etc anything memory/storage related is always solid. with monitors they just throw in better specs every year without designin things to actually last
I've got an old dell monitor u27 something. It got some oxifdation issues, but I fixed it. I bought samsung in the meanwhile, and I also chose the best from 600 bucks range. As I fixed dell, side by side samsung has so much worse colors, even the creosshair in games is barely visible. Such a cheap experience with it, I hate samsung as a brand. Bad commutation, loses signal from time to time, the only plus it's big 32 inches, and blacks for movies are better. I switched to old dell, as it's so much better in colors. I would not recommend samsung monitors as well.
I’ve owned mine for a few months now and I can honestly say there is no comparison to lcd monitors for me. Saving a couple hundred bucks by waiting months to see if the price would come down is simply not worth the wait. My enjoyment time is far more valuable.
I am watching this right now, with the DW. I got it and installed it this afternoon and when I saw this I had a sense of dread that I should have gotten the DWF but this thing is beautiful and it was on sale for a steal. It is gorgeous and I am using colour calibration on it for professional reasons anyway.
How much did you get it for?
@@Mushroomhaus0001 :) half price. I'm in Australia and usually it sell for 2000 AUD I got it for 950 including shipping. Don't know why. Don't care.
I'm actually thinking about returning the DWF and get the DW. I'm using 3080 and for the 2 main games I play the FPS does drop to below 60 sometimes (especially in Cyberpunk). So I am very curious if the g-sync module would help here
@@yushengcen4656 cyberpunk is just hard to run its not the monitor
@@yushengcen4656 rtx 4090 and the DW display with every setting at ultra/psycho (ray tracing max too) and it hovers around 60fps in real time man but saying that i dont notice any tearing
I also bought the AW3423DW and im still absolutely stunned by dark loading screens with small symbols and text floating in the abyss. Im happy the new version is less expensive making the technology more accessable to everyone. You cant complain about more features for less money. Also im not into changing settings that much making me not bother about the firmware at all. Since it was set up i cant stop myself from freaking out every time something is literally floating in the air when its dark and i cant make out the edge of the screen.
3:07, wrong approach, you can compare 60/70 to 165/192.5, because you know, percentages. If im not mistaken, those two are comparable in terms of gain, barely noticable.
Also, uncapped framerate for lowest latency? Im pretty sure i've seen guys prove that the best you can do is to cap your framerate with ingame limiters to be a bit below your refresh rate, so for example 144hz with 140 cap. And this also varied between GPUs and displays.
thank you to all early adopters for making the sacrifice so that us plebs may benefit later.
Same my dwf gets here tomorrow
Thought about buying this a couple days ago but ultimately decided that I wanted to see the other brands' QD offerings first (as well as a solid review like this one) before I dropped over $1k.
All QD oled monitors and TVs are gonna be pretty similar as it is, Samsung only makes 1 model of mother glass at the moment for it. So they're all gonna be pretty similar to the point most differences will be on a spec sheet or small written review.
Exactly the same as me, i'm waiting to see if any of the new ones will be 4k and have DP 2.0 and HDMI 2.1. Once one of these has that, i'm diving in. Can't wait for 2023
@@daymianhogue1634 Totally, mostly just hoping the competition results in a good option coming around for sub $1k. If not, the DWF here seems like a solid option.
@@Beokabatuka I dont think sub 1k is coming there's like 4 other QDoleds announced and all of them are as expensive or more expensive than the DW. The Dwf is likely to be the "budget" option for qdoled/oled monitors for atleast a little while. The panels are made by Samsung and even Samsungs announced qdoled is more expensive. So Dell/Alienware must've gotten a very good deal from them, that's gotta be hard to beat.
Its rteally incredible howw much monitors have progressed because of the competition. Imagine if we had the same with graphics cards
sorry but theres hardly any competition, this monitor was born because of the lack of competition, the monitor market is literally reached 5 years behind the tv market, to which that is the monitor markets competition, all the fald displays released are bugger all competition as well.. the pg27uq released years ago and is better than the buggy falds they're selling at still high af prices today. sorry but I'm sick of people like you only just entering the monitor community recently and claiming competition
@@StevoHDA we have 150€ 144hz displays today
@@hytalefanboi7471 with what? 6ms plus gtg response times? 800:1 or less contrast ratio?
Graphics cards constantly get faster whilst CPUs are slowpokes in comparison. SSD storage space grows slowly too.
@@StevoHDA You’re an idiot if you think monitor competition doesn’t exist. Anybody who has been playing PC games for the last decade can see that. This isn’t some exclusive knowledge that only you or a select few have…. These monitors have the same display tech that TVs do. QD-OLED is literally cut from the same sheet. So instead of rambling on all that garbage you just spouted, how about some actual examples to prove your point?
11:10 please can Linus lab include a number from something regular like in this case a led panel. this help understand how good it is, As i got no idea of a average led panel time otherwise.. Like for testing the 4080 a graph of the 2060 or one the most used gpu
Been there done that, with every Microsoft product from Lumia 920 (dust in the front camera), to xbox one (with the huge power brick), to surface book ( don’t let me start on this). Still never regretted it, happy to be from the first people trying cool products ☺️
I bought into LG Oled as an early adopter and I'm still quite happy. The tech just gets better and I do look forward to upgrading to a matured QD-Oled panel in a few years but the major features and advantages are still present with the earlier LG models. Personally I really like G-Sync built into the panel. Running a 3080ti at 4k in HDR still doesn't get you 60 frames consistently in all games, especially if your running mods. I wouldn't have an issue with dropping the Extreme version (what is it anyways) but G-Sync is significantly better than Freesync at 4k. With 40-series cards we might actually start seeing these techs disappear, replaced by new DLSS tech on the die rather than tech in the monitor.
The difference between a modern LCD and W-OLED is like the difference between W-OLED and QD-OLED.
W-OLED looks incredible, don't get me wrong. The blacks are the biggest advantage. However, LCD largely caught up on colors. Quantum dots bring the OLED blacks with even better colors. LCD and W-OLED have narrow blue spectra but wider and more inaccurate red and green spectra. Quantum dots produce extremely narrow spectra for red, green, and blue making the colors look more accurate and vibrant in a way that probably isn't captured by the percentage of DCI-P3 color space listed on the calibration paper. Plus there are no color filters and essentially zero loss of energy so the panel runs at a lower voltage and temperature, significantly extending lifespan.
It is a really big difference. I planned to get a W-OLED, but then held off when Samsung announced QD-OLED and it was worth the wait. You should see one for yourself.
So does source tone mapping enabled on the DWF fix the HDR1000 mode and make it enjoyable to use?
Exactly my question...
1 week of reading all around the web and haven’t find an answer to this.
Linustechtips channel in a nutshell: make an outrageous clickbait title and then proceed to confute it and prove it wrong during the course of the video.
I am getting excited lots of big upgrades coming. I got the legendary acer predator xb271hu almost 7 years ago and it still has some of the best contrast on the market. QD OLED is coming soon so I might have a good enough reason to upgrade, lots of new GPUs coming, used market is becoming acceptable again.
still got my acer predator xb271hu ips, using it in portrait mode next to the DW and C1 :)
I bought the DW not long after it came out and I don't regret it in the slightest. It was the first time in my life I had gotten a job and saved up money and I spent it to buy myself an amazing monitor and build myself the best computer I've ever owned and I don't think anything else will ever be as satisfying.
Ive had this monitor in my wishlist since that video. But in that video you did mention that this was the first of its kind, and that waiting should mean competition and revisions and price drops. Now, im not waiting for that reason, I'm waiting because that monitor is not in my budget and wont be any time soon, but glad to see this happening. Hoping by the time I do want to buy this, we might be another version or two in, and maybe Samsung or someone else will attempt to compete as well. the new version not being white is a huuuuge plus haha.
Is it possible that the DWF could have a firmware update to allow it to better improve the calibration (or provide the options) to match the DW?
No, calibration is done with external tools (to see what the colours are in real life and calibrate it with a screen’s software)
DWF came in yesterday. It cost twice as much as my last monitor. It looks infinitely better than my last monitor.
It’s seriously f#cking stupid how good this looks.
My only complaint is that it’s not 38 inches. (That’s what she said.)
I mean it would kill her
4:15 You do not get above 165fps minimum in a lot of games at 3440x1440 even on a 4090. The VRR is very welcome. Am I missing something here?
4:21 I own an RTX 3080 Ti with a 1440p monitor and I cap my frame rate to 143 FPS for almost everything - if I were to play Fortnite, League, Rocket League, CS;GO etc I would leave uncapped. But even for World of Tanks and Warships, 143 FPS is already a ton, and more than I really need, so I save power and screen tearing from ever happening. The NVCP cap is really good with latency and reflex makes better still.
also in Control, RDR2, Metro Enhanced, etc I BARELY ever hit my frame cap, and it is usually WITH DLSS Quality enabled - as most games look better in most ways with DLSS on instead of off. Hell I use DLDSR on WoT and WoWS for better clarity, at 2.25x - which is 4k native.
When did he start looking like the guy who lives in his mom's basement and watches kids at the park?
Seriously. Wash your hair, dude.