Agreed. I'd like to go higher than 3440x1440 but I don't want to give up the ultrawide aspect ratio. There are ultrawides that go higher but they are ridiculously expensive.
I gave up after five minutes, these graphs are extremely unintuitive. Graphs are supposed to make it easier to understand data, these do the opposite. Also, dont use white text on light colorful backgrounds, many people wont be able to read it.
The graphs are extremely high quality and must have taken a very long time to be produced. They take some time to understand, but that's expected of something of that level. Just stop being lazy and take 5 minutes to study something before saying it's bad.
Very interesting comparison! Indeed the factors are not just number of pixels, but what those pixels are used for. Since most 3D games maintain a 1st- or 3rd-person perspective at ground level and orientation, more width would tend to mean more objects in the viewport (assuming same FoV as 4K), less occlusion optimization, and exponentially more items in the distance (for LoD optimization). All of this results in more triangles for the GPU and more work for the CPU in calculating positions, movement, collisions, etc. The actual composition of the game world (how open are the spaces, how cluttered it is vertically) could lead to vast differences in that gap between games. Of course, for many games widescreen is an afterthought and no attention was given to optimizing level/world design for it, which would also have a large effect. A cool project could be to graph the compared decrease in performance for 1) increasing pixel count while maintaining standard 16:9 aspect ratio, increasing pixels count only horizontally. For your next video. ;)
Almost all the things you mention are bang on reasons why scaling isn't linear except on which is "not optimized for ultrawide" this doesn't seem to be a valid reason (vs 21:9) for the 4k out performing linear scaling. As when i did 1440p vs ultrawide the ultrawide resolution scaled much better than 1:1 against the 16:9 14400p monitor. Some performance affecting metrics are just not influenced by pixel count and scaling is better than 1:1 regardless of the aspect ratio. 4K is not 4 times as hard to drive as 1080p for example (path tracing may get you close though since number of RT cores will be 99% of the limiting factor.) Through with that said Super Ultrawide dose cause some anomalies because it is showing a 50% different image than the 16:9 monitors.
the gray and white are not the real important info it's the colored bars that appear later and I don't want the background data distracting from the important info.
You've really muddied the data in your charts. We have one bar doing way too much. The colors need to be more distinct between the UW and 4K. I'm not sure why we're interjecting "expected drop", we're expecting drop yes but as you state in the beginning of the video the change won't be linear. If I just pay attention to the UW and 4K numbers its a good video.
The White and gray bars are just background data to the important thing that we are measuring which is the actual performance difference which is represented with the colored bars. They are meant to not draw focus and that is why they are not in more contrasting colors. Most people expect performance to behave linearly with resolution so that is there as a guide to how things differ from expectations and highlights when things go really wrong like when your run out of memory on 8gb card and you actually underperform the expectations.
@@ultrawidetechchannel No, he's right. The graphs are an absolute mess. And the super pale grey compared to slightly darker pale grey is a poor choice too. General rule with graphs is you should be able to just look and the data is clear. If it's not, then it's a bad graph. These are bad graphs.
@@ultrawidetechchannel Just take the feedback man. It's a great video but the graphs are a mess. Adding to that they are only useful to look at for the first couple of seconds shown since afterwards you add the percentage and you can't even see the 4K framerate anymore..
These are really just one to one comparisons because they're easier to understand for most people. I still have the 3840 x1600 resolution with all my other testing and I may do a direct comparison to either 4K or 1440p like I did for this one in the future.
If you absolutely insist on having a QD OLED panel and also want to do some work on it, unfortunately, the ultrawide option is not feasible (at least not for me). The text fringing is terrible on all current models (except for the 32-inch 4K models). awesome video! just subbed.
The perspective I was aproaching was purly gaming focused but ya if you want to have a work from home monitor that you also game on you might want to look at one of those new Dell IPS black ultrawides.
IMHO 3840x1600 is the best resolution! 4K is too pixel heavy and the image crispness is not worth the in-game FPS cost compared to the extra field of view you get on an ultra wide monitor.
Issue with 4K displays is there are literaly 3 cards that can play games at 4K in native at 60fps+ and not all games ofcourse. With ray tracing enabled even expensive 4090 strugle to deliver 60fps at 4K without upscale technology.
4K will be a tough resolution to run in visual showcase games for a few more gpu generations as Path tracing finds it's way in to more and more games. And the 3440x1440p resolution should still be able to leverage the fastest cards of those generations with out being waisted in high fidelity titles.
so i have the DWF 34" usually sit about 3 ft away. I made the mistake of hooking up my PC (7900xtx) to my family room 65" LG G2 and OMG even sitting just 5FT away from the 65" the difference is massive! I just found myself exploring maps more on games as they just looked so much more stunning. so now i may have to change my DWF 3440x1440 for a 32" 4K oled.
You sit a bit far from your monitor and very close to your tv. I don't think it's the 4K. I think it's the proximity versus screen size. You should really pull your monitor about a foot closer to you and most people would have a TV of that size around 12 ft away. I think you will be very disappointed if you buy that 32in and expect it to feel like your tv did.
hey can you help me, is the downgrade on quality i have 28" 4k 144hz ips hight end monitor planing to changer for g8 34 oled is it worthy or i will get hit in picture quality? i have 7900xtx gpu
My employer got me a 32 inch 4k 60Hz IPS monitor for design work about 5 years ago. I've always also used it for gaming. I'm currently using an RX 6700 XT and a Ryzen 5 7600. Most games run at 1440p 60Hz. Spending between $299 - $550, I could get a 27 inch 1440p 240Hz IPS, a 34 inch ~180Hz VA or, a 32 inch 4k 144Hz IPS? I'll likely upgrade to something the equivalent to a 4070ti - 4080 next year. Is 32 inch 1440p a bit low resolution for design work?
I think you'll find a 32-in 1440p monitor to be quite blurry and unsatisfactory even for just general web browsing, much less content creation. Especially if you're coming from a 4k monitor. When I got my 5K by 2K 40 inch ultrawide for video editing, I checked multiple times that my 3440x1440p 34 inch ultrawide to wasn't using a lower resolution then should be because it looks so blurry in comparison and that has a much better DPI than a 32 inch 1440p monitor does.
@@ultrawidetechchannel 32 inch 4k gaming it is then. I like the MSI MAG 323UPF for its brightness and USB-C 90W PD and KVM so I can use my MacBook Pro and gaming PC on the same monitor. Also looking at the new, hopefully cheaper MSI MAG 322UPF and the slightly older Gigabyte M32U.
Pixel fillrate is very important with this. You might hear some people say that its irrelevant, but its actually one of the most important things in this matter. Cheaper or especially older cards will bottleneck at these high resolutions. Thats why benchmarks on normal monitors don't matter because most cards will not get bottlenecked by fillrate at 1080p. Nvidia is notoriously worse in this department. 3440x1440 has 5 million pixels to reach 100 fps the GPU needs to refresh half a billion pixels per second. Even a 4090 cannot do that as it only does 393 million pixel per second. So if your game has more changing pixels and your pixel fillrate is far away from your target resolution your fps will suffer. Thats also the reason why upscaling gives so performance because it kinda goes around that and renders the game at a lower resolution and then other cores will simply interpolate.
What monitor would you recommend for 3080 12gb , also i play mix of all types of games, not really into specific genre. (Also im planning to update to 5080/5070ti when they are available if reasonably priced)
I am of the opinion right now the 3440x1440p resolution is the optimal resolution for gaming low enough that high refresh rate is almost always on the table but high enough and big enough that it vastly upgrades the experience from smaller 16:9 monitors.
I'm currently on a 3840x2160p 43' 60hz tv, do you recommend changing to a 34' 3440x1440 165hz considering I have a rtx 3080? I mostly play at 4k using Dlss.
Changing to the Ultrawide would allow you to have a much higher refresh rate than you could achieve on the 4k monitor even if your upscaling from the same native resolution. If you're worried about the size shrink there are 3440x1440 39" models out there that will make that hit less noticeable.
I live in an apartment and i sadly can control my neighbors, and have very limited times of day i can recorde. I wish could afford a studio with proper sound isolation but that dream is a long way off.
I would greatly appreciate it if you could make a video discussing the impact of CPU performance on ultra-wide resolution. I was always very curious about how much the CPU bottlenecks games on ultra-wide, but no one ever provides those benchmarks. The only thing I know is that the CPU rarely ever matters at 4k, but it somewhat matters at 1440p 16:9.
I have owned and been happy with several Xiao Mi products but I have never owned a monitor of theirs. I assume your referring to Xiaomi Curved Gaming Monitor G34WQi the price is very attractive for all the features it has. A quick reddit search shows most users seem to be happy with it and the ones that did have defective unites were able to get them RMAed no problem so it looks like a strong choice in that price range.
Any of the cards featured here have their own dedicated videos that show DLSS/FSR Quality performance used when ray tracing, so i do have that info available on my channel. I didn't use it in these videos because it would just throw off the 4090 results by a lot and would affect the 4080 and 7900XTX as well. if you're using the ultrawide and getting more fps than you need on a 4090 then you can use things like DLAA to make the image quality pretty much match that of native 4k.
We know that a 34-inch ultrawide screen is more immersive for gamers and offers a better refresh rate for PC Games. But the main question is, do we notice a real difference between ultrawide and 4k Monitor? I mean, if the picture quality looks better on 4k, then it's something like a big difference in picture quality between VA/IPS and OLED panels. Summarizing: Panel: IPS> VA> Mini LED> OLED> QD-OLED and Resolution: HD>Full HD> QHD 1440p> UWQHD Ultra Wide> UHD> 4K 2160p
Or .... you can run your 4K monitor at 1440p resolution for gaming and bump it up when you want. My 4K monitor at 1440 still looks better then my 1440p monitor, and the FPS cost is minimal. Thank for your video, great job.
I currently have both the LG C2 42 as well as the LG 45GR95QE-B. The UW LG45 is better for PC gaming as it's easier to run while the LG 42 is better for console gaming and general media consumption.
Sure if you have something as large as a TV on your desk you can get away with it but if your using a 32in or god forbid a 27in 4k monitor when you crop in the image is just going to be so small it’s not worth giving up the extra screen real estate for the cinematic aspect ratio.
1440p all the time all year long. I have a 1080p ultra but matched with my old nvidia 1080. Next will be 4080 and 1440p ultrawide upgrades. 4K is not worth the fps loss and you will forever be upgrading if you wish to play newer games at 60fps 4k...
I am of the same opinion, 4k doesn't bring enough to the table at the moment to be worth the performance cost. Maybe in two to three more generations it'll be trivial but today it's not.
@@ultrawidetechchannelYip! i also have my 4k tv connected to my computer for my ps3 & nintendo emulators i run in 4k and controller connected :). Best of both worlds hehe.
All i wonder is that is there lots of graphic diff between uw and normal 4k monitor. Just bought 34 Oled uw and so happy with it but i wonder if other one got so much quality on it.. great vid thx btw boss
In game the extra pixel density isn't as readily noticeable like it is on the desktop and working with text. I have a 5120x2160 ultrawide that i test, edit and sadly only occasionally are able to game on, and while I can tell the difference between it and the 3440x1440p monitor above it. If i were to build a gaming machine from scratch today the monitor would be a 34" 3440x1440p OLED.
As a graphic designer this video causes me great pain. Need to pick better contrast colors for the bars. Two white tons and a grey isn’t working so well. Especially when you’re reviewing monitors. If someone’s watching on a lower quality screen you can’t even tell the difference.
From a purely gaming perspective it's a huge upgrade, the extra detail in a 4k image is hard to realize in any moving camera image and you will actually get more perceived detail from having a higher refresh rate. The lower resolution will give you a smoother gaming experience, and allow you to leverage that higher refresh rate more often and the wider field of view will be a huge plus especially since you didn't care for the extra vertical height.
@@ultrawidetechchannel I tried a 34in ultrawide and it feels too small compared to 32in 3840x2160. I was wrong about the height because I set my monitor too high. I think that 40in 3840x1600 will be golden for me.
@@domagojoinky8262 agree this would be the ideal screen size and resolution.. sadly there don't seem to be any gaming 140hz+ refresh displays with this spec? (certainly not OLED ones)
I have a 3440x1440 oled and and leg 4k oled and I used to have an ips 3440x1440, I could use to tell the difference between my tv and the ips but when I got the 3440x1440 oled I could barely tell the difference. I think the 4k visually does have the edge but it's slim and is not worth the price increase and performance decrease.
Contrast does so much for perceived image sharpness and detail. I saw a video where there was a blind comparison between an ips 8K tv and a 4k oled at 65 inch and the testers all thought the oled was the higher resolution of the two.
@@ultrawidetechchannel Yes, to an almost magical degree and to add to my comparison, I was comparing it from being 50cm away from my 3440x1440, to being around 8 feet away from my 4k lg tv. The reason I love this so much is I know, at least for myself, that I don't need to chase resolution upgrades anymore.
@@mattfm101 I wish i could feel the same way. Logicly i know that the 3440x1440p oleds are the sweet spot for gaming that let you have good resolution and high immersion without sacrificing in the way of high refresh rates. but i can't help but lust after the rumored 45in 5120x2160 240hz oled.
got 42 asus oled that can easily go ultrawide with black bars that is smaller = best of both worlds but personally still use 4k most of the time. I have 4090 so performance not big concern.
I agree that 3440x1440 is right now the sweet spot resolution for gaming. It would be nice if consoles started supporting ultrawide but they are so TV focused that I don't see that happening.
If your getting 90-110 that puts you slower than both the 4080 and the 7900 XTX. Are you sure you're not using dlaa or something that increases the difficulty of rendering but makes the visuals better looking? Or maybe some of the ray tracing features?
Yes even ones that turn it or FSR on by default otherwise the 4090 will be more likely to be CPU bound at the lower resolution and not reflect as accurately the performance difference for someone going for max visuals using DLAA.
@@arl.v39 there is no 1920x1080 in this video. it's exclusively comparing 4k (3840x2160) to 3440x1440. 4k is 4x the resolution of 1080p and the ultrawide is 2.38 times the resolution for reference.
I understand the data on the graphs but i will say that i was hoping there would be a bigger boost when going from 4k to WQHD, similar frame rates make me assuem similar power draw, for what is in acruality 67% less details in a way ?¿ man im still not sure what i wanna get lol
A perfect pair the AW3423DW has just the right amount of resolution and refresh rate to fully take advantage of the 4090 without leaving anything on the table.
@@ultrawidetechchannelif you are a brokie and just have a 3080, get a 1440p OLED UW. If you have a 4090, get a 4k PHOLED 32" 16:9. It's as simple as that.
once you go ultrawide, you don't go back, i love it for media and RPG's. Unless you are a die hard FPS gamer, its a great experience for gaming.
Agreed. I'd like to go higher than 3440x1440 but I don't want to give up the ultrawide aspect ratio. There are ultrawides that go higher but they are ridiculously expensive.
Bs I had multiple ultrawide monitors and I went back to 16:9 including the aw3423dw and I went back to the aw3225qf
Ppi must suck on ultrawides
I fully agree. Ever since going ultrawide I've only been looking at the next better ultrawide monitor. Never even considering a standard aspect ratio.
A 27in 1440p monitor has the same PPI as a 34in ultrawide. A 32in 4K monitor has the same PPI as a 40in 5K by 2K ultrawide.
I gave up after five minutes, these graphs are extremely unintuitive. Graphs are supposed to make it easier to understand data, these do the opposite. Also, dont use white text on light colorful backgrounds, many people wont be able to read it.
I have had a few complaints on the contrast and have upped the contrast on later similar videos with new clearer audio descriptors
He doesn't work for you lol
You don't have to watch, or at minimum if your going to critique it's a good thing to be nice.
The graphs are extremely high quality and must have taken a very long time to be produced. They take some time to understand, but that's expected of something of that level. Just stop being lazy and take 5 minutes to study something before saying it's bad.
Graphs are 100% readable, full of useful info and understandable. It's fully on you. Be nice.
if you don’t understand these graphs than that is most definitely a skill issue lol. we always need people flipping burgers so we appreciate that!
Very interesting comparison! Indeed the factors are not just number of pixels, but what those pixels are used for. Since most 3D games maintain a 1st- or 3rd-person perspective at ground level and orientation, more width would tend to mean more objects in the viewport (assuming same FoV as 4K), less occlusion optimization, and exponentially more items in the distance (for LoD optimization). All of this results in more triangles for the GPU and more work for the CPU in calculating positions, movement, collisions, etc. The actual composition of the game world (how open are the spaces, how cluttered it is vertically) could lead to vast differences in that gap between games. Of course, for many games widescreen is an afterthought and no attention was given to optimizing level/world design for it, which would also have a large effect.
A cool project could be to graph the compared decrease in performance for 1) increasing pixel count while maintaining standard 16:9 aspect ratio, increasing pixels count only horizontally. For your next video. ;)
Almost all the things you mention are bang on reasons why scaling isn't linear except on which is "not optimized for ultrawide" this doesn't seem to be a valid reason (vs 21:9) for the 4k out performing linear scaling. As when i did 1440p vs ultrawide the ultrawide resolution scaled much better than 1:1 against the 16:9 14400p monitor. Some performance affecting metrics are just not influenced by pixel count and scaling is better than 1:1 regardless of the aspect ratio. 4K is not 4 times as hard to drive as 1080p for example (path tracing may get you close though since number of RT cores will be 99% of the limiting factor.)
Through with that said Super Ultrawide dose cause some anomalies because it is showing a 50% different image than the 16:9 monitors.
guy picks gray white and slightly white for reference.... dude what red and blue was not a option?
the gray and white are not the real important info it's the colored bars that appear later and I don't want the background data distracting from the important info.
You've really muddied the data in your charts. We have one bar doing way too much. The colors need to be more distinct between the UW and 4K. I'm not sure why we're interjecting "expected drop", we're expecting drop yes but as you state in the beginning of the video the change won't be linear.
If I just pay attention to the UW and 4K numbers its a good video.
The White and gray bars are just background data to the important thing that we are measuring which is the actual performance difference which is represented with the colored bars. They are meant to not draw focus and that is why they are not in more contrasting colors.
Most people expect performance to behave linearly with resolution so that is there as a guide to how things differ from expectations and highlights when things go really wrong like when your run out of memory on 8gb card and you actually underperform the expectations.
@@ultrawidetechchannel
No, he's right. The graphs are an absolute mess. And the super pale grey compared to slightly darker pale grey is a poor choice too.
General rule with graphs is you should be able to just look and the data is clear. If it's not, then it's a bad graph.
These are bad graphs.
@@ultrawidetechchannel Just take the feedback man. It's a great video but the graphs are a mess. Adding to that they are only useful to look at for the first couple of seconds shown since afterwards you add the percentage and you can't even see the 4K framerate anymore..
i agree i was confused at first. Just make them different colors.
Would be nice to see 3840x1600 kept in the benchmarks, as it's a nice middle ground between 3440x1440 and 3840x2160.
These are really just one to one comparisons because they're easier to understand for most people. I still have the 3840 x1600 resolution with all my other testing and I may do a direct comparison to either 4K or 1440p like I did for this one in the future.
@@ultrawidetechchannelThanks! :)
If you absolutely insist on having a QD OLED panel and also want to do some work on it, unfortunately, the ultrawide option is not feasible (at least not for me). The text fringing is terrible on all current models (except for the 32-inch 4K models). awesome video! just subbed.
The perspective I was aproaching was purly gaming focused but ya if you want to have a work from home monitor that you also game on you might want to look at one of those new Dell IPS black ultrawides.
I guess it's a case of different strokes for different folks. It took a little getting used to but I don't even notice the text fringing anymore.
I work in IT and im perfectly fine on my AW3423DWF QD OLED, guess it depends on the person.
Nice vid...thanks for the work
My pleasure.
IMHO 3840x1600 is the best resolution! 4K is too pixel heavy and the image crispness is not worth the in-game FPS cost compared to the extra field of view you get on an ultra wide monitor.
I agree with you but the big problem is there's a distinct lack of modern monitors at that resolution.
Issue with 4K displays is there are literaly 3 cards that can play games at 4K in native at 60fps+ and not all games ofcourse. With ray tracing enabled even expensive 4090 strugle to deliver 60fps at 4K without upscale technology.
4K will be a tough resolution to run in visual showcase games for a few more gpu generations as Path tracing finds it's way in to more and more games. And the 3440x1440p resolution should still be able to leverage the fastest cards of those generations with out being waisted in high fidelity titles.
so i have the DWF 34" usually sit about 3 ft away. I made the mistake of hooking up my PC (7900xtx) to my family room 65" LG G2 and OMG even sitting just 5FT away from the 65" the difference is massive! I just found myself exploring maps more on games as they just looked so much more stunning. so now i may have to change my DWF 3440x1440 for a 32" 4K oled.
You sit a bit far from your monitor and very close to your tv. I don't think it's the 4K. I think it's the proximity versus screen size. You should really pull your monitor about a foot closer to you and most people would have a TV of that size around 12 ft away.
I think you will be very disappointed if you buy that 32in and expect it to feel like your tv did.
hey can you help me, is the downgrade on quality i have 28" 4k 144hz ips hight end monitor planing to changer for g8 34 oled is it worthy or i will get hit in picture quality? i have 7900xtx gpu
Thanks you, this I what I been looking for
Glad I could help.
My employer got me a 32 inch 4k 60Hz IPS monitor for design work about 5 years ago. I've always also used it for gaming. I'm currently using an RX 6700 XT and a Ryzen 5 7600. Most games run at 1440p 60Hz. Spending between $299 - $550, I could get a 27 inch 1440p 240Hz IPS, a 34 inch ~180Hz VA or, a 32 inch 4k 144Hz IPS? I'll likely upgrade to something the equivalent to a 4070ti - 4080 next year. Is 32 inch 1440p a bit low resolution for design work?
I think you'll find a 32-in 1440p monitor to be quite blurry and unsatisfactory even for just general web browsing, much less content creation. Especially if you're coming from a 4k monitor. When I got my 5K by 2K 40 inch ultrawide for video editing, I checked multiple times that my 3440x1440p 34 inch ultrawide to wasn't using a lower resolution then should be because it looks so blurry in comparison and that has a much better DPI than a 32 inch 1440p monitor does.
@@ultrawidetechchannel 32 inch 4k gaming it is then. I like the MSI MAG 323UPF for its brightness and USB-C 90W PD and KVM so I can use my MacBook Pro and gaming PC on the same monitor. Also looking at the new, hopefully cheaper MSI MAG 322UPF and the slightly older Gigabyte M32U.
Pixel fillrate is very important with this. You might hear some people say that its irrelevant, but its actually one of the most important things in this matter. Cheaper or especially older cards will bottleneck at these high resolutions. Thats why benchmarks on normal monitors don't matter because most cards will not get bottlenecked by fillrate at 1080p.
Nvidia is notoriously worse in this department. 3440x1440 has 5 million pixels to reach 100 fps the GPU needs to refresh half a billion pixels per second. Even a 4090 cannot do that as it only does 393 million pixel per second. So if your game has more changing pixels and your pixel fillrate is far away from your target resolution your fps will suffer.
Thats also the reason why upscaling gives so performance because it kinda goes around that and renders the game at a lower resolution and then other cores will simply interpolate.
What monitor would you recommend for 3080 12gb , also i play mix of all types of games, not really into specific genre. (Also im planning to update to 5080/5070ti when they are available if reasonably priced)
I am of the opinion right now the 3440x1440p resolution is the optimal resolution for gaming low enough that high refresh rate is almost always on the table but high enough and big enough that it vastly upgrades the experience from smaller 16:9 monitors.
Once i would be able to get the 5k2k OLED i would be so happy. Best of the three worlds. Ultrawide, 2160p, fast and crispy panel.
It is the dream, but i'm afraid it's going to be at least 12 months out still :(
I wish it was just a side by side fps comparison lol
I'm currently on a 3840x2160p 43' 60hz tv, do you recommend changing to a 34' 3440x1440 165hz considering I have a rtx 3080?
I mostly play at 4k using Dlss.
Changing to the Ultrawide would allow you to have a much higher refresh rate than you could achieve on the 4k monitor even if your upscaling from the same native resolution. If you're worried about the size shrink there are 3440x1440 39" models out there that will make that hit less noticeable.
Thanks for making the video, was thinking to swap my ultra wide 34 to 4K monitor 32, and I shouldnt , my card and cpu will be destroyed LOL
Glad I could help
Dod yoy recommend ultrawide oled 3440*1440 for rtx 4090 ? or 4k oled 16.9 ?
bro whats that background noise ? just let them out of the basement already
I live in an apartment and i sadly can control my neighbors, and have very limited times of day i can recorde. I wish could afford a studio with proper sound isolation but that dream is a long way off.
I would greatly appreciate it if you could make a video discussing the impact of CPU performance on ultra-wide resolution. I was always very curious about how much the CPU bottlenecks games on ultra-wide, but no one ever provides those benchmarks. The only thing I know is that the CPU rarely ever matters at 4k, but it somewhat matters at 1440p 16:9.
Ultrawide lovin
That's what I'm about.
What do you think of a monitor: Xiaomi MI Curved 32" Ultra wide monitor?
I have owned and been happy with several Xiao Mi products but I have never owned a monitor of theirs. I assume your referring to Xiaomi Curved Gaming Monitor G34WQi the price is very attractive for all the features it has. A quick reddit search shows most users seem to be happy with it and the ones that did have defective unites were able to get them RMAed no problem so it looks like a strong choice in that price range.
Sorry, i don't understand the graph. I thought you would show us how gaming looks like on both monitors so it would helps us which one to buy
You deserve lots of subscribers :)
not bad, i would still like to see the performance costs when dlss is enabled in supported title, and v ram usage.
Any of the cards featured here have their own dedicated videos that show DLSS/FSR Quality performance used when ray tracing, so i do have that info available on my channel. I didn't use it in these videos because it would just throw off the 4090 results by a lot and would affect the 4080 and 7900XTX as well.
if you're using the ultrawide and getting more fps than you need on a 4090 then you can use things like DLAA to make the image quality pretty much match that of native 4k.
We know that a 34-inch ultrawide screen is more immersive for gamers and offers a better refresh rate for PC Games. But the main question is, do we notice a real difference between ultrawide and 4k Monitor? I mean, if the picture quality looks better on 4k, then it's something like a big difference in picture quality between VA/IPS and OLED panels. Summarizing: Panel: IPS> VA> Mini LED> OLED> QD-OLED and Resolution: HD>Full HD> QHD 1440p> UWQHD Ultra Wide> UHD> 4K 2160p
Or .... you can run your 4K monitor at 1440p resolution for gaming and bump it up when you want. My 4K monitor at 1440 still looks better then my 1440p monitor, and the FPS cost is minimal. Thank for your video, great job.
I currently have both the LG C2 42 as well as the LG 45GR95QE-B. The UW LG45 is better for PC gaming as it's easier to run while the LG 42 is better for console gaming and general media consumption.
That is pretty much the perfect gaming and media combo you have going on there.
Why choose when u can set your TV to ultrawide, some modern TVs automatically have this feature but u can always manually set it on your PC.
Sure if you have something as large as a TV on your desk you can get away with it but if your using a 32in or god forbid a 27in 4k monitor when you crop in the image is just going to be so small it’s not worth giving up the extra screen real estate for the cinematic aspect ratio.
1440p all the time all year long. I have a 1080p ultra but matched with my old nvidia 1080. Next will be 4080 and 1440p ultrawide upgrades. 4K is not worth the fps loss and you will forever be upgrading if you wish to play newer games at 60fps 4k...
I am of the same opinion, 4k doesn't bring enough to the table at the moment to be worth the performance cost. Maybe in two to three more generations it'll be trivial but today it's not.
@@ultrawidetechchannelYip! i also have my 4k tv connected to my computer for my ps3 & nintendo emulators i run in 4k and controller connected :). Best of both worlds hehe.
All i wonder is that is there lots of graphic diff between uw and normal 4k monitor. Just bought 34 Oled uw and so happy with it but i wonder if other one got so much quality on it.. great vid thx btw boss
In game the extra pixel density isn't as readily noticeable like it is on the desktop and working with text. I have a 5120x2160 ultrawide that i test, edit and sadly only occasionally are able to game on, and while I can tell the difference between it and the 3440x1440p monitor above it. If i were to build a gaming machine from scratch today the monitor would be a 34" 3440x1440p OLED.
@@ultrawidetechchannel thx a lot boss, helped a lot
As a graphic designer this video causes me great pain. Need to pick better contrast colors for the bars. Two white tons and a grey isn’t working so well. Especially when you’re reviewing monitors. If someone’s watching on a lower quality screen you can’t even tell the difference.
Is going from 32in 3840x2160 60Hz to 34in 3440x1440 144Hz downgrade or upgrade ? I would like a bit more width. 32in has too much height for me.
From a purely gaming perspective it's a huge upgrade, the extra detail in a 4k image is hard to realize in any moving camera image and you will actually get more perceived detail from having a higher refresh rate. The lower resolution will give you a smoother gaming experience, and allow you to leverage that higher refresh rate more often and the wider field of view will be a huge plus especially since you didn't care for the extra vertical height.
@@ultrawidetechchannel I tried a 34in ultrawide and it feels too small compared to 32in 3840x2160. I was wrong about the height because I set my monitor too high. I think that 40in 3840x1600 will be golden for me.
@@domagojoinky8262 I don't know of any 40in 3840x1600 monitors but there are 38" ones.
@@domagojoinky8262 agree this would be the ideal screen size and resolution.. sadly there don't seem to be any gaming 140hz+ refresh displays with this spec? (certainly not OLED ones)
I have a 3440x1440 oled and and leg 4k oled and I used to have an ips 3440x1440, I could use to tell the difference between my tv and the ips but when I got the 3440x1440 oled I could barely tell the difference. I think the 4k visually does have the edge but it's slim and is not worth the price increase and performance decrease.
Contrast does so much for perceived image sharpness and detail. I saw a video where there was a blind comparison between an ips 8K tv and a 4k oled at 65 inch and the testers all thought the oled was the higher resolution of the two.
@@ultrawidetechchannel Yes, to an almost magical degree and to add to my comparison, I was comparing it from being 50cm away from my 3440x1440, to being around 8 feet away from my 4k lg tv.
The reason I love this so much is I know, at least for myself, that I don't need to chase resolution upgrades anymore.
@@mattfm101 I wish i could feel the same way. Logicly i know that the 3440x1440p oleds are the sweet spot for gaming that let you have good resolution and high immersion without sacrificing in the way of high refresh rates. but i can't help but lust after the rumored 45in 5120x2160 240hz oled.
I have no idea what im looking at
I am still waiting for QD OLED 38" 3840x1600P 240Hz. Then I will buy a wide screen .
the info is good i think, but it s hard to follow, the graphs seem overcrowded
got 42 asus oled that can easily go ultrawide with black bars that is smaller = best of both worlds but personally still use 4k most of the time. I have 4090 so performance not big concern.
Once you go desk destroyer size with your monitor, you can pull of things like that if you want to boost your FPS.
In my opinion the best resolution in gaming 3440 X 1440 and above 120FPS .. hope PlayStation see this better than PS5pro
I agree that 3440x1440 is right now the sweet spot resolution for gaming. It would be nice if consoles started supporting ultrawide but they are so TV focused that I don't see that happening.
@@ultrawidetechchannel yup, so bad u are right .. more than 50% of console gamers playing on monitors .. but they focused on TV i don’t know why
Can't say I've ever seen 194+ fps in Diablo 4 at 4K on my 4090 with a 7950x3d processor.
4K no DLSS 90-110 fps.
DLSS balanced I get maybe 165 -_-
If your getting 90-110 that puts you slower than both the 4080 and the 7900 XTX. Are you sure you're not using dlaa or something that increases the difficulty of rendering but makes the visuals better looking? Or maybe some of the ray tracing features?
@@ultrawidetechchannel had to reinstall drivers.
I am getting 165+ with ultra and dlaa.
No ray tracing as that sits at 100-115
DLSS off in all games?
Yes even ones that turn it or FSR on by default otherwise the 4090 will be more likely to be CPU bound at the lower resolution and not reflect as accurately the performance difference for someone going for max visuals using DLAA.
@@ultrawidetechchannel It's a good news. I thought 3440*1440's FPS is around 70% of 1920*1080. Now this video shows it's under 65%.😂
@@arl.v39 there is no 1920x1080 in this video. it's exclusively comparing 4k (3840x2160) to 3440x1440. 4k is 4x the resolution of 1080p and the ultrawide is 2.38 times the resolution for reference.
I understand the data on the graphs but i will say that i was hoping there would be a bigger boost when going from 4k to WQHD, similar frame rates make me assuem similar power draw, for what is in acruality 67% less details in a way ?¿ man im still not sure what i wanna get lol
This video is exactly why I won't be getting a 4K screen for my 4090. I'm using the AW3423DW 3440x1440 OLED and it's excellent.
A perfect pair the AW3423DW has just the right amount of resolution and refresh rate to fully take advantage of the 4090 without leaving anything on the table.
@@ultrawidetechchannelif you are a brokie and just have a 3080, get a 1440p OLED UW. If you have a 4090, get a 4k PHOLED 32" 16:9. It's as simple as that.
The graphs make no sense.
This makes me glad I play at 1080p haha
It certainly is cheaper to run.
Davis Brenda Hall Jason Lopez Laura
quit talking with your hands
They move that way on their own and there's nothing I can do to stop them.