Ya me too . I recently upgraded from 3440x1440 to a 4K but it has a 3840x1600 mode which I use for racing and Witcher/Fallout type games. Best of both worlds and automatically detected in the driver so no futzing after switching. First video which actually gives a closer idea of where it falls between the 2. Thumbs up.
Same, I compare 4k to 1440p and pick the numbers in-between them for a rough guess it's worked well for the last 10 years I have had my old 2560x1080 (calculated from 1440p-1080p) and then 3440x1440p monitor, and if I get better performance all the better. I am a firm believer in "better to have it and not need it, than to need it and not have it" Who complains that they have performance left on the table? Unless of course the game runs like crap and all your hardware is snoozing, I have always had odd resolution monitors even back in the 90's I used to have to modify/hack games to get them to work on my CRT monitor, I guess it says something about me lol I upgraded to a 4070Ti Super from a RTX2080 because the 8GB VRAM was becoming a problem. Even DLSS couldn't fix that in a lot of the games. As i like to keep my graphics card for a while i didn't feel 12GB would be enough, and in some cases already 12GB can struggle in a few titles which isn't a good sign for the future...
@@Rafael_B Samsung Odyssey Neo G7 43". It had some reported issues when it first launched which i fortunately missed as I bought it early 2024 (for 1/2 MSRP). I applied the latest firmware right after hooking it up and am very happy with it. Watched a few YT videos on tweaking it for HDR which got me in the ballpark to tweak it further. Only running it @ 120 Hz as that's where 4:4:4 chroma support ends but my graphics card can't push it even this fast usually and I'm in need of a GPU upgrade now. Ah, I've opened the proverbial can of worms ...
You have a killer setup and I would agree that 3440x1440 is kind of the sweet spot resolution right now where almost no card is too powerful for the resolution, but a lot of cards can still work very well at that resolution. And it has some damn fine monitors available, whose prices aren't too outrageous.
I'm currently using a 27inch 1440p monitor, I'm giving serious consideration into getting a ultrawide Monitor. its a bit annoying as not all games support the 3440 resolution but there is usually a workaround. Question: could you ever go back to a 16:9 monitor after using a 21:9 ultrawide for so long?
@@Lennox032 Personally I'm not interested in going back. For a brief time there i was eyeballing one of those 42" LG tvs but then the AW3423DW OLED came out and I lost all interest in them.
@@Lennox032 I switched my main monitor LG 27" IPS 1440p 180hz to LG 34" IPS ultrawide 160hz about half a year ago, I would never go back. Having that extra space is simply amazing. Very minor downsides are cutscenes being 16:9 in games and most youtube content is 16:9 but there are extensions where you can toggle the aspect ratio / crop the video.
Thanks, I do beleive the ultrawide comunity will continue to grow it seems 9/10 ultrawide users never go back to standard wide monitors. And with so many new sizes, resolutions and price points in the ultrawide ecosystem will keep it going .
Once you go ultrawide you can never go back ......... For the first time I can play First Person without feeling like I have horse blinders on wrecking my peripheral vision and Third Person games feel even more cinematic than ever .... The larger problem is the idiot game developers that never do their cutscenes in ultrawide meaning I usually have to go into the game's EXE with a Hex Editor and hack it so the cutscenes are ultrawide too .... Sure it's simple enough to do but that's all the more reason it's stupid that the game developers don't do with with a few simple lines of code.
Thanks a lot for this video. I am currently investigating buying a new graphics card for my 3440x1440 screen, but most fps examples I find for 1440 and this helps to understand where I would roughly end up. Initially I was doing the same mistake, by multiplying the fps with a factor based on the resolution quotes.
I have both individual and head to head videos for all the graphics cards for the resolution you're interested in. They cover off both averages and 1% low performance as well as show DLSS and FSR performance for ray tracing.
What I have observed trough numerous reviews online trough the last few years, you need approx. 73FPS at 1440p to have 60FPS in 1440p UW. Interestingly enough, that is almost exactly reflecting your findings :D
Ya doubling your resolution like that will definitely give you a hard time in cyberpunk. it really hurts you for every pixel added especially when ray tracing.
Very informative, i noticed this while playing Star citizen in 3840x1600, and while using all 3 of my screens for a 6000x1600 resolution, i saw only a small drop in frames, 6800xt and 5900x 32gb ram. Unfortunately the screens on the side are not freesync so i get some tearing and not full screen on them because of their resolution.
I'll be doing a 4k vs super ultrawide video next and I'm very currious if ill get a lower margin than the 12.5% resolution scale as you would expect when just scaling up or if the nature of the super ultrawide showing a 50% different sceen will make it actuly scale better than the 12.5% like you were seeing using a multimonitor display.
I've done various triple monitor setups for the last 15yrs. AMD used to have far more advanced driver support for triple monitors. Back during the AMD 5850 1gb GPU's days of 2009 I had triple 1080p 60hz matching monitors and it worked great. I then tried an Ultrawide 1080p in the middle with two similar bezel matching LG 16:9 1080p monitors on the sides. I never could get away from tearing or a proper combined resolution with black bars. I now have on my wife's PC combined three 27" 1080p 240hz LG's all on Display Port and even though one of them isn't the same model they all play nice together due to the matching resolution. They all run 240hz with VRR working correctly on my 6800XT. I even have a 4th 27" 1080p 144hz installed as the center top monitor. The default configuration tool in AMD driver doesn't allow for proper config always placing the 4th monitor in a quad wide setup instead of being a standalone. What I learned from a YT video is AMD has this eyefinity monitor configuration application inside the AMD driver folder that you can run which actually did provide a bit more config options. I likely won't fix your mix matched resolution issues with black bars but it did allow me to configure my setup properly. I have to select 3x1 and then pick which of the 4 monitors I wanted to setup. It then set the 4th monitor as a stand alone and it properly now positions itself on top of the combined triple setup. The built in config application is called EYEFINITYPRO.EXE and can be found here C:/program files/AMD/CNEXT/CNEXT/eyefinitypro.exe.
I love PLP setups. Used to have HP 20-30-20 PLP back in the day. Now also 3840x1600 but still interested in side monitors. What 1600p monitors do you use for the portrait monitors?
This is why I also factor in Panel size + Seating distance into my performance budget. It saves me a lot. My usual seating distance of 2.5 - 3 feet away allows me to use 2560 x 1080P @ 29" quite comfortably, including with the panel size. With a 3440 x 1440P monitor, I'd need to sit much closer, but lose peripheral vision benefits of having the full monitor in view without needing eye movement. I would also need to use 125% @ 3440 x 1440p windows scaling to see better. Cards like the 6950 XT ($499)/7900 GRE ($550) become much for viable for me versus needing $1000 cards, like 7900 XTX to satisfy the 4 million pixel budget of UW1440P. UW1080P is nice.
Went from 1920x1080 24inch to 2560x1080 29 inch and it was all i wanted. Performance Hit was there, but managable (i7 8700 + RX 6600). Upgraded to 3440x1440 in May this year and also upgraded to R7600 + RX 6800. It is wonderful, BUT. While 1440p is great and clear. 1080p UW does not look bad. It really is something to consider when not much budget and 29inch UW works wonderful next to a 24 inch 16:9. If you really really are drawn to UW, get it. Do not get it, if your System already struggles with 16:9 Resolution. But 1080p UW exists and it not that bad.
I broke my ultrwide 1080p this week within an hour of setting it up in a new spot. I have the 5800x3d and rx7600. I'm looking at the 1440 ultrawide moniters now. I'll be watching movies and all at 1440 but playing games at 1080p. How noticeable is the blurriness from downscaling while playing?
@@bliN.K_ray it won't go on sale as they stopped production...anyway for 1440p UW the 4080 is enough, pretty happy with my 4080 on 144hz 1440p ultrawide
While I don't have any hands on experience for a true budget monitor, the KOORUI 34 Inch Ultrawide 165HZ amzn.to/3Pl2I2M is getting some good press on several outlets and has a pretty satisfied amazon user score. It’s not IPS it’s using what I think is one of Samsung’s fast VA screens but those are very good panels. Here is a little bit more expensive LG option amzn.to/3veCNCP again haven’t used it myself but have had good experiences with my LG monitor and TVs and here is the Samsung G5 amzn.to/48TR1qt the quintessential midrange ultrawide.
If you just set the internal game resolution, while in full screen mode, to whatever resolution you want your recording to be, then it will record it at that resolution even if in the game it shows the screen all stretched out. The recording should be in standard 16:9
I would also like to see 3440x1440 vs 3840x2160. Watching reviews of whatever new hardware that comes out, Ive always estimated about half of the difference between 1440 results and 4k. I actually have no idea what the difference should be
@@ultrawidetechchannel One more question, is it possible to consider the power consumption of those cards for future videos like this? This way we can compare which resolution is more energy efficient or to have more data to compare in addition to the fps. Thanks again for all your work and dedication, you really are the only UA-cam channel creating content like this
@@lucashernandez922 I don't really have the equipment to properly collect that data and on the tiny bit of money I'm getting from UA-cam I’m not going to be able to invest in it for a while. Eventually I would love to be able to test everything but that is still a while down the road.
The only problem I have with ultra wide is the fact that now every lesser quality monitor I look at has poor quality in comparison. Once you’re playing in 5k everything else looks like shit
Well said, using the math in your video as well as the one tier above your own typical 1440p GPU of choice is a good decision maker. No need to overspend on some cards unless raytracing or other features are of importance, because from pure rasterization performance or even with some upscaling you could easily get 100+ in a lot of these titles. But would you still recommend the 7800XT though for 1440p ultrawide if you're going off by purely rasterization performance/FSR without raytracing and high/optimised settings instead of ultra? Or is it worth to spend the extra for the 7900XT in the long run? I still haven't seen many 7800XT ultrawide benchmarks out there despite it being months now since the cards release.
Sadly, I don't have the 7800XT so I can't say definitively but personally I would like to have the extra power of the 7900XT for the ultrawide resolution just so I know that high refresh rates are almost always going to be on the table if I feel like the game will really benefit from it. At standard 1440p i think the 7800XT would satisfy but for the ultrawide I would still want the 7900XT the same way that I recommend the 4070Ti over the 4070 at this resolution.
Use SAM (rebar) and the auto OC of adrenaline. Then it can do most games 70+fps on ultrawide. Its better than going for a more expensive card right now. Just save that upgrade money and upgrade in 5 years instead of 6 years.
so, ive got a 3440x1440p monitor paired with a 3070.. been seeing the limits of just 8gbs and not sure where to go from here, 4070TS/4080(S) or as much as i enjoy it, sell the monitor. Any Thoughts?
The 16GB on the 4070TiS and the 4080S will keep you safe at that resolution for quite a long while. Even if you were to go down to a 16:9 1440p monitor it wouldn't mean you would be safe with only 8GB of VRAM. Some newer games are still going to get you.
Thank you, this is very helpful! I myself has just upgraded to 4080 super and was seriously considering 4090. However, after trying out 4080s on my ultrawide AW3423dw I'm really happy I didn't pay extra for 4090 - everything runs great on perfect FPS and high graphics settings
@@yasink1995 i really like my OLED AW3423dw, so never thought of replacing it with a 4k screen. Plus a 4k oled would cost extra money and it will hit my fps
great test....been gaming on 3440x1440 since 2015 and loving it....will never go back to 16:9 was thinking of going to 3840x1600 when I upgraded to 4080 last year but in the end I stayed with 34" 3440x1440 the 34" is perfect for me, with 38" I had to move my head to look at the corners when I had it at comfortable distance to read the text (same distance as my 34" as they both have almost the same PPI so the same distance provides the same text readability at 100% scaling)
@@ultrawidetechchannel yeah well in 2024 there are much better options for pure productivity than 34" 3440x1440 but in 2015 that screen was a gamechanger ;o]] there were only few 4k monitors in existance in 2015 and all kinda small so 34" ultrawide was actually the productivity king ;o]] I think the 3840x1600 38" will see a second wind and with some time become what 3440x1440 34" is today, the common accessible gaming ultrawide 5K is too many pixels for a few more gpu generations to be widely adopted for gaming
I went to 34” a couple years ago but accidentally broke my monitor so had to go back to 27” for a whole, I hated it after trying ultrawide so now i’m back with a 34” monitor lol but finding benchmarks for it isn’t the easiest.
What I'd really love to know is how much the two side monitors I have next to my center 3440x1440p screen are hurting my gaming performance. I have a 27" 3840x2160p 60Hz monitor on the left, my center 35" 3440x1440p 100Hz screen in the middle and a 27" 2560x1440p (60Hz) on the right. I usually only have STEAM and Discord running on the side monitors so nothing that should be using a lot of GPU horsepower but still, would be interesting to know if I'm losing noticeable performance by having those side monitors on while gaming.
Honestly, that's my main hesitation. I know a lot or newer games are doing better with natively supporting it but I also play a lot of older games as well and I'm debating whether the additional hassle would be worth it to me or if I should just get a normal oled instead of a nice ultrawide
@@slayerdwarfify once you get used to it, it’s not that bad. But, it can be annoying when you want to play a game that just released, but doesn’t support it, and there’s no fix available for a couple days to a few weeks or possibly months. Or if there’s no support at all, and you’re forced to play the game with black bars. I hate that even more. Actually there’s even worse 😬 when there is no support, and instead of black bars, it’s white bars! That sucks
@@slayerdwarfifyjust use the ultra wide monitor with a 2560x1440p resolution if there is no other way the sides are just black Bars in this resolution but you get used to it to the point that you think you play on a non ultrawide
When i upgraded to a 1440p ultrawide, my 1070 was having a hard time driving total war warhammer series. Had to upgrade to a 2080ti to get a decent framerate. x70 series cards would be the bare minimum for getting good fps on UW1440P tho id recommend x80. Great content, i subscribed
Thanks for the sub. Yeah, the 4070 series at that resolution is providing you with good performance today but likely not so good performance in a few years. Where is the 4080 will give you amazing performance now and good performance in a couple years.
Noob here with monitors. Can I change the resolution down to 2560 if needed so that there is black bars on the sides but can increase the FPS? I also have a Sony 50" OLED 60hz and my 6800 has had no problems keeping up with 60hz. However I just need something for programing and the occasional RTS / FPS games since my PC has been used mainly for iRacing and AC. Thanks.
I could be wrong but it seems the risk of burn in would be greater if the monitor was running in a “fake” 16:9 ratio rather than its native 21:9. As in the pixels on the edges would receive less wear than the ones being used in the middle. Proper pixel refreshes could likely combat this, all depends on how many hrs you use it at a time plus if you’re doing a lot of static image work
If you plan to work as much as you play I would look for an LCD based Ultrawide, most games that are 16:9 only will default to a letterboxed 2560x1440 if you want to force a game to be 16:9 that supports ultrawide it is doable though i would just use DLSS if i wanted to increase my fps rather than cutting down 34% of my screen real-estate.
@@JonoLB Thanks for response. Oh sorry I would be using my LG 34GP950G-B mainly for programming, browsers, linux shell, docs, and lite photoshop editing. This will be done on a Macbook pro. MS Windows with AMD 7800X3d and 6800 GPU will be used mainly for gaming and lite audio editing/ DJing until I get a dedicated machine for DJing. Likely a m1 or m2. Still not wanting to pay too much for a SOC from Apple that can't be upgraded... :) OLED is only used for sim racing and watching movies. Response time is amazing for only 60hz.. But I'm not an expert with Monitors. I've been using my Macbook for work with old retina screen and cheap 24inch 1080p monitor. So wanted to give myself a better monitor to help optimize my work space and also utilize my PC for other lite gaming such as FIFA. Thanks for your help.
This video makes me hesitating between OLED 3440x1440p and 4K a lot …. If my performance will decrease by switching on ultrawide( even a little) would it be smarter to move to 4K 32’ for sharper and crisp image ? Nice video btw 😊
I'm not quite clear on what your saying, but 4K (3840x2160) would be significantly harder to drive than Ultrawide (3440x1440). In game the sharpness advantage of 4K will be less noticable than the frame rate drop will be, but on the desktop and in other apps the the sharpness advantage will be clearly noticable.
I’m my experience, there isn’t a massive visual difference between the image clarity of my 3440x1440 Alienware oled and my 4K tv. What made the biggest difference is the extra width for immersion and going from ips to oled. I used to always lug my computer upstairs to play games in 4K, but since the ultra wide, I never do. Because I much much prefer the aspect ratio and the image clarity is super high fidelity. Difference is nil
@@ghulcaster I bought a MSI Qd Oled 3440x1440p and yes I full agree, the ultrawide ratio is awesome and I don’t regret my purchase I can notice less clarity in games because I just love pause and looking the landscape but it doesn’t bother me a lot ( except for games with bad TAA). When the games is a little bit blurry you can use the NVIDIA filter and enjoy the ultrawide clarity :)
Even with DLSS on, I'm having trouble playing big games like The Witcher 3 on my Dell G15 with RTX 4050 on an LG 34" monitor. I can't get past 45 fps, is that normal?
Support the creators who natively support ultrawide and don't support those who won't do it. Otherwise, use mods if you're willing to go for the hassle that shouldn't even exist. In the end it doesn't even matter.
3440×1440 is amazing. My GTX1080 does really well even on newer games. I can run medium to high graphics on Forza Horizon 5 and average between 50-90FPS I have a 144hz monitor and although I dont really ever hit it in games 8 years old or newer, I rarely ever get anything below 90fps and its perfect as is
Actually it does scale fairly linearly with resolution. It's just not a pure 1:1 scaling, It's more like 0.9:1 versus the pixel count increase (which comes to about a 22% difference, down from 34%). Which is to be expected given that not ALL of the graphical work is granular, some of it has more vector-like properties. I wasn't seeing "half" of the projected performance drop as you say in summary. Ray tracing has more CPU involvement... which can somewhat skew graphics card comparisons. But for your purposes here, I don't think it would skew the effect you're trying to show. A big part of why you're not seeing a 1:1 linear decrease is that 1440p ultrawide is using regular 1440p textures (just in somewhat larger quantity). When you go from 1080 to 1440, or 1440 to 4k, you're dealing with a lot more detailed texture work. The ultrawide steps in resolution skip that, which saves some performance. But even full-rez jumps aren't quite 1:1 (but they can get pretty close to 1:1) I don't think it has much to do with what's displaying in the center versus the edges of the screen though. Unless you're in games with detailed players & enemies, but rather un-detailed settings (of which there are several popular titles, admittedly). In games with detailed settings, there's not going to be any benefit there though. Still, you're looking at 25-40% OF 34%... so a 1:1 guestimate is off by what? 8-13%? (in the ultimate FPS) ...not sure how often that's going to make a significant difference in video card choice. It's a bigger difference than an overclock, but it's not a huge difference. I dunno. Generally I advise against buying the bare minimum card for your needs... there's no bigger waste of money than getting a card which is "almost good enough". On the other hand, if you overshoot a bit, you have some insulation against obsolescence. So it's not really a waste unless you overshoot by a lot.
I don't think you can call it scaling linearly even at a lower ratio because the game to game variability can sometimes be pretty big. If it was linear at a lower ratio then most games would see a similar loss. The end average discounting the 2080 was 17.8% which is not quite half but not too far off, definitely closer to half than 75% so I'm ok with that fact that I said "about half". As for it being better to over shoot, well it depends where you are in the stack because if you over shot from a 4080/7900XTX to a 4090 because you though you needed 34.4% more performance rather than 20%/16% more than that is a $1000 over spend (at the moment) which is the cost of a nice OLED Ultrawide monitor these days.
@@ultrawidetechchannel Most "steps" in gpu tier are no more than $200. 4090 is a bad example as it's kind of in an ultra-tier of it's own. Personally, I've under-estimated gpu needs, and it was quite a waste of money to do so. Twice actually. Meanwhile I have also over-estimated gpu needs (1080 Ti), which at 1440p allowed me to skip 2 generations of gpu's (rather than the usual one generation), and it _more_ than paid for the up front premium on cost, while offering really 'above & beyond' performance early in it's 6 year use cycle. You can't know for certain how well a card will perform on your specific machine, particularly with software applications which haven't even been released yet ..."a bit of padding is good". For example, right now 8GB has problems in 1440p in numerous game titles. And it usually shows up in benchmarks. But some titles are starting to ask for more than 12GB. Not very many ...yet. But personally, I would lean strongly towards 16GB as a buffer against this game requirements trend. It seems like nvidia is leaning into ram limitations as a method of built-in obsolescence in their more mid-tier cards. You make a valid point. It's not a 34% difference. More like 20-28%. I think i was more distracted by the "barely enough is good" thinking. I just look at it differently. To me "barely enough" is "almost obsolete".
@@kathrynck Where you’re upgrading from in the stack does have a big effect on pain of the price jump. Before the 4080 Super the price jump from the Ti to the 4080 was pretty ugly at $400 a 50% increase in price. While the 7900XTX did help out for those looking for only a raster boost it did nothing for those looking to increase RT performance. All that said the true solution to the problem is better data, and I am doing my damndest to provide that for the Ultrawide Community. I might be far from perfect but as far as I can tell I’m the best the community has and trust me I have looked trying to find someone who was doing this so I could follow them and watch their reviews. But that person didn’t seem to exist, so I set out to rectify the problem myself. I’m doing what I can with the resources I have to help ultrawide users make better decisions.
I've been using a 5120x1440 super ultra wide with my rtx 3080 and I feel like it seems the shit out of my card and it's over heating the monitor cables in the back amdy monitor black screens (not actual black screen). I down scaled to a 32" 1440 monitor hopen it fixed my issue. You have any insight?
Your 3080 is not strong enough for that ultra wide. You are basically playing games at higher resolution than 4K with that monitor. If you play on standard 1440p monitor you see huge boost in performance and lower heat dissipation.
A month ago I weighted the idea of changing my RTX 2060 for a RX 7700. But, I considered I would not notice any significant difference since I had a 1080p 60hz monitor. Very basic. The RTX 2060 already allowed me to play 98% of my games at ultra settings and achieving 60 fps. an RX 7700 would have not made any difference. my real bottleneck was my monitor. So I went ultrawide 1440p 160hz. In the case of Helldivers 2, which I played at high settings, the change to 1440p took a heavy hit on the poor 6VRAM of my gpu, so I had to downscale the settings and use upscaling to barely achieve 60 fps. But in the end, you are wondering, was it worth it?. HELL YES. I CAN SEE MORE!
I run an ultrawide 1080P monitor and I was considering a 1440P one to pair up with my 5800X3D and 7800 XT. And have been wondering how much of a drag ultra wide 1440P will be on that. But I also typically run games with FSR (frame generation it available) and no ray tracing at high to ultra settings. Now I may be able to make a more educated decision.
I have a Westinghouse 34" ultrawide being pushed by a 5800x3d and 6900xt. The set up workes pretty well. Cyberpunk runs at around 65 fps with ultra settings and high RT settings.
I'm very hesitant on getting an ultra wide 3440x1440 but I'm getting a 4070 super. I'm not sure how performance will be especially when I want to lock at 100 fps
It is also worth to mention than in ultrawide you can activate vrs without really notice it, increasing even more the performance with basically no quality difference.
Depends on the game and if you are willing to use DLSS or FSR upscaling and frame gen. If you can get to about 90fps with upscaling then frame gen can take you the resto of the way to your monitors max refresh rate.
3840x1600 is a wonderful resolution that doesnt seem common enough with monitors. Those 45 ultrawides released in 2022-2023 with 3440x1440 resolution really should have been 3840x1600 instead.
The same applies to 4K and even 8K, increasing resolution does not increase load across all aspects of the render pipeline equally, even if you calculate pixels rendered per-second you're almost always getting the best value using a higher resolution display.
when it comes to gaming is ultra wide really worth all the hassle? having to worry about performance, paying more money for the monitor and whether the game properly supports the setting doesn't seem like a good trade-off just to get some more screen on the sides, which might even make the experience worse for many people. why not go from 1440p to 4k instead where you would still have to pay the extra money and need a better gpu but the visual benefits are more noticeable and you don't get the cringe wide screen that might cause nausea nor do u have to worry whether it will actually work for certain games?
I have a 34" 1440p ultrawide and I still haven't found a game made in the last 10 years that doesn't support this resolution. And even for older games that don't support it, there's usually a mod that makes it work (Skyrim for example). I personally like ultrawide because I play a lot of games where there's a lot of UI navigating (RTS, city builders, management, simulation). I no longer want a non-ultrawide because the games I play need a lot of info to be displayed at all times. And yes, I have tried multiple monitors. I barely used the second one while I always have 2 windows open on my ultrawide. Watching content is the real issue, yes, but I got used to the black bars when in fullscreen. Just like we got used to watching movies with the black bars on top and below. And I sit slightly more than an arm's length away from the screen so there's absolutely no risk of feeling sick.
@@metalface_villain Unfortunately I haven't. But I guess it wouldn't be incredibly different from what I currently do, as I would be able to place smaller windows comfortably on a 16:9 4k display as I do on my 21:9 1440p. Same goes for UI elements in games.
What are your thoughts on the effect of changing the field of view? Theoretically you should run a bigger FOV angle on an ultrawide. And in theory this could cause lower fps. From my tests and a little looking around on YT this performance impact is a lot less than I expected. So, if you didn't factor that into your test, I think you've been right to do so, but I would be interested in your thoughts on this.
I used just the base settings you get when setting a preset as most people will never touch the FOV slider in a game. Most games i don't bother with it at all I don't feel 21:9 is wide enough to need and adjustment, 32:9 depending on what kind of experience you want to get out of it may make FOV a necessary setting to play around with.
I’m going pc next year, selling my ps5 and Series X. Bought a Rog Ally and fell in love with the pc platform. My question is, can I play ultra wide on my 65 oled tv? Sorry the ignorance.
As @Dystopikachu said you can use your driver control panel to change the resolution to display an ultrawide image with black bars at top and bottom same as a movie shot is cinematic widescreen, though i would probably choose 3840x1600p for the resolution as that will not cause any blurriness as it will match your TV's resolution. Make sure you chose the 1:1 pixel setting so you don't just get your image stretched tall.
My brother has changed his monitor to a 34 inch ultrawide 1440p oled. In a few weeks I will build him a new pc and I am still hesitating whether to recommend him the 7900 xt or the 4070ti super. The 20gb of the 7900 xt seems to me a great option for a monitor like this but I know that the nvidia is a safe bet; dlss, rt, nvidia codec etc. What would you recommend me in this case?
I'm now running a 4060 on a 24" 1080p monitor......do you think I should get a 27" 1440p or a 32" ultrawide? I know I probably won't be able to play at 1440p, but I can still rely on DLSS and frame generation. What should I do?
I think a 4060 might struggle a bit too much with ultrawide personally. I suppose it partly depends on what types of games you like. If its games like Fortnight and CS Go than you would be fine but if you're playing games like Black Myth Wukong or Alan Wake 2 I'd stick with 1440p.
Hope I found this video before I bought my 5120x1440 monitor... I had a 4060 Ti at that time, ended up with an 4070 Ti Super which I'm really happy with! I now have a proper GPU with my 5120x1440 and 3440x1440 screen.
Thank you very much for this review that no one ever seem to have domne or didn't care about.. Never are there any game reviews using ultrawide screens. Been using 2 ultra wide 3440x1440 now for many years.., (60 Hz Samsung se790c for 8 years and 144 Hz AOC CU34G2X for 4 years) and I love them to bits. I always had a feeling I got better framerates than extrapolated from the 34½% performance drop claimed. My games ran faster than i expected based on game reviews with my graphics cards. glad to see it was not just wisful thinking on my side Very happy with the present gaming screen. However I would like to have a 3840x 2160 screen beside it for office and the few games that look better with non ultrawide screen. Hate it I can't really play skyrim on ultrawide screen.. and for that a 32 " 4K ish screen would be awesome .
Thanks, I do have some game review but its hard and expensive to keep up with game releases as I don't get any press codes or anything like that. I to try to tackle all the big releases that hit PC GamePass on day one though.
just got the new alienware oled ultrawide absolute beast although on certain setting i get headaches so had to do some tinkering , last ultrawide i had was the 2015 asus pg349q they have come a long way!
Any way you could get your hands on a 2080TI and run it like 2080 just to see how much of a performance difference it would be if it had more VRAM? As a 2080 owner that moved from 1440p 16:9 to 1440p 21:9 a few months back, the 8GB really aged the card, performance wise. It may not have be a 30% performance reduction, but it sure felt like one! Now I have a 4080S on the way, and seeing an over 100% performance increase it makes spending all that money feel a little bit better. 😅
I don't know anybody who has one and the cost of acquiring one will definitely not pay for itself with the content I can make from it. Right now the 2080 is a good standing for the 4060 and any other 8 gigabyte card that you might be considering. You will be delighted to no end by the performance improvement you're getting with the 4080 super.
I'm on two ultrawide's, an Oled and IPS plus a third 24" IPS. My 3080ti handles the games I play just fine with max settings and ray tracing, but latest games I've played are RE4 remake and RE Village. Been meaning to replay Elden Ring and start Cyberpunk, maybe I'll notice it more then.
Great video, but you're not a loud talker. That's completely fine, but get closer to the mic, please. :) Edit: I thought the sound that plays, when new bars appear, is my HDD scratching, scary stuff.
The true test for me would be lending ultra wide monitors to FPS pros. The extra vision alone would be great, especially for battle royal games, but I’m very worried about the supposed image warping at the edges. Besides that having two monitors where you can game on one and watch something on another screen is very nice. Right now I have a performance monitor and a picture quality monitor. I would love to just have one monitor for convenience sake.
The fact that my 3060 is handling high to ultra graphics in ultrawide 1440p and hovering between 40-80 fps in every title pretty much proves that you dont need overwhelmingly good specs to run it.
Once GPUs are powerful enough, I think the 5K by 2K ultrawides are going to become the new high watermark. As it will allow larger size monitors without compromising clarity.
@@ultrawidetechchannel In my honest opinion, driving the 5K2K screens with great performance(90+ fps for me) will get really annoying really fast. I'm praying for a 38" 3840x1600 OLED 240hz+, literally perfect endgame monitor in my eyes. That, or a 38" 4320x1800(way better PPI and clarity but still less pixels to drive than 4K) 1800p is the long lost, missing stop-gap between 1440p and 4K.
@@epiccontrolzx2291 the sad thing is I don't think that panel is ever coming. the 1600p ultrawide seems to be a forgotten size. Now you're either seeing 3440x1440 at larger sizes or it just jumping up to 5120x2160.
@@ultrawidetechchannel it's a shame for sure, I'll probably have to just bite the bullet with a UW 2160p and use upscaling or resolution scaling to claw back decent performance. Do you have any insight as to how 1600p or 1800p content looks on a 2160p screen?
Thanks for the video! I was wondering between the msi 271QRX vs 341CQP, specifically around how bad the performance would be. You made a great point that the extra pixels on the sides are not seeing much action, so it actually doesn't need that much GPU!
I've been gaming on Ultrawide since 2019. I'm currently using an Alienware AW3821DW that I bought in the beginning of 2022. It can be irritating that I still have to do a hex edit or some other modification for some games, but it is what it is.
Luckily most games i've played I've found a way to even edit the cut scenes to display in ultrawide. Pre-renderered scenes are a different story though. @@ultrawidetechchannel
That's bechause the games never are made for ultrawide, or even super ultrawide 32:9. The image is stretched more and more the wider the screen, and your performance is mostly affected by going from 1080p to 1440p. That's why my 3070ti could still run most games with 90+ fps on my 5120x1440p monitor. It was struggeling in the newest titles tho, so i did upgrade to a 4070ti that i got for cheap and it is amazing.
*Fun Fact...* The "Rise of the Tomb Raider" benchmark got a higher FPS score with a hard drive than it did with an SSD. Why? Because many of the textures didn't load in quickly enough to get processed. So you got more pop-in but a higher FPS. Benchmarking can get weird (I'm referring to the lack of VRAM for this video causing difficulties).
Hopefully you respond before tomorrow. But I'm going to Micro Center tomorrow to build my pc. And I'm having a hard time choosing between a 4070 TI SUPER or a 4080 Super. I don't want to get a a 4080 Super if I can be fine with with 4070 TI SUPER. It would save me about 200 bucks.
If you don't mind using DLSS here and there the 4070 ti can provide a great play experience at 3440x1440. All my testing I do at max or Ultra settings, which often are not worth the performance hit over High. I don't think there is a game you wont be able to hit 60 on when using DLSS. (using path tracing doesnt count)
Huh, it would never have occurred to me to do the calculation that way (divide by pure increase of pixel count), precisely because that doesn't work between 1440P and 4K either. My thought process was to say this resolution includes about 25% of the extra pixels you'd render when going from 1440P to 4K. So i'll estimate performance by adding 75% of the 1440P framerate to 25% of the 4K framerate. I'd be interested to know how far off I am with this approach. I figure it would work pretty well when there are no memory walls involved - it'd likely still underestate cards where 4K would be memory walled but 3440x1440 isn't, but it should do OK where that isn't the case.
The XTX it can match the standard Ti in RT even in Cyberpunk and best it in most other games and the raster performance is just so much higher. Sure you have slightly worse upscaling tech but with the kind of performance the XTX can provide at 3440x1440 you don't need it in most cases. If it was between the Ti Super and the XT i would lean Ti super.
@@ultrawidetechchannel eh... from the benchmarks I have seen, that is not true. The 4070 ti is roughly 14.5% faster than the 7900xtx in RT at 1440p when looking at Cyberpunk 2077 only. When you add in the option for ray reconstruction, the 4070 ti is just the better option if using CP2077 as the metric to go by.
@@deuswulf6193 If you are only going to play cyberpunk and only use RT in cyberpunk then ya the 4070Ti edges it out and has a couple of nice extras, but in these benchmarks here I have 7 other games that use RT and the 7900XTX beats the 4070ti in all of them most not by much but a couple have it with a decent lead. With the most significant one being Fortnite where the XTX performs a fair bit faster and Lumin RT is likely to be quite common in future games. If you want RT power then the 4080 Super is the one to get at the $1000 price point if you can stll find any at that price.
@@ultrawidetechchannelI'm actually surprised at how well Unreal Engine's Lumen lighting system works with AMD GPUs, in games that is. Back when deciding between a 4090 and a 7900xtx, partially for game dev/3D work, one of the issues with the AMD was that it was chock full of issues when it came to lumen in the editor itself. Epic did a good job with Fortnite in terms of outright compatibility across the board. Unfortunately with rendering in say something like Blender, even a rtx 3060 can win out over the 7900xtx. Just goes to show that once you throw in productivity work, things get a bit weird. Anyways, I agree the 4080 super sits in the sweet spot as far as RT goes when comparing it to the 7900xtx. As far as MSRP goes, should'nt be too hard. One just has to buy from Nvidia, Asus, gigabyte...etc storefront directly. That's how I was able to get a 4090 for $1599. They don't upcharge there.
I have the 4070ti, I did some calculation because that just doesn't make sense as well to hear how much you lose performance going to an ultra wide and thank you so much for proving me right lol I have a 3440x1440 and I love the extra screen immersion you can get while not loosing "34%" performance as they say
I just got a 4070Super from 1080Ti, planning to get alienware aw3423dwf, seen some testing where certain games(alan wake 2 for example) were reaching towards 12Gb VRAM, should I be worried about VRAM?
12 GB should be fine for that resolution for everything except path tracing in Cyberpunk and Alan wake 2. Regular ultra ray tracing especially with DLSS on which you will definitely be using will still stay within your memory budget.
it doesn't tell the whole story. going from 1080p to 2160p doesn't require 4x gpu performance either. does ultrawide have smaller impact on performance than the increase in pixel count would predict?
I did build a system using 4070ti and 5800x a couple years ago. Great 16:9 1440p rig. Then without much research I got an ultrawide a month after. I don’t regret it, and most games look and run well, but I do wish I had a 4080 or didn’t need to lean too heavily on Frame generation. I think the biggest problem is legacy and some newer games don’t natively support UW. I mean… Elden Ring… what are we doing Fromsoft? And UW “fixes” can trip the anti-cheat
If you overspend and find out the GPU can't keep up, maybe your monitor can do picture-by-picture so one 5:9 region can be for the OS "second monitor" and remaining 16:9 can be for "narrow" widescreen 1440p. Might regain some FPS then. Just be sure to auto-hide the taskbar and use a rotating desktop background to reduce OLED burnin for the OS region :)
"hundreds of dollars of needless spending" is pretty much the entire hobby
haha, it feels that way sometimes.
Before I bought my 3840 x 1600 monitor I used to look at benchmarks for 1440p and 4K and then just figure out the average FPS between the two.
Ya getting good benchmarks for ultrawides has been hard for a long time but I'm trying to fix that.
Ya me too . I recently upgraded from 3440x1440 to a 4K but it has a 3840x1600 mode which I use for racing and Witcher/Fallout type games. Best of both worlds and automatically detected in the driver so no futzing after switching. First video which actually gives a closer idea of where it falls between the 2. Thumbs up.
Same, I compare 4k to 1440p and pick the numbers in-between them for a rough guess it's worked well for the last 10 years I have had my old 2560x1080 (calculated from 1440p-1080p) and then 3440x1440p monitor, and if I get better performance all the better. I am a firm believer in "better to have it and not need it, than to need it and not have it"
Who complains that they have performance left on the table? Unless of course the game runs like crap and all your hardware is snoozing, I have always had odd resolution monitors even back in the 90's I used to have to modify/hack games to get them to work on my CRT monitor, I guess it says something about me lol
I upgraded to a 4070Ti Super from a RTX2080 because the 8GB VRAM was becoming a problem. Even DLSS couldn't fix that in a lot of the games. As i like to keep my graphics card for a while i didn't feel 12GB would be enough, and in some cases already 12GB can struggle in a few titles which isn't a good sign for the future...
@@tonyc1956 to which 4k Monitor did you upgrade to?
@@Rafael_B Samsung Odyssey Neo G7 43". It had some reported issues when it first launched which i fortunately missed as I bought it early 2024 (for 1/2 MSRP). I applied the latest firmware right after hooking it up and am very happy with it. Watched a few YT videos on tweaking it for HDR which got me in the ballpark to tweak it further. Only running it @ 120 Hz as that's where 4:4:4 chroma support ends but my graphics card can't push it even this fast usually and I'm in need of a GPU upgrade now. Ah, I've opened the proverbial can of worms ...
Been using 3440x1440 for 4-5 years now, currently an Alienware oled with a 4090. I truly believe 3440x1440 is the sweetspot.
You have a killer setup and I would agree that 3440x1440 is kind of the sweet spot resolution right now where almost no card is too powerful for the resolution, but a lot of cards can still work very well at that resolution. And it has some damn fine monitors available, whose prices aren't too outrageous.
Same setup for me, but I'm very tempted by the new 32 inch 4k OLEDs because of the uneven wear with 16:9 content
I'm currently using a 27inch 1440p monitor, I'm giving serious consideration into getting a ultrawide Monitor. its a bit annoying as not all games support the 3440 resolution but there is usually a workaround. Question: could you ever go back to a 16:9 monitor after using a 21:9 ultrawide for so long?
@@Lennox032 Personally I'm not interested in going back. For a brief time there i was eyeballing one of those 42" LG tvs but then the AW3423DW OLED came out and I lost all interest in them.
@@Lennox032 I switched my main monitor LG 27" IPS 1440p 180hz to LG 34" IPS ultrawide 160hz about half a year ago, I would never go back. Having that extra space is simply amazing. Very minor downsides are cutscenes being 16:9 in games and most youtube content is 16:9 but there are extensions where you can toggle the aspect ratio / crop the video.
This guy is actually the goat, making videos for a niche community which will hopefully grow!!!
Thanks, I do beleive the ultrawide comunity will continue to grow it seems 9/10 ultrawide users never go back to standard wide monitors. And with so many new sizes, resolutions and price points in the ultrawide ecosystem will keep it going .
Once you go ultrawide you can never go back ......... For the first time I can play First Person without feeling like I have horse blinders on wrecking my peripheral vision and Third Person games feel even more cinematic than ever .... The larger problem is the idiot game developers that never do their cutscenes in ultrawide meaning I usually have to go into the game's EXE with a Hex Editor and hack it so the cutscenes are ultrawide too .... Sure it's simple enough to do but that's all the more reason it's stupid that the game developers don't do with with a few simple lines of code.
@@longjohn526 Preach, even when the gameplay is flawless in Ultrawide those stupid 16:9 cutsceens jsut kick you right out of the action.
Yup needs alot more subscribers if you ask me!
Thanks a lot for this video. I am currently investigating buying a new graphics card for my 3440x1440 screen, but most fps examples I find for 1440 and this helps to understand where I would roughly end up. Initially I was doing the same mistake, by multiplying the fps with a factor based on the resolution quotes.
I have both individual and head to head videos for all the graphics cards for the resolution you're interested in. They cover off both averages and 1% low performance as well as show DLSS and FSR performance for ray tracing.
I think if we forget about too powerful GPU and too weak ones, it's fair to evaluate to 20 % the fps we loose with UW.
What I have observed trough numerous reviews online trough the last few years, you need approx. 73FPS at 1440p to have 60FPS in 1440p UW. Interestingly enough, that is almost exactly reflecting your findings :D
Glad I could help reinforce your personal research
I made the jump to my 49" ultrawide and definitely see a hit with my 3080 FTW3, but mainly in Cyberpunk.
Ya doubling your resolution like that will definitely give you a hard time in cyberpunk. it really hurts you for every pixel added especially when ray tracing.
Between 1440p to ultra wide on my 3080 in Cyberpunk ultra with rt I loose about 10 fps
But it looks so good on the 49" lol. I can't go back.
Very informative, i noticed this while playing Star citizen in 3840x1600, and while using all 3 of my screens for a 6000x1600 resolution, i saw only a small drop in frames, 6800xt and 5900x 32gb ram. Unfortunately the screens on the side are not freesync so i get some tearing and not full screen on them because of their resolution.
I'll be doing a 4k vs super ultrawide video next and I'm very currious if ill get a lower margin than the 12.5% resolution scale as you would expect when just scaling up or if the nature of the super ultrawide showing a 50% different sceen will make it actuly scale better than the 12.5% like you were seeing using a multimonitor display.
I've done various triple monitor setups for the last 15yrs. AMD used to have far more advanced driver support for triple monitors. Back during the AMD 5850 1gb GPU's days of 2009 I had triple 1080p 60hz matching monitors and it worked great. I then tried an Ultrawide 1080p in the middle with two similar bezel matching LG 16:9 1080p monitors on the sides. I never could get away from tearing or a proper combined resolution with black bars.
I now have on my wife's PC combined three 27" 1080p 240hz LG's all on Display Port and even though one of them isn't the same model they all play nice together due to the matching resolution. They all run 240hz with VRR working correctly on my 6800XT. I even have a 4th 27" 1080p 144hz installed as the center top monitor. The default configuration tool in AMD driver doesn't allow for proper config always placing the 4th monitor in a quad wide setup instead of being a standalone. What I learned from a YT video is AMD has this eyefinity monitor configuration application inside the AMD driver folder that you can run which actually did provide a bit more config options. I likely won't fix your mix matched resolution issues with black bars but it did allow me to configure my setup properly.
I have to select 3x1 and then pick which of the 4 monitors I wanted to setup. It then set the 4th monitor as a stand alone and it properly now positions itself on top of the combined triple setup.
The built in config application is called EYEFINITYPRO.EXE and can be found here C:/program files/AMD/CNEXT/CNEXT/eyefinitypro.exe.
I love PLP setups. Used to have HP 20-30-20 PLP back in the day. Now also 3840x1600 but still interested in side monitors. What 1600p monitors do you use for the portrait monitors?
This is why I also factor in Panel size + Seating distance into my performance budget. It saves me a lot.
My usual seating distance of 2.5 - 3 feet away allows me to use 2560 x 1080P @ 29" quite comfortably, including with the panel size.
With a 3440 x 1440P monitor, I'd need to sit much closer, but lose peripheral vision benefits of having the full monitor in view without needing eye movement. I would also need to use 125% @ 3440 x 1440p windows scaling to see better.
Cards like the 6950 XT ($499)/7900 GRE ($550) become much for viable for me versus needing $1000 cards, like 7900 XTX to satisfy the 4 million pixel budget of UW1440P. UW1080P is nice.
Went from 1920x1080 24inch to 2560x1080 29 inch and it was all i wanted. Performance Hit was there, but managable (i7 8700 + RX 6600). Upgraded to 3440x1440 in May this year and also upgraded to R7600 + RX 6800. It is wonderful, BUT. While 1440p is great and clear. 1080p UW does not look bad. It really is something to consider when not much budget and 29inch UW works wonderful next to a 24 inch 16:9.
If you really really are drawn to UW, get it. Do not get it, if your System already struggles with 16:9 Resolution. But 1080p UW exists and it not that bad.
I broke my ultrwide 1080p this week within an hour of setting it up in a new spot. I have the 5800x3d and rx7600. I'm looking at the 1440 ultrawide moniters now. I'll be watching movies and all at 1440 but playing games at 1080p. How noticeable is the blurriness from downscaling while playing?
Im looking at 32-34 inch ultrawide qhd btw
You didn't show 1% lows and 0.1% lows and i have an 34" alienware and a 4090, but the immersion is much better on Ultrawide.
Yup especially racing games and I prefer shooters in ultrawide also
Thats my plan. I recently bought AW3423DWF and now waiting for 4090 to go on sale
Crazy how many people have the same monitor including myself.
same! but a 4070 super, still amazing experience
@@bliN.K_ray it won't go on sale as they stopped production...anyway for 1440p UW the 4080 is enough, pretty happy with my 4080 on 144hz 1440p ultrawide
Is there a budget IPS 3440x1440p 144hz-170hz monitor you would recommend? Just upgraded to a 7800x3d and 4080 super.
While I don't have any hands on experience for a true budget monitor, the KOORUI 34 Inch Ultrawide 165HZ amzn.to/3Pl2I2M is getting some good press on several outlets and has a pretty satisfied amazon user score. It’s not IPS it’s using what I think is one of Samsung’s fast VA screens but those are very good panels. Here is a little bit more expensive LG option amzn.to/3veCNCP again haven’t used it myself but have had good experiences with my LG monitor and TVs and here is the Samsung G5 amzn.to/48TR1qt the quintessential midrange ultrawide.
@@ultrawidetechchannel Awesome recommendations! Just found your channel and its pretty great!
@@ajhylton thanks
Lg 34gp83a-b
Think its a shame to cheap out on monitor. You have killer setup dude
so my 4070 can actually play at this resolution.... since my esports days are far behind me ill go with ultrawide then
Ya the 4070 will be fine at this resolution, you will need to leveraged DLSS in some of the ray tracing titles to get the frames you want.
And those who are well won't go with an ultrawide? I don't agree.
FYI, it is hard to read the colors of 2560 vs 3440 as they are both kinda white/gray.
i have made things more contrasty in later videos after getting this kind of feedback.
do you know how can i record my gameplay at 16/9 while using an ultrawide monitor? im having problems man
If you just set the internal game resolution, while in full screen mode, to whatever resolution you want your recording to be, then it will record it at that resolution even if in the game it shows the screen all stretched out. The recording should be in standard 16:9
I would also like to see 3440x1440 vs 3840x2160. Watching reviews of whatever new hardware that comes out, Ive always estimated about half of the difference between 1440 results and 4k. I actually have no idea what the difference should be
Thanks for taking the ideas. Love this video and truly, this was very helpful. Love your channel to death
Thank you very much.
@@ultrawidetechchannel
One more question, is it possible to consider the power consumption of those cards for future videos like this? This way we can compare which resolution is more energy efficient or to have more data to compare in addition to the fps.
Thanks again for all your work and dedication, you really are the only UA-cam channel creating content like this
@@lucashernandez922 I don't really have the equipment to properly collect that data and on the tiny bit of money I'm getting from UA-cam I’m not going to be able to invest in it for a while. Eventually I would love to be able to test everything but that is still a while down the road.
@@ultrawidetechchannel
Still incredible!
Thank you for your dedication and for your videos
Very nice video, keep up the good work
Thanks, will do!
Great video. Liked and subscribed.
Not as much good info about Ultrawide as I wish, but this guy is helping.
This is a real gem! I’m considering buying the 4070 Super for my 2k ultrawide monitor. Thanks for the good video!
The only problem I have with ultra wide is the fact that now every lesser quality monitor I look at has poor quality in comparison. Once you’re playing in 5k everything else looks like shit
Even if you go 1440p QD-Oled? That's wild
@@AshLordCurryYou know ultrawides also come in OLED, right?
U guys must be rich
Well said, using the math in your video as well as the one tier above your own typical 1440p GPU of choice is a good decision maker. No need to overspend on some cards unless raytracing or other features are of importance, because from pure rasterization performance or even with some upscaling you could easily get 100+ in a lot of these titles.
But would you still recommend the 7800XT though for 1440p ultrawide if you're going off by purely rasterization performance/FSR without raytracing and high/optimised settings instead of ultra? Or is it worth to spend the extra for the 7900XT in the long run? I still haven't seen many 7800XT ultrawide benchmarks out there despite it being months now since the cards release.
Sadly, I don't have the 7800XT so I can't say definitively but personally I would like to have the extra power of the 7900XT for the ultrawide resolution just so I know that high refresh rates are almost always going to be on the table if I feel like the game will really benefit from it. At standard 1440p i think the 7800XT would satisfy but for the ultrawide I would still want the 7900XT the same way that I recommend the 4070Ti over the 4070 at this resolution.
Use SAM (rebar) and the auto OC of adrenaline. Then it can do most games 70+fps on ultrawide.
Its better than going for a more expensive card right now. Just save that upgrade money and upgrade in 5 years instead of 6 years.
I love it when a diagram uses the same color for 2 different values
so, ive got a 3440x1440p monitor paired with a 3070.. been seeing the limits of just 8gbs and not sure where to go from here, 4070TS/4080(S) or as much as i enjoy it, sell the monitor. Any Thoughts?
The 16GB on the 4070TiS and the 4080S will keep you safe at that resolution for quite a long while. Even if you were to go down to a 16:9 1440p monitor it wouldn't mean you would be safe with only 8GB of VRAM. Some newer games are still going to get you.
@@ultrawidetechchannel ok, thats very helpful Thankyou. the more people I talk to about this, Im leaning more and more towards the 4080S.
Thank you, this is very helpful! I myself has just upgraded to 4080 super and was seriously considering 4090.
However, after trying out 4080s on my ultrawide AW3423dw I'm really happy I didn't pay extra for 4090 - everything runs great on perfect FPS and high graphics settings
Ya the 4080 Super is plenty for that resolution and the only reason to really go more is for full path tracing.
Your 4080s could handle 4k why stick with 1440 if i may ask
@@yasink1995 i really like my OLED AW3423dw, so never thought of replacing it with a 4k screen. Plus a 4k oled would cost extra money and it will hit my fps
This is a good video. I cant go back from ultrawide now. Just got an oled 240hz 3440x1440 monitor, its so unreal!!
This is the kind of data I've always wanted to see but no one else seems to do it. Subbed!!
I love you thx, trying to make sense of benchmarks to have an idea where my ultrawide frame rate would be, was a nightmare.
great test....been gaming on 3440x1440 since 2015 and loving it....will never go back to 16:9
was thinking of going to 3840x1600 when I upgraded to 4080 last year but in the end I stayed with 34" 3440x1440
the 34" is perfect for me, with 38" I had to move my head to look at the corners when I had it at comfortable distance to read the text
(same distance as my 34" as they both have almost the same PPI so the same distance provides the same text readability at 100% scaling)
the 38401600p 38in monitors seem to be a dying breed and the 3440x1440p ones just aren't useful for anything outside gaming.
@@ultrawidetechchannel yeah well in 2024 there are much better options for pure productivity than 34" 3440x1440 but in 2015 that screen was a gamechanger ;o]]
there were only few 4k monitors in existance in 2015 and all kinda small so 34" ultrawide was actually the productivity king ;o]]
I think the 3840x1600 38" will see a second wind and with some time become what 3440x1440 34" is today, the common accessible gaming ultrawide
5K is too many pixels for a few more gpu generations to be widely adopted for gaming
I went to 34” a couple years ago but accidentally broke my monitor so had to go back to 27” for a whole, I hated it after trying ultrawide so now i’m back with a 34” monitor lol but finding benchmarks for it isn’t the easiest.
I'm here trying to help you out I won't be able to get to every thing but but ill try to do as best I can for ultrawide users.
@@ultrawidetechchannel Thanks!
Great video!
Glad you enjoyed it
What I'd really love to know is how much the two side monitors I have next to my center 3440x1440p screen are hurting my gaming performance.
I have a 27" 3840x2160p 60Hz monitor on the left, my center 35" 3440x1440p 100Hz screen in the middle and a 27" 2560x1440p (60Hz) on the right.
I usually only have STEAM and Discord running on the side monitors so nothing that should be using a lot of GPU horsepower but still, would be interesting to know if I'm losing noticeable performance by having those side monitors on while gaming.
Likey not much maybe like 1-3% depending not the lovely of activity in the feed and the number of media shares
If you want to go ultra wide, be comfortable with needing to either mod, config edit, or run third party apps to make it work in a lot of games
Honestly, that's my main hesitation. I know a lot or newer games are doing better with natively supporting it but I also play a lot of older games as well and I'm debating whether the additional hassle would be worth it to me or if I should just get a normal oled instead of a nice ultrawide
@@slayerdwarfify once you get used to it, it’s not that bad. But, it can be annoying when you want to play a game that just released, but doesn’t support it, and there’s no fix available for a couple days to a few weeks or possibly months. Or if there’s no support at all, and you’re forced to play the game with black bars. I hate that even more. Actually there’s even worse 😬 when there is no support, and instead of black bars, it’s white bars! That sucks
@@slayerdwarfifyjust use the ultra wide monitor with a 2560x1440p resolution if there is no other way the sides are just black Bars in this resolution but you get used to it to the point that you think you play on a non ultrawide
When i upgraded to a 1440p ultrawide, my 1070 was having a hard time driving total war warhammer series. Had to upgrade to a 2080ti to get a decent framerate.
x70 series cards would be the bare minimum for getting good fps on UW1440P tho id recommend x80.
Great content, i subscribed
Thanks for the sub. Yeah, the 4070 series at that resolution is providing you with good performance today but likely not so good performance in a few years. Where is the 4080 will give you amazing performance now and good performance in a couple years.
Noob here with monitors. Can I change the resolution down to 2560 if needed so that there is black bars on the sides but can increase the FPS? I also have a Sony 50" OLED 60hz and my 6800 has had no problems keeping up with 60hz. However I just need something for programing and the occasional RTS / FPS games since my PC has been used mainly for iRacing and AC. Thanks.
I could be wrong but it seems the risk of burn in would be greater if the monitor was running in a “fake” 16:9 ratio rather than its native 21:9. As in the pixels on the edges would receive less wear than the ones being used in the middle.
Proper pixel refreshes could likely combat this, all depends on how many hrs you use it at a time plus if you’re doing a lot of static image work
If you plan to work as much as you play I would look for an LCD based Ultrawide, most games that are 16:9 only will default to a letterboxed 2560x1440 if you want to force a game to be 16:9 that supports ultrawide it is doable though i would just use DLSS if i wanted to increase my fps rather than cutting down 34% of my screen real-estate.
@@ultrawidetechchannelThanks for the recommendations!
@@JonoLB Thanks for response.
Oh sorry I would be using my LG 34GP950G-B mainly for programming, browsers, linux shell, docs, and lite photoshop editing. This will be done on a Macbook pro. MS Windows with AMD 7800X3d and 6800 GPU will be used mainly for gaming and lite audio editing/ DJing until I get a dedicated machine for DJing. Likely a m1 or m2. Still not wanting to pay too much for a SOC from Apple that can't be upgraded... :)
OLED is only used for sim racing and watching movies. Response time is amazing for only 60hz.. But I'm not an expert with Monitors. I've been using my Macbook for work with old retina screen and cheap 24inch 1080p monitor. So wanted to give myself a better monitor to help optimize my work space and also utilize my PC for other lite gaming such as FIFA.
Thanks for your help.
This video makes me hesitating between OLED 3440x1440p and 4K a lot ….
If my performance will decrease by switching on ultrawide( even a little) would it be smarter to move to 4K 32’ for sharper and crisp image ?
Nice video btw 😊
I'm not quite clear on what your saying, but 4K (3840x2160) would be significantly harder to drive than Ultrawide (3440x1440). In game the sharpness advantage of 4K will be less noticable than the frame rate drop will be, but on the desktop and in other apps the the sharpness advantage will be clearly noticable.
I’m my experience, there isn’t a massive visual difference between the image clarity of my 3440x1440 Alienware oled and my 4K tv. What made the biggest difference is the extra width for immersion and going from ips to oled. I used to always lug my computer upstairs to play games in 4K, but since the ultra wide, I never do. Because I much much prefer the aspect ratio and the image clarity is super high fidelity. Difference is nil
@@ghulcaster I bought a MSI Qd Oled 3440x1440p and yes I full agree, the ultrawide ratio is awesome and I don’t regret my purchase
I can notice less clarity in games because I just love pause and looking the landscape but it doesn’t bother me a lot ( except for games with bad TAA). When the games is a little bit blurry you can use the NVIDIA filter and enjoy the ultrawide clarity :)
Even with DLSS on, I'm having trouble playing big games like The Witcher 3 on my Dell G15 with RTX 4050 on an LG 34" monitor. I can't get past 45 fps, is that normal?
funnily enough i went from a 1440p CRT that i got for free straight to ultrawide 1440p
ran out of vram immediately
How to watch yt videos with ultrawide and play games that doesn't support Ultrawide in not streched Ultrawide?
mods
@@Chuck15How is the mod called?
Support the creators who natively support ultrawide and don't support those who won't do it.
Otherwise, use mods if you're willing to go for the hassle that shouldn't even exist.
In the end it doesn't even matter.
I was surprised to see my 4060 still got reasonable fps when i switched to UW
The uptick isn't that harsh and going from Ultra to High or High to Medium will off set most of the fps loss.
I always wondered how much FPS dropped from going UW. Thank you for testing
My pleasure! It's my goal to fill in the gaps of ultrawide knowledge gaming community
I can't unhear "Ulta wide"
*whispers* ultra...wide.
Have you ever heard Benedict cumberbatch pronounce "pinguin"?😂
3/5 people should go ultrawide. The two being competitive FPS gamers and super casuals.
I prefer ultra heigth. Everything closer, not much information in those sides anyway. You are closer to action. Squarer ftw.
3440×1440 is amazing. My GTX1080 does really well even on newer games. I can run medium to high graphics on Forza Horizon 5 and average between 50-90FPS
I have a 144hz monitor and although I dont really ever hit it in games 8 years old or newer, I rarely ever get anything below 90fps and its perfect as is
Keep having a blast with your ultrawide.
Does ultra wide reduce cpu bottleneck cos of the extra resolution?
Yes the extra resolution of the ultrawide puts more pressure on the GPU so that a CPU bottleneck is less likely.
Actually it does scale fairly linearly with resolution.
It's just not a pure 1:1 scaling, It's more like 0.9:1 versus the pixel count increase (which comes to about a 22% difference, down from 34%).
Which is to be expected given that not ALL of the graphical work is granular, some of it has more vector-like properties.
I wasn't seeing "half" of the projected performance drop as you say in summary.
Ray tracing has more CPU involvement... which can somewhat skew graphics card comparisons. But for your purposes here, I don't think it would skew the effect you're trying to show.
A big part of why you're not seeing a 1:1 linear decrease is that 1440p ultrawide is using regular 1440p textures (just in somewhat larger quantity). When you go from 1080 to 1440, or 1440 to 4k, you're dealing with a lot more detailed texture work. The ultrawide steps in resolution skip that, which saves some performance. But even full-rez jumps aren't quite 1:1 (but they can get pretty close to 1:1)
I don't think it has much to do with what's displaying in the center versus the edges of the screen though. Unless you're in games with detailed players & enemies, but rather un-detailed settings (of which there are several popular titles, admittedly). In games with detailed settings, there's not going to be any benefit there though.
Still, you're looking at 25-40% OF 34%... so a 1:1 guestimate is off by what? 8-13%? (in the ultimate FPS) ...not sure how often that's going to make a significant difference in video card choice. It's a bigger difference than an overclock, but it's not a huge difference. I dunno. Generally I advise against buying the bare minimum card for your needs... there's no bigger waste of money than getting a card which is "almost good enough". On the other hand, if you overshoot a bit, you have some insulation against obsolescence. So it's not really a waste unless you overshoot by a lot.
I don't think you can call it scaling linearly even at a lower ratio because the game to game variability can sometimes be pretty big. If it was linear at a lower ratio then most games would see a similar loss. The end average discounting the 2080 was 17.8% which is not quite half but not too far off, definitely closer to half than 75% so I'm ok with that fact that I said "about half".
As for it being better to over shoot, well it depends where you are in the stack because if you over shot from a 4080/7900XTX to a 4090 because you though you needed 34.4% more performance rather than 20%/16% more than that is a $1000 over spend (at the moment) which is the cost of a nice OLED Ultrawide monitor these days.
@@ultrawidetechchannel Most "steps" in gpu tier are no more than $200. 4090 is a bad example as it's kind of in an ultra-tier of it's own.
Personally, I've under-estimated gpu needs, and it was quite a waste of money to do so. Twice actually. Meanwhile I have also over-estimated gpu needs (1080 Ti), which at 1440p allowed me to skip 2 generations of gpu's (rather than the usual one generation), and it _more_ than paid for the up front premium on cost, while offering really 'above & beyond' performance early in it's 6 year use cycle.
You can't know for certain how well a card will perform on your specific machine, particularly with software applications which haven't even been released yet ..."a bit of padding is good".
For example, right now 8GB has problems in 1440p in numerous game titles. And it usually shows up in benchmarks. But some titles are starting to ask for more than 12GB. Not very many ...yet. But personally, I would lean strongly towards 16GB as a buffer against this game requirements trend. It seems like nvidia is leaning into ram limitations as a method of built-in obsolescence in their more mid-tier cards.
You make a valid point. It's not a 34% difference. More like 20-28%. I think i was more distracted by the "barely enough is good" thinking. I just look at it differently. To me "barely enough" is "almost obsolete".
@@kathrynck Where you’re upgrading from in the stack does have a big effect on pain of the price jump. Before the 4080 Super the price jump from the Ti to the 4080 was pretty ugly at $400 a 50% increase in price. While the 7900XTX did help out for those looking for only a raster boost it did nothing for those looking to increase RT performance.
All that said the true solution to the problem is better data, and I am doing my damndest to provide that for the Ultrawide Community. I might be far from perfect but as far as I can tell I’m the best the community has and trust me I have looked trying to find someone who was doing this so I could follow them and watch their reviews. But that person didn’t seem to exist, so I set out to rectify the problem myself. I’m doing what I can with the resources I have to help ultrawide users make better decisions.
@@ultrawidetechchannel Yeah the 4080 was waaay overpriced. Nvidia's upper stack was a mess. it's more sorted out now.
More info is always good info :D
I've been using a 5120x1440 super ultra wide with my rtx 3080 and I feel like it seems the shit out of my card and it's over heating the monitor cables in the back amdy monitor black screens (not actual black screen). I down scaled to a 32" 1440 monitor hopen it fixed my issue. You have any insight?
Your 3080 is not strong enough for that ultra wide. You are basically playing games at higher resolution than 4K with that monitor. If you play on standard 1440p monitor you see huge boost in performance and lower heat dissipation.
A month ago I weighted the idea of changing my RTX 2060 for a RX 7700. But, I considered I would not notice any significant difference since I had a 1080p 60hz monitor. Very basic. The RTX 2060 already allowed me to play 98% of my games at ultra settings and achieving 60 fps. an RX 7700 would have not made any difference. my real bottleneck was my monitor.
So I went ultrawide 1440p 160hz.
In the case of Helldivers 2, which I played at high settings, the change to 1440p took a heavy hit on the poor 6VRAM of my gpu, so I had to downscale the settings and use upscaling to barely achieve 60 fps. But in the end, you are wondering, was it worth it?.
HELL YES. I CAN SEE MORE!
I felt the same way when getting my first one. I always end up buying more monitor than i have GPU then upgrading to accommodate the monitor.
so for my 3440x 1440, how much vram should I get? 12 gb or 16 gb?
I got ultrawide monitor and since then I can't imagine life without it.
I jave a 7800x3d and a 4070ti. Can you do a 1440p to 4k and their ultrawides thanks.
So you want a video comparing 2560x1440p + 3440x1440p + 3840x2160 + 5120 x 2160?
@@ultrawidetechchannel mhm. I'm curious what modern games look like FPS wise on my hardware for going ultrawide.
This is such an excellent video! Ultrawide is very rarely covered so well. I appreciate this a great deal!
I run an ultrawide 1080P monitor and I was considering a 1440P one to pair up with my 5800X3D and 7800 XT. And have been wondering how much of a drag ultra wide 1440P will be on that. But I also typically run games with FSR (frame generation it available) and no ray tracing at high to ultra settings. Now I may be able to make a more educated decision.
Check this video out and just assume like a 15% lower performance. ua-cam.com/video/utUF_3z-lPE/v-deo.html
I have a Westinghouse 34" ultrawide being pushed by a 5800x3d and 6900xt. The set up workes pretty well. Cyberpunk runs at around 65 fps with ultra settings and high RT settings.
I'm very hesitant on getting an ultra wide 3440x1440 but I'm getting a 4070 super. I'm not sure how performance will be especially when I want to lock at 100 fps
I run a 32:9 samsung oddesy. I can never go back. It has its pain, but playing third person games on this one is such a blast.
It is also worth to mention than in ultrawide you can activate vrs without really notice it, increasing even more the performance with basically no quality difference.
hey i’m using a samsung g9 odyessey and i was wondering what would the difference be with 32:9 or 5120x1440p?
so, lets say to a 160Hz / 3440x1440 monitor, it requires a 4090 to properly run, right?
Depends on the game and if you are willing to use DLSS or FSR upscaling and frame gen. If you can get to about 90fps with upscaling then frame gen can take you the resto of the way to your monitors max refresh rate.
3840x1600 is a wonderful resolution that doesnt seem common enough with monitors.
Those 45 ultrawides released in 2022-2023 with 3440x1440 resolution really should have been 3840x1600 instead.
Is wish 16:10 was more common
21:10 would be king tho, but most comp games are standard at 16:9 (which blows)
The same applies to 4K and even 8K, increasing resolution does not increase load across all aspects of the render pipeline equally, even if you calculate pixels rendered per-second you're almost always getting the best value using a higher resolution display.
when it comes to gaming is ultra wide really worth all the hassle? having to worry about performance, paying more money for the monitor and whether the game properly supports the setting doesn't seem like a good trade-off just to get some more screen on the sides, which might even make the experience worse for many people. why not go from 1440p to 4k instead where you would still have to pay the extra money and need a better gpu but the visual benefits are more noticeable and you don't get the cringe wide screen that might cause nausea nor do u have to worry whether it will actually work for certain games?
I have a 34" 1440p ultrawide and I still haven't found a game made in the last 10 years that doesn't support this resolution. And even for older games that don't support it, there's usually a mod that makes it work (Skyrim for example).
I personally like ultrawide because I play a lot of games where there's a lot of UI navigating (RTS, city builders, management, simulation). I no longer want a non-ultrawide because the games I play need a lot of info to be displayed at all times.
And yes, I have tried multiple monitors. I barely used the second one while I always have 2 windows open on my ultrawide.
Watching content is the real issue, yes, but I got used to the black bars when in fullscreen. Just like we got used to watching movies with the black bars on top and below.
And I sit slightly more than an arm's length away from the screen so there's absolutely no risk of feeling sick.
@@fluttzkrieg4392 how would you say ultra wide 1440p compares to 4k, in case you have tried em both?
@@metalface_villain Unfortunately I haven't. But I guess it wouldn't be incredibly different from what I currently do, as I would be able to place smaller windows comfortably on a 16:9 4k display as I do on my 21:9 1440p. Same goes for UI elements in games.
What are your thoughts on the effect of changing the field of view? Theoretically you should run a bigger FOV angle on an ultrawide. And in theory this could cause lower fps.
From my tests and a little looking around on YT this performance impact is a lot less than I expected. So, if you didn't factor that into your test, I think you've been right to do so, but I would be interested in your thoughts on this.
I used just the base settings you get when setting a preset as most people will never touch the FOV slider in a game. Most games i don't bother with it at all I don't feel 21:9 is wide enough to need and adjustment, 32:9 depending on what kind of experience you want to get out of it may make FOV a necessary setting to play around with.
I’m going pc next year, selling my ps5 and Series X. Bought a Rog Ally and fell in love with the pc platform. My question is, can I play ultra wide on my 65 oled tv? Sorry the ignorance.
You can always manually set the resolution to 3440x1440 or whatever you like on PC, but you will have large black borders on a 4K television.
As @Dystopikachu said you can use your driver control panel to change the resolution to display an ultrawide image with black bars at top and bottom same as a movie shot is cinematic widescreen, though i would probably choose 3840x1600p for the resolution as that will not cause any blurriness as it will match your TV's resolution. Make sure you chose the 1:1 pixel setting so you don't just get your image stretched tall.
@@ultrawidetechchannel nice will do. Just waiting for the next generation of RTX. Bought a used Ally to start dabbling.
My brother has changed his monitor to a 34 inch ultrawide 1440p oled. In a few weeks I will build him a new pc and I am still hesitating whether to recommend him the 7900 xt or the 4070ti super. The 20gb of the 7900 xt seems to me a great option for a monitor like this but I know that the nvidia is a safe bet; dlss, rt, nvidia codec etc. What would you recommend me in this case?
Go with 4070TiS or with the one that has good price (promo)
all amd cards are shit unless you are playing one specific game that works with one specific driver
@user-vl4iq7bj5e
I have yet to find one that doesn't run on my 7900 XTX.
Superb video! My 3080 and 165hz ultra wide salute you and are saying "it's not that bad!"
I'm now running a 4060 on a 24" 1080p monitor......do you think I should get a 27" 1440p or a 32" ultrawide? I know I probably won't be able to play at 1440p, but I can still rely on DLSS and frame generation. What should I do?
I think a 4060 might struggle a bit too much with ultrawide personally. I suppose it partly depends on what types of games you like. If its games like Fortnight and CS Go than you would be fine but if you're playing games like Black Myth Wukong or Alan Wake 2 I'd stick with 1440p.
Hope I found this video before I bought my 5120x1440 monitor... I had a 4060 Ti at that time, ended up with an 4070 Ti Super which I'm really happy with!
I now have a proper GPU with my 5120x1440 and 3440x1440 screen.
I'm glad to hear you ended up with a super pleasing setup.
Thank you very much for this review that no one ever seem to have domne or didn't care about.. Never are there any game reviews using ultrawide screens. Been using 2 ultra wide 3440x1440 now for many years.., (60 Hz Samsung se790c for 8 years and 144 Hz AOC CU34G2X for 4 years) and I love them to bits. I always had a feeling I got better framerates than extrapolated from the 34½% performance drop claimed. My games ran faster than i expected based on game reviews with my graphics cards. glad to see it was not just wisful thinking on my side
Very happy with the present gaming screen. However I would like to have a 3840x 2160 screen beside it for office and the few games that look better with non ultrawide screen.
Hate it I can't really play skyrim on ultrawide screen.. and for that a 32 " 4K ish screen would be awesome .
Thanks, I do have some game review but its hard and expensive to keep up with game releases as I don't get any press codes or anything like that. I to try to tackle all the big releases that hit PC GamePass on day one though.
you can mod skyrim to handle ultrawide
just got the new alienware oled ultrawide absolute beast although on certain setting i get headaches so had to do some tinkering , last ultrawide i had was the 2015 asus pg349q they have come a long way!
That's a sweet monitor hopefully you have solved your headache problem.
Thanks for the video, now i can enjoy my 1080p 60Hz screen with laptop 1650
Any way you could get your hands on a 2080TI and run it like 2080 just to see how much of a performance difference it would be if it had more VRAM?
As a 2080 owner that moved from 1440p 16:9 to 1440p 21:9 a few months back, the 8GB really aged the card, performance wise. It may not have be a 30% performance reduction, but it sure felt like one!
Now I have a 4080S on the way, and seeing an over 100% performance increase it makes spending all that money feel a little bit better. 😅
I don't know anybody who has one and the cost of acquiring one will definitely not pay for itself with the content I can make from it. Right now the 2080 is a good standing for the 4060 and any other 8 gigabyte card that you might be considering.
You will be delighted to no end by the performance improvement you're getting with the 4080 super.
wonderful video thank you for making it
Glad you enjoyed it!
thank you for testing these for us
Glad you liked it.
I'm on two ultrawide's, an Oled and IPS plus a third 24" IPS. My 3080ti handles the games I play just fine with max settings and ray tracing, but latest games I've played are RE4 remake and RE Village. Been meaning to replay Elden Ring and start Cyberpunk, maybe I'll notice it more then.
As long as you use DLSS you should be able to avoid most of the VRAM pitfalls.
Great video, but you're not a loud talker. That's completely fine, but get closer to the mic, please. :)
Edit: I thought the sound that plays, when new bars appear, is my HDD scratching, scary stuff.
Thanks for the feedback I do most my recording while the kids are sleeping so I can't get too loud, and sorry for the tech scare.
The true test for me would be lending ultra wide monitors to FPS pros. The extra vision alone would be great, especially for battle royal games, but I’m very worried about the supposed image warping at the edges. Besides that having two monitors where you can game on one and watch something on another screen is very nice. Right now I have a performance monitor and a picture quality monitor. I would love to just have one monitor for convenience sake.
You may be interested in checking out this video of mine ua-cam.com/video/GvyKtMzInT4/v-deo.html
The fact that my 3060 is handling high to ultra graphics in ultrawide 1440p and hovering between 40-80 fps in every title pretty much proves that you dont need overwhelmingly good specs to run it.
Right now it's the sweet spot resolution for immersion and high FPS.
That means rtx 4060 ti should do fine
@@kasadam85 yes, the 4060 ti is by no means a bad card, it is just very expensive for what you get.
It’s a good thing you don’t own a OLED 4K monitor those need at least a rtx4080 to get 4k 120 on medium graphics
@@kasadam85 IF you get the 16gb version that is
great video thanks!!
Glad you liked it.
What do you like on the ultrawide?
Been on 3440x1440 since 2020. Never going back and never going up to 4k
Once GPUs are powerful enough, I think the 5K by 2K ultrawides are going to become the new high watermark. As it will allow larger size monitors without compromising clarity.
@@ultrawidetechchannel In my honest opinion, driving the 5K2K screens with great performance(90+ fps for me) will get really annoying really fast. I'm praying for a 38" 3840x1600 OLED 240hz+, literally perfect endgame monitor in my eyes.
That, or a 38" 4320x1800(way better PPI and clarity but still less pixels to drive than 4K) 1800p is the long lost, missing stop-gap between 1440p and 4K.
@@epiccontrolzx2291 the sad thing is I don't think that panel is ever coming. the 1600p ultrawide seems to be a forgotten size. Now you're either seeing 3440x1440 at larger sizes or it just jumping up to 5120x2160.
@@ultrawidetechchannel it's a shame for sure, I'll probably have to just bite the bullet with a UW 2160p and use upscaling or resolution scaling to claw back decent performance. Do you have any insight as to how 1600p or 1800p content looks on a 2160p screen?
can you use better colors in your graphs i have no idea which is which for anything
Thanks for the video! I was wondering between the msi 271QRX vs 341CQP, specifically around how bad the performance would be.
You made a great point that the extra pixels on the sides are not seeing much action, so it actually doesn't need that much GPU!
I'm glad I could help you make that decision with more confidence.
I've been gaming on Ultrawide since 2019. I'm currently using an Alienware AW3821DW that I bought in the beginning of 2022.
It can be irritating that I still have to do a hex edit or some other modification for some games, but it is what it is.
Most new games have no problem with 21:9 for game play but often cutscenes lamely remain 16:9
Luckily most games i've played I've found a way to even edit the cut scenes to display in ultrawide. Pre-renderered scenes are a different story though. @@ultrawidetechchannel
That's bechause the games never are made for ultrawide, or even super ultrawide 32:9.
The image is stretched more and more the wider the screen, and your performance is mostly affected by going from 1080p to 1440p.
That's why my 3070ti could still run most games with 90+ fps on my 5120x1440p monitor.
It was struggeling in the newest titles tho, so i did upgrade to a 4070ti that i got for cheap and it is amazing.
*Fun Fact...*
The "Rise of the Tomb Raider" benchmark got a higher FPS score with a hard drive than it did with an SSD. Why? Because many of the textures didn't load in quickly enough to get processed. So you got more pop-in but a higher FPS. Benchmarking can get weird (I'm referring to the lack of VRAM for this video causing difficulties).
I ordered a UWQHD OLED 34”. I've had vanilla 1440 until now. I'm afraid my 4080 won't handle this resolution well.
Your 4080 will do very well on a 3440x1440p ultrawide, no need to worry.
Hopefully you respond before tomorrow. But I'm going to Micro Center tomorrow to build my pc. And I'm having a hard time choosing between a 4070 TI SUPER or a 4080 Super. I don't want to get a a 4080 Super if I can be fine with with 4070 TI SUPER. It would save me about 200 bucks.
If you don't mind using DLSS here and there the 4070 ti can provide a great play experience at 3440x1440. All my testing I do at max or Ultra settings, which often are not worth the performance hit over High. I don't think there is a game you wont be able to hit 60 on when using DLSS. (using path tracing doesnt count)
As someone who is aiming for 60fps on highest settings i must say 3440x1440p is the sweetspot (cost to performance)
Indeed it is
Huh, it would never have occurred to me to do the calculation that way (divide by pure increase of pixel count), precisely because that doesn't work between 1440P and 4K either.
My thought process was to say this resolution includes about 25% of the extra pixels you'd render when going from 1440P to 4K. So i'll estimate performance by adding 75% of the 1440P framerate to 25% of the 4K framerate.
I'd be interested to know how far off I am with this approach. I figure it would work pretty well when there are no memory walls involved - it'd likely still underestate cards where 4K would be memory walled but 3440x1440 isn't, but it should do OK where that isn't the case.
So what u will take if u want to buy a GPU r8 now for this res 4070 ti super or 7900xtx .
The XTX it can match the standard Ti in RT even in Cyberpunk and best it in most other games and the raster performance is just so much higher. Sure you have slightly worse upscaling tech but with the kind of performance the XTX can provide at 3440x1440 you don't need it in most cases. If it was between the Ti Super and the XT i would lean Ti super.
@@ultrawidetechchannel eh... from the benchmarks I have seen, that is not true. The 4070 ti is roughly 14.5% faster than the 7900xtx in RT at 1440p when looking at Cyberpunk 2077 only. When you add in the option for ray reconstruction, the 4070 ti is just the better option if using CP2077 as the metric to go by.
@@deuswulf6193 If you are only going to play cyberpunk and only use RT in cyberpunk then ya the 4070Ti edges it out and has a couple of nice extras, but in these benchmarks here I have 7 other games that use RT and the 7900XTX beats the 4070ti in all of them most not by much but a couple have it with a decent lead. With the most significant one being Fortnite where the XTX performs a fair bit faster and Lumin RT is likely to be quite common in future games.
If you want RT power then the 4080 Super is the one to get at the $1000 price point if you can stll find any at that price.
@@ultrawidetechchannelI'm actually surprised at how well Unreal Engine's Lumen lighting system works with AMD GPUs, in games that is.
Back when deciding between a 4090 and a 7900xtx, partially for game dev/3D work, one of the issues with the AMD was that it was chock full of issues when it came to lumen in the editor itself. Epic did a good job with Fortnite in terms of outright compatibility across the board.
Unfortunately with rendering in say something like Blender, even a rtx 3060 can win out over the 7900xtx. Just goes to show that once you throw in productivity work, things get a bit weird.
Anyways, I agree the 4080 super sits in the sweet spot as far as RT goes when comparing it to the 7900xtx. As far as MSRP goes, should'nt be too hard. One just has to buy from Nvidia, Asus, gigabyte...etc storefront directly. That's how I was able to get a 4090 for $1599. They don't upcharge there.
I have the 4070ti, I did some calculation because that just doesn't make sense as well to hear how much you lose performance going to an ultra wide and thank you so much for proving me right lol I have a 3440x1440 and I love the extra screen immersion you can get while not loosing "34%" performance as they say
I just got a 4070Super from 1080Ti, planning to get alienware aw3423dwf, seen some testing where certain games(alan wake 2 for example) were reaching towards 12Gb VRAM, should I be worried about VRAM?
12 GB should be fine for that resolution for everything except path tracing in Cyberpunk and Alan wake 2. Regular ultra ray tracing especially with DLSS on which you will definitely be using will still stay within your memory budget.
@@ultrawidetechchannel Thanks again for the response!!
it doesn't tell the whole story. going from 1080p to 2160p doesn't require 4x gpu performance either. does ultrawide have smaller impact on performance than the increase in pixel count would predict?
What if I already have both types
Me buying a 7800 xt cause I cannot afford any of these cards tested😂 great vid. I hope my plan goes ok for some cod on ultra wide😢
You'll be fine the AMD cards do very well in CoD
I did build a system using 4070ti and 5800x a couple years ago. Great 16:9 1440p rig. Then without much research I got an ultrawide a month after. I don’t regret it, and most games look and run well, but I do wish I had a 4080 or didn’t need to lean too heavily on Frame generation.
I think the biggest problem is legacy and some newer games don’t natively support UW. I mean… Elden Ring… what are we doing Fromsoft? And UW “fixes” can trip the anti-cheat
If you overspend and find out the GPU can't keep up, maybe your monitor can do picture-by-picture so one 5:9 region can be for the OS "second monitor" and remaining 16:9 can be for "narrow" widescreen 1440p. Might regain some FPS then. Just be sure to auto-hide the taskbar and use a rotating desktop background to reduce OLED burnin for the OS region :)
While an interesting idea I think just turning up the DLSS or lowering the settings is a more applicable option.
what about mid range gpu such as 3060ti?