Linus try this. take your Gaming Monitor to your Gaming GPU, then a second GPU but its a really Low end card like a 4 or 3 gig variety for a second monitor. Then test to see about gaming quality then. I bet this will fix the issue and give GPU makers a new market.
Interestingly, you can actually get an FPS boost with a monitor on a gaming laptop, as most HDMI ports connect directly to the dGPU and bypass the iGPU. Many screens with optimus go iGPU > dGPU, so the integrated graphics act as a bottleneck. This is less important these days with the MUX switch becomming more common, but it's still a nice trick.
On a side note to the MUX switch, on fallout 3. On some laptops (specifically AMD) it does not switch from iGPU > dGPU on the built in screen automatically, like with most games. You must manually switch to a higher power mode through AMD Software Adrenalin Edition otherwise the game wont start. Not really sure why but, Bethesda breaking games before you can even play them is impressive tbh. I'm sure there's more games that do this, but I don't know of them. edit: I know why it won't start (not enough power on iGPU), it's the GPU not switching I don't know.
@@PeterKane379hdmi port goes directly into laptop big graphics card. Gaming laptops generally have two graphics cards, for ease of explanation a "small graphics card" and a "big graphics card" the bigger one being the faster one. Normally laptops use a small graphics card to send the signal to the big graphics card which adds "latency" reducing performance. Plugging the hdmi into the laptop thus directly into the big graphics bypasses this issue (as long as you turn off the laptop display as per the reasons listed above). Hope this helps you understand!
@@_minzez_2926 I bet if you try other older games that won't place a huge load on the GPU that this will still happen at least from time-to-time, it's probably related to the game being so light relative to modern hardware that the driver doesn't realise it's a game. Also, any modern iGPU should play Fallout 3 just fine given I was able to keep playing an FO:NV playthrough when my GTX 275 died back in the day and I had to temporarily use an HD4200 iGPU.
I actually thought a second monitor would have a far larger impact on your in game performance, it’s really nice to know in most cases it’s fairly negligible.
@@antikommunistischaktion Monitor would be cheaper or better quality for the price of the laptop. If you'd buy 2 worst laptops just cause of small price, their screens gonna be much worse if you just spent the same amount on monitors itselves.
@@smokyz_ yeah you're absolutely right, but I think that guy did not care for facts. he just said something extra just because he is an internet guy who will comment whatever he thinks will make him sound cool
I also am a two monitor guy and a pretty casual gamer. Half my games I play i've locked the framerate at 60 and with my current build and those games it never dips below that. But having that extra screen for things outside of gaming is huge for me. The option to have a chat open on one screen while watching hockey on the other, as one example. Also, i do some music production as a hobby and having a second screen for that... I couldn't go back. My first two screen experience was just connecting a monitor to a laptop and since then I've refused to daily a single monitor.
It's worth mentioning that having a multi-monitor set-up doesn't REQUIRE you to use all of your available monitors in tandem. Sometimes you can just turn on only your main monitor and chill that way.
@@KroLeXzso if having multiple monitors even if you’re only using the main once the other monitors are technically on too which means it will use some resources and add latency. Is there a way to like fix this would having to unplug unused monitors all the time?
I as well would like to see if a additional graphics cards help with multimonitor setups. In an streamer scenario it would be interesting to know if a dedicated gpu for streaming, another for gaming and another for monitors could increace the performance, or ven just one for gaming and a second for stream + monitors
@@diedrichgI was using one monitor plugged into my motherboard using the integrated GPU and one plugged into my dedicated GPU for a while and I noticed watching videos on the second monitor caused lots of stuttering and it fixed when I plugged it into my dedicated GPU
This is the stuff I am stoked to see more data of. I have 3 monitors and a few follow-up questions related to this subject. If I have an older high refresh rate monitor with g-sync does that hurt my performance on my primary gaming monitor since the GPU is trying to sync with more than one g-sync module? Would it be best to disable G-sync in the monitor? If I set my IGPU to run other applications like Chromes hardware acceleration, discord, etc. How does that affect things? How about If I have extra monitors set to a higher fresh rate? I have a 60hz tv, a 144hz side monitor, and a 175hz primary monitor for gaming. Is it best practice to set the monitors I am not using for gaming to 60hz? Does it not affect it? How about color depth differences, mixing HDR and non-HDR, 10-bit color, etc? I am very excited for more lab data.
Its kind off disapointing, but the answer is it all depends on your system health, and CPU you are using. GPU doesnt really care if it is rendering to the 1, 2, 3, or 10 monitors. Most of the work in rendering pixels is done almost instantaneously., G-sync is just a buffer, so it is not so different performance wise than v-sync. You do need to have a good install of drivers though. Every rendering queue, starts with your CPU. CPU sends information needed to your GPU about what is to be rendered, in what resolution, and orders which data goes where. So the answer is, Get yourself better CPU.
im suprised you guys didnt test to see how much the input lag gets worsened for each monitor getting added, and to see if plugging the main monitor into the gpu for gaming and all the secondary and so on monitors into the integrated gpu for non gaming tasks will help with the peformance. i hope you guys will make another video addressing these issues.
@@carlosgarcia1165nope just a limited scope. They were trying to answer a specific question. There's no reason they couldn't do a follow up looking at latency. However I doubt it will add much to the story.
@@HerbaMachinaagreed, I mean this is computers we're talking about can't get into everything. Half the time I get a game that won't start all because I have my controller plugged in before I start it
It should work if you have an iGPU built into your cpu. Plugging extra monitor into that should negate the performance hit. But I agree they should have shown this
The high end system was used and while the results were surprising to see how minimal this impacts the performance I would also how loved to see how the effects are with the mid and low end system. But those who buy low or mid range pc might not be using multiple 4k monitors since 4k monitors do cost a lot and depending on the GPU they might have to opt for display through igpu and not through the GPU. This could have been a longer video with more variation in setup as this is a interesting topic to know about more.
Its one of those things that they could probably do an 8 hour video with all the variables. I dont know about low end parts. Its not really worth testing because you can probably work out the pattern from mid and high. like an igpu or video card with less than 8gb of vram isn't really worth testing. You got 1080p and adding in more 1080p monitors to your already stressed system is going to be slightly worse.
This would've made more sense but at the same time this is exactly what I was worried about for them with LTT labs. This might be the most useless video I've ever sat through and I'm not joking. You're telling me the computer playing back 3 videos hits my performance slightly! Thank god I had hard data to back up the obvious lol. Also the "twist" of a monitor not doing anything not impacting performance is just very odd. Duh guys, you're not drawing anything new.
No kidding. I honestly have no desire to game in 4K. It's pissing away performance for an infinitesimally minor boost in picture quality. (And I'm not just speculating. I've played the same games on a 50" 1080p vs a 70" 4K, and the extra pixels did jack for my experience. Pretty visuals were still pretty, regardless of how many millions were spent on a given screen's marketing.)
I've tested 2 multiple monitors, and I highly recommend one monitor when gaming. The less stuff ON the better, especially with older gfx cards. Steam overlay cost me performance also, super noticeable on older gfx card. If you want dual monitors, keep it on slim usage on 2nd monitor like a website/tab/app but it will hurt the fps maybe under 10% ??? That's just my experience 😊
In a deeper dive, what about monitors that are different resolution and/or refresh rates? I’m sure most people use identical monitors for their 2nd, 3rd, and 4th?
@@TwiztedMannix87Watch the stuff at 5:20 and you might realize it doesnt matter if you don't do anything on the 2nd screen. Also, you must have a really old card, like 7XX or prior from Nvidia or an AMD equivalent to have such performance issues.
Would be cool to test response time with different monitors connected (diff. refresh rate and resolutions) to see if having something like main 165hz 1440p with a 2nd 60hz 1080p would alter performance or response time.
This is exactly what we needed the most! Even though the average frames were only slightly affected, the overall render spikes and response times do feel different with different monitor configurations. Would LOVE to see a follow up video :)
I got you. I run a 165hz 4K panel for my main display, and a portrait oriented 1080p at 144 next to it for reference and multitasking windows... Response rate (though not scientifically measured to the capabilities of LMG) is not noticable.
i have this setup! and i can tell you atleast on my setup playing a game i lose like 40% of my fps if the 1080p monitor is playing any video at a different refresh rate. edit: i also did try a 1440p and 1080p at the same rates, and that doesn't make a noticeable difference in fps, for me at least its the difference in refresh rate that screws me over by like 40%'ish
@@PinHeadSupliciumwtf I doubt VRAM will be much of an issue unless you're really close to maxing out the VRAM. You probably only add a couple hundred MB's at most if you have two extra monitors open for really basic web tasks or whatever. So on an 8GB card that would be maybe 3% of the VRAM?
I can assure the performance of games can be really, really strong... As my poor old PC is running on GT 640 3GB and 8GB DDR3 RAM, with i5-3350P I sometimes have troubles viewing a video on FHD (even tho my monitors are both 60Hz 1366x768, but the video is way smoother) and playing something demanding for my PC, like Genshin Impact or GTA V... But I second this idea, would love to see benchmark results there
I would've loved to see an exploration on how this impacts lower end systems. A few years ago I used to run triple monitors off an RX 480 4GB and the performance increase I got from removing one of my monitors was SIGNIFICANT. I don't remember any exact numbers but there was a big jump in less demanding games like Minecraft, allowing me to finally use shaders, and heavier games went from noticeably stuttery to very smooth, sometimes even almost doubling FPS, like it did in R6 Siege, where I went from low 30s to high 40s and even 50s.
Yeah I used to have to constantly unplug my 2nd and 3rd monitor to consistently hit my 144FPS. I got like a 40FPS difference I think (from memory, might be wrong).
Same I had worse specs and honestly I'd get less frames with discord up let alone the extra monitors. I had to turn everything off often on harder games
Viva La Dirt League! Nice.. Thanks for finally covering this topic -- I once asked some former engineers who worked on DirectX at MS, and they didn't seem to have a high-confidence explanation of how things like v-sync and g-sync work, with multi-mon (and potentially with monitors fused together at the driver level, via Eyefinity/Surround). And how it intersects with things like the Win10 "FSO" (fullscreen-optimizations). There's a lot of confusion, and it's very hard to measure display latency, with any certainty or precision.
The whole point is that Microsoft shouldn't be aware of those issues. Although optimizations on the OS-level got to be a point, the basics are that the OS is not responsible for that kind of technique. The drivers are not Microsofts business. The OS delivers an anchor point with Direct X or straight into the OS, and getting a particular proprietary technique to work is on the origin company.
must be sooooo useful to be able to have discord open or youtube on anothe rmonitor.... not. 90%+ of people dont need these extra monitors and do it cos everybody else does it.
What would be cool is seeing this effect on laptops as many of us may use gaming laptops to work or complete our school/college assignments and then come home and plug monitor(s) in to play using additional accessories. I also think that this will vary a lot based on the gpu for lets say a 3060 or a 4060 or whatever other gpus are currently in use by the gaming community.
I am wondering if my dedicated graphics card in a laptop directly displaying the game on my main display while the integrated graphics running voice chat interfaces will hurt my performance. ( not taking account of the CPU hit since I would could also run voice in background)
Actually plugging your laptop in an external monitor in 90% of cases will boost your performance (if you have a dedicated gpu, that is). That's because you internal display routes the dedicated graphics card trough the integrated one when you need more performance (say gaming). By doing this your laptop saves power as it uses the IGPU when light tasks are being done and switches to the Dgpu only when needed. However this rerouting (known as Nvidia Optimus) causes latency and performance loss. The Dedicated GPU however in most cases is directly connected to your HDMI/DP so connecting an external monitor would actually increase performance and reduce latency. This is for Laptops with Dgpus ONLY and not in all cases.
@@HowToArter This was the case on laptops before 2021. Now they come with MUX switches so you don't need another monitor to gain extra performance, as mux switch can just disable the igpu.
@@Alias_AnybodyMainly gaming laptops, but some work laptops can have a mux switch. though theres no point because they are mostly spent moving to place to place, so having optimus is necessary to keep their battery life if they do have a dGPU.
Yes. 2 I have used 2 monitors for the last 3 years, and just the possibility of having messages/youtube on my other screen while im playing something is so useful
It's not the same, but if Taskmanager is correct then Videos on my second Monitor are Played through the iGPU during gaming on my dedicated GPU but i still lose quite a lot of Performance with 1080p Videos on my Laptop, which is unfortunate
I actually was wondering that lately. Benchmark where basicly the same with igpu enabled or disabled so I tought, why not put the secondary monitor and decode video on the igpu? Howether I quickly ran into issue, firstly I couldn't get the igpu to do 100% of the work (the impact of a 1080p video on the second monitor was at best halved, there was always a few % of gpu usage no mater what). And more importantly, while running benchmark I noticed that my cpu performanced plumetted by 15-20% in short test (ryzen 5600g). After more test, I noticed that the power usage of the cpu was lower when the igpu was doing something, at least at the beggining of the benchmark, slowly ramping to the normal usage after ~20s. I'm not sure how that would work in real world effect, but short spike of cpu usage are pretty frequent, i'm not taking that risk, so I disabled my igpu. Not sure how a different cpu would act.
As default in windows it will use your dedicated graphics for graphics acceleration/encode/decode. Doesnt matter which physical connection the monitor is connected to. What you can do is go to Windows Settings -> Display -> Graphics. You can then set which graphics card you want a certain application to use. This will impact gain back a small amount of performance if you like to game on one monitor and watch video on the second.
Even if I lost upwards of 5% of my game's performance, there's no way I'm giving up my triple monitor setup. I'll sooner add a fourth screen, a big 4K TV for media viewing, than I will discard a single monitor! >:U But, it's fun to know what the difference may be anyways, inconsequential as it may be.
i have 9, search for PCI ADT-LINK to M.2, I have two of those with AMD cards and 8 monitors that way, it takes the load off the CPU thanks to the two and cards and the chipset. I am using and cards because it is very hard to have mixed generations nvidia cards ;)
As a long-time multi-monitor user, I have been wondering for a long time whether it really makes a big difference if you use static images on the second or third screen while gaming. I have recently been a user of a Ryzen 7000 system and have now connected the second and third monitors to the mainboard via the CPU and the main monitor is still connected to my nvidia GPU. Works perfectly, even with VMs and Citrix Workspace across different monitors. I just haven't been able to determine the difference in performance yet. Maybe something for a follow up? thanks for the video
I was looking for a comment like this. I have 3 monitors right now (Have a 4th, but I need an adapter) I was curious if it'd be better to have just the 4k on the gpu and the other 2 on the on-board
This was quite unexpected, I never thought some games might affect the multi-monitor setup that much. I would also love to see this experiment on Linux with a few more variations (e.g. Proton and native titles, Wayland/Xorg etc.) Keep up great quality!
I had a similar Problem. 240hz wqhd Main Monitor and one 60hz 1980x1200p Monitor with vega 64. When I use both in same time I can't utilize 240hz mode. It just doesn't work. Look into it. I always thought it hast to do something with the 60hz Monitor because it has only vga and dvi (I use dvi hdmi adapter) Edit: I forgot to mention, that the 240hz Monitor is only possible to run 144hz wqhd when both Monitor connected
Rather have a little lower performance than not having 3 monitors. That being said, i wonder if i can just add another small grafics card that handles the other 2 monitors
@@adsads196 This. I use my IGPU for my secondary monitor and configured my browser to always use the IGPU to get the decoding / 3D load away from the main GPU. Works great :)
I used to run a cheap GT 610 with a GT 660 for half of the monitors in a quad monitor setup. That was on a HEDT platform so it had enough PCIe lanes to keep both cards at x16 A bit later I used NVIDIA Surround on the GTX 660 for 3 of the monitors and the GT 610 for the extra monitor. Tbh the difference wasn't that big so at some point I stole the GT 610 for a different system and ran all 4 on the GTX 660. Nowadays I'm still running 4 monitors on just a single GTX 1080, though I don't use NVIDIA Surround anymore because of the inconvenience of 3 monitors showing up as one.
Couldn't you also use the secondary monitor with the integrated graphics in the CPU by hooking it up to the motherboard output? That would take load off the GPU, though I'm curious how much load if any it would put on the CPU. Could be helpful for addressing bottlenecks.
Question. Is there a reason Labs didn't run the same tests with secondary monitors connected to CPU integrated graphics? Since current gen CPUs from both teams now have integrated graphics (basically across the board), I feel like this might have less of an impact on FPS as (especially at higher resolutions) more load is on the GPU vs CPU.
The igpu "steal" memory from the ram you have and use as vram. So it's better to use the dedicated GPU to run the second monitor. When i connected the second monitor to my motherboard (using the igpu) my ram go from 16gb to 12gb.
I feel like its important to clarify that it's not because you have more monitors plugged in but what you are doing on those monitors - having a windowed game along with a video playing on a single ultrawide will have the same effect. If your GPU is slow/old enough you will notice a performance impact just from having multiple monitors attached due to limited bandwidth. When I want to be immersed in a game I just turn my 3 side monitors off.
glad some one said this, this just seems like a case of if you do more with your computer its more performance hit on it and not actaully anything to do with the number of screens!
@@on99kfc Even if they were, when the window is not visible modern browsers and up to date video drivers generally stop rendering the video portion and only play the audio, you can test this yourself by opening a performance monitoring program and playing back a video then completely blocking the video with a static application like notepad. When using a slow enough system I have personally witnessed the video player canvas being blank when it has been covered for a while and then it shows again after a couple of seconds.
@@vintagemotorsalways1676 Windows will continue render the video in the background. Just play a UA-cam video and open task manager to see the GPU video decoding utilization. I tested this with 1 to 3 video playing, video decoding utilization go up to 10%, opened discord and full screen the discord windows, GPU utilization didn't go down at all. The video is being play regardless of it being view or not because the audio is playing. The decoding of the video is handle by the GPU, so even playing the video on the background with sound only, the video is still being decoded by the GPU.
@@on99kfc I did - that's why I said it. It is also dependent on your browser and what enhancements you have installed/ enabled. I believe you but it doesn't make my experience incorrect.
The vram usage will also increase as you add up monitors. Also, it's a bit of a niche case but I noticed that running two displays on different GPUs (e.g on a laptop, the built in display running on the igpu and the second one running on the discrete gpu) will also increase the driver overhead
I have a bad random laptop and running Acer monitor from that. I run into some issues, possibly caused by this. But, the laptop is bad and needs to be cleaned, dust-wise, too. But, everything works fine most of the time, just not for gaming, and it sometimes turns itself off recently (though this is likely caused by some other issues I've not yet fixed, haha).
running two monitors on different cards, both nvidia but different generations, previously it was a single gpu driving two monitors. even with two drivers essentially considering different generations performance is better with the two gpus one driving each monitor. as to why this, so i can game without a performance hit and watch stuff on the other monitor and nvidia iray which doesn't care if gpus, cards, generations or memory matches as long as they're nvidia it will use them to render.
A year ago I had a 5 monitor setup on a Radeon RX580. They were all 1080p and arranged in two rows - the top row having two monitors and the bottom having three. At IDLE with NOTHING open on Windows 11, it was consuming ~3GB of VRAM. Because of this arrangement, I had a virtual resolution of 5760x2160 with dead space on either side of the top row of monitors. I’m not sure if that contributed to the VRAM usage or not, but I did want to second your statement - that more monitors = more VRAM usage.
One thing you didn’t mention is multi device setups. I have 3 laptops, none of which is particularly powerful, but a combination of software like Synergy, networking, and a video capture device make them feel like one quite capable multi-tasking monster. And also it’s cool that I have Linux, Windows, and MacOS all working together and each typically does what it’s best at. The MacBook Air records, streams, and edits, the HP has midrange AMD graphics and a ton of RAM, and the touchscreen goes between being a streamdeck-like launcher and a dedicated SSH terminal. Plus it also runs servers on the local network. Maybe it’s niche, but it seems like people should know it is completely viable to have a streaming / recording setup where one machine is the multi tasker running most of the monitors and handling the stream, while another machine just handles gaming. This is even nice that you can have a separate keyboard and mouse for say like chat without leaving the game and potentially bugging out the cursor
A look at what happens if the monitors are on independent GPUs would be interesting, especially in terms of what happens if you use an iGPU for a secondary display. For my part, doing the gaming VM in a NAS thing it’s been an easy path to enabling multiple VMs when I actually want a second seat.
I too want to see this. Intel back to third gen has 3 monitor output. 12th gen (maybe 11th too?) Has 4 monitor outputs. Also a second card with multiple outputs is $15-20. NVS510, FirePro W4100 or W5100 etc.
I'd love this, I've got a spare 1050 hanging out in my computer from my old build and would love to see if running my second monitor off it would help. My CPU doesn't have integrated graphics so that isn't an option unfortunately.
I put 2x 1080 Ti in my gaming PC but had one going to my monitor and one going to my TV. It worked shockingly well, but I couldn't control the RGB on the second one, and using one heated up the other because they were literally stacked on top of one another
This was my original plan for my 12600k and 4070ti build, but I only found out after buying that the BIOS on the Z690 platform doesn't allow you to enable the iGPU if you have a dedicated GPU enabled. It's one or the other. Which is pretty stupid. I check the BIOS updates occasionally to see if they've changed that, but haven't seen it yet.
Love your recent tests, LMG, though I find the conclusion here to be a bit incomplete. It would be nice to see more in depth how extra monitors affect the overall system delays like input lag and render consistency. Personally I've always noticed that something was off whenever using the second monitor. Even though the frames are very similar, the feeling is way different. Examples of tests: Timings + fps consistency when only 1 monitor connected to GPU Timings + fps consistency when 2 monitors connected to GPU Timings when 1 monitor is connected to GPU and 1 to integrated graphics Try to add more monitors to the mix. 1, 2, 3, 4 with same spec vs different specs.
Yeah I was also dissapointed that it wasn't tested, not sure if it's placebo, but I did "notice" that when I upgraded to my 1080 Ti and couldn't use my 2nd monitor cuz no DVI port.
This is more of a Gamers Nexus type analysis and not the sort of "tip of the iceberg" that LMG does. If you haven't noticed, the primary point of the video is to sell the monitors linked in the description below.
@@cjbtlr Not sure if you've noticed, but LBG Labs is partially up and running. If they find a way of automating the testing method, they can gather a ton of data about latencies and stuttering
100% agree on this. "2 screens on a tower PC" and "2 screens on a laptop (which means one integrated screen and one external screen)" is a completely different deal.
Is it? more monitors is more system resources used, which will lower your game performance, same as running multiple programs while you game. This has been known forever, and frankly shouldn't be an issue unless you're running a very under-powered machine. Overdriven monitors causing unintended clock behavior and effecting power consumption is a much bigger rabbit hole.
Note: this performance loss can be entirely circumvented with either a 2nd GPU or iGPU handling the 2nd display, this is as easy as plugging the display into the other GPU and then anything on that display should run on the 2nd GPU, and if not you still always can force it to in windows settings It is a bit overkill to have a 2nd GPU for like 2 fps average, but if you already have an iGPU, why not
That doesn't work, the workload is still taking by the primary GPU it's just shunted through the seconds connector. Even with SLI this is how it was done. It's been quite a few years since I ran a multi GPU setup but even after the death of SLI the best you could ever get the second GPU to do was physx workloads.
@@stoney202I'm sure that depends on config though? I do know if you set your primary gpu to x, I think it would make sense for it to go through primary but if not, don't know why they'd assume you want it to go through a second gpu automatically
curious that you never mentioned refresh rates, they made a huge difference in frametiming for me. with a secondary 60hz monitor i had lots of stutters on my 144hz main monitor, but once i switched to a 144hz 2nd monitor those issues went away.
I swear this has been something that's been hitting me for years whenever i tried to run games at high fps. surprisingly little converage on this issue, since a lot of people will use old cheap monitors for the second display that cannot matcht the high refresh rate of their main monitor
I use a 240hz s2522hg and a 70hz (old 1080p 60hz benq with a 10hz OC no freesync or gsync) and have no difference in stutters on my main monitor from tests were I was troubleshooting crashes and stuttering in a game and completely unplugged the 2nd monitor while also removing any windows drivers left over from it. I have an i3 12100f so no iGPU, both monitors connected to one GPU.
Forgot to add that I managed to solve the stutter/crash and tested again for curiosity sake to see how much performance I was losing and it was negligible, no stutters. PCs are wonderful but sometimes the random issues people can get with a combination of hardware, software, drivers and monitors can make you pull your hair out.
@@Xio_XDnow that makes me wonder if I was running my second monitor with the 10hz oc applied at the time that I was troubleshooting. I'll have to do some tests to see if running 60 or 70 has stutter problems but I've not seen any in csgo/battlebit all smooth for now running 240/70
@@Xio_XDdoes the little variation have an effect on this? example my main is at 239.964 and my other is at 119.982. i just lowered it from 143.981 but its max is 239.757. would it be best to keep the second one at 120 or increase ot to match the main 240? i only lowered it for power consumption
Would be interesting to see another video like this involving a similar multi-monitor issue, that being the fact that having multiple monitors with _differing_ refresh rates can sometimes cause inconsistent frame timings or even actual refresh rate drops with a faster monitor trying to match a slower one (ex: 144Hz briefly dipping down to 60Hz) While I've never experienced the refresh rate drops myself, my frame timings have always felt inconsistent (even with G-sync) and the only way I can get a valid result on the UFO test is by setting my 144Hz monitor down to 60Hz to match my secondary monitor (the test always stutters or fails to sync at 144Hz). So I'm very curious to know if the labs team could produce any helpful information about all of this
i use myself 144hz 1440p monitor for gaming and second 1080p 60hz one for video while playing, i had zero issues so far with frame drops on my amd gpu for like 6 years
I run a 240 Hz HDR monitor and a 60 Hz for my secondary. The only real solution I use is to use Borderless Windowed mode and it seems to help out a ton though I do sacrifice a few frames to make it work like that. Ideally all the monitor's should be the same spec.
That's why it's highly recommended to use multiples of the same display for a multi-monitor setup. You get the same refresh rate, same resolution, size, and no weird behaviour when moving windows from one display to another (or if it's on multiple monitors!).
I have the opposite problem. Strangely and occasionally, my secondary monitor (60Hz), will drop to something like 10Hz on startup. Turning it off and on again does the trick. Might actually just be a problem with the monitor itself, because I seem to remember that unplugging it form and replugging it back into the GPU does not fix it.
Honestly, the last point hits the hardest for me. I used to have videos running on a second monitor, or an active chat or whatever, but somewhere along the line I remembered how much more enjoyable the gaming experience is when you actually focus on it.
I during Covid I started plugging in my laptop to an external monitor. Have a zoom call open on the laptop display and using the external monitor for everything else was such a liberating feeling. It definitely increases my productivity and makes some of the things I do on my computer easier. When I game I run the game on the laptop display because it’s superior to my external monitor in terms of resolution, refresh rate, etc. and have discord, Spotify, hwinfo, etc. going on the other apps. Really makes for a nice setup. At least with a laptop I couldn’t see myself getting rid of the external monitor for my home base. With a desktop I’d be more inclined to go with a single monitor so long as it was sufficiently large.
There's no reason why it would affect latency, any sort of latency would be seen as a drop in FPS, not how fast those frames get to the screen. Using more screens and having pixels move on them obviously uses GPU power, I don't honestly know why this video made it seem so shocking or unexpected,
@@blaness13 For a long time, people said it was no issue. No downsides. 1:1 performance. Even today, you can see that this wasn't true when you search back to days past and look at the driver issues Nvidia used to (and, in my experience, still in some cases do) have with mismatched refresh rates for example.
@@OGPatriot03 I guess it depends on what you consider a real downside, If they dropped FPS just by being plugged in and turned on then sure, thats a downside, but i don't consider losing 1 or 2 frames for active content on them a downside when you can just stop the content, its a common sense trade off, like getting a higher res monitor,
A video about best vertical monitors would be awesome. The viewing angle on the 27 inch I flipped vertically isn’t the best but not sure what to look into for a replacement 🎉
Sounds like your 27" is a TN, so the viewing angles are asymmetrical. A good IPS screen should do you for 178 degrees in both directions (meaning it'd be totally fine for vertical flipping), rather than TN's atypical 170/160, with horrific off-axis colour shifts.
you ALWAYS always always always (seriously always) wanna use a curved monitor for portrait mode, even if you dont like them in landscape as a main monitor. i swear its like curved monitor were actually created to solve this exact problem.
Genuine question: What if you’re mirroring your primary display, like say for example sending video to a capture card for a 2 PC gaming/streaming setup?
I feel it would have been interesting to see what running secondary displays does to performance when running those displays on a second GPU for example using integrated graphics for secondary displays and then your GPU for the single primary display.
I run a second video card connected to my second display to watch streams while gaming and it removes a massive amount of in-game stuttering. No need to even run any benchmarks it's so painfully obvious. If you do this you have to make sure to set your secondary display as your main display in windows or else the gaming GPU will still be decoding video.
that's what I used to do. I could notice the difference in performance, and would highly recommend it. it's one of the reasons why I think you shouldn't buy an F-series Intel CPU.
How can I dedicate the integrated GPU (on the CPU) to handle tasks like that, please? (edit .. wait lol .. is it just as simple as plugging the other display into the mobo port for the IGPU .. or is there more to it than that .. settings, optimizations, etc.?)
Back in the days of the CRT I had three of these heavyweights in my setup. If I was going to do some serious gaming I would always turn two off because it made a very big difference in performance back then. I am a little surprised the performance hit is much smaller these days. Good to know.
@@camerontechstuffsI actually had to use kitchen countertops to DIY myself a corner desk for my retro cave because a 19" CRT monitor made a crappy ikea cardboard desk make some *very* unhealthy noises... didn't want it to go right through the damn thing (and that was just ONE monitor)
I'm using one monitor straight from the graphics card and the 2nd monitor from the motherboard (yes Intel integrated graphics). From what I can tell, the 2nd monitor still uses the graphics card and does affect performance a little bit. I'll be interesting if the Labs team can follow up this video with this configuration.
I didn’t do any scientific testing, but when I upgraded from a GTX 770 to a 1080, I put both cards in for a while to run each monitor, and there was a noticeable lag when moving a window across the boundary between displays that went away when I took out the 770 and ran both off the 1080.
@@sja2095 Considering that 32GB are starting to become the norm, heck i have 128GB, a few megs being used by your iGPU doesn't even matter. Plus, considering that a non-F CPU still output the same heat output as a F CPU, it' doesn't matter. It's not iGPU costing you 100W.
@@sja2095 Some intel integrated graphics actually used to be hidden on the northbridge chip. I don't think they've done that in a while, but it is worth noting.
I don't know how it works on Windows, but on Linux when you have a dedicated GPU but also some screens connected to the iGPU, it works kinda like a multi GPU laptop, where you can pick what programs run on which GPU, and if the display GPU is different from the render GPU, it copies the window contents over to the other GPU (which obviously has a small performance impact, but negligible for me)
Could you do a video on running multiple displays at different framerates? This should be quite common with many people buying a new high refresh rate monitor for gaming and using their old 60Hz as a secondary. Apparently windows really does not like this. In my case plugging both monitors into the gpu completely kills gsync and there are massive framerate drops on the main monitor whenever something was moving on the secondary. My only fix for this was plugging the secondary 60Hz monitor into the mainboard to run on the iGPU
I sometimes use my old 60hz display as a secondary to my 165hz main, and I don't have these issues. Plugged into the same GPU and all. Performance drops and/or stutter when video on 2nd is playing sure - depends on how heavy the game is since I'm using an old GTX1060. But VRR on main display works fine in this scenario and I have no massive FPS drops. However, my 2nd monitor is plugged in via DVI while main is on DisplayPort so this could be the reason why.
I suspect this is one of those issues that's been known about for years and companies will shuttle the blame back and forth between each other. I've absolutely seen this issue before - a friend's computer which otherwise ran flawlessly would hitch down to ~20 fps in 3d games, if his (older) secondary monitor was plugged in. On the other hand, I've built and used plenty of setups with multiple monitors *without* Windows having a problem, including using g-sync with both monitors plugged into the GPU. It just seems to break on certain hardware configurations, especially with older GPUs / monitors.
I have a problem with different scaling, it's just blurry on 100% monitor. I also have projector with hdmi switch, one input is PC and another is Android TV stick. So if switch is selected android, PC stutters and frameskips, even if im not using the projector. Also if i use the projector and pc is turned off, fans will glow dimly.
Most people nowdays use 2 monitors, the second monitor beeing a cheap 60hz. Running 2 monitors at different refresh rates creates major stutters and jerky motion when both monitors are simultaneously displaying motion. Now I heard an windows update partially fixed the problem. You can make a video about this issue will be very interesting seeing the results
Would be nice if he talked about having multiple monitors with for example in the case of 2 monitors, have main gaming monitor plugged into GPU, and second monitor plugged into the Motherboard. Would be curious to see the comparison between 2 monitors in a GPU and 1 in each.
Should also test mixed refresh rates, turning off/on hardware acceleration on various pieces of software, exclusive full screen vs borderless/window. mixed HDR. This headache has gotten better in recent years, but especially Vista to Win10 DWM with mixed refresh rates and window'd mode games would seemingly cause the game to stutter down to the lowest refresh consistently. EDIT: Also I do think layout can mess with it as well especially portrait next to landscape, which it does appear y'all did test. I think the "virtual canvas" has to be even larger, because of monitor layout.
@@bdubs85 I had the same issue wanting to run my 1440p at 165hz with another 1440p at 60Hz, dropping the 165*Hz down to 120Hz and no more stutter/hitches/tears. More recently I OC'd the 60Hz pannel to 72Hz and 144Hz is fine on the main, with the 165/72 combo it becomes way less noticeable than 165/60 but still present
@@ericmullen14 So, in theory, as long as your secondary monitor runs at a refresh rate that's a multiple of your main screen, in your case 144/72, it should run the best rather than using more mismatches values Still, which OS do you have?
YES. High refresh monitor + 60Hz second monitor is a super common setup and the source of so many issues. I'm running 144Hz + 60Hz and was getting around 30ms of lag every couple of seconds when discord was open on my second monitor. At one point my main screen would be capped to 60Hz when watching a video on the second monitor. A different 60Hz monitor amplified this problem so much more, having discord running on it would it make it (discord) super laggy and almost unusable, but only when the main monitor was set to 144Hz
@@Sebaex windows 10, if it's half at least the initial logic to try it was the frame generation pacing would be equal between the two, the slower doubling frames but the frame pacing is equal
If you feel you need the extra performance in certain games or whatever else just temporarily turn off the second monitor and turn it on when you need it again.
This was a good video. a while back I tried to determine through benchmarks if my 2nd screen was hindering my performance. as far as i could tell it wasn't having an effect. but my 2nd screen always had idle apps on screen, now I better understand why this was the case. for me having extra utility is worth the minute performance hit. I appreciate what you guys and the other tech channels do, so I don't have figure everything out for myself.
It'd be interesting to know if there's a performance loss when running a single BIG monitor running both the game (in borderless window mode) and video streams concurrently since the latter seems to affect performance the most
@@Masterrunescapeer How? All of the tests had three videos playing and having them minimized instead of full screen returned identical results. How could having one monitor with three videos playing minimized and having four monitors with three videos minimized having an FPS disparity be because of the videos and not the monitors?
@@cheesefriesandgangsigns The answer to this is simple, the testing methodology here is flawed. When a video is running in the background there is no rendering happening so there is no usage on the video card (Yes, there is still decoding of the audio and playing of the audio, but the rendering isn't done) So, of course there is no impact on the FPS until you start having videos visible on a monitor.
Now I'm curious about what happens when you run a 1080p screen with a 4K, or a 60Hz monitor with a 144Hz one or a mix of everything! Thanks for making me question a point I never though off before!
Works ok for me. I run 2 4k60 TVs and 1 4k120 TV for Truck Simulator. I also run a 4th display for PC stats at 1080p60. If I'm playing a game on only my main screen, I will usually have Twitch, OBS, and a webpage or two running on the other monitors. I can confirm there is some performance penalty, but not much. That was even on a 6600XT (I recently got a 6950XT for more VRAM and frames). Just do a little research before mixing and matching, though; I hear nvidia cards can be pickier with multi-monitor setups.
It works great for me since I switched to win11 (and did all the other updates for drivers etc.) I'm running one 32" 144hz 1440p with HDR and a second 27" in 1080p 60hz without HDR on a RTX3070. Good performance in MSFS, DCS and UE4 games.
I run one 1440p165hz display for primary and one 1080p60 for secondary on a rtx 3070, never had single hiccup with them. primary is on DP and secondary on HDMI.
Thanks for making this video and clearing things up and quantifying the effects. My personal take from this is that having a browser or chat window open on the 2nd screen doesn't have that huge of an impact. 👍
Sure but a second monitor costs money and takes up space. I can do all my browsing and discord on my phone while gaming and it’s way more space efficient
The worst thing is that both AMD and NVIDIA are seemingly unable to fix the high-power-draw bug for multi monitor setups! My 2070s uses 50-60W while my two monitors are connected. Sometimes there are tweaks and workarounds to get power draw back to 15-20W, but that doesn't work for everyone or every case. It’s insane how both of these huge companies have not solved these issues for years(!) now.
While in colleage I switched to three montiors with my main monitor being longitudianly in the center with two monitors in verticle orientation being on the sides. Worked great for having corse guidelines on one screen, my refrence material on the left and my active document in the middle. Super cool getting to see more and more data being implemented from the LAB's testing. Definitely will be turning the side two off when jumping into a game, never really thought about this and are curious how it will do for me.
As a multiple monitor guy, I can tell you whenever I boot up a game i press win + p and select only primary (running 2 2K 27" monitors) I do that mostly cause the other one can be distracting but I do it also for the performace. I thought it would be more demanding but your experiment really brought me some clarity, Thanks.
Interesting video, but I think you should have tested a range of graphics cards from both AMD and NVIDIA - we might find something interesting to recommend one brand over the other for multi monitor users similar to how Hardware Unboxed uncovered the driver overhead discrepancy between the two
Weirdly enough, I've heard more complaints of NVidia drivers not playing along with the DWM of Windows 10/11 compared to AMD. In my opinion these kind of which one's better than the other one kind of comparisons are basically useless (including the driver overhead thing) but if you really care, here you go.
Yeah, especially with Labs now seemingly pretty operational, it would be nice to see start seeing LMG cover more hardware scenarios in these types of tests.
Not only that but there will be a VRAM usage overhead, which is about 8MB for 1080p for the frame buffer, 4K is going to be 4x that per display, going to assume its double or triple that when vsync is enabled as it keeps 2 frames buffered at a given moment.
I mean it would be difficult considering how different can the desktop environments and display server or protocol (X11 or Wayland) can be, but would you test if similar results can be found on Linux distros as well? Maybe try the most mainstream DEs such as GNOME or KDE Plasma.
honestly, I feel like two monitors is necessary nowadays, especially when games don’t let you dashboard to go into like discord or something without breaking the game to a point where you need to restart it
As a software engineer I find it very hard to be productive on a single display. I have a 49” ultra wide and two 27” vertical monitors. I’d be curious to see if this also happens on Linux both with X11 and wayland since I don’t use windows.
I'm pretty sure this happens in both X11 and Wayland. Also I've worked with 2 monitors vs 1 single monitor and for me productivity is the same, since I use a window manger I can just switch to another workspace with a single keypress which is often faster than turning to look at the other monitor and dragging my mouse to it.
@@roccociccone597 Hmm, It's never been a problem for me, but I see what you mean. Maybe you could configure it with a fade effect if your WM allows, to sort of "smooth" the transition
@@VictorRodriguez-zp2do I use hyprland so yes I already have it. But multiple monitors are just nicer especially when you’re reading docs and have to copy paste something.
First of all, nice video! I however would've loved if more in dept talking points were discussed. Such as the effect of variable refresh rates or pixel count due to the use of different monitors or the impact of g-sync or freesync. Just to name a couple. You have the lab and the equipment. What's holding you back?
The worst problem I had with multi-monitor setup came from having different refresh rates on both of them. My main display is 144Hz, secondary 60Hz. Turns out that when gpu is used as shown in this video on both of them, the overall refresh rate drops to the lowest one so my 144Hz display was only getting 60Hz. This combined with wallpaper engine held me locked at 60Hz for longer that I'm willing to admit. From what I've read, this issue only affects Nvidia GPU's and AMD resolved it years ago but correct me if something changed.
I can't fully testify since I only own a 75hz main monitor along side a few 60hz ones but I recall running an online framerate checker before which stated the proper 75hz & 60hz respectively. I did have to change my Monitor's resolution from "1080p Recommended" to "1080p PC" to get the higher framerate tho. For some reason would cap at 60hz on the other 1080p res setting.
Use 4k 120Hz+1080p 60Hz on RTX 3080. Works fine, 4k is lg c2 and it have input signal information overlay. Whatever i do on second monitor, overlay reported 4k 120 Hz signal with any changes.
Correct me if I’m wrong, but is there a way to set certain apps to use the iGPU instead of the discrete graphics if you have it? At that rate I would set everything that wasn’t a game or something needed strong power to run on the igpu. Give it some use instead of sitting there ready for the inevitable troubleshooting
I’d be curious to see if there is any performance difference with multi monitor and streaming if you use a dedicated encoding gpu like an Intel ARC for av1 encode/decode and amd or nivdia for playing the game, also what would happen if you plug the secondary monitor into the Intel gpu vs just using it for decode and how you can set that up. Could be a relatively inexpensive way to get all that performance back.
I had my second monitor plugged into the iGPU for a while, but didn't do extensive performance testing. The problem is that whenever you drag a window that requires some GPU rendering (be it for video decoding, or something like the deskpad configurator) from one monitor to the other, the screen would blank for a second when it switched GPUs. It's possible to avoid this by setting your browser (and other applications that can cause this) to be rendered by your main GPU through Windows settings, but that of course destroys any benefit you might get.
I'm curious about how multiple GPUs driving the displays would work, or if it is a good idea to put the integrated GPU to drive the second monitor. From my experience, the performance still suffered a lot, but I thought it didn't make any sense as there are 2 GPU doing different work.
Compositor only works on single GPU, if you connect display to second GPU, the first one will copy the image to the second one, which is a lot more workload than just directly connecting the monitor to the first GPU
I've found a fair bit of success plugging my secondary monitor into my integrated GPU, it's only 1080p and used for videos so it works great! Takes my GPU load off at least. There are a few caveats but pretty great overall.
the only drawback i can see is when you're in cpu limited games. taking that power allowance away from the processing cores could have an even bigger impact to overall performance.
@@dippst It that case there’s an option to plug in another cheaper second gpu assuming you don’t hit a PCI-E lane limit. But idk if that would help or not given other overheads. I actually did that years ago with a GTX 560 and an AMD HD radeon gpu in the same system lmao, it worked and it was jank.
This is why I plug my second monitor into my motherboard to take advantage of integreated graphics. Sure it probably made very little difference before but I definitely thought I FELT a difference when both monitors were outputting from my gpu
This is what I do as well. I've got my main monitor on my GPU, and my second monitor on the motherboard (iGPU). I do have my TV connected to the GPU as well, but that one is typically disabled in Windows unless I'm actually using it. This setup probably still have some performance impact, but it seems to be noticeably less than running both monitors on the GPU. Mainly I guess it "moves" the load from the GPU to the RAM, and affects the CPU temperature. It might also take some bandwidth from the CPU. My thinking is this. In games where the CPU (or RAM I guess) is the bottleneck, having both monitors connected to the GPU is preferable. In games where the GPU is the bottleneck, moving the secondary monitor the the iGPU would be preferable. However, switching back and forth would be more hassle than it's worth, and therefor most people would benefit from utilizing the iGPU, since most games seems to be GPU heavy. Unless there's some other drawback to utilizing the iGPU that I'm not seeing. If you're having temp issues with any particular component, that could also affect the choice.
You didn't mention the additional VRAM usage that comes from additional monitors. Even if you're not running anything on the other monitors and they are just chilling on the desktop, their additional VRAM usage can be significant if you're trying to play one of the newer VRAM hungry games on an 8GB GPU. This is the main reason I started turning off my 2nd monitor while playing heavy games and not the FPS loss.
@anunoriginaljoke5870 it will matter in the details and also the FPS of the game. I using 3070 and it's sometime exceed my 8GB ram in the last of us, when it happend the game look terrible and fps start stuttering
@anunoriginaljoke5870 i'm not the author of the main comment. I just want to answer your question. It will depend on the game you play. Some affect some not
@NordiXXL Same I have a rx 6600 xt and 8 GB was not enough. would stutter like crazy.. moved to linux game runs butter smooth now on High settings. problem is Windows is not optimized as sitting idle can use over 1 gb of vram
Was this really the best the labs could do? What about using an igpu for secondary monitors? What about AMD and Intel GPUs? I thought the point of labs was to go above and beyond, over the top, and maybe find unexpected results along the way.
Ran into this issue a handful of years back, running a 144hz 1080p gaming monitor for main, and two portrait 60hz 1080p IPS monitors on either side of it (web dev / DBA) on a GTX 1080, everything seemed to be ok. Picked up a 4k monitor for my Playstation, had it above my main PC gaming monitor, figured "it's just sitting there during work hours, I should just hook my PC up to it and watch videos during the day on it instead of the portrait monitors". My poor GPU had a meltdown with the added load and was running sub 30hz just existing in that config. Nowadays, the PC is on a single 244hz 1080p gaming monitor, with a LG DualUp hooked up to my MacBook. PC is much happier since I'm still running that GTX 1080, it's given me a few extra years out of this system without a hit to my productivity.
Have you tried using the function that allows you to use onboard graphics to run a second/third monitor I know intel can and probably the new zen processors can too but it should be the same as running a single monitor off your graphics card as long as your processor is powerful enough to not be running at 100% during gaming. This would be an interesting test to make sure.
This was my setup, I used to have my Main display plugged in to GPU and the Secondary display to the iGpu (UHD630), the interesting part is that if you place a window on the secondary display as say a youtube tab, both the GPU and the iGpu is working in tandem decoding the video. ua-cam.com/video/mi8xlCz42CM/v-deo.html a video from TYC explaining this niche topic. my previous use case was to use Intel Quick Sync to encode video through OBS whether its recording or streaming. I have done some test, GPU encoding vs QS encoding in recording 1080p footage the result ; GPU encode decreases fps around 10-13 compared to stock IGpu encode decreases fps around 7-9 compared to stock Streaming is lighter compared to Recording, around 2-3 fps decrease across the board, and Streaming and Recording only nets around 1-3 fps decrease compared to recording alone. I have only tested it on my own hardware, your millage may vary. Core i5 10400, RX580 8gb, 16gb ddr4, 1080p main and 1600x900 second monitor.
I run 3 monitors right now and I'll happily sacrifice some frames for the convenience. When working, I use one for my IDE, one for unity editor and one for a browser. When gaming I use the secondary ones for videos/music/discord and the other for hwmonitor to check on stats. Main monitor is 1440p and side ones are 1080, so I barely notice the loss in performance.
having editor and IDE on separate monitors for code debugging is a god send, wouldn't change it for anything, and while coding, main monitor can be IDE and secondary can be google
I am a web developer and have laptop and main monitor. Usually i leave IDE on monitor and docs/google on laptop. For gaming or personal use i find that another monitor is distracting, so when my friend lent me a monitor until i bought mine i tried to use them both and most of the time it were just showing desktop so i would turn it off when playing.
another thing about multi monitor setups (on windows) is that borderless full screen applications can have frametime issues due to the dwm only having one compositor for all monitors. before i think w10 version 2004, it used to be even worse. if there was ANY activity on your secondary monitors, the frame rate of your differing refresh rate monitors would suffer. these issues i brought up happen only if the refresh rates of your monitors aren't exactly the same as far as i can tell.
I would've liked if there was some comparison included regarding if it increases the input lag or not to your main monitor when gaming. Or if it increases tearing. Otherwise a good in-depth investigation!
It doesn't, assuming identical framerate in all scenarios. Obviously if you're running some intensive tasks which lower your framerate, your input latency will increase as well. But just having multiple monitors enabled doesn't cause any of that. By definition it is input lag, not output lag. The output signal is instant and synchronized across all outputs. One common issue is having monitors with different resolutions and refresh rates. Windows doesn't handle that too well. Like if you have a 120 Hz and a 60 Hz screen, Windows might limit both displays to 60 Hz. There are ways to work around this.
only thing missing in testing is asyncronous displays, like having that nice 4k display as main, and 2k or 1080p displays as secondary, probably wouldnt make much of a difference tho, very informative video
im currently running such a setup. while i dont have rdr or cyberpunk, i am playing diablo 4, and in the process of swapping a fourth monitor, a costco derivitive lg ultragear that couldnt meet its advertised displayport 1.4 spec, shame on me for trying to get a good deal. i can make my rx5700xt chug if im moving around programs while youtube or twitch vids are playing. during beta i did have diablo crash my gpu driver and reset my pc. since then i havent had a repeat.
I’ve been running 4 monitors on my corporate computer for a decade, but only one 27” monitor on my personal computer. It’s perfect for RTS and world building games, but I’ve been contemplating getting it a second monitor for popping out a full size preview for video and photo editing. My biggest concern is how to arrange my desk so I can fit two 27” monitors side by side, have enough room to see everything, and not get a sore neck from staring diagonally for hours. With my 2X2 work monitors, I’m constantly moving from one to another, which prevents strain.
I'd like to see this test done again with Hardware Acceleration disabled in whatever browser you're using, I imagine the hit to the CPU when encoding/decoding is probably more impactful on the overall experience. I apologize in advance if this was already mentioned in the video and I happened to miss it somehow.
I would really like to see them test this, but have the side monitor plugged into another (presumably older and less powerful) gpu. Thats how I run my setup and I feel that it works quite well.
it would be an interesting test but may I ask why you use that setup? Is your gaming GPU not very powerful so you're trying to give it every advantage possible? I wonder if there are driver overheads etc that hurt performance having multiple GPUs
I was hoping you would expand testing a bit to show if you get any performance hit on your main GPU if you have your extra monitors on a separate one? My guess would be no hit since those tasks should now be done on the other GPU, but maybe there is a surprising answer?
If you're someone struggling with having a short attention span and you use dual monitors try switching back to using single monitor. It personally really helped me at getting back the ability to focus on just one thing for longer periods of time without constantly checking discord or some other crap every 3 minutes
Save 50% on a Keeper personal or family plan, plus get 30 days free at bit.ly/3OscPl5
yo linus why did you make this video isn't this widely known
Just use a Tablet and keep a single display - best of both worlds
Linus try this. take your Gaming Monitor to your Gaming GPU, then a second GPU but its a really Low end card like a 4 or 3 gig variety for a second monitor. Then test to see about gaming quality then. I bet this will fix the issue and give GPU makers a new market.
Why didn't you add miss matched monitor resolutions? That has a larger impact on FPS.
We need more testing. How about using the iGPU for a 2nd monitor? What is the effect on input latency? There is so much more...
Thank you linus, I will run my other monitors in 240p now :)
open up a rgb gif on your displays for more performance
@@tombi_106 Never thought of that, thanks :)
That's what i did for years, since i lose around 20% FPS with 1080p Videos with my crappy Laptop.
You don't have to ignore. You can choose whatever you want. You are just now informed of those decisions. Better than not knowing, yeah?
@@Seedbro are u not a bot i looked u up on social blade and saw u on seytonics vid
As a multiple monitor guy: I’m just gonna ignore this
Same
to be honest it's fine for you probably because you have a gaming pc
Yeah I'm just gonna buy a more powerful card, I ain't alt tabbing
Exactly
Same
Interestingly, you can actually get an FPS boost with a monitor on a gaming laptop, as most HDMI ports connect directly to the dGPU and bypass the iGPU. Many screens with optimus go iGPU > dGPU, so the integrated graphics act as a bottleneck. This is less important these days with the MUX switch becomming more common, but it's still a nice trick.
Absolutely true, but only usually apply to laptops without a MUX switch for the GPUs which aren't that much of them nowadays
On a side note to the MUX switch, on fallout 3. On some laptops (specifically AMD) it does not switch from iGPU > dGPU on the built in screen automatically, like with most games. You must manually switch to a higher power mode through AMD Software Adrenalin Edition otherwise the game wont start. Not really sure why but, Bethesda breaking games before you can even play them is impressive tbh. I'm sure there's more games that do this, but I don't know of them.
edit: I know why it won't start (not enough power on iGPU), it's the GPU not switching I don't know.
@@PeterKane379hdmi port goes directly into laptop big graphics card.
Gaming laptops generally have two graphics cards, for ease of explanation a "small graphics card" and a "big graphics card" the bigger one being the faster one.
Normally laptops use a small graphics card to send the signal to the big graphics card which adds "latency" reducing performance.
Plugging the hdmi into the laptop thus directly into the big graphics bypasses this issue (as long as you turn off the laptop display as per the reasons listed above).
Hope this helps you understand!
You can always disable the laptop monitor and get the igpu benefits without losing the 2 fps from having a second one
@@_minzez_2926 I bet if you try other older games that won't place a huge load on the GPU that this will still happen at least from time-to-time, it's probably related to the game being so light relative to modern hardware that the driver doesn't realise it's a game.
Also, any modern iGPU should play Fallout 3 just fine given I was able to keep playing an FO:NV playthrough when my GTX 275 died back in the day and I had to temporarily use an HD4200 iGPU.
Ill save you 10 mins, you only really lose 1-2 fps
We need more people like you O7
Thank you so much, the grandeur behind a simple question was just way too much
But spending that 10mins is the point... we all know already it's not meaningful.
Thank you good sir 🫡
Thanks
I actually thought a second monitor would have a far larger impact on your in game performance, it’s really nice to know in most cases it’s fairly negligible.
Just have a secondary laptop next to your monitor, or two laptops.
Depends on the PC. If I play Factorio on my NUC and I just open a browser on my 2nd monitor the frames drop from 60 to 45 and lower.
@@antikommunistischaktion Monitor would be cheaper or better quality for the price of the laptop. If you'd buy 2 worst laptops just cause of small price, their screens gonna be much worse if you just spent the same amount on monitors itselves.
@@smokyz_ yeah you're absolutely right, but I think that guy did not care for facts. he just said something extra just because he is an internet guy who will comment whatever he thinks will make him sound cool
I also am a two monitor guy and a pretty casual gamer. Half my games I play i've locked the framerate at 60 and with my current build and those games it never dips below that. But having that extra screen for things outside of gaming is huge for me. The option to have a chat open on one screen while watching hockey on the other, as one example. Also, i do some music production as a hobby and having a second screen for that... I couldn't go back. My first two screen experience was just connecting a monitor to a laptop and since then I've refused to daily a single monitor.
It's worth mentioning that having a multi-monitor set-up doesn't REQUIRE you to use all of your available monitors in tandem. Sometimes you can just turn on only your main monitor and chill that way.
Having it plugged in does add input latency. Wish they had done test to explain why
@@KroLeXzso if having multiple monitors even if you’re only using the main once the other monitors are technically on too which means it will use some resources and add latency. Is there a way to like fix this would having to unplug unused monitors all the time?
@@sukitoru- hook up your secondaries to their own power bar and just flick the switch when you don't want them using resources
@@KroLeXz its probably so minimal that you dont even recognize it in 99% of games.
You can have a multi-monitor setup but you don't have to use them both all the time - is that really worth mentioning, or is it totally obvious?
I believe it would be interesting to see how much difference using iGPU or a second GPU for secondary monitors would make.
Wanted to make the same comment. What about the iGPU for monitor 2 with monitor 1 on the GPU?
Yeah using processors with integrated graphics for such things is what I do, i have 3 monitors and yes it didn’t make difference in performance
A lot, I'm using secondary GPU's or IGP's all the time for second and third monitors. On my main it would take +10 C in heat and a performance hit.
I as well would like to see if a additional graphics cards help with multimonitor setups. In an streamer scenario it would be interesting to know if a dedicated gpu for streaming, another for gaming and another for monitors could increace the performance, or ven just one for gaming and a second for stream + monitors
@@diedrichgI was using one monitor plugged into my motherboard using the integrated GPU and one plugged into my dedicated GPU for a while and I noticed watching videos on the second monitor caused lots of stuttering and it fixed when I plugged it into my dedicated GPU
This is the stuff I am stoked to see more data of. I have 3 monitors and a few follow-up questions related to this subject. If I have an older high refresh rate monitor with g-sync does that hurt my performance on my primary gaming monitor since the GPU is trying to sync with more than one g-sync module? Would it be best to disable G-sync in the monitor? If I set my IGPU to run other applications like Chromes hardware acceleration, discord, etc. How does that affect things? How about If I have extra monitors set to a higher fresh rate? I have a 60hz tv, a 144hz side monitor, and a 175hz primary monitor for gaming. Is it best practice to set the monitors I am not using for gaming to 60hz? Does it not affect it? How about color depth differences, mixing HDR and non-HDR, 10-bit color, etc? I am very excited for more lab data.
Its kind off disapointing, but the answer is it all depends on your system health, and CPU you are using. GPU doesnt really care if it is rendering to the 1, 2, 3, or 10 monitors. Most of the work in rendering pixels is done almost instantaneously., G-sync is just a buffer, so it is not so different performance wise than v-sync. You do need to have a good install of drivers though.
Every rendering queue, starts with your CPU. CPU sends information needed to your GPU about what is to be rendered, in what resolution, and orders which data goes where. So the answer is, Get yourself better CPU.
@@ph0ax497this video is 9 months old
im suprised you guys didnt test to see how much the input lag gets worsened for each monitor getting added, and to see if plugging the main monitor into the gpu for gaming and all the secondary and so on monitors into the integrated gpu for non gaming tasks will help with the peformance. i hope you guys will make another video addressing these issues.
they made a very easy video to earn some cash, nothing serious..
@@carlosgarcia1165nope just a limited scope. They were trying to answer a specific question. There's no reason they couldn't do a follow up looking at latency. However I doubt it will add much to the story.
@@HerbaMachinaagreed, I mean this is computers we're talking about can't get into everything. Half the time I get a game that won't start all because I have my controller plugged in before I start it
@@HerbaMachina this topic is too broad to talk just about some simple stuff. I think they are creating more desinformation than actually helping
It should work if you have an iGPU built into your cpu. Plugging extra monitor into that should negate the performance hit.
But I agree they should have shown this
The high end system was used and while the results were surprising to see how minimal this impacts the performance I would also how loved to see how the effects are with the mid and low end system. But those who buy low or mid range pc might not be using multiple 4k monitors since 4k monitors do cost a lot and depending on the GPU they might have to opt for display through igpu and not through the GPU. This could have been a longer video with more variation in setup as this is a interesting topic to know about more.
I hope they are just testing the waters
Yeah I wish they covered scenarios more people actually use, eg. secondary 1080p monitor on their low/mid end PC, using Wallpaper Engine etc.
Its one of those things that they could probably do an 8 hour video with all the variables. I dont know about low end parts. Its not really worth testing because you can probably work out the pattern from mid and high. like an igpu or video card with less than 8gb of vram isn't really worth testing. You got 1080p and adding in more 1080p monitors to your already stressed system is going to be slightly worse.
This would've made more sense but at the same time this is exactly what I was worried about for them with LTT labs. This might be the most useless video I've ever sat through and I'm not joking. You're telling me the computer playing back 3 videos hits my performance slightly! Thank god I had hard data to back up the obvious lol. Also the "twist" of a monitor not doing anything not impacting performance is just very odd. Duh guys, you're not drawing anything new.
No kidding. I honestly have no desire to game in 4K. It's pissing away performance for an infinitesimally minor boost in picture quality. (And I'm not just speculating. I've played the same games on a 50" 1080p vs a 70" 4K, and the extra pixels did jack for my experience. Pretty visuals were still pretty, regardless of how many millions were spent on a given screen's marketing.)
Would like to see a deeper dive into this kind of testing. HDMI vs DP, 60 fps refresh vs a higher one etc
Yes, also they didnt specify if harware acceleration was on/off (i suppose on) and if turning off helps or not
I've tested 2 multiple monitors, and I highly recommend one monitor when gaming. The less stuff ON the better, especially with older gfx cards. Steam overlay cost me performance also, super noticeable on older gfx card.
If you want dual monitors, keep it on slim usage on 2nd monitor like a website/tab/app but it will hurt the fps maybe under 10% ???
That's just my experience 😊
In a deeper dive, what about monitors that are different resolution and/or refresh rates? I’m sure most people use identical monitors for their 2nd, 3rd, and 4th?
@WayneMcCormick they should do more monitor testing, I'm interested to see the legit results. I recommend identical set up as much as possible.
@@TwiztedMannix87Watch the stuff at 5:20 and you might realize it doesnt matter if you don't do anything on the 2nd screen.
Also, you must have a really old card, like 7XX or prior from Nvidia or an AMD equivalent to have such performance issues.
If your computer is having performance issues from having 2 monitors, I'm pretty sure your computer has much bigger issues
Would be cool to test response time with different monitors connected (diff. refresh rate and resolutions) to see if having something like main 165hz 1440p with a 2nd 60hz 1080p would alter performance or response time.
This is exactly what we needed the most! Even though the average frames were only slightly affected, the overall render spikes and response times do feel different with different monitor configurations. Would LOVE to see a follow up video :)
I got you. I run a 165hz 4K panel for my main display, and a portrait oriented 1080p at 144 next to it for reference and multitasking windows... Response rate (though not scientifically measured to the capabilities of LMG) is not noticable.
Or the effect of adding a low tier secondary graphics card. Like pairing a 3080 with a 3050 to run the extra monitor(s).
@@SkyhausmannYES YES YES!
i have this setup! and i can tell you atleast on my setup playing a game i lose like 40% of my fps if the 1080p monitor is playing any video at a different refresh rate.
edit: i also did try a 1440p and 1080p at the same rates, and that doesn't make a noticeable difference in fps, for me at least its the difference in refresh rate that screws me over by like 40%'ish
Would've been nice to also include the test on lower end hardware, maybe something like a 2060 to see what the difference would be there
Yeah curious about VRAM limited cards.
I had 2 monitors on a 2060, wasnt a terrible hit at all but still preformed better on my new 6700xt
Indeed. Also curious about games where the CPU is the bottleneck. For example on an i3 where there are no extra cores for background tasks
@@PinHeadSupliciumwtf
I doubt VRAM will be much of an issue unless you're really close to maxing out the VRAM. You probably only add a couple hundred MB's at most if you have two extra monitors open for really basic web tasks or whatever. So on an 8GB card that would be maybe 3% of the VRAM?
I can assure the performance of games can be really, really strong... As my poor old PC is running on GT 640 3GB and 8GB DDR3 RAM, with i5-3350P I sometimes have troubles viewing a video on FHD (even tho my monitors are both 60Hz 1366x768, but the video is way smoother) and playing something demanding for my PC, like Genshin Impact or GTA V... But I second this idea, would love to see benchmark results there
I would've loved to see an exploration on how this impacts lower end systems. A few years ago I used to run triple monitors off an RX 480 4GB and the performance increase I got from removing one of my monitors was SIGNIFICANT.
I don't remember any exact numbers but there was a big jump in less demanding games like Minecraft, allowing me to finally use shaders, and heavier games went from noticeably stuttery to very smooth, sometimes even almost doubling FPS, like it did in R6 Siege, where I went from low 30s to high 40s and even 50s.
VRAM usage is the most affected when you run multiple monitors.
Yeah I used to have to constantly unplug my 2nd and 3rd monitor to consistently hit my 144FPS. I got like a 40FPS difference I think (from memory, might be wrong).
You could hit win + O (or was it P?) and switch to single display mode to avoid physically unplugging them.
Same I had worse specs and honestly I'd get less frames with discord up let alone the extra monitors. I had to turn everything off often on harder games
My thought exactly.
the twitch / kick joke at around 6:50 absolutely destroyed me, soo funny
nice!
nice!
didnt laugh
Having the extra real estate is worth more to me than a couple of FPS.
BTW - I love that you had VLDL on the lab's monitors
Give me the gold or i'll mug ya!
@@Pro_Triforcer - What are we ? - We are Mugger. - What does mugger do ? They muggle people - (both) SO ...... LET'S .... MUG HIM !!
Nice little touch adding the video title above one of the monitors too
Hey. There's a gamer over there. He's probably got a lot of FPS on him. Let's mug 'em and take all his FPS!
@@LyokoisGreat2 lol, i had not seen that, thats even better
Viva La Dirt League! Nice.. Thanks for finally covering this topic -- I once asked some former engineers who worked on DirectX at MS, and they didn't seem to have a high-confidence explanation of how things like v-sync and g-sync work, with multi-mon (and potentially with monitors fused together at the driver level, via Eyefinity/Surround). And how it intersects with things like the Win10 "FSO" (fullscreen-optimizations). There's a lot of confusion, and it's very hard to measure display latency, with any certainty or precision.
Which is why I would have expected LTT to so exactly that research in this video...
The whole point is that Microsoft shouldn't be aware of those issues. Although optimizations on the OS-level got to be a point, the basics are that the OS is not responsible for that kind of technique. The drivers are not Microsofts business. The OS delivers an anchor point with Direct X or straight into the OS, and getting a particular proprietary technique to work is on the origin company.
The thing is, if you have a second monitor, you don't have to use it all the time. It's just amazingly useful when you do need it.
i have 3 monitor when i working. but when i playing game i turn off the other 2 and keep one on. have to keep all that FPS
must be sooooo useful to be able to have discord open or youtube on anothe rmonitor.... not. 90%+ of people dont need these extra monitors and do it cos everybody else does it.
@@JamieReynolds89As a laptop user: When you're doing nearly anything productive, they're really useful
@@JamieReynolds89 guessing you've never held a 40hr/week office job. come back when you grow up, kid
@@JamieReynolds89 super useful for anything creative
What would be cool is seeing this effect on laptops as many of us may use gaming laptops to work or complete our school/college assignments and then come home and plug monitor(s) in to play using additional accessories.
I also think that this will vary a lot based on the gpu for lets say a 3060 or a 4060 or whatever other gpus are currently in use by the gaming community.
I am wondering if my dedicated graphics card in a laptop directly displaying the game on my main display while the integrated graphics running voice chat interfaces will hurt my performance. ( not taking account of the CPU hit since I would could also run voice in background)
Actually plugging your laptop in an external monitor in 90% of cases will boost your performance (if you have a dedicated gpu, that is). That's because you internal display routes the dedicated graphics card trough the integrated one when you need more performance (say gaming). By doing this your laptop saves power as it uses the IGPU when light tasks are being done and switches to the Dgpu only when needed. However this rerouting (known as Nvidia Optimus) causes latency and performance loss.
The Dedicated GPU however in most cases is directly connected to your HDMI/DP so connecting an external monitor would actually increase performance and reduce latency.
This is for Laptops with Dgpus ONLY and not in all cases.
@@HowToArter This was the case on laptops before 2021. Now they come with MUX switches so you don't need another monitor to gain extra performance, as mux switch can just disable the igpu.
@@kaishajacobye
Question is if this applies to all laptops with a dGPU or just the GAMING laptops.
@@Alias_AnybodyMainly gaming laptops, but some work laptops can have a mux switch. though theres no point because they are mostly spent moving to place to place, so having optimus is necessary to keep their battery life if they do have a dGPU.
As a gamer, I despair at knowing I'm sacrificing performance. But as a person that watches LTT on my second monitor, AIN'T NO WAY I'M GIVING THAT UP!
Yes. 2 I have used 2 monitors for the last 3 years, and just the possibility of having messages/youtube on my other screen while im playing something is so useful
You need a 3rd monitor with LTT store open
😂I actually noticed the performance drop cos I'm using a laptop that had a 144HZ refresh rate and my external monitor is only 60HZ
Do you guys not have phones?
The performance drop just isn't large enough to outweigh the convenience of having a 2nd monitor
Out of curiosity, how does the performance change if the secondary monitors are connected to the IGPU (integrated graphics) of the CPU?
Or any second gpu for that matter.
It's not the same, but if Taskmanager is correct then Videos on my second Monitor are Played through the iGPU during gaming on my dedicated GPU but i still lose quite a lot of Performance with 1080p Videos on my Laptop, which is unfortunate
I'd really love to know!
I actually was wondering that lately. Benchmark where basicly the same with igpu enabled or disabled so I tought, why not put the secondary monitor and decode video on the igpu? Howether I quickly ran into issue, firstly I couldn't get the igpu to do 100% of the work (the impact of a 1080p video on the second monitor was at best halved, there was always a few % of gpu usage no mater what). And more importantly, while running benchmark I noticed that my cpu performanced plumetted by 15-20% in short test (ryzen 5600g). After more test, I noticed that the power usage of the cpu was lower when the igpu was doing something, at least at the beggining of the benchmark, slowly ramping to the normal usage after ~20s. I'm not sure how that would work in real world effect, but short spike of cpu usage are pretty frequent, i'm not taking that risk, so I disabled my igpu. Not sure how a different cpu would act.
As default in windows it will use your dedicated graphics for graphics acceleration/encode/decode. Doesnt matter which physical connection the monitor is connected to.
What you can do is go to Windows Settings -> Display -> Graphics. You can then set which graphics card you want a certain application to use. This will impact gain back a small amount of performance if you like to game on one monitor and watch video on the second.
Even if I lost upwards of 5% of my game's performance, there's no way I'm giving up my triple monitor setup. I'll sooner add a fourth screen, a big 4K TV for media viewing, than I will discard a single monitor! >:U
But, it's fun to know what the difference may be anyways, inconsequential as it may be.
Yeah, suggesting people "give up" their monitor setup is completely senseless. Just turn them off when not in use.
@@leonidas14775Yep. I can turn off a monitor if need be and I've got GPU horsepower to spare. Multi-monitor setups is the way
i have 9, search for PCI ADT-LINK to M.2, I have two of those with AMD cards and 8 monitors that way, it takes the load off the CPU thanks to the two and cards and the chipset. I am using and cards because it is very hard to have mixed generations nvidia cards ;)
@carlosgarcia1165 why do you need 9 monitors 💀it doesnt even sound _that_ useful tbh
3 is the limit. At 4, you are better up with a Dual PC setup, usually with some cheap laptop.
As a long-time multi-monitor user, I have been wondering for a long time whether it really makes a big difference if you use static images on the second or third screen while gaming. I have recently been a user of a Ryzen 7000 system and have now connected the second and third monitors to the mainboard via the CPU and the main monitor is still connected to my nvidia GPU. Works perfectly, even with VMs and Citrix Workspace across different monitors. I just haven't been able to determine the difference in performance yet. Maybe something for a follow up? thanks for the video
I was looking for a comment like this. I have 3 monitors right now (Have a 4th, but I need an adapter) I was curious if it'd be better to have just the 4k on the gpu and the other 2 on the on-board
This was quite unexpected, I never thought some games might affect the multi-monitor setup that much. I would also love to see this experiment on Linux with a few more variations (e.g. Proton and native titles, Wayland/Xorg etc.)
Keep up great quality!
I had a similar Problem. 240hz wqhd Main Monitor and one 60hz 1980x1200p Monitor with vega 64. When I use both in same time I can't utilize 240hz mode. It just doesn't work. Look into it. I always thought it hast to do something with the 60hz Monitor because it has only vga and dvi (I use dvi hdmi adapter)
Edit: I forgot to mention, that the 240hz Monitor is only possible to run 144hz wqhd when both Monitor connected
@@denis2381 the refresh rate being locked to 144hz is a limitation of xorg I think
My main is at 144hz. But when I start games, the TV video, aka 4k UA-cam streaming does stutter for a few seconds.
@@ShimmerismYT what is that
up
1:21 HOLY CRAP THAT'S VIVA LA DIRT LEAGUE ON THE TEST SCREENS (Epic NPC Man series) 😂 I'm so frickin happy 🤣
Same
Rather have a little lower performance than not having 3 monitors. That being said, i wonder if i can just add another small grafics card that handles the other 2 monitors
You can. Best would be from the same manufacturer so you don't have to install 2 drivers
I'd have personally used my igpu for the second monitor for watching vids etc if I had one.
@@adsads196 This. I use my IGPU for my secondary monitor and configured my browser to always use the IGPU to get the decoding / 3D load away from the main GPU. Works great :)
I used to run a cheap GT 610 with a GT 660 for half of the monitors in a quad monitor setup. That was on a HEDT platform so it had enough PCIe lanes to keep both cards at x16
A bit later I used NVIDIA Surround on the GTX 660 for 3 of the monitors and the GT 610 for the extra monitor. Tbh the difference wasn't that big so at some point I stole the GT 610 for a different system and ran all 4 on the GTX 660.
Nowadays I'm still running 4 monitors on just a single GTX 1080, though I don't use NVIDIA Surround anymore because of the inconvenience of 3 monitors showing up as one.
@@deano1699 that is not how things work
Couldn't you also use the secondary monitor with the integrated graphics in the CPU by hooking it up to the motherboard output? That would take load off the GPU, though I'm curious how much load if any it would put on the CPU. Could be helpful for addressing bottlenecks.
Ive always done that since 2015 with a 4790 and now my r5 3600 with minimal issue. Vr second screen is way harder on the system them a couple monitors
Question. Is there a reason Labs didn't run the same tests with secondary monitors connected to CPU integrated graphics? Since current gen CPUs from both teams now have integrated graphics (basically across the board), I feel like this might have less of an impact on FPS as (especially at higher resolutions) more load is on the GPU vs CPU.
most gaming intels dont have a intergretaed gpu
@@SkyblowDKonly the F models
The igpu "steal" memory from the ram you have and use as vram. So it's better to use the dedicated GPU to run the second monitor.
When i connected the second monitor to my motherboard (using the igpu) my ram go from 16gb to 12gb.
@@Masterblack1991 only as much as you set in the bios, and generally by default its 128mb to 256mb, so it should be nothing to be concerned about
This is exactly what I do, gaming screen runs in GPU. Secondary screen of integrated graphics
I feel like its important to clarify that it's not because you have more monitors plugged in but what you are doing on those monitors - having a windowed game along with a video playing on a single ultrawide will have the same effect. If your GPU is slow/old enough you will notice a performance impact just from having multiple monitors attached due to limited bandwidth. When I want to be immersed in a game I just turn my 3 side monitors off.
I thought those 3 videos were always playing/running in the all the test? even 1 monitor test also have 3 video running in the background?
glad some one said this, this just seems like a case of if you do more with your computer its more performance hit on it and not actaully anything to do with the number of screens!
@@on99kfc Even if they were, when the window is not visible modern browsers and up to date video drivers generally stop rendering the video portion and only play the audio, you can test this yourself by opening a performance monitoring program and playing back a video then completely blocking the video with a static application like notepad. When using a slow enough system I have personally witnessed the video player canvas being blank when it has been covered for a while and then it shows again after a couple of seconds.
@@vintagemotorsalways1676 Windows will continue render the video in the background. Just play a UA-cam video and open task manager to see the GPU video decoding utilization. I tested this with 1 to 3 video playing, video decoding utilization go up to 10%, opened discord and full screen the discord windows, GPU utilization didn't go down at all. The video is being play regardless of it being view or not because the audio is playing. The decoding of the video is handle by the GPU, so even playing the video on the background with sound only, the video is still being decoded by the GPU.
@@on99kfc I did - that's why I said it. It is also dependent on your browser and what enhancements you have installed/ enabled. I believe you but it doesn't make my experience incorrect.
The vram usage will also increase as you add up monitors. Also, it's a bit of a niche case but I noticed that running two displays on different GPUs (e.g on a laptop, the built in display running on the igpu and the second one running on the discrete gpu) will also increase the driver overhead
I have a bad random laptop and running Acer monitor from that. I run into some issues, possibly caused by this. But, the laptop is bad and needs to be cleaned, dust-wise, too. But, everything works fine most of the time, just not for gaming, and it sometimes turns itself off recently (though this is likely caused by some other issues I've not yet fixed, haha).
running two monitors on different cards, both nvidia but different generations, previously it was a single gpu driving two monitors. even with two drivers essentially considering different generations performance is better with the two gpus one driving each monitor. as to why this, so i can game without a performance hit and watch stuff on the other monitor and nvidia iray which doesn't care if gpus, cards, generations or memory matches as long as they're nvidia it will use them to render.
A year ago I had a 5 monitor setup on a Radeon RX580. They were all 1080p and arranged in two rows - the top row having two monitors and the bottom having three. At IDLE with NOTHING open on Windows 11, it was consuming ~3GB of VRAM. Because of this arrangement, I had a virtual resolution of 5760x2160 with dead space on either side of the top row of monitors. I’m not sure if that contributed to the VRAM usage or not, but I did want to second your statement - that more monitors = more VRAM usage.
One thing you didn’t mention is multi device setups. I have 3 laptops, none of which is particularly powerful, but a combination of software like Synergy, networking, and a video capture device make them feel like one quite capable multi-tasking monster.
And also it’s cool that I have Linux, Windows, and MacOS all working together and each typically does what it’s best at. The MacBook Air records, streams, and edits, the HP has midrange AMD graphics and a ton of RAM, and the touchscreen goes between being a streamdeck-like launcher and a dedicated SSH terminal. Plus it also runs servers on the local network. Maybe it’s niche, but it seems like people should know it is completely viable to have a streaming / recording setup where one machine is the multi tasker running most of the monitors and handling the stream, while another machine just handles gaming. This is even nice that you can have a separate keyboard and mouse for say like chat without leaving the game and potentially bugging out the cursor
theres a lot he didnt mention what is your point.
A look at what happens if the monitors are on independent GPUs would be interesting, especially in terms of what happens if you use an iGPU for a secondary display. For my part, doing the gaming VM in a NAS thing it’s been an easy path to enabling multiple VMs when I actually want a second seat.
I liked the idea of arc, not for a gaming Gpu, but rather to handle the workload of second and third monitors.
I too want to see this. Intel back to third gen has 3 monitor output. 12th gen (maybe 11th too?) Has 4 monitor outputs. Also a second card with multiple outputs is $15-20. NVS510, FirePro W4100 or W5100 etc.
I'd love this, I've got a spare 1050 hanging out in my computer from my old build and would love to see if running my second monitor off it would help. My CPU doesn't have integrated graphics so that isn't an option unfortunately.
I put 2x 1080 Ti in my gaming PC but had one going to my monitor and one going to my TV. It worked shockingly well, but I couldn't control the RGB on the second one, and using one heated up the other because they were literally stacked on top of one another
This was my original plan for my 12600k and 4070ti build, but I only found out after buying that the BIOS on the Z690 platform doesn't allow you to enable the iGPU if you have a dedicated GPU enabled. It's one or the other. Which is pretty stupid. I check the BIOS updates occasionally to see if they've changed that, but haven't seen it yet.
Love your recent tests, LMG, though I find the conclusion here to be a bit incomplete. It would be nice to see more in depth how extra monitors affect the overall system delays like input lag and render consistency. Personally I've always noticed that something was off whenever using the second monitor. Even though the frames are very similar, the feeling is way different.
Examples of tests:
Timings + fps consistency when only 1 monitor connected to GPU
Timings + fps consistency when 2 monitors connected to GPU
Timings when 1 monitor is connected to GPU and 1 to integrated graphics
Try to add more monitors to the mix. 1, 2, 3, 4 with same spec vs different specs.
Yeah I was also dissapointed that it wasn't tested, not sure if it's placebo, but I did "notice" that when I upgraded to my 1080 Ti and couldn't use my 2nd monitor cuz no DVI port.
This is more of a Gamers Nexus type analysis and not the sort of "tip of the iceberg" that LMG does.
If you haven't noticed, the primary point of the video is to sell the monitors linked in the description below.
@@cjbtlr Not sure if you've noticed, but LBG Labs is partially up and running. If they find a way of automating the testing method, they can gather a ton of data about latencies and stuttering
100% agree on this. "2 screens on a tower PC" and "2 screens on a laptop (which means one integrated screen and one external screen)" is a completely different deal.
Is it? more monitors is more system resources used, which will lower your game performance, same as running multiple programs while you game. This has been known forever, and frankly shouldn't be an issue unless you're running a very under-powered machine. Overdriven monitors causing unintended clock behavior and effecting power consumption is a much bigger rabbit hole.
Note: this performance loss can be entirely circumvented with either a 2nd GPU or iGPU handling the 2nd display, this is as easy as plugging the display into the other GPU and then anything on that display should run on the 2nd GPU, and if not you still always can force it to in windows settings
It is a bit overkill to have a 2nd GPU for like 2 fps average, but if you already have an iGPU, why not
How do i plug it to my igpu
@@mehmetariftasl8277Plug it into your motherboard. But only do this for secondary displays.
@@mehmetariftasl8277 Usually the display port (hdmi, dp, vga, etc) on the motherboard instead of the gpu itself. Depends on your CPU too.
That doesn't work, the workload is still taking by the primary GPU it's just shunted through the seconds connector. Even with SLI this is how it was done. It's been quite a few years since I ran a multi GPU setup but even after the death of SLI the best you could ever get the second GPU to do was physx workloads.
@@stoney202I'm sure that depends on config though? I do know if you set your primary gpu to x, I think it would make sense for it to go through primary but if not, don't know why they'd assume you want it to go through a second gpu automatically
Yooo, love that you guys were streaming VLDL! They’re really underrated!
curious that you never mentioned refresh rates, they made a huge difference in frametiming for me. with a secondary 60hz monitor i had lots of stutters on my 144hz main monitor, but once i switched to a 144hz 2nd monitor those issues went away.
I swear this has been something that's been hitting me for years whenever i tried to run games at high fps. surprisingly little converage on this issue, since a lot of people will use old cheap monitors for the second display that cannot matcht the high refresh rate of their main monitor
I use a 240hz s2522hg and a 70hz (old 1080p 60hz benq with a 10hz OC no freesync or gsync) and have no difference in stutters on my main monitor from tests were I was troubleshooting crashes and stuttering in a game and completely unplugged the 2nd monitor while also removing any windows drivers left over from it. I have an i3 12100f so no iGPU, both monitors connected to one GPU.
Forgot to add that I managed to solve the stutter/crash and tested again for curiosity sake to see how much performance I was losing and it was negligible, no stutters. PCs are wonderful but sometimes the random issues people can get with a combination of hardware, software, drivers and monitors can make you pull your hair out.
@@Xio_XDnow that makes me wonder if I was running my second monitor with the 10hz oc applied at the time that I was troubleshooting. I'll have to do some tests to see if running 60 or 70 has stutter problems but I've not seen any in csgo/battlebit all smooth for now running 240/70
@@Xio_XDdoes the little variation have an effect on this? example my main is at 239.964 and my other is at 119.982. i just lowered it from 143.981 but its max is 239.757. would it be best to keep the second one at 120 or increase ot to match the main 240? i only lowered it for power consumption
Would be interesting to see another video like this involving a similar multi-monitor issue, that being the fact that having multiple monitors with _differing_ refresh rates can sometimes cause inconsistent frame timings or even actual refresh rate drops with a faster monitor trying to match a slower one (ex: 144Hz briefly dipping down to 60Hz)
While I've never experienced the refresh rate drops myself, my frame timings have always felt inconsistent (even with G-sync) and the only way I can get a valid result on the UFO test is by setting my 144Hz monitor down to 60Hz to match my secondary monitor (the test always stutters or fails to sync at 144Hz). So I'm very curious to know if the labs team could produce any helpful information about all of this
i use myself 144hz 1440p monitor for gaming and second 1080p 60hz one for video while playing, i had zero issues so far with frame drops on my amd gpu for like 6 years
I run a 240 Hz HDR monitor and a 60 Hz for my secondary. The only real solution I use is to use Borderless Windowed mode and it seems to help out a ton though I do sacrifice a few frames to make it work like that.
Ideally all the monitor's should be the same spec.
That's why it's highly recommended to use multiples of the same display for a multi-monitor setup. You get the same refresh rate, same resolution, size, and no weird behaviour when moving windows from one display to another (or if it's on multiple monitors!).
This is still a thing? Good lord
I have the opposite problem. Strangely and occasionally, my secondary monitor (60Hz), will drop to something like 10Hz on startup. Turning it off and on again does the trick. Might actually just be a problem with the monitor itself, because I seem to remember that unplugging it form and replugging it back into the GPU does not fix it.
Honestly, the last point hits the hardest for me. I used to have videos running on a second monitor, or an active chat or whatever, but somewhere along the line I remembered how much more enjoyable the gaming experience is when you actually focus on it.
I during Covid I started plugging in my laptop to an external monitor. Have a zoom call open on the laptop display and using the external monitor for everything else was such a liberating feeling. It definitely increases my productivity and makes some of the things I do on my computer easier. When I game I run the game on the laptop display because it’s superior to my external monitor in terms of resolution, refresh rate, etc. and have discord, Spotify, hwinfo, etc. going on the other apps. Really makes for a nice setup. At least with a laptop I couldn’t see myself getting rid of the external monitor for my home base. With a desktop I’d be more inclined to go with a single monitor so long as it was sufficiently large.
Would be interesting to see how having a second monitor connected effects the latency when playing games.
There's no reason why it would affect latency, any sort of latency would be seen as a drop in FPS, not how fast those frames get to the screen.
Using more screens and having pixels move on them obviously uses GPU power, I don't honestly know why this video made it seem so shocking or unexpected,
and power draw
@@blaness13 I have to assume a bunch of multi-monitor users said "There's no downside" and that must've been a running myth.
@@blaness13 For a long time, people said it was no issue. No downsides. 1:1 performance.
Even today, you can see that this wasn't true when you search back to days past and look at the driver issues Nvidia used to (and, in my experience, still in some cases do) have with mismatched refresh rates for example.
@@OGPatriot03 I guess it depends on what you consider a real downside, If they dropped FPS just by being plugged in and turned on then sure, thats a downside, but i don't consider losing 1 or 2 frames for active content on them a downside when you can just stop the content, its a common sense trade off, like getting a higher res monitor,
A video about best vertical monitors would be awesome. The viewing angle on the 27 inch I flipped vertically isn’t the best but not sure what to look into for a replacement 🎉
you should try using an ips or va panel, they have great viewing angles
Rtings has good statistics for viewing angles and which monitors can be rotated vertically without a vesa mount
Sounds like your 27" is a TN, so the viewing angles are asymmetrical. A good IPS screen should do you for 178 degrees in both directions (meaning it'd be totally fine for vertical flipping), rather than TN's atypical 170/160, with horrific off-axis colour shifts.
you ALWAYS always always always (seriously always) wanna use a curved monitor for portrait mode, even if you dont like them in landscape as a main monitor.
i swear its like curved monitor were actually created to solve this exact problem.
At 1:55
You have Viva La Dirt League up on three monitors!!
Nice!!
Genuine question: What if you’re mirroring your primary display, like say for example sending video to a capture card for a 2 PC gaming/streaming setup?
I feel it would have been interesting to see what running secondary displays does to performance when running those displays on a second GPU for example using integrated graphics for secondary displays and then your GPU for the single primary display.
I run a second video card connected to my second display to watch streams while gaming and it removes a massive amount of in-game stuttering. No need to even run any benchmarks it's so painfully obvious. If you do this you have to make sure to set your secondary display as your main display in windows or else the gaming GPU will still be decoding video.
Yeah I got a cpu with instigated graphics for that reason. Pls talk about in next video!
that's what I used to do. I could notice the difference in performance, and would highly recommend it. it's one of the reasons why I think you shouldn't buy an F-series Intel CPU.
How can I dedicate the integrated GPU (on the CPU) to handle tasks like that, please?
(edit .. wait lol .. is it just as simple as plugging the other display into the mobo port for the IGPU .. or is there more to it than that .. settings, optimizations, etc.?)
@@THE-X-Force connect the second monitor to the motherboard
Back in the days of the CRT I had three of these heavyweights in my setup. If I was going to do some serious gaming I would always turn two off because it made a very big difference in performance back then. I am a little surprised the performance hit is much smaller these days. Good to know.
Jeez. What was the desk made out of? Lol
@davidhodge0201 probably wood, unlike the cardboard ikea stuff these days lol
because nowadays in proportion, running a monitor it's a much smaller % of a gpu capabilities compared to back then
@@davidhodge0201 solid Oak! It was insanely heavy to say the least.
@@camerontechstuffsI actually had to use kitchen countertops to DIY myself a corner desk for my retro cave because a 19" CRT monitor made a crappy ikea cardboard desk make some *very* unhealthy noises... didn't want it to go right through the damn thing (and that was just ONE monitor)
6:50 Kick 🔛🔝
I don't watch this video willing to remove my other monitor I am just watching to see how bad my performance is effected.
I'm using one monitor straight from the graphics card and the 2nd monitor from the motherboard (yes Intel integrated graphics). From what I can tell, the 2nd monitor still uses the graphics card and does affect performance a little bit. I'll be interesting if the Labs team can follow up this video with this configuration.
I didn’t do any scientific testing, but when I upgraded from a GTX 770 to a 1080, I put both cards in for a while to run each monitor, and there was a noticeable lag when moving a window across the boundary between displays that went away when I took out the 770 and ran both off the 1080.
@@sja2095 Considering that 32GB are starting to become the norm, heck i have 128GB, a few megs being used by your iGPU doesn't even matter.
Plus, considering that a non-F CPU still output the same heat output as a F CPU, it' doesn't matter. It's not iGPU costing you 100W.
@@sja2095 Some intel integrated graphics actually used to be hidden on the northbridge chip. I don't think they've done that in a while, but it is worth noting.
@@dustinbrueggemann1875 now the igpu is located inside the cpu
I don't know how it works on Windows, but on Linux when you have a dedicated GPU but also some screens connected to the iGPU, it works kinda like a multi GPU laptop, where you can pick what programs run on which GPU, and if the display GPU is different from the render GPU, it copies the window contents over to the other GPU (which obviously has a small performance impact, but negligible for me)
Could you do a video on running multiple displays at different framerates? This should be quite common with many people buying a new high refresh rate monitor for gaming and using their old 60Hz as a secondary.
Apparently windows really does not like this. In my case plugging both monitors into the gpu completely kills gsync and there are massive framerate drops on the main monitor whenever something was moving on the secondary. My only fix for this was plugging the secondary 60Hz monitor into the mainboard to run on the iGPU
I sometimes use my old 60hz display as a secondary to my 165hz main, and I don't have these issues. Plugged into the same GPU and all.
Performance drops and/or stutter when video on 2nd is playing sure - depends on how heavy the game is since I'm using an old GTX1060. But VRR on main display works fine in this scenario and I have no massive FPS drops.
However, my 2nd monitor is plugged in via DVI while main is on DisplayPort so this could be the reason why.
I suspect this is one of those issues that's been known about for years and companies will shuttle the blame back and forth between each other.
I've absolutely seen this issue before - a friend's computer which otherwise ran flawlessly would hitch down to ~20 fps in 3d games, if his (older) secondary monitor was plugged in.
On the other hand, I've built and used plenty of setups with multiple monitors *without* Windows having a problem, including using g-sync with both monitors plugged into the GPU.
It just seems to break on certain hardware configurations, especially with older GPUs / monitors.
I have a problem with different scaling, it's just blurry on 100% monitor. I also have projector with hdmi switch, one input is PC and another is Android TV stick. So if switch is selected android, PC stutters and frameskips, even if im not using the projector. Also if i use the projector and pc is turned off, fans will glow dimly.
This has already been fixed in Windows 10. If you're still experiencing that, there's some other issue you're experiencing.
Most people nowdays use 2 monitors, the second monitor beeing a cheap 60hz. Running 2 monitors at different refresh rates creates major stutters and jerky motion when both monitors are simultaneously displaying motion. Now I heard an windows update partially fixed the problem. You can make a video about this issue will be very interesting seeing the results
all 3 of my monitors have different refresh rates, ahhh
I would think having the second monitor running off the igpu would fix this
120hz main & some un-branded cheap 60hz second monitor and yep everything(games browser) lags/stutters on main monitor!
I have a 4070gpu and can't game and watch one youtube video or it lags
@@IamWeezyHD make sure any game modes/optimizers within windows are turned off
0:40 did linus have a bicycle horn in his throat. what was that noise.
Would be nice if he talked about having multiple monitors with for example in the case of 2 monitors, have main gaming monitor plugged into GPU, and second monitor plugged into the Motherboard. Would be curious to see the comparison between 2 monitors in a GPU and 1 in each.
Should also test mixed refresh rates, turning off/on hardware acceleration on various pieces of software, exclusive full screen vs borderless/window. mixed HDR. This headache has gotten better in recent years, but especially Vista to Win10 DWM with mixed refresh rates and window'd mode games would seemingly cause the game to stutter down to the lowest refresh consistently.
EDIT: Also I do think layout can mess with it as well especially portrait next to landscape, which it does appear y'all did test. I think the "virtual canvas" has to be even larger, because of monitor layout.
I'm running into this right now. 175hz display paired with 2 60hz screens are causing micro stutters even if plugged into integrated graphics.
@@bdubs85 I had the same issue wanting to run my 1440p at 165hz with another 1440p at 60Hz, dropping the 165*Hz down to 120Hz and no more stutter/hitches/tears. More recently I OC'd the 60Hz pannel to 72Hz and 144Hz is fine on the main, with the 165/72 combo it becomes way less noticeable than 165/60 but still present
@@ericmullen14 So, in theory, as long as your secondary monitor runs at a refresh rate that's a multiple of your main screen, in your case 144/72, it should run the best rather than using more mismatches values
Still, which OS do you have?
YES. High refresh monitor + 60Hz second monitor is a super common setup and the source of so many issues. I'm running 144Hz + 60Hz and was getting around 30ms of lag every couple of seconds when discord was open on my second monitor. At one point my main screen would be capped to 60Hz when watching a video on the second monitor. A different 60Hz monitor amplified this problem so much more, having discord running on it would it make it (discord) super laggy and almost unusable, but only when the main monitor was set to 144Hz
@@Sebaex windows 10, if it's half at least the initial logic to try it was the frame generation pacing would be equal between the two, the slower doubling frames but the frame pacing is equal
I don't know why I'm surprised at this point... Seeing VLDL videos playing on the monotors was something I did not expect to see. Cool!
If you feel you need the extra performance in certain games or whatever else just temporarily turn off the second monitor and turn it on when you need it again.
Or run your extra screens off a Laptop.
This was a good video. a while back I tried to determine through benchmarks if my 2nd screen was hindering my performance. as far as i could tell it wasn't having an effect. but my 2nd screen always had idle apps on screen, now I better understand why this was the case. for me having extra utility is worth the minute performance hit. I appreciate what you guys and the other tech channels do, so I don't have figure everything out for myself.
It'd be interesting to know if there's a performance loss when running a single BIG monitor running both the game (in borderless window mode) and video streams concurrently since the latter seems to affect performance the most
It's the same, this is overblown as it's based on the videos playing, not the screens themselves.
@@Masterrunescapeer How? All of the tests had three videos playing and having them minimized instead of full screen returned identical results. How could having one monitor with three videos playing minimized and having four monitors with three videos minimized having an FPS disparity be because of the videos and not the monitors?
@@cheesefriesandgangsigns The answer to this is simple, the testing methodology here is flawed. When a video is running in the background there is no rendering happening so there is no usage on the video card (Yes, there is still decoding of the audio and playing of the audio, but the rendering isn't done) So, of course there is no impact on the FPS until you start having videos visible on a monitor.
You had Viva La Dirt League on the monitors, I love them so much
1:20 Lol, I was literally just watching Viva la Dirt Leage right before this. 😂
Now I'm curious about what happens when you run a 1080p screen with a 4K, or a 60Hz monitor with a 144Hz one or a mix of everything! Thanks for making me question a point I never though off before!
Yup, would love to see a mix-match of different resolution setups tested.
Works fine for me. I have 3 different monitors. 1 is 1080p one is 4k and one is a super shitty monitor from like 2005
Works ok for me. I run 2 4k60 TVs and 1 4k120 TV for Truck Simulator. I also run a 4th display for PC stats at 1080p60. If I'm playing a game on only my main screen, I will usually have Twitch, OBS, and a webpage or two running on the other monitors. I can confirm there is some performance penalty, but not much. That was even on a 6600XT (I recently got a 6950XT for more VRAM and frames). Just do a little research before mixing and matching, though; I hear nvidia cards can be pickier with multi-monitor setups.
It works great for me since I switched to win11 (and did all the other updates for drivers etc.) I'm running one 32" 144hz 1440p with HDR and a second 27" in 1080p 60hz without HDR on a RTX3070. Good performance in MSFS, DCS and UE4 games.
I run one 1440p165hz display for primary and one 1080p60 for secondary on a rtx 3070, never had single hiccup with them. primary is on DP and secondary on HDMI.
I want to see Linus and the team do a sound setup using a home theater receiver hooked up via HDMI
I personally plug my secondary monitors into my integrated gpu, I'm curious and would have liked to see if that helps offset the performance losses
Yass, they should have totally showed this. I bet many people do this.
Loving the Viva La Dirt League videos in the tests! :D They’re so funny and a great bunch of gaming skits/film/video makers :)
Thanks for making this video and clearing things up and quantifying the effects. My personal take from this is that having a browser or chat window open on the 2nd screen doesn't have that huge of an impact. 👍
Sure but a second monitor costs money and takes up space. I can do all my browsing and discord on my phone while gaming and it’s way more space efficient
0:57 Oh look, an ACTUAL segue for once
guys i have been waiting for a video like this for so long and i'm watching most of your videos for years now, how could i miss this
The worst thing is that both AMD and NVIDIA are seemingly unable to fix the high-power-draw bug for multi monitor setups! My 2070s uses 50-60W while my two monitors are connected. Sometimes there are tweaks and workarounds to get power draw back to 15-20W, but that doesn't work for everyone or every case. It’s insane how both of these huge companies have not solved these issues for years(!) now.
While in colleage I switched to three montiors with my main monitor being longitudianly in the center with two monitors in verticle orientation being on the sides. Worked great for having corse guidelines on one screen, my refrence material on the left and my active document in the middle. Super cool getting to see more and more data being implemented from the LAB's testing. Definitely will be turning the side two off when jumping into a game, never really thought about this and are curious how it will do for me.
2:22 Yep, that's EXACTLY what I was thinking 😂😂 and he said it.
As a multiple monitor guy, I can tell you whenever I boot up a game i press win + p and select only primary (running 2 2K 27" monitors) I do that mostly cause the other one can be distracting but I do it also for the performace. I thought it would be more demanding but your experiment really brought me some clarity, Thanks.
Interesting video, but I think you should have tested a range of graphics cards from both AMD and NVIDIA - we might find something interesting to recommend one brand over the other for multi monitor users similar to how Hardware Unboxed uncovered the driver overhead discrepancy between the two
Weirdly enough, I've heard more complaints of NVidia drivers not playing along with the DWM of Windows 10/11 compared to AMD. In my opinion these kind of which one's better than the other one kind of comparisons are basically useless (including the driver overhead thing) but if you really care, here you go.
Yeah, especially with Labs now seemingly pretty operational, it would be nice to see start seeing LMG cover more hardware scenarios in these types of tests.
Not only that but there will be a VRAM usage overhead, which is about 8MB for 1080p for the frame buffer, 4K is going to be 4x that per display, going to assume its double or triple that when vsync is enabled as it keeps 2 frames buffered at a given moment.
only heathens use VSync
I mean it would be difficult considering how different can the desktop environments and display server or protocol (X11 or Wayland) can be, but would you test if similar results can be found on Linux distros as well? Maybe try the most mainstream DEs such as GNOME or KDE Plasma.
honestly, I feel like two monitors is necessary nowadays, especially when games don’t let you dashboard to go into like discord or something without breaking the game to a point where you need to restart it
As a software engineer I find it very hard to be productive on a single display. I have a 49” ultra wide and two 27” vertical monitors. I’d be curious to see if this also happens on Linux both with X11 and wayland since I don’t use windows.
I work one on PC with a 27", one with 49"+27" and one pc with a touch screen. Definitely couldn't do my job with 1 haha
I'm pretty sure this happens in both X11 and Wayland. Also I've worked with 2 monitors vs 1 single monitor and for me productivity is the same, since I use a window manger I can just switch to another workspace with a single keypress which is often faster than turning to look at the other monitor and dragging my mouse to it.
@@VictorRodriguez-zp2do I also use a WM, but tbh switching workspaces kinda feels too abrupt to me. I’ve never gotten used to it
@@roccociccone597 Hmm, It's never been a problem for me, but I see what you mean. Maybe you could configure it with a fade effect if your WM allows, to sort of "smooth" the transition
@@VictorRodriguez-zp2do I use hyprland so yes I already have it. But multiple monitors are just nicer especially when you’re reading docs and have to copy paste something.
First of all, nice video! I however would've loved if more in dept talking points were discussed. Such as the effect of variable refresh rates or pixel count due to the use of different monitors or the impact of g-sync or freesync. Just to name a couple.
You have the lab and the equipment. What's holding you back?
The worst problem I had with multi-monitor setup came from having different refresh rates on both of them.
My main display is 144Hz, secondary 60Hz. Turns out that when gpu is used as shown in this video on both of them, the overall refresh rate drops to the lowest one so my 144Hz display was only getting 60Hz. This combined with wallpaper engine held me locked at 60Hz for longer that I'm willing to admit.
From what I've read, this issue only affects Nvidia GPU's and AMD resolved it years ago but correct me if something changed.
I can't fully testify since I only own a 75hz main monitor along side a few 60hz ones but I recall running an online framerate checker before which stated the proper 75hz & 60hz respectively. I did have to change my Monitor's resolution from "1080p Recommended" to "1080p PC" to get the higher framerate tho. For some reason would cap at 60hz on the other 1080p res setting.
Use 4k 120Hz+1080p 60Hz on RTX 3080. Works fine, 4k is lg c2 and it have input signal information overlay. Whatever i do on second monitor, overlay reported 4k 120 Hz signal with any changes.
Correct me if I’m wrong, but is there a way to set certain apps to use the iGPU instead of the discrete graphics if you have it? At that rate I would set everything that wasn’t a game or something needed strong power to run on the igpu. Give it some use instead of sitting there ready for the inevitable troubleshooting
I’d be curious to see if there is any performance difference with multi monitor and streaming if you use a dedicated encoding gpu like an Intel ARC for av1 encode/decode and amd or nivdia for playing the game, also what would happen if you plug the secondary monitor into the Intel gpu vs just using it for decode and how you can set that up. Could be a relatively inexpensive way to get all that performance back.
I had my second monitor plugged into the iGPU for a while, but didn't do extensive performance testing. The problem is that whenever you drag a window that requires some GPU rendering (be it for video decoding, or something like the deskpad configurator) from one monitor to the other, the screen would blank for a second when it switched GPUs. It's possible to avoid this by setting your browser (and other applications that can cause this) to be rendered by your main GPU through Windows settings, but that of course destroys any benefit you might get.
I'm curious about how multiple GPUs driving the displays would work, or if it is a good idea to put the integrated GPU to drive the second monitor. From my experience, the performance still suffered a lot, but I thought it didn't make any sense as there are 2 GPU doing different work.
Compositor only works on single GPU, if you connect display to second GPU, the first one will copy the image to the second one, which is a lot more workload than just directly connecting the monitor to the first GPU
I've found a fair bit of success plugging my secondary monitor into my integrated GPU, it's only 1080p and used for videos so it works great! Takes my GPU load off at least. There are a few caveats but pretty great overall.
the only drawback i can see is when you're in cpu limited games. taking that power allowance away from the processing cores could have an even bigger impact to overall performance.
Yea but realistically when are you ever running into a cpu bound game for any cpu built in the last 10 years ?
@@dippst It that case there’s an option to plug in another cheaper second gpu assuming you don’t hit a PCI-E lane limit. But idk if that would help or not given other overheads.
I actually did that years ago with a GTX 560 and an AMD HD radeon gpu in the same system lmao, it worked and it was jank.
@@oso1165 when you run an ass load of mods. my farm sim mods folder is twice the size of farm sim itself lol.
1:22 a VLDL reference on LTT? I'll take it.
This is why I plug my second monitor into my motherboard to take advantage of integreated graphics. Sure it probably made very little difference before but I definitely thought I FELT a difference when both monitors were outputting from my gpu
This sounds like a great idea.
I'll definitely need to try it out myself.
This is what I do as well. I've got my main monitor on my GPU, and my second monitor on the motherboard (iGPU). I do have my TV connected to the GPU as well, but that one is typically disabled in Windows unless I'm actually using it.
This setup probably still have some performance impact, but it seems to be noticeably less than running both monitors on the GPU. Mainly I guess it "moves" the load from the GPU to the RAM, and affects the CPU temperature. It might also take some bandwidth from the CPU.
My thinking is this. In games where the CPU (or RAM I guess) is the bottleneck, having both monitors connected to the GPU is preferable. In games where the GPU is the bottleneck, moving the secondary monitor the the iGPU would be preferable. However, switching back and forth would be more hassle than it's worth, and therefor most people would benefit from utilizing the iGPU, since most games seems to be GPU heavy. Unless there's some other drawback to utilizing the iGPU that I'm not seeing.
If you're having temp issues with any particular component, that could also affect the choice.
You didn't mention the additional VRAM usage that comes from additional monitors. Even if you're not running anything on the other monitors and they are just chilling on the desktop, their additional VRAM usage can be significant if you're trying to play one of the newer VRAM hungry games on an 8GB GPU. This is the main reason I started turning off my 2nd monitor while playing heavy games and not the FPS loss.
It might also eat up a bit more PCIe bandwidth. If you're decoding or encoding video that data must be streamed through the PCIe bus to the GPU.
@anunoriginaljoke5870 it will matter in the details and also the FPS of the game. I using 3070 and it's sometime exceed my 8GB ram in the last of us, when it happend the game look terrible and fps start stuttering
@anunoriginaljoke5870 i'm not the author of the main comment. I just want to answer your question. It will depend on the game you play. Some affect some not
Laughs in 3090
@NordiXXL Same I have a rx 6600 xt and 8 GB was not enough. would stutter like crazy.. moved to linux game runs butter smooth now on High settings. problem is Windows is not optimized as sitting idle can use over 1 gb of vram
Was this really the best the labs could do? What about using an igpu for secondary monitors? What about AMD and Intel GPUs? I thought the point of labs was to go above and beyond, over the top, and maybe find unexpected results along the way.
And different monitor resolutions and refresh rates too!
@@Sopel997yes exactly!
Ran into this issue a handful of years back, running a 144hz 1080p gaming monitor for main, and two portrait 60hz 1080p IPS monitors on either side of it (web dev / DBA) on a GTX 1080, everything seemed to be ok. Picked up a 4k monitor for my Playstation, had it above my main PC gaming monitor, figured "it's just sitting there during work hours, I should just hook my PC up to it and watch videos during the day on it instead of the portrait monitors". My poor GPU had a meltdown with the added load and was running sub 30hz just existing in that config.
Nowadays, the PC is on a single 244hz 1080p gaming monitor, with a LG DualUp hooked up to my MacBook. PC is much happier since I'm still running that GTX 1080, it's given me a few extra years out of this system without a hit to my productivity.
Have you tried using the function that allows you to use onboard graphics to run a second/third monitor I know intel can and probably the new zen processors can too but it should be the same as running a single monitor off your graphics card as long as your processor is powerful enough to not be running at 100% during gaming. This would be an interesting test to make sure.
EXACTLY this!
This was my setup, I used to have my Main display plugged in to GPU and the Secondary display to the iGpu (UHD630), the interesting part is that if you place a window on the secondary display as say a youtube tab, both the GPU and the iGpu is working in tandem decoding the video. ua-cam.com/video/mi8xlCz42CM/v-deo.html a video from TYC explaining this niche topic. my previous use case was to use Intel Quick Sync to encode video through OBS whether its recording or streaming.
I have done some test, GPU encoding vs QS encoding in recording 1080p footage the result ;
GPU encode decreases fps around 10-13 compared to stock
IGpu encode decreases fps around 7-9 compared to stock
Streaming is lighter compared to Recording, around 2-3 fps decrease across the board, and Streaming and Recording only nets around 1-3 fps decrease compared to recording alone.
I have only tested it on my own hardware, your millage may vary.
Core i5 10400, RX580 8gb, 16gb ddr4, 1080p main and 1600x900 second monitor.
I run 3 monitors right now and I'll happily sacrifice some frames for the convenience. When working, I use one for my IDE, one for unity editor and one for a browser. When gaming I use the secondary ones for videos/music/discord and the other for hwmonitor to check on stats. Main monitor is 1440p and side ones are 1080, so I barely notice the loss in performance.
having editor and IDE on separate monitors for code debugging is a god send, wouldn't change it for anything, and while coding, main monitor can be IDE and secondary can be google
I am a web developer and have laptop and main monitor. Usually i leave IDE on monitor and docs/google on laptop. For gaming or personal use i find that another monitor is distracting, so when my friend lent me a monitor until i bought mine i tried to use them both and most of the time it were just showing desktop so i would turn it off when playing.
8:39 just skipped a heartbeat there 😅
Exactly 😂😂
another thing about multi monitor setups (on windows) is that borderless full screen applications can have frametime issues due to the dwm only having one compositor for all monitors. before i think w10 version 2004, it used to be even worse. if there was ANY activity on your secondary monitors, the frame rate of your differing refresh rate monitors would suffer. these issues i brought up happen only if the refresh rates of your monitors aren't exactly the same as far as i can tell.
I would've liked if there was some comparison included regarding if it increases the input lag or not to your main monitor when gaming. Or if it increases tearing. Otherwise a good in-depth investigation!
It doesn't, assuming identical framerate in all scenarios.
Obviously if you're running some intensive tasks which lower your framerate, your input latency will increase as well. But just having multiple monitors enabled doesn't cause any of that. By definition it is input lag, not output lag. The output signal is instant and synchronized across all outputs.
One common issue is having monitors with different resolutions and refresh rates. Windows doesn't handle that too well. Like if you have a 120 Hz and a 60 Hz screen, Windows might limit both displays to 60 Hz. There are ways to work around this.
only thing missing in testing is asyncronous displays, like having that nice 4k display as main, and 2k or 1080p displays as secondary, probably wouldnt make much of a difference tho, very informative video
im currently running such a setup. while i dont have rdr or cyberpunk, i am playing diablo 4, and in the process of swapping a fourth monitor, a costco derivitive lg ultragear that couldnt meet its advertised displayport 1.4 spec, shame on me for trying to get a good deal. i can make my rx5700xt chug if im moving around programs while youtube or twitch vids are playing. during beta i did have diablo crash my gpu driver and reset my pc. since then i havent had a repeat.
I love the initial test loop. VLDL are so funny. Love seeing all the different youtube channels I watch inter-reference one another.
I’ve been running 4 monitors on my corporate computer for a decade, but only one 27” monitor on my personal computer. It’s perfect for RTS and world building games, but I’ve been contemplating getting it a second monitor for popping out a full size preview for video and photo editing. My biggest concern is how to arrange my desk so I can fit two 27” monitors side by side, have enough room to see everything, and not get a sore neck from staring diagonally for hours. With my 2X2 work monitors, I’m constantly moving from one to another, which prevents strain.
I'd like to see this test done again with Hardware Acceleration disabled in whatever browser you're using, I imagine the hit to the CPU when encoding/decoding is probably more impactful on the overall experience.
I apologize in advance if this was already mentioned in the video and I happened to miss it somehow.
But why would you do this?
@@unvergebeneid Hardware acceleration gives me screen tearing and scrolling hitches pretty often in Chrome.
@@SteamPunk96 ouch. What GPU do you have? Just so I can steer clear of those drivers 😬
I would really like to see them test this, but have the side monitor plugged into another (presumably older and less powerful) gpu. Thats how I run my setup and I feel that it works quite well.
it would be an interesting test but may I ask why you use that setup? Is your gaming GPU not very powerful so you're trying to give it every advantage possible? I wonder if there are driver overheads etc that hurt performance having multiple GPUs
Feel like it would be more realistic to do the testing using the iGPU since many newer CPUs are including them and many aren't utilizing them.
I was hoping you would expand testing a bit to show if you get any performance hit on your main GPU if you have your extra monitors on a separate one? My guess would be no hit since those tasks should now be done on the other GPU, but maybe there is a surprising answer?
If you're someone struggling with having a short attention span and you use dual monitors try switching back to using single monitor. It personally really helped me at getting back the ability to focus on just one thing for longer periods of time without constantly checking discord or some other crap every 3 minutes