Hmmm... 20 years of vsync, never had to worry about tearing! Of course, I don’t pretend like I’m going to be a pro CS:GO player and obsess of negligible amounts of input lag, like it even matters unless you’re a pro player!
@@melxb if you're asking about what it is, it's the AMD version of G-sync im not certain about the difference in performance but im about to look it up anyway so ill get back to you on that.
Why does this "pc masterrace" thing shine thru every pc focused comment ? I dont get why so many pc players are such platform Nazis. It should not matter where ones plays. Can't we all just enjoy games and technology ?
@@Ample17 seriously dude?? Have you read the console wars comments on their console content? btw nothing in that comment hinted at the "master race "perspective.
for this comment to be true you would have to include consoles in the category of 'pc focused content', this comment is a self-contradictory statement!
@@Ample17 Lmao I don't know how you got all of that from his comment but OK. It's really funny though, console gamers are more rabid when it comes to being "platform nazis". But when PC gamers do it OH GOD NO PLZ THINK OF THE CONSOLE CHILDREN!!
I had to explain VSync & triple buffering to a developer the other day when they were perplexed about their engine not exceeding 60 FPS and their profile tools reporting a mysteries chunk of CPU activity that took up more than 12 milliseconds in their 16.666ms frame. Was a bit of a weird conversation to have with someone who was in charge of a game's development.
Unfortunately that, or usually even worse, is very common in software devs. I know many who’ve made applications for Windows for 20 years and STILL somehow don’t even know the very basics of how the OS works (and therefore no knowledge of how to make a decent application for it that does things “right” for that OS)... and I’m talking not knowing shit like “you need to reboot your PC every once in a while to prevent glitches and poor performance, or as one of the first things to try if something stops working properly”. Its that bad for MOST developers today, who - keep in mind - are usually developing apps and not games, since the talented ones often go into gaming development
@@zombievac the talented developers going into game dev is something I noticed recently. I was out with a friend and I met the owner of a software company, he asked me what languages do I know and I mentioned C/C++ and he lost his shit - saying that there's such huge demands for C/C++ developers right now. I thought it was weird because I'm used to everyone around me knowing C/C++, not knowing the language is a weird thing, but then I realised that the problem is the C/C++ developers these days move immediately into high performance game development or some critical application development for very large projects, leaving the rest of the software industry oddly starved of low-level, hard-core developers.
I'm a PC gamer who pretty much *always* turns V-sync on no matter what simply because I legitimately *despise* screen tearing (it actually legit hurts my eyes)! I know it adds input lag but honestly, I don't really care too much about that because since I'm not at all interested in speedrunning or competitive online multiplayer, my gaming tastes are such that the lag V-sync adds doesn't really affect me all that much.
I find that it depends on what framerate you are getting. If you are getting just a bit above your refresh rate, the tearing is horrible. If you are getting way above, I don't notice it. I always turn vsync off, and try to adjust my settings so I get 80-100 fps.
I use it for singleplayer games. One thing that could help you out is fast sync, which eliminates screen tearing while not capping your fps to your monitors refresh rate and it dosen't add input lag. Its only for nvidia gpus from what I recall though, you could also just buy a gsync monitor if you have a big wallet.
FreeSync/GSync is the only sync I'll ever use, I hate screen tearing but I hate input lag even more FreeSync/GSync is *better* than VSync, it makes the game feel smooth even when I hit low FPS.
This was explained very well. I’m someone with a rather limited knowledge of these behind the scenes type of information, and I found this easy to digest. Good job Alex!
It would to be great to have an updated video on this topic, since now we have things like Fast Sync and RTSS Scanline Sync. Both options are great to use with motion blur reduction (that can't be used with Freesync/Gsync).
what DF refers to as "true triple buffering" is nvidia fast sync / amd enhanced sync. but yeah, i use RTSS scanline sync and it is like black magic to me. i've tried to look how it works multiple times, but its like a black box with zero documentation anywhere i could find online. i feel like i mostly understand it... but how the hell does it just know how to move the tear off screen like that? and why does it work exactly the same on classic games and new games where i can have 60-400 FPS, but when i try to play a really old game like half life it breaks and needs to be reconfigured? its so bizarre.
Should have mentioned the new screen tear eliminating feature of Riva Tuner Statistics Server called "Scanline Sync". Works wonders, especially for older games!
yea for older games def, i believe you have to have 2x or more of your refresh for a smooth experience. otherwise i experience judder but its great when it works perfetct
On my pc, if I can reach over 90 fps I just disable vsync entirely as it generally doesn't distract me on the games I play. If I play a game that runs at 60-70fps on average with the chance of it going below that, I use adaptive vsync. If a game goes below 60fps, I care more about maintaining the best input latency I can rather than if there's tearing or not. My PC currently has a gtx 980 in it, and since Nvidia seems to not want to make gsync affordable, I'll probably switch to an AMD gpu on my next PC so that I can get variable refresh rates without having to destroy my wallet.
But why? Are you a pro with lots of money on the line? Vsync has been good enough for a long time now for the input lag to be nearly negligible if your PC has decent specs like you mentioned, and tearing is so much more distracting (and vsync has the nice benefit of capping your FPS to your refresh rate, because variable FPS and therefore variable input lag is even worse because you can’t adjust to it)
@@zombievac That's probably why most games since its inception gave the option to turn V-Sync off, right? You don't have to be a pro to appreciate occasional lower input lag over the same, but higher input lag all the time. Also that's not correct. If you drop down to the next V-Sync step your input lag is doubled within a single frame. That's more inconsistent than V-Sync off will ever get you.
This video didn't need to be made. You just need a ten second clip saying "Vsync is the work of Satan and developers should just optimize performance better."
I've wondered what it does and have seen suggestions about what it does everywhere for performance. It turns out the only correct option is disable for casual players who don't have those expensive graphic cards
I love that you're using Serious Sam games in the background. Criminally underrated gems! SS3:BFE alone was like the FAR superior, less casual nu-DOOM.
Really hope television sets with VRR become commonplace soon. AMD freesync on some Samsung TVs is a start but it's nowhere near as polished as PC freesync/gsync yet. HDMI 2.1 was supposed to lead the way but the industry is dragging its collective feet on it.
HDMI 2.1 wont even be on 2019 TV's very frustrating.. HDMI 2.1 wont be ready for consumers till the end of 2019 so 2020 TV's will get it. VRR on my Samsung Q8 and Q9 is amazing. Sometims I hook up the pc and play 1440p @ 120fps(120Hz) 4k it holds 40-60 really well ... Xbox One X games are starting to take advantage too with unlocked frame rates..
Alex this was an absolutely awesome video! This will be the definitive video I lead people to so they can better understand what V-sync is, its variations, and how they should be used. V-sync has to be the most misunderstood aspect about PC gaming because it's believed by many to be the enemy or just something that caps your frame rate. Great job again, Alex.
Visiting in 2024 - this video answered sooo many questions I had. You covered all the bases - VSync AND adaptive sync/VRR. An evergreen video. Great job on this one, Alex!!
Input latency with VSync happens when the input polling is blocked on the same thread as the rendering (inputs during Vsync are held until the start of the frame). id Tech 5 is a good example of an engine that moves the input polling onto its own thread; inputs are consumed the moment they happen rather than the start of the next frame, so there should be less perceived input latency. Another cool thing I was investigating a couple of years ago was using late-frame re-projection with mouse-input to have absolutely, incredibly smooth first-person mouse-look - but this came at the cost of some subtle pixel smearing and increased GPU work-load, but it's something I think competitive FPS games should consider as an option - poll mouse input at the start of the frame, then poll a second time just before VSync and do re-projection. VR games use this technique (as does the PSVR). Personally, I can't stand the input delay of VSync. It just absolutely ruins FPS games for me; as much as I hate tearing I will live with it for the sake of avoiding weird floaty mouse input.
Yes DF, educate the masses! You're doing a great thing for every video game fan curious about the technical side of their in-game settings or why certain things are the way they are.
ive used gsync for the past 3 years and never looked back.
6 років тому+25
Me too. V sync can be borked on some games, g-sync just bypasses all that nonsense and runs your games smooth and lag free, as long as you cap fps a few under the monitor hz limit.
I think the video should have included more timeline graphs of the time it takes to render/refresh, to make it easier to understand. Also for people that are not familiar with this stuff, it should have been pointed out, that higher refresh rates significantly reduce tearing.
@@anteksadkowski5166 You may get more tearlines, but the difference between the frames, so the offset, is smaller so each tearline should be a bit less noticeable.
a noteworthy disadvantage when using variable refreshrates (g-sync, free senc) is that you can no longer use backlight strobing / motion blur reduction.
thanks a lot for your time and effort to share this information! :D thought i knew everything about it when my brain said "v sync = no tear. that's all there's to it" ..turns out theres a LOT more to it.
@@KhangHoang-ks3nm I know it's stingy, but it's actually full on interpolation. So whatever your fps is, that is how many times the animations will update per second.
This is an amazing explanation. Been looking for something like this video for years. Finally I can refer to this video every time someone asks what vsync is or does.
what i still dont fully understand is why i get tearing on my 144hz screen even when i have constant 144 fps. no matter if vsync is on or if i use a framelock
Hi, Alex & team. A long-time subscriber of the Channel here. Already watched this video months ago, but I just wanted to clarify a couple of things (a bit of an OCD person when it comes to these things,lol). So in summary, not counting "No VSync" of course, your video basically presented us with FIVE "kinds" of VSync or syncing solutions or whatever you wanna call it. 1) Double Buffer VSync 2) Triple Buffer VSync 3) "True" Triple Buffer VSync (aka Enhanced Sync/Fast Sync) 4) Adaptive VSync 5) Variable Refresh Rate (aka VESA Adaptive-Sync/FreeSync/G-Sync). So based on that summary, are you saying that "Adaptive VSync" is actually different from VESA's "Adaptive-Sync", despite being nearly identical in their names (literally just missed each other by ONE letter)? If so, where else is Adaptive VSync implemented (you DID mention Call of Duty & Crysis in the video,but is that it)? And should people care about it ultimately, or is it simply outclassed & superseded now by Adaptive-Sync and its popular implementations FreeSync & G-Sync? Thanks for the clarification, guys. And very good video by the way, highly informative & VERY relevant.
I find screen tearing, frame pacing stutters incredibly distracting. Years ago got a G sync but i dunno if its because the frame rates are not right for the G sync but i really do not notice a difference compared to v sync or adaptive V sync
If u never get dips in fps u will not notice a difference. Its when the framerate fluctuates u notice the power of g-sync. With a gsync monitor 45 fps will feel about the same as 60 fps with a normal monitor.
V-SYNC has been a mystery to all of us for over 20 years. We all know the basics but we never fully understood how it works. After watching tons of videos, reading articles for years after years, still a mystery. Then this video comes and voilà, we finally know.. thanks for the video bro.. nice work
Not everyone plays games, and not all gamers play on PC. So it wont ever be a "standard" as it has no use anywhere else other than in rendering graphics in real time. Thus a very small portion of the display market. We're talking about gsync/freesync, not HD/4K/HDR or whatever else thats in use (and usable) across the entire market.
@@sd19delta16 yep its kind of tripple buffer . Gpu produce infinte frame bcs its think that vsync off and then fast sync chose best frame and show it on screen.. correct me if im wrong.. !!
FastSync needs the frame rate to be at least double that of the refresh rate, preferably 3 times fasteror more so only best used for games like CS:GO, it massively reduces delay and removes tearing as it shows the most up to date frame every time, so say 120 Hz Display but the game is say 480 FPS, then the frame delay is not 8.333 ms but 2.083 ms, good stuff it is. Bit remembered, need at least double the frame rate of the display refresh rate, if not at least double, it pretty much is just normal vsync then so useless
sd19delta, basically the different are fast sync do not limit your gpu frame rate and still maintain viewable frame for your monitor.(to reduce stuttering and input lag)
I don’t completely agree, if you truly hate tearing (even infrequently) it’s not necessarily the best. I noticed this on the Xbox one x version of call of duty wwII where the in game cutscenes were plagued with tear, whereas the PS4 pro version used triple buffered vsync which meant stutter instead. I’d personally take the stutter over tearing, but that’s why I consider it to be more a subjective matter than an objective one.
another type of vsync that DF didn't explain is half refresh rate vsync, it is pretty self explanatory, this type of vsync will render the frames at half the refresh rate of the monitor. meaning that each new frame is followed by a duplicate frame, to make it seem smoother. this is what they use for games on consoles, this is why many people think that 30fps is smoother on consoles than on pc, but you can do the exact same on a pc using nvidia inspector. for amd users, this option is not directly available in their drivers unfortunately, but they can do the same thing using RTSS's scanline sync by setting it to "x/2", where the 'x' is the refresh rate of the monitor. scanline sync is like a custom vsync where you can adjust the position of the tearline by yourself, scanline sync is an alternate vsync which has less input lag. its basically what DF explained at 3:24, you can move the tearline away from the centre of the screen or even out of the screen space, so that tearing will not occur at all in gameplay. EDIT: ok they kinda explained this at 4:45, but they didn't fully explain about half refresh rate vsync.
Really cool video. The two things I don't understand are why doubling the refresh rate helps with free sync/g-sync and also why it can introduce ghosting.
I'm just at 1:22 and I already noticed the Condemned Theme Main Menu soundtrack and Serious Sam The first Encounter first mission sound. I love this soundtrack already and there is still 17 minutes to go with a full explaination of Vsync. My Saturday can't be better now XD
G-Sync here after been using V-Sync and 60hz for 10 years... I'm not sad that I changed to G-Sync. :) If anybody love games. Please get a G-Sync monitor, yes they are expensive but they are worth it. Cheers.
@@nO_d3N1AL U will only notice the difference when the framerate is unstable. If u use for example V-sync and your framerate are locked to 60 and u never dip under 60 the experience will be the same. If your framerate dips or your framerate is unlocked G-sync is vastly superior.
Thank you for this particular topic choice Alex! Been trying to make proper sense of Enhanced Sync for months now (currently a Radeon user) and this video is the perfect explanation to me. A needed and great watch this was.
G-sync is the best sync based on my own experience. Too bad it's Nvidia only tech because at this point I'm really not sure if my next GPU will be from Nvidia.
@@Sonu666 , " Freesync is garbage anyway" You have no clue what you are talking about. Probably use a console so your fps is to low to use G or Freesync.
It depends. If you play with a controller and/or you play single player games, you should use it. But if you play competitive shooters on PC that require very low input lag, you should definitely turn it off, even if you are using a lower refresh rate monitor. Sure, you'll experience screen tearing, but you will get much lower input lag which is critical in games like Siege or CSGO. One more thing: I cap the refresh rate even if I don't use Vsync (via RivaTuner). I don't like fluctuating framerate and response time. People like to play totally uncapped with FreeSync and G-Sync monitors but I like consistent framerates and response time.
I play with controller and disable it almost whenever I can. Consoles rarely get the option, but the lower latency is preferable unless the the tearing to too obnoxious or the game doesn't need responsive controls.
Yeah, I am also very sensitive to tearing. Quake was the first game I turned VSync on (on 3DFX Voodoo1). Still remember the console command vid_wait 1. Since then I haven't played a single game without Vsync, cause it turns out I hate tearing much more than slow framerates or input lag or low resolutions. One of the worst and ugliest things that can happen to a game for me is if it starts tearing. Also the reason I don't bother with consoles, where you are basically left at the discretion of developers in this matter.
Next topic, optimization of PC components, choosing the right ones for the right resolution, airflow, positioning of the fans, A to Z guys, A to Z ... I want more like this ... Hell of a good work ;)
I remember last gen when consolers swore Vsync didnt exist. the human eye couldnt see it, just like above 720p, just like above 30fps. Funny how they change their tune when consoles get it. Textures, i remember "debating" with consolers about improved textures and they said "who stands and looks at rocks when playing a game", now they bicker and argue with each other about which plastic boxes rocks look better. ha ha console hypocrite race
Hilarious how they chop and change every other minute, pretend not to see stuff until they have it then all of a sudden its like the second coming of christ if the other console doesnt have it. then when they dont have it its back to being unable to see the difference.
@@nextlifeonearth Freesync isn't as good as gsync and you can't use Nvidia GPUs with free sync, which means you'll have to use an AMD GPU and they're just not up to the standard of Nvidia's GPUs. AMD have really been playing catch up with Intel and NVidia for years and, until they start actually coming out with products that can compete with Nvidia and Intel, we should avoid them or otherwise they won't change their business practice.
Not sure how freesync is not as good as gsync, it 100% works, and the measured delay is the same as g-syncs best results (with gsync having inconsistent measured delays(source - bitwit and hardware unboxed). RX 580 competes directly with 1060 in price and performs on par with it. the next step up vega 56 and GTX 1070, they are the same price and the vega beats it convincingly in most games. add on to that the fact that you can buy a vaga 56 and freesync monitor for the same money as a 1060 and gsync monitor, and the only reason you cant use Nvidia GPUs with freesync is because Nvidia specifically wont let you (it would take nothing more than a firmware update) and then consider who should be avoided until they change their business practices.
I feel so lucky, I got a G-SYNC compatible monitor for my first gaming PC, so I haven't really had to deal with any tearing or latency issues. FYI, Blurbusters has a great series of articles on G-SYNC and input latency. It's full of measurements and solid analysis. Long story short: - enable g-sync for fullscreen mode - globally enable v-sync in NVIDIA control panel (vsync and gsync were intended to work together) - disable v-sync for in-game settings - either in-game or in NVIDIA control panel, set max FPS to whatever your refresh rate is minus 3+, so 144hz would be max 141 fps All of the above settings mean that you always get the minimum possible input lag without any screen tearing regardless of what FPS your GPU is outputting.
I use fast sync whenever possible, but unfortunately it doesn't work with all games. From my testings fast sync works perfectly with 50% of my game library, while the other 50% either doesn't work at all, present weird input behavior or visual artifacts. Fallout 4 is one game where fast sync works wonders and makes the game a lot more playable, while the Trackmania 2 games (Stadium, Canyon, Valley, etc) just go haywire when fast sync is enabled.
Wow, that lament about Nvidia in the last part, but you don't say nothing about the hassle of shopping for a Freesync monitor? About how its ranges are limited? About how upgrading to a new GPU meant your current Freesync monitor may not play well with it anymore? But hey, its easier to just put a target on the big bad Nvidia. I wish AMD would do better, I really do. It would give me better choices and options when upgrading my PC. However by comparison, my per dollar investment has always done better with Nvidia/Intel.
Thomas S At least on console, if you have forced V-SYNC, then everyone else does too. It maintains a reasonably level playing field. Though this argument could completely fall apart when you consider the enhanced consoles coexisting with the base consoles.
Rocket league lets you disable it at the cost of tearing. And that is a 60fps game. Of course I play it at 200fps on pc, but there is that option just saying.
Jean Lepage Ultimately, it's not a significant difference either way. It's just that the game would then force players to deal with the tearing or willingly choose to play at a disadvantage. I've seen games that do this, and I don't think it's had a particularly negative impact on the online experience. I'm just saying that a consistent experience between players shouldn't be completely devalued.
I never use v-sync in any game, i prefer having more fps with tearing than input lag. especially in shooter games which is what i play mostly. I don't mind tearing at all.
Screw Nvidia for ruining the majority of gamers experience by forcing them to buy a gsync monitor. Gsync could have just been a standard for high quality monitors with "superior smoothness". But it should be a choice. Not shoved down my throat. Thanks for holding gamers back. Now I'm saving up for an amd.......i will have to wait for that variable sync smoothness You shouldn't even be able to ignore open standards. That is anti-consumer and anti-competitive.
You're trying to argue that, because Nvidia cards only support Gsync, if you want a variable refresh monitor and own an Nvidia card, you have to buy a gsync monitor, which is anti-competitive and anti-consumer, but it's actually the complete opposite, as is evidenced by the fact that you've said you're going to buy an AMD graphics card next time you upgrade. Nvidia having gsync has created a competition between free sync and gsync and further fuelled the competition between nvidia and AMD, which increases competition in the market, which will then drive prices down as they try and compete for consumers, which is pro-consumer. In other words, it's the exact opposite of what you've claimed it to be and it is an excellent thing for gamers.
@@David-ud9ju This might be true if both companies were on even footing to start with, but they're not. Nvidia is the only viable option for high-end gaming GPU's, and their introduction of exclusive proprietary technology was designed to make the status quo as sturdy as possible. People buy G-Sync monitors because they have Nvidia GPU's, and then choose Nvidia for their next GPU upgrade because they have a G-sync monitor. This is anti-competitive and pro-status quo.
My cousin doesnt feel the inputlag as well...but I definitely do, when I play with his pc...his senses are just not that sensible...I'm sure its the same with your people...some people still say there is no difference between 60 and 144hz
The down side to this shift to VRR is going to be the frame pace issues when recording a game… trust me it’s been an absolute pain, only fix so far is to disable G-Sync and lock V-Sync to fixed refresh of the recording capture framerate… so 60fps. ☹️
Yeah my old FX-8350 would really struggle to run games with v-sync turned on but then i upgraded to an i5-6600k and now v-sync doesnt effect performance at all, i still turn it off for multiplayer gaming tho.
My general experience has been, keep it simple: use gsync/freesync if you have it, use fast sync if you don't. The side effects of fast sync when running at closer to display refresh or under display refresh aren't all that bad, and it's generally going to result in the least amount of latency you can get without tearing, whatever your framerate.
20 years of tearing, hopefully my grandchildren won't face this monster.
It has been a long battle, brother.
50.000 people used to experience tearing here ... Now it's a ghost town.
Hmmm... 20 years of vsync, never had to worry about tearing! Of course, I don’t pretend like I’m going to be a pro CS:GO player and obsess of negligible amounts of input lag, like it even matters unless you’re a pro player!
gsync
We just need those G-Sync or Freesync or whatever variable refresh rate monitors to improve.
It’s shocking how many last-gen games suffered from screen tearing and the average person had no idea it was a thing
Ah yes, nothing like waking up and learning about V-Sync.
but what is g sync
This literally just happened to me haha woke up, brewed coffee, checked youtube, opened this xD
@@melxb its basically a nvidia version of v sync thats built into the hardware and it has less input lag than v sync
@@t4ky0n how does free sync compare
@@melxb if you're asking about what it is, it's the AMD version of G-sync im not certain about the difference in performance but im about to look it up anyway so ill get back to you on that.
Loving the pc focused content keep up the good work DF.
Why does this "pc masterrace" thing shine thru every pc focused comment ? I dont get why so many pc players are such platform Nazis.
It should not matter where ones plays. Can't we all just enjoy games and technology ?
@@Ample17 seriously dude?? Have you read the console wars comments on their console content? btw nothing in that comment hinted at the "master race "perspective.
for this comment to be true you would have to include consoles in the category of 'pc focused content', this comment is a self-contradictory statement!
Ermmm. Not just pc
@@Ample17 Lmao I don't know how you got all of that from his comment but OK.
It's really funny though, console gamers are more rabid when it comes to being "platform nazis". But when PC gamers do it OH GOD NO PLZ THINK OF THE CONSOLE CHILDREN!!
I had to explain VSync & triple buffering to a developer the other day when they were perplexed about their engine not exceeding 60 FPS and their profile tools reporting a mysteries chunk of CPU activity that took up more than 12 milliseconds in their 16.666ms frame. Was a bit of a weird conversation to have with someone who was in charge of a game's development.
@@Relex_92 There's actually too much information too keep up with. If the developer has a small team then it's possible.
Unfortunately that, or usually even worse, is very common in software devs. I know many who’ve made applications for Windows for 20 years and STILL somehow don’t even know the very basics of how the OS works (and therefore no knowledge of how to make a decent application for it that does things “right” for that OS)... and I’m talking not knowing shit like “you need to reboot your PC every once in a while to prevent glitches and poor performance, or as one of the first things to try if something stops working properly”. Its that bad for MOST developers today, who - keep in mind - are usually developing apps and not games, since the talented ones often go into gaming development
@@zombievac the talented developers going into game dev is something I noticed recently. I was out with a friend and I met the owner of a software company, he asked me what languages do I know and I mentioned C/C++ and he lost his shit - saying that there's such huge demands for C/C++ developers right now. I thought it was weird because I'm used to everyone around me knowing C/C++, not knowing the language is a weird thing, but then I realised that the problem is the C/C++ developers these days move immediately into high performance game development or some critical application development for very large projects, leaving the rest of the software industry oddly starved of low-level, hard-core developers.
I honestly find it not that hard to believe. I can run a car but I had no idea how the thing actually works, for example
To be fair in the manual for most game engines it will just say prevents tearing and limits you framerate to device refresh.
I'm a PC gamer who pretty much *always* turns V-sync on no matter what simply because I legitimately *despise* screen tearing (it actually legit hurts my eyes)!
I know it adds input lag but honestly, I don't really care too much about that because since I'm not at all interested in speedrunning or competitive online multiplayer, my gaming tastes are such that the lag V-sync adds doesn't really affect me all that much.
I find that it depends on what framerate you are getting. If you are getting just a bit above your refresh rate, the tearing is horrible. If you are getting way above, I don't notice it. I always turn vsync off, and try to adjust my settings so I get 80-100 fps.
To get best results, before even changing in game settings get your PC performance settings and GPU control panel settings right for your setup.
I use it for singleplayer games. One thing that could help you out is fast sync, which eliminates screen tearing while not capping your fps to your monitors refresh rate and it dosen't add input lag. Its only for nvidia gpus from what I recall though, you could also just buy a gsync monitor if you have a big wallet.
Funnily enough I haven't had screen tearing in about 8 years or so. Not sure how but I guess I am doing something right.
FreeSync/GSync is the only sync I'll ever use, I hate screen tearing but I hate input lag even more
FreeSync/GSync is *better* than VSync, it makes the game feel smooth even when I hit low FPS.
This was explained very well. I’m someone with a rather limited knowledge of these behind the scenes type of information, and I found this easy to digest. Good job Alex!
It would to be great to have an updated video on this topic, since now we have things like Fast Sync and RTSS Scanline Sync. Both options are great to use with motion blur reduction (that can't be used with Freesync/Gsync).
what DF refers to as "true triple buffering" is nvidia fast sync / amd enhanced sync.
but yeah, i use RTSS scanline sync and it is like black magic to me. i've tried to look how it works multiple times, but its like a black box with zero documentation anywhere i could find online.
i feel like i mostly understand it... but how the hell does it just know how to move the tear off screen like that? and why does it work exactly the same on classic games and new games where i can have 60-400 FPS, but when i try to play a really old game like half life it breaks and needs to be reconfigured? its so bizarre.
Should have mentioned the new screen tear eliminating feature of Riva Tuner Statistics Server called "Scanline Sync". Works wonders, especially for older games!
Thunder Run explain si
This shit's perfect for emulators.
Sounds interesting, thanks for letting me know about it.
Yeah but it took SO MUCH time to deliver until now, they better have already included this back in the later 2000s.
yea for older games def, i believe you have to have 2x or more of your refresh for a smooth experience. otherwise i experience judder but its great when it works perfetct
On my pc, if I can reach over 90 fps I just disable vsync entirely as it generally doesn't distract me on the games I play. If I play a game that runs at 60-70fps on average with the chance of it going below that, I use adaptive vsync. If a game goes below 60fps, I care more about maintaining the best input latency I can rather than if there's tearing or not.
My PC currently has a gtx 980 in it, and since Nvidia seems to not want to make gsync affordable, I'll probably switch to an AMD gpu on my next PC so that I can get variable refresh rates without having to destroy my wallet.
Agreed, I just turn down settings until I get a solid 120fps, I would rather play 900p 120hz than 1440p 60hz
But why? Are you a pro with lots of money on the line? Vsync has been good enough for a long time now for the input lag to be nearly negligible if your PC has decent specs like you mentioned, and tearing is so much more distracting (and vsync has the nice benefit of capping your FPS to your refresh rate, because variable FPS and therefore variable input lag is even worse because you can’t adjust to it)
Well pro or not I still dont want to be handicapped by the input lag it introduces. It feels sluggish and I don't like that feeling.
On a 60 hz monitor in many games, if you get over 100FPS it can still look way way way more jittery than 60fps vsync. Sad reality.
@@zombievac That's probably why most games since its inception gave the option to turn V-Sync off, right?
You don't have to be a pro to appreciate occasional lower input lag over the same, but higher input lag all the time. Also that's not correct. If you drop down to the next V-Sync step your input lag is doubled within a single frame. That's more inconsistent than V-Sync off will ever get you.
I’m surprised this video was only made now
This video didn't need to be made. You just need a ten second clip saying "Vsync is the work of Satan and developers should just optimize performance better."
@@GuySocket they could do that if everyone had the exact same hardware, and wanted the same resolution and framerate. We don't.
@@GuySocket With high refresh rates you get even more tearing without V-Sync (or some other sync method).
@@GuySocket sounds like you need to watch the video because that's not what causes tearing.....
I've wondered what it does and have seen suggestions about what it does everywhere for performance.
It turns out the only correct option is disable for casual players who don't have those expensive graphic cards
I love that you're using Serious Sam games in the background.
Criminally underrated gems! SS3:BFE alone was like the FAR superior, less casual nu-DOOM.
Really hope television sets with VRR become commonplace soon. AMD freesync on some Samsung TVs is a start but it's nowhere near as polished as PC freesync/gsync yet. HDMI 2.1 was supposed to lead the way but the industry is dragging its collective feet on it.
I feel the same way about HDR with monitors. One day we will get the perfect display.
HDMI 2.1 wont even be on 2019 TV's very frustrating.. HDMI 2.1 wont be ready for consumers till the end of 2019 so 2020 TV's will get it. VRR on my Samsung Q8 and Q9 is amazing. Sometims I hook up the pc and play 1440p @ 120fps(120Hz) 4k it holds 40-60 really well ... Xbox One X games are starting to take advantage too with unlocked frame rates..
I'm loving those Serious Sam tracks.
SS in VR is pretty awesome :)
Ss in VR Is SERIOUSLY awesome! I'm doing a vr LIVE walkthrough in those days, check my channel for those!
Serious Sam VR is magical. The ability to aim dual guns in any direction like you're in Equilibrium? Hell yeah!
VR is something I've never tried. Serious Sam in VR looks really awesome though!
Alex this was an absolutely awesome video! This will be the definitive video I lead people to so they can better understand what V-sync is, its variations, and how they should be used. V-sync has to be the most misunderstood aspect about PC gaming because it's believed by many to be the enemy or just something that caps your frame rate. Great job again, Alex.
Visiting in 2024 - this video answered sooo many questions I had. You covered all the bases - VSync AND adaptive sync/VRR. An evergreen video. Great job on this one, Alex!!
Input latency with VSync happens when the input polling is blocked on the same thread as the rendering (inputs during Vsync are held until the start of the frame). id Tech 5 is a good example of an engine that moves the input polling onto its own thread; inputs are consumed the moment they happen rather than the start of the next frame, so there should be less perceived input latency.
Another cool thing I was investigating a couple of years ago was using late-frame re-projection with mouse-input to have absolutely, incredibly smooth first-person mouse-look - but this came at the cost of some subtle pixel smearing and increased GPU work-load, but it's something I think competitive FPS games should consider as an option - poll mouse input at the start of the frame, then poll a second time just before VSync and do re-projection. VR games use this technique (as does the PSVR).
Personally, I can't stand the input delay of VSync. It just absolutely ruins FPS games for me; as much as I hate tearing I will live with it for the sake of avoiding weird floaty mouse input.
Yes DF, educate the masses!
You're doing a great thing for every video game fan curious about the technical side of their in-game settings or why certain things are the way they are.
I've always been using V-sync, whatever the outcome. Tearing is an absolute no-no for me.
awesome work Alex, this year i started using G-Sync on 144hz display, its glorious.
G-Sync & Free-Sync.
Thank you.
Yea, I did not like V-Sync. Free-Sync is amazing.
ive used gsync for the past 3 years and never looked back.
Me too. V sync can be borked on some games, g-sync just bypasses all that nonsense and runs your games smooth and lag free, as long as you cap fps a few under the monitor hz limit.
Same. Been using a 1440p 165hz overclocked screen /w gsync. Its faaaantastic
word, gsync is the ssd of monitor upgrades.
@ so for 144hz you cap it at 142? or is it guesswork? Most games just have a 144 fps limit so do you use RTSS?
X34 predator with gsync here! Love it!!
My gaming experince totally changed since I got my gsync monitor. No more tearing or shutering when I am below my refresh rate. I would not go back
Wyatt Cheng I can't wait! 😂
Wyatt Cheng go play candy crush
Agree man, g-sync is glorious
im getting tired of trying to find a compromise between tearing and lag lol think i might just get one too
I think the video should have included more timeline graphs of the time it takes to render/refresh, to make it easier to understand. Also for people that are not familiar with this stuff, it should have been pointed out, that higher refresh rates significantly reduce tearing.
I know I am anserwing after a long time but is higher fps=lower tearing still a thing when fps is getting higher than monitor's refresh rate?
@@anteksadkowski5166 You may get more tearlines, but the difference between the frames, so the offset, is smaller so each tearline should be a bit less noticeable.
Great video!
I didn't know how Triple Buffering worked before - it's great to know how it works
Whoa, yesterday i was looking for a video explaining v-sync and couldn't find anything that helped. Crazy timing.
Variable refresh rate is the way, G-sync cost a lot but damn it's good !!
Very helpful. Had no idea what v-sync did before this and why it said it would up my frame rate when it actually dipped it. Thank you.
Alex, you continue to impress me with your knowledge of the topics you discuss.
Wow I've been into technical gaming for about 6 years and this is the best video I've ever seen explaining vsync.
Rivatuner Statistics server has now an option to control the tearline, it's quite interesting.
Really? I need to check that out.
Yes, I even heard you can get the tearline to appear "nowhere" by using that method, so it´s not on your screen at all.
really no one would of explain it better then this channel, keep up the good work, I like how you explain things that make it So easy to understand.
a noteworthy disadvantage when using variable refreshrates (g-sync, free senc) is that you can no longer use backlight strobing / motion blur reduction.
how about turn off motion blur at first place?
thanks a lot for your time and effort to share this information! :D
thought i knew everything about it when my brain said "v sync = no tear. that's all there's to it" ..turns out theres a LOT more to it.
HOLY SHIT. SOMEONE FIXED THE ANIMATION INTERPOLATION IN HALO PC????
You can use Chimera and the 60FPS animation feature built into it.
@@KhangHoang-ks3nm I know it's stingy, but it's actually full on interpolation. So whatever your fps is, that is how many times the animations will update per second.
This is an amazing explanation. Been looking for something like this video for years. Finally I can refer to this video every time someone asks what vsync is or does.
what i still dont fully understand is why i get tearing on my 144hz screen even when i have constant 144 fps. no matter if vsync is on or if i use a framelock
Turn vsync on in nvidia settings
Hi, Alex & team. A long-time subscriber of the Channel here. Already watched this video months ago, but I just wanted to clarify a couple of things (a bit of an OCD person when it comes to these things,lol). So in summary, not counting "No VSync" of course, your video basically presented us with FIVE "kinds" of VSync or syncing solutions or whatever you wanna call it. 1) Double Buffer VSync 2) Triple Buffer VSync 3) "True" Triple Buffer VSync (aka Enhanced Sync/Fast Sync) 4) Adaptive VSync 5) Variable Refresh Rate (aka VESA Adaptive-Sync/FreeSync/G-Sync). So based on that summary, are you saying that "Adaptive VSync" is actually different from VESA's "Adaptive-Sync", despite being nearly identical in their names (literally just missed each other by ONE letter)? If so, where else is Adaptive VSync implemented (you DID mention Call of Duty & Crysis in the video,but is that it)? And should people care about it ultimately, or is it simply outclassed & superseded now by Adaptive-Sync and its popular implementations FreeSync & G-Sync? Thanks for the clarification, guys. And very good video by the way, highly informative & VERY relevant.
I usually turn on v-sync to get rid of screen tearing.
No shit! (-_-)'
Same if playing on TV instead of my G-sync monitors. But I also make sure I hit the constant 60 fps.
I usually use v-sync to lock the game at 60fps so my laptop wont overheat
Yeah lmao
I usually breathe air to survive. I know, I know... that’s way too interesting for this thread!
More tech focus videos please. Let's go through the entire graphical settings menu.
Frame limiting with MSI Afterburner works best with vsync, cuz it's smoothens frame time.
Dante GTX Can i use Afterburner to lock a 60 FPS game into 30 FPS?
@@CanaldoZenny You can type your desired fps in settings
What should i set fps limiter and vsync on on 75hz?
@@simorx580 same as monitor refresh rate(Hertz)
@@DanteGTX i have set 75hz and i ve still got tearing
Digital foundry never misses a bear with their videos explaining technology.
Great video and thank you very much.
I find screen tearing, frame pacing stutters incredibly distracting. Years ago got a G sync but i dunno if its because the frame rates are not right for the G sync but i really do not notice a difference compared to v sync or adaptive V sync
If u never get dips in fps u will not notice a difference. Its when the framerate fluctuates u notice the power of g-sync. With a gsync monitor 45 fps will feel about the same as 60 fps with a normal monitor.
@@underflip2 That's not what G-Sync does, lol.
V-SYNC has been a mystery to all of us for over 20 years. We all know the basics but we never fully understood how it works. After watching tons of videos, reading articles for years after years, still a mystery. Then this video comes and voilà, we finally know.. thanks for the video bro.. nice work
Hey Digital Foundry can you do a video on the new Scan Line Sync in Rivatuner Statistics Server?
i wish they would teach us cool stuff like this at school
freesync master race (30-144hz range
When business monitors adopts it, it will be the end of tearing
@@madson-web
And hopefully the end of overpriced G-Sync technology.
@@madson-web No need in business, unless your powerpoint presentation really needs it lol. Creative yes.
@@tomtalk24 so yeah, it will never be a standard this way. This thing needs to be releasing from "gamer" tag somehow
Not everyone plays games, and not all gamers play on PC. So it wont ever be a "standard" as it has no use anywhere else other than in rendering graphics in real time. Thus a very small portion of the display market. We're talking about gsync/freesync, not HD/4K/HDR or whatever else thats in use (and usable) across the entire market.
Long overdue. Fantastic video. Thank you, DF.
im using fast sync for all my games and after that never used vsync... pls do a video on fast sync vs free sync and g sync...
Actually, what Alex called "True" Triple-Buffered V-Sync is Fast Sync. I'm not sure why he called it that to be honest.
@Wyatt Cheng im not playing diablo games.. and i dont like mobile games either..
@@sd19delta16 yep its kind of tripple buffer . Gpu produce infinte frame bcs its think that vsync off and then fast sync chose best frame and show it on screen.. correct me if im wrong.. !!
FastSync needs the frame rate to be at least double that of the refresh rate, preferably 3 times fasteror more so only best used for games like CS:GO, it massively reduces delay and removes tearing as it shows the most up to date frame every time, so say 120 Hz Display but the game is say 480 FPS, then the frame delay is not 8.333 ms but 2.083 ms, good stuff it is. Bit remembered, need at least double the frame rate of the display refresh rate, if not at least double, it pretty much is just normal vsync then so useless
sd19delta, basically the different are fast sync do not limit your gpu frame rate and still maintain viewable frame for your monitor.(to reduce stuttering and input lag)
This is a tearable video. 2020 represents.
Every game in existence should have an option for adaptive V-sync, it's objectively the best solution outside of investing in a G-sync monitor.
The game doesnt have to support it only your monitor and gpu does.
@@_Thred_ no. For example: Batman Arkham Nightmare uses adaptiv v sync. No Need for an amd graphic card or freesync monitor
I don’t completely agree, if you truly hate tearing (even infrequently) it’s not necessarily the best. I noticed this on the Xbox one x version of call of duty wwII where the in game cutscenes were plagued with tear, whereas the PS4 pro version used triple buffered vsync which meant stutter instead. I’d personally take the stutter over tearing, but that’s why I consider it to be more a subjective matter than an objective one.
If you have an Nvidia card you can use the control panel to turn it on in the vast majority of games.
@@Archivist42 I dont believe that's the same thing with adaptive v sync
another type of vsync that DF didn't explain is half refresh rate vsync, it is pretty self explanatory, this type of vsync will render the frames at half the refresh rate of the monitor. meaning that each new frame is followed by a duplicate frame, to make it seem smoother. this is what they use for games on consoles, this is why many people think that 30fps is smoother on consoles than on pc, but you can do the exact same on a pc using nvidia inspector. for amd users, this option is not directly available in their drivers unfortunately, but they can do the same thing using RTSS's scanline sync by setting it to "x/2", where the 'x' is the refresh rate of the monitor. scanline sync is like a custom vsync where you can adjust the position of the tearline by yourself, scanline sync is an alternate vsync which has less input lag. its basically what DF explained at 3:24, you can move the tearline away from the centre of the screen or even out of the screen space, so that tearing will not occur at all in gameplay.
EDIT: ok they kinda explained this at 4:45, but they didn't fully explain about half refresh rate vsync.
Yay, finally another Tech Focus video :d
Using Freesync for 3 years now. No tearing and it's buttery smooth!
The clips with tearing were painful to watch
The new RTSS version has an "s-sync" feature which lets you control where the tearline appears. Works quite well!!
you should definitely use it in Waulfrenshtain The New Colossus
Nice phonetic spelling, but where does the 'r' come from 🤔
Wolfenstein
Hahahahahahahahaha!
@@huawafabe r/woosh
That´s how you say it in german. WolfenSCHtein.
Really great work there! Now things are much more clear to me!
G-sync FTW. Haven't seen tearing once for over 3 years now :-)
Same here. Tearing? What is tearing? lol
Really cool video. The two things I don't understand are why doubling the refresh rate helps with free sync/g-sync and also why it can introduce ghosting.
This is v sync
Digital Foundry: "[...] und auf Wiedersehen."
Me, a german: "Wait a minute"
Yeah man, a german in the team of DF :D
I'm just at 1:22 and I already noticed the Condemned Theme Main Menu soundtrack and Serious Sam The first Encounter first mission sound.
I love this soundtrack already and there is still 17 minutes to go with a full explaination of Vsync.
My Saturday can't be better now XD
G-Sync here after been using V-Sync and 60hz for 10 years... I'm not sad that I changed to G-Sync. :) If anybody love games. Please get a G-Sync monitor, yes they are expensive but they are worth it. Cheers.
Am I the only one who can't tell the difference? GSync feels no different to me
@@nO_d3N1AL U will only notice the difference when the framerate is unstable. If u use for example V-sync and your framerate are locked to 60 and u never dip under 60 the experience will be the same. If your framerate dips or your framerate is unlocked G-sync is vastly superior.
I have a freesync monitor and it's night and day in the improvement that it makes to my gaming
Team red here. If you don't have Nvidia, get Freesync/adaptive sync instead. It's not more expensive and therefore even more worth it.
Thank you for this particular topic choice Alex! Been trying to make proper sense of Enhanced Sync for months now (currently a Radeon user) and this video is the perfect explanation to me. A needed and great watch this was.
G-sync is the best sync based on my own experience. Too bad it's Nvidia only tech because at this point I'm really not sure if my next GPU will be from Nvidia.
Why not?
@@ik1llpeeple4fun rtx prices are insane
MadFinnTech u can use AMD card on G Sync monitor just won't have Freesync support & Freesync is garbage anyway
@@Sonu666 , " Freesync is garbage anyway"
You have no clue what you are talking about. Probably use a console so your fps is to low to use G or Freesync.
ik1llpeeple4fun Price fixing, gimping and shady business practice.
These are great. You’re an awesome addition to the DF team, Alex.
It depends. If you play with a controller and/or you play single player games, you should use it. But if you play competitive shooters on PC that require very low input lag, you should definitely turn it off, even if you are using a lower refresh rate monitor. Sure, you'll experience screen tearing, but you will get much lower input lag which is critical in games like Siege or CSGO. One more thing: I cap the refresh rate even if I don't use Vsync (via RivaTuner). I don't like fluctuating framerate and response time. People like to play totally uncapped with FreeSync and G-Sync monitors but I like consistent framerates and response time.
No tearing for me. It's too distracting.
I play with controller and disable it almost whenever I can. Consoles rarely get the option, but the lower latency is preferable unless the the tearing to too obnoxious or the game doesn't need responsive controls.
Videos like this are why I continue to sub to this channel. Thank you Alex!!
I cannot play a single game without vsnc
i use G-sync. I can live without V-sync
Yup me neither, screen tearing is horrendous.
I use freesync, no need for vsync.
I also cant. But its because my laptop overheats and v-sync locks it at 60fps
Yeah, I am also very sensitive to tearing. Quake was the first game I turned VSync on (on 3DFX Voodoo1). Still remember the console command vid_wait 1. Since then I haven't played a single game without Vsync, cause it turns out I hate tearing much more than slow framerates or input lag or low resolutions. One of the worst and ugliest things that can happen to a game for me is if it starts tearing. Also the reason I don't bother with consoles, where you are basically left at the discretion of developers in this matter.
Fantastic video and the Toasty! made my day - great memories of MK2 on the Megadrive.
After 15 years of playing on console i finally bought my first gaming PC, i hope i will not be disappointed
@@dayko. I'm waiting for it to be delivered, that's 2 week at best (Amazon international)
@@Neiva71 don't care, I'm only interested in gaming
Whether you will be disappointed or not depends on the specs,so what are they ?
@@madalinradion Ryzen 5 2600, RX 580 and 16GB Ram with 144hz monitor (144hz is one of the main reasons i got a PC)
@@DmxAng Its a good pc,just dont expect to reach 144fps in the newest games and the highest settings
I am so glad I finally got a freesync qhd monitor.. I used to hate vsync in old days so much, specially double buffered ones
Lol is that the PC version of Turok? Completely forgot it even existed.
There's also a great version of Turok 2 on GOG.
sweet 3dfx times!
You do know that HD remasters of T1 & 2 came out like year ago?
There was always PC version of Turok 1 & 2, but a while ago both 1 & 2 got a good remaster, i think they are even on Xbox/PS4 now.
Next topic, optimization of PC components, choosing the right ones for the right resolution, airflow, positioning of the fans, A to Z guys, A to Z ... I want more like this ... Hell of a good work ;)
I remember last gen when consolers swore Vsync didnt exist. the human eye couldnt see it, just like above 720p, just like above 30fps.
Funny how they change their tune when consoles get it. Textures, i remember "debating" with consolers about improved textures and they said "who stands and looks at rocks when playing a game", now they bicker and argue with each other about which plastic boxes rocks look better. ha ha
console hypocrite race
Hilarious how they chop and change every other minute, pretend not to see stuff until they have it then all of a sudden its like the second coming of christ if the other console doesnt have it. then when they dont have it its back to being unable to see the difference.
Most console gamers are contempt with what they have. Engaging in a technical discussion is mostly useless.
This explains the different sync options so clearly. Thanks!
I use it in every game, or the screen tearing gets really bad, it's distracting
@Wyatt Cheng where did diablo immortal came from???
Free-sync is finally available in NVIDIA (10 and 20 series)
G-Sync is the best, but comes of the high licensing costs of Fu-King Nvidia…
Freesync to the rescue. All you have to do is give even less money to Nvidia and give some to AMD instead.
@Ignignokt did you delete your comment?
@@nextlifeonearth Freesync isn't as good as gsync and you can't use Nvidia GPUs with free sync, which means you'll have to use an AMD GPU and they're just not up to the standard of Nvidia's GPUs. AMD have really been playing catch up with Intel and NVidia for years and, until they start actually coming out with products that can compete with Nvidia and Intel, we should avoid them or otherwise they won't change their business practice.
Not sure how freesync is not as good as gsync, it 100% works, and the measured delay is the same as g-syncs best results (with gsync having inconsistent measured delays(source - bitwit and hardware unboxed). RX 580 competes directly with 1060 in price and performs on par with it. the next step up vega 56 and GTX 1070, they are the same price and the vega beats it convincingly in most games. add on to that the fact that you can buy a vaga 56 and freesync monitor for the same money as a 1060 and gsync monitor, and the only reason you cant use Nvidia GPUs with freesync is because Nvidia specifically wont let you (it would take nothing more than a firmware update) and then consider who should be avoided until they change their business practices.
I was just doing research on this topic and not even a day later DigitalFoundry puts a Tech Focus video up on it.
I cant be the only one that uses v-sync to lock the game at 60 fps so my pc wont overheat
Usually, people lock their framerate using rivatuner
No, only on older games like the first Witcher and Divinity.
Cap at 60, force v sync.
Otherwise they run terribly.
@@mrmagoo-i2l Exactly.
Sometimes when my laptop gets really hot
you can get rivatuner or even better, nvidia inspector
I feel so lucky, I got a G-SYNC compatible monitor for my first gaming PC, so I haven't really had to deal with any tearing or latency issues.
FYI, Blurbusters has a great series of articles on G-SYNC and input latency. It's full of measurements and solid analysis.
Long story short:
- enable g-sync for fullscreen mode
- globally enable v-sync in NVIDIA control panel (vsync and gsync were intended to work together)
- disable v-sync for in-game settings
- either in-game or in NVIDIA control panel, set max FPS to whatever your refresh rate is minus 3+, so 144hz would be max 141 fps
All of the above settings mean that you always get the minimum possible input lag without any screen tearing regardless of what FPS your GPU is outputting.
I use fast sync way better than V-sync, low latency and no tearing it’s great
I use fast sync whenever possible, but unfortunately it doesn't work with all games. From my testings fast sync works perfectly with 50% of my game library, while the other 50% either doesn't work at all, present weird input behavior or visual artifacts.
Fallout 4 is one game where fast sync works wonders and makes the game a lot more playable, while the Trackmania 2 games (Stadium, Canyon, Valley, etc) just go haywire when fast sync is enabled.
@Wyatt Cheng But I don't have a phone :(
Phenomenon content please carry on this magnificent work Alex
Wow, that lament about Nvidia in the last part, but you don't say nothing about the hassle of shopping for a Freesync monitor? About how its ranges are limited? About how upgrading to a new GPU meant your current Freesync monitor may not play well with it anymore? But hey, its easier to just put a target on the big bad Nvidia. I wish AMD would do better, I really do. It would give me better choices and options when upgrading my PC. However by comparison, my per dollar investment has always done better with Nvidia/Intel.
Been using G-Sync for 3 years. Love it.
I wish you could disable v sync on console (for multiplayer games)
Thomas S why?? It will run at what 31 fps INSTED of 30. 😂😂😂😂😂😂😂😂😂😂😂🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣
Thomas S At least on console, if you have forced V-SYNC, then everyone else does too. It maintains a reasonably level playing field.
Though this argument could completely fall apart when you consider the enhanced consoles coexisting with the base consoles.
Rocket league lets you disable it at the cost of tearing. And that is a 60fps game. Of course I play it at 200fps on pc, but there is that option just saying.
Jean Lepage Ultimately, it's not a significant difference either way. It's just that the game would then force players to deal with the tearing or willingly choose to play at a disadvantage. I've seen games that do this, and I don't think it's had a particularly negative impact on the online experience. I'm just saying that a consistent experience between players shouldn't be completely devalued.
Rainbow six has v-sync. On console.
Another great Alex video with great music.
I used vsync when I had a 60Hz monitor. At 144+ its useless.
True brøther
I have a 240hz monitor. Microstutters and tearing are still there, even if they are less noticeable. Vsync is still needed.
Same with 120Hz. Rarely do my games exceed 120fps on ultra settings. Usually 60-100 depending on game
not true. why do you think gsync monitors exist?
@@Relex_92 Minimising the problem to the point where you don't notice it.. is the same as fixing the problem completely
This video is too underappreciated.
I never use v-sync in any game, i prefer having more fps with tearing than input lag. especially in shooter games which is what i play mostly. I don't mind tearing at all.
Haven't seen a tear since switching to g-sync. I love it
Me: **turns v-sync off**
V-sync: Am I a joke to you?
I'm sorry but that's the meme I thought about when seeing the thumbnail 😂
My head is hurting! Thanks for explaining 👍🏻
Screw Nvidia for ruining the majority of gamers experience by forcing them to buy a gsync monitor. Gsync could have just been a standard for high quality monitors with "superior smoothness". But it should be a choice. Not shoved down my throat. Thanks for holding gamers back. Now I'm saving up for an amd.......i will have to wait for that variable sync smoothness
You shouldn't even be able to ignore open standards. That is anti-consumer and anti-competitive.
You're trying to argue that, because Nvidia cards only support Gsync, if you want a variable refresh monitor and own an Nvidia card, you have to buy a gsync monitor, which is anti-competitive and anti-consumer, but it's actually the complete opposite, as is evidenced by the fact that you've said you're going to buy an AMD graphics card next time you upgrade. Nvidia having gsync has created a competition between free sync and gsync and further fuelled the competition between nvidia and AMD, which increases competition in the market, which will then drive prices down as they try and compete for consumers, which is pro-consumer. In other words, it's the exact opposite of what you've claimed it to be and it is an excellent thing for gamers.
@@David-ud9ju This might be true if both companies were on even footing to start with, but they're not. Nvidia is the only viable option for high-end gaming GPU's, and their introduction of exclusive proprietary technology was designed to make the status quo as sturdy as possible. People buy G-Sync monitors because they have Nvidia GPU's, and then choose Nvidia for their next GPU upgrade because they have a G-sync monitor. This is anti-competitive and pro-status quo.
Geile Sache 😁 auf Wiedersehen, Alex 😉👍
Enabling vsync in multiplayer games is a joke. The amount of imput lag you can get is insane
@@Neiva71 Depends on the game, and if you're also capping to 60fps (or whatever your screen refresh rate is), it can reduce the lag significantly
I'd prefer nearly anything over tearing.
You can also try this trick: disable some poorly implemented in-game V-Sync, then force V-Sync from NVIDIA/AMD control panel, works for some games.
@@Neiva71 as i said
My cousin doesnt feel the inputlag as well...but I definitely do, when I play with his pc...his senses are just not that sensible...I'm sure its the same with your people...some people still say there is no difference between 60 and 144hz
The down side to this shift to VRR is going to be the frame pace issues when recording a game… trust me it’s been an absolute pain, only fix so far is to disable G-Sync and lock V-Sync to fixed refresh of the recording capture framerate… so 60fps. ☹️
Yeah it is something we struggle with too
v sync is awful i disable it 100% of the time if i can
Yeah my old FX-8350 would really struggle to run games with v-sync turned on but then i upgraded to an i5-6600k and now v-sync doesnt effect performance at all, i still turn it off for multiplayer gaming tho.
You enjoy screen tearing? Each to their own I guess, I prefer good image quality and stable performance myself.
My general experience has been, keep it simple: use gsync/freesync if you have it, use fast sync if you don't. The side effects of fast sync when running at closer to display refresh or under display refresh aren't all that bad, and it's generally going to result in the least amount of latency you can get without tearing, whatever your framerate.