Only to discover edge-cases where the easy solution falls apart. Fixing these is a recipe for spaghetti code. Sometimes the harder approach ends up simpler than the easy one.
@@BenLubar can't tell you the number of times I've hit a problem and after some reading couldn't believe why nobody had thought of this simple solution. After implementation I understand exactly why it's not used. That said, occasionally, very occasionally you hit upon a unique simple approach and with trepidation you realise it is actually new and innovative.
19:53 - in case anyone is confused about that, the trick is making sure your GPU isn't oversaturated at 100% usage, that's basically what reflex is doing. For example: - 131 fps, 99% GPU -- less responsive - 120 fps, 91% GPU -- more responsive That said, if your GPU is underutilized, then it's the other way around: the more fps, the better. - 131 fps, 57% GPU -- slightly more responsive - 120 fps, 51% GPU -- less responsive
As someone who has had to change the refresh rate on two friends' monitors because they never changed the setting i've got to say: spreading the word is important. CHECK YOUR MONITOR SETTINGS, PEOPLE
Getting Nvidia itself to sponsor you and send you testing hardware definitely feels like a badge of honor that you've earned after all these years of creating my favorite tech and gaming related videos across all of your channels. Congrats!
in b4 the amd fanboys calling him an "nvidia shill" just like phil's previous vid regarding Ray reconstruction in his last video, its fun and hilarous to see amd fanboys cry
@@Eleganttf2 how ironic, you ridicule AMD fanboys before they utter a single sentence revealing yourself as an even more rabid fanboy. For reference I currently game on an nvidia card and have happily criticised both AMD/ATI and nvidia for decades.
@@beardedchimp lol you don't know how much amd fanboys stormed phil's last vid when he's literally just talk about Ray reconstruction go see for yourself if you dont believe me, and im just stating the truth here to prevent more amd fanboys from trashing on this vid's comment section
@@beardedchimp but sure if you wanna keep catering to amd just like any other tech tubers fishing for the subs and likes because the popular narrative is now "nvidia=bad, greedy. Amd=generous, good guy" then go ahead
it is not always best to set it to the max. there are cases where setting the mouse's polling rate to >500 can make games unplayable. but in 2023 you probably wont have to worry about this, just set it to the max
@@slonkazoidvery true i play on a very high poll rate on both my mouse and keyboard, some software just straight up cant handle it, i get double inputs on my keyboard and specifically when trying to use optifine in minecraft if you use high dpi it will break and over accelerate your mouse
@@griffin1366 I know many years ago mouse manufacturers used to use artificial sampling 'boosters' to give highly marketable huge numbers. The chipset had a native max refresh above which they were less accurate. I'm not sure if that is done any more as technology progressed. Gamers were also obsessed about using the lowest sampling rate, or the "holy grail" of 400hz. I'm glad this video dispelled the myth that 400hz was optimal for input latency.
the liquid crystals (from LCD) in the display are actuated electromagnetically, overdrive just overvolts these tiny electromagnets in the hopes that it accelerates the crystals faster. but as the electromagnets don't actively slow the crystals down for their new position, they can overshoot and then bounce back a bit
This is actually the best video i've seen at explaining all the different components that go into rendering an image and what each of the options you can add on to that process do. awesome work!
finally a video that actually explains what an ldac is instead of assuming the viewer knows all about it and it's history of an obscure device they'll never see
Everyone I've ever seen use an LDAT has explained what it is and how it works at least once. If you just watch a later video of theirs where they don't repeat that basic info that's on you. You can always Google what an LDAT is.
@@cezarsoba "erm if you werent an idiot and used google you would maybe not have this problem you little plebian". The delivery is very superiority complex esk. and it just comes off a little weird. Formality means alot in text.
This video saved my sanity. I have had the black reflex testing box on my screen for months. No idea where it came from or how to fix it. I scoured the internet with the most common response being a dying monitor or various other issues, whose solutions did not solve my problem. IT HAS BEEN MONTHS!!! I quite literally debated clicking this video. I'm not concerned with my input lag in CS2 as I don't notice a problem..........If I hadn't, I would still be playing with that little black box that flashes every time you click. Thank you for giving my sanity back
Well done Philip. This is the most comprehensive video on input latency I've ever seen. You do brilliant work with your platform in the CS community. Cheers 🍺
@@NulJern So you didn't watch it then? Just because something is sponsored by a company doesn't mean that the information contained in the video isn't useful, accurate and comprehensive. SMH.
@@NulJern Since you can hardly speak English, I'd advise you to sit down and take a breath. BTW, I'm probably twice your age. Unfortunately, you are not showing much intelligence.
Hi Phillip, Reflex+GSYNC+VSYNC is theoretically supposed to limit your framerate to a percentage below your refresh rate. I believe that's why your vsync+gsync testing showed 58fps and such a reduction in latency! However I have noticed this feature does not kick in with 100% consistency. Maybe something to look into a little further? Great video!
Great work mr philip, there's not many resources out there that go this in depth on latency. I'm sure this video will be a lot of use to people over the years
Awesome video! One thing I wanna note, all of these different things add up really quick, to the point where your latency could be 4-6ms faster then if you didn’t fine tune everything. That doesn’t sound like alot but you definitely can notice the difference. That paired with making sure your cpu and gpu aren’t at a super high usage in order to make sure you have the most consistent frames with the highest 1% lows, is the most important thing you can do for your setup. The highest framerate ≠ the best and smoothest experience. Frame caps are very important, finding the middle ground between latency and fps is the hardest part. My rule of thumb is to make sure your gpu and cpu arent over 85% usage, because past that it starts affecting the consistency of your frames. Cap your fps to something you can constantly obtain within that 85%. Higher 1% lows = less microstuttering and buttery smooth gameplay. There is also a GAINT rabbit hole of tweaks and different things you can change on your pc in order to lower latency, some of it is not good so always be careful and do your research before listening to what some random person online says
13:20 As a side note: when using gyro input and flick stick for playstation 4 controllers on steam input, be sure to use a wired cable. When I tried wireless, the input lag was extremely noticable due to bluetooth delay, or at least to me. If you want an idea for a continuation for this video, I think it'd be interesting to see how controllers stack up to keyboard/mouse.
Just another reason why Valve needs to get off their butts and release a second-gen Steam Controller with all the features the DS4 has but none of the reliance on Bluetooth.
@@stevethepocket I'll take bluetooth. Just give me the option like that Razer mouse Philip has shown. I don't want a single-purpose USB dongles when I don't need them.
Bluetooth was never meant to have good response times. Although for some reason PS3 controllers do well wirelessly. That's not to say that wireless is slow, just the Bluetooth protocol is.
@@vasilis23456 I'm pretty sure Sony controllers all have a 2.4G mode that relies on a proprietary receiver inside the console. They wouldn't be able to do stuff like PSVR head tracking without making people sick otherwise.
I love this video! You totally nailed what's been bugging me since I ditched my old CRT for a crappy LCD. Had no clue about input latency, and here I was, wondering why everything felt off. Thanks for clearing that up, seriously I feel enlightened now
I have been searching for a video like this all week. I have been using my 240fps camera to test different setups. Thanks for the video. I learned a lot.
Just to make it clear, you want to enable vsync when you use gsync to get completely rid of tearing. When you have vsync and gsync on, it acts differently than regular vsync. But you need to cap your framerate slightly below your refresh rate, or use nvidia reflex which does it for you. If you don't cap your framerate, it'll hit your refresh rate and default to regular vsync which introduces a lot of latency.
No, Vsync will have as much Input latency as Gsync/Freesync,/Adaptive Sync these technologies adapt the refresh rate of the monitor to the actual framerate you create. If you have as much frames as you have refreshes V-Sync and adaptive Sync act the same. Adaptive sync lowers the monitor refresh rate when your FPS drops below the monitor refresh rate, which leads to less input latency than V-Sync in that scenario, because V-Sync will simply output the same frame again thus halving FPS for that moment. The best solution is to use triple buffering solutions like Fast Sync (Nvidia) or Enhanced Sync (AMD) together with an adaptive sync technology. Because these triple buffering solutions always render a new frame and only output the newest one, thus reducing latency(if you have more FPS than monitor refreshes). (But be aware there is another triple buffering technology which act as a frame queue that greatly increases latency, so don't assume triple Buffering will always decrease latency)
@MrDavibu No, wrong. OP is right. Enabling Vsync with Gsync enabled IS NOT the same as only using Vsync. When using Gsync you need to also enable Vsync in Nvidia settings, otherwise you still get some screen tearing on top and bottom of your screen. Specifically, you also need to cap framerate to AT LEAST 3fps below monitor refresh rate to guarantee Vsync never kicks in. Enabling Vsync while Gsync is enabled will not increase latency like it does with only Vsync enabled.
When I recorded my 240hz monitor with a 240 fps cam and tested different overdrive settings the main difference between them was that at lower overdrive levels a new frame often looked transparent. The white object that I used for testing was often grey the first frame it was visible and turned white in the next frame. With the highest overdrive level this still happened sometimes but a lot less often. I'd assume that the transition from black to grey is already enough to trigger the LDAT.
I feel like the title shouldn't be limited to CS2. Anyone searching for videos on input latency would learn so much from this, but they might not click the video if they think it only applies to CS.
Hey Philip! Always love these videos! I'd really like to see some testing with Gsync's intended use with vsync on + a capped framerate! I remember reading blur busters' articles testing gsync both with and without vsync on. They basically came to the conclusion that frametimes were more consistent with it turned ON and the framerate capped. Would like to see some additional testing to backup you saying in this video that you should always turn vsync off
Wow. I've been a regular binge watch viewer of your channels but this video might be one of your top 10 masterpieces. The way you stated that this company is the best with pronunciation of the exact word while stating your personal opinion about advertise - the very thing you are doing right there - was just word artistry. And then the level you invest into testing and research! This has at least the potential for a standalone Paper journal contributed, if not even a whole PhD thesis in computer science. I think I have never seen somebody so seemingly unintentionally achieving such an academic level of depth while doing something so obviously entertaining for oneself at the same time.
@@Eleganttf2 lol i wrote my point you dumb kid. Read what i wrote instead of reading into what i wrote, so no that's NOT what i am trying to say, and you must be ignorant even thinking that, considering it wasn't what i wrote.
V-Sync can be useful (it gets rid of tearing occurring at the lower part of the monitor) without adding any additional latency when used with G-Sync and capping the frame rate below the refresh rate of your monitor, please correct this misinformation I am of the constant slander towards V-Sync by people who don't know how to use it.
Philip, I would love to see the "Going Low" series in CS2, like you did in CS:GO again after 10 years. It was such a good series, I would love to see it again.
@@3kliksphilip I've heard similar comments from my English teachers before. I think their main issue with it is that it is kind of generic and they wanted us to use more creative language when writing essays and stuff. If I remember correctly, they suggested we could instead restate the thesis with slightly different language, but I could be off about that. That kind of thing never really bothered me personally, loved the video as always!
@@3kliksphilip I've read student essays where they constantly end each section with "in conclusion". When you finally reach the end and the actual conclusion they've already concluded several times. In conclusion, explain your point rather than tell the reader your surmising is what they should conclude. (This is here purely to bump up the word count to submit the essay) Final conclusion, Jane Austen ruined the 90's for me, I'll never forgive GCSE English lit.
@@3kliksphilip Like other people have said I think it’s because it’s considered a cliché. I never really fully understood that since it just means the audience 100% knows it is one. However they are the teacher so they know much better than my whiny complaints.
Extremely detailed explanation, thanks! I'm wondering how much of a factor a keyboards input lag would be in CS. Counter strafing is very important in this game, and standard mechanical keyboards have a debounce delay of around 5ms. Keyboards with optical and analog switches don't have that 'problem'. Could be an idea for a follow up video.
Sometimes, I think that my input latency in CS2 genuinely reaches 400 ms- it always happens shortly after a lagspike before going down again, and it's not very frequent, but it's really annoying, especially since I never had that problem in Global Offensive.
@@ToodelsALXno, he literally said his problem is directly related to his network connection, why tf would hardware acceleration do anything. I mean let’s be honest guys, this video is great, but in reality we, as users, barely have any control over any of the graphics pipelines being used in games. That’s why he even points out that Nvidia has to calibrate these things, we can affect it don’t get me wrong, but if you genuinely think you can fix 400ms response delays with a flip of a switch you might be crazy
I think the only thing that wasn't answered by this video was how the latency compares between GO and CS2, other than that, this video is incredibly well done! I'd be curious how the measure up with the tools you now have to measure such things!
With the latest CS2 update they got rid of all the input latency that was present in csgo and made firing client sided instead of server sided. There was an upto 16ms input lag in csgo and it doesnt exist anymore with cs2 which means firing is now instant
@@Acidity01 I know they changed the firing animation to play sooner, but I mean the input latency itself, like does it actually take longer for CS2's inputs to register than CSGO's? I know subtick means that the shot will register where the input was detected via the game client, meaning --sort of-- directly after the CPU processes it and it's sent to the game, which would happen more or less in parallel with the display of the shot, but I'd just be interested how the "feeling" of latency compares between the games now.
@@Max128ping I mean, I personally don't think it does, but there's been so much anti-CS2 rhetoric, it'd be nice to at least put it in perspective, that's all
If you have a G-sync monitor turn V-Sync ON and enable Nvidia Reflex. It will automatically cap your FPS just below your monitors max refresh rate. This eliminates tearing and has other benefits.
we are talking about 3-10 miliseconds differences here, average human reaction is 150-300ms. Its much more worth it to have GSYNC + REFLEX ON+BOOST and cap fps -3 below refresh rate for the ultimate smooth experience imho
Something to bear in mind is how most reaction time tests use a physical feedback system. Your brain can recognise changes much quicker than 150-300ms, it's just that it takes that long for your nervous system to create a signal from visual changes, process it in the brain, send the signal to your hand and then actuate the muscles. Which means that a noticeable/detectable difference in latency is probably a lot smaller than assumed
@@nade5557 true and fair point, but if we are talking about input latency being an advantage then we need to take into consideration the whole chain, the server, tickrate, visual change, brain recognizing it, sending signal to move your arm and click, then comes into play your peripherals, your system HW and its processing time, CPU sending data to GPU to render and then into monitor etc. Thats basically why i am saying that i enjoy more a game that is silky smooth with no tearing with potential 5ms higher input lag then a micro stutter/teary mess with ab solute minimal input latency. But hey maybe its the age talking, ive been playing competitively from 1999 to 2015 and now prefer the visual enjoyable experience over extra few ms sacrificed :D
I was really expecting a latency comparison between CS2 and CSGO. Also as a man silly enough to daily drive a 4090 and 13900k as well, I was expecting to see which CS2/PC settings offer the absolute lowest input latency on that specific setup. Great video nonetheless!
I'm pretty sure ive seen tests with CSGO before and it was kinda bad in terms of latency (mouse click VS seeing shot on screen). Also CSGO lacked any sort of reflex implementation so I wouldnt be surprised if it performs worse than CS2
@@Ferdam I'm quite sure it would, I was thinking it would be a fun statistic to throw at people noceboing themselves into thinking CS2 is the worst game ever developed. However a bit more consideration and reading some comments here makes me think it wouldn't really matter and there will always be some other unrelated thing made up to support their view.
3kliks is one of the most valuable resources to the general gaming community, he answers questions I don't even know I want the answer to (not this one obviously lol, but I mainly mean his other videos)
FYI, if you're using GSync or FreeSync and you're given the option of VSync, always keep it on. You were never supposed to be given that option in the first place.
small clear up regarding adaptive sync. it is supposed to be used alongside vsync with a framerate limit 1 frame below your maximum refresh rate. this has minimal additional input latency compared to fixed refresh rate vsync which doubles your input latency sometimes. so gsync + vsync and limit framerate 1 frame below refresh rate
Testing for muzzleflash vs a flashing box is going to be slower because the game engine must register the fire input, pass it through game logic, generate the appropriate animation, particles, sounds, client-side predicted hitscan, and rendering of all of these things before you see it on the screen. One is specific to the game itself, the other is purely system latency before whatever a game adds on top. Some games end up buffering 2-3 frames, or have other situations going on like using a message/event system to pass input and game logic events around (i.e. input message is entered into the queue one frame, the next frame the game logic processes it and issues more events to generate animation/particles/sounds which only start the frame after, and might not even be visible until another frame after that) which at 100FPS is automatically ~30ms on top of system latency. Each game is going to be different depending on what the developers cared about. The original PC release of Halo: Combat Evolved was the worst thing I'd seen before or since. It had a good solid 200+ milliseconds of input latency in spite of running at a decent framerate, and coming from playing id Software FPS games for a decade (at the time) and playng Halo:CE with a mouse meant YUCK!
I don't think you really know how rendering works. It is so stuff doesn't just "appear", muzzleflashes are slightly delayed so your brain feels better about it. It is not because it "has to pass through game logic"... Input events will usually only be a frame delayed at most, and not super common to be delayed.
Putting the polling rate to max on most gaming mice will result in the accuracy of the movement (particularly rapid movement like flick shots) of the cursor tracking getting worse compared to its default. This is because manufacturers are running that higher polling by "overclocking" the individual components of the mouse, making them perform worse under pressure. This is done so they can have a bigger number on the box, not because it's actually superior. If it was objectively better, it would be the default, or the option to adjust polling rate would be removed altogether! In my experience, if a mouse offers 250hz, 500hz and 1000hz as options, 500hz strikes a good middle-ground, while 250hz is usually the most accurate.
Good point. Also, can't wait for Linux to become mainstream and games to have native builds for Linux, as OS bloat in Linux is much less as compared to Windows. (well, if you install a lot of stuff it can get the same or worse, but Linux itself is much more configurable and can be stripped down much more. And configured to have no telemetry at all (like not even check if it's enabled or not) for example)
It depends on what "tweaks" the ISO does. There's a few things you can do that drops input latency by 10-15% (1-3ms depending on the game / scene etc.) but it's not worth breaking your install for. Also while it may be a bit faster, it's less stable so to speak.
@@Winnetou17 "bloat in Linux is much less as compared to Windows", you say that but have you ever looked at the xorg source code? It is ancient bloat layered with decades of bloat on top of bloat. I'd be really interested to see this tested with xorg vs wayland, nvidia vs amd. I was gifted a 2080ti and use i3wm, nvidias hatred of open standards made using sway (wayland) a nightmare. While I can get it running these days there are all sorts of issues and last time I tried to game on it the input lag was horrible. With the AMD driver being open source, I'd bet that if there was serious effort put into minimising latency across the entire pipeline they could achieve incredible numbers.
I've been testing my ZOWIE 2540K vs newly purchased LG OLED 27GR95QE. On 1080p max settings the 2540K average PC latency displayed on NVIDIA's in-game overlay is around 7ms average vs LG's 1440p max setting settings at an average of about 10ms. In conclusion, if you're strictly a CS player, ZOWIE is still king, IMO. Thank you very much for this video - learned a bunch!
18:14 There actually is one scenario where (double-buffer) vsync can be used with the express purpose of minimizing input lag, and that is when it is paired with variable refresh rate modes (freesync/gsync.) Battle(non)sense covered this in detail in his videos. However you also must cap your frame rate to slightly below your maximum refresh rate in this scenario. If you prefer to play at an unlimited frame rate, then instead you avoid vsync and free/gsync and instead use nvidia ultra low latency or AMD anti-lag to keep the input lag down.
Can you use the LDAT on a CRT monitor? I have a fairly nice Philips CRT that I can run at 1920x1440 at 85Hz. It feels so buttery smooth because there isn't the sample-and-hold method used on digital monitors. I don't know if that means there is less "real" latency but I'd love to see someone do a proper CRT test with an LDAT. A Trinitron-based CRT would be the best to use for a test.
CRT smoothness is something else. Not even OLED or any BFI tech gets close to what a well tuned high end CRT can do. Prices of good CRTs still make me cry though.
A CRT isn't displaying a constant image, each time it scans it will be flashing bright and dark which we perceive as a constant image. If you are trying to detect a change in brightness then you'd have to differentiate between the flash during each refresh and the brightness between refreshes. The phosphors in different CRTs also have different decay curves after being stimulated complicating matters. That effect along with bleed is part of the reason why it feels so smooth, just like film our eyes don't see a single discrete frame but the smooth blur between movement. Though with film it is done during exposure, CRT is from its display. Also CRTs are superior simply due to the de-gauze button. I can't tell how many times I pressed it in the 90's for that beautifully satisfying sound. I was worried I might damage my fancy monitor I bought just for CS 1.3, but I couldn't help myself.
15:14 your cat is maybe suffering from infection. His left eye is a little swelling. one female cat always came to my home in the evening for milk. And i noticed swelling and it makes it harder to see for her. Luckily There is a medicine for it.
Despite all of the problems behind this release, I don't think I've experienced much issues with input lag. In fact, I'd say it feels much more accurate than GO. However, there are parts in 2 where I experience chugging and that *feels* like input lag even if it isn't. I miss a lot more shots but I'm able to now blame it on my bad aim, not on the servers. It's a marked improvement.
You can enable V-Sync to avoid tearing without a latency penalty if your game + hardware + screen combination supports both G-Sync and NVIDIA Reflex (and you enable all three) Source: ua-cam.com/video/Gub1bI12ODY/v-deo.html
The Latency actually changes if you measure it on the top, mid or bottom of the display. At least with LCD panels. I got a really handy tool build by Leo Bodnar. It's similar to the LDAT, but it generates a Testsignal on an HDMI Output. This allows you to specifically determine the latency of your display. And it got 3 test areas. Top, Mid and Bottom.
Can we get a comparison between CSGO and CS2 latency? Like stats with uncapped and capped? Would be really interesting to see if cs2 has actually higher input latency in the engine or can the perception of more lag come just from the lower fps.
Considering the comment from nvidia that CS2 testing was so low latency they thought it was a bug, I can't see how CSGO could be better. As you said, I suspect lower fps combined with gamers being hyper critical and having the perception that updates/new releases are broken and inferior.
@@beardedchimp The "idea" that something is true is often more impactful on a person's perception than reality. Especially considering that gamers rarely actually know these details.
@@beardedchimp had 4-6ms with nvidia reflex tool on windows 7 5600x build and viper 8k mouse on csgo, yes getting windows 7 to work on a 5600x was pain and suffering but god damn did it feel snappy
@@fridging that does sound satisfying after enduring the pain. I remember doing that with an ancient laptop in 2009, the CPU had such an old instruction set that even linux/unix distros had abandoned it. Those that "hadn't" would actually segfault all over the place, just nobody was around still using that CPU to tell them. I eventually found a distro that actually worked and once I got XFCE running, my god it was gloriously snappy. Couldn't believe it. Oh and it had no cd drive and couldn't boot off usb, had to network boot from another computer via TFTP. God the pain, but oh my it made the satisfaction ever so euphoric. How did you measure end-to-end input latency as 4-6ms? It takes a 360Hz monitor to even approach those numbers. My 144hz love child (or similar) uses all of those milliseconds even before the mouse input and rendering pipeline has a say.
I've also had the same suspicion of input latency lowering when you use the fps max command to limit to slightly lower than your average, im glad you tested it because I don't really have the tools to test it properly lmao. I also have noticed that on my rig frame time consistency is slightly better when you use fps max with an actual number rather than 0, even if its 999 and you NEVER hit 999. very odd behavior I wish you had also tested the AMD low latency thats built into the drivers, as it feels slightly odd to me but I cant place why. I suspect it COULD be input latency related as I don't see an obvious frame rate or frame pacing issue
Hey philip, I had a question after watching the video and wanted to ask you as you stated "if you have stupid questions, do ask". Since the Operating System (and the USB SW which I assume mean drivers) is part of the latency chart, I wonder if different operating systems would have resulted on different results. What do you think? Will it be a noticable latency change or barely noticable?
Yes, in fact the Linux kernel has an alternative specialized "realtime" configuration, which offers the smallest latency possible. It is used by industrial and scientific applications mainly, where response time is _critical_ (as in, heavy machinery might crash if the program "misses a beat"), and audio processing hardware (yeah some audio gear is just a computer running Linux). This is not viable for desktop use because it assumes programs won't try to hog all the CPU, but there are other, less dramatic configurations you can make to reduce desktop latency, and in fact some Linux distributions do use these settings by default.
If you look at the diagram with the green blocks, you'll see the composite stage. He mentioned this regarding fullscreen vs. windowed in Windows. Since the GUI uses different compositors on Linux, I'm sure there would be a difference here. How much? I don't know. There are probably other differences regarding drivers and USB handling, as well as how games are rendered in the first place.
My freesync monitor is one of the beat purchases I ever made, I didn't know what I was missing until I had it. Visual tearing has always driven me nuts, and when I first got into PC gaming I didn't know how much delay vsync added and always had it on. I've come so far
Tearing drives me absolutely nuts. Since the mid 90's technology has seemed to fluctuate where tearing is getting better, then suddenly it is the bane of my existence. The worst is watching films where they have a slow cinematic panning shot totally ruined and torn up, LOTR in particular.
Depends on the refresh rate on the monitor (like said in the video) i also dont notice screen tearing at 500 fps with a 240hz monitor... but with my 4k 60hz monitor/tv i certainly do a lot! @@budgetking2591
Nah, This is IMPOSSIBLE. I just like minutes ago went to your channels wondering where u were or you took a small vacation for Valve to update the game.
You forgot a mention for Reflex, Reflex is also acting as a FPS limiter, to prevent the card using too much of the BUS or the graphic card itself, because a maxed BUS or GPU cores increase drasticaly the inputlag, Reflex is preventing that and is keeping the GPU and BUS usage in line to not bottleneck the card, Battlenonsens made a pretty good analysis of this and how FPS cap, Reflex, GSYNC settings alone or mixed reduce or not the inputlag, even just limiting the FPS from 10-20% away from your max FPS reduce the inputlag massively, even on high end hardware. Also, IIRC, VSync SHOULD BE ON on the Nvidia control panel when using GSYNC (NOT IN THE INGAME SETTINGS).
I've always found that capping FPS does wonders for reducing jitter. When it comes to human perception, jitter can really ruin perceived responsiveness and input lag. It also feels nasty, at least input lag tends to be consistent.
From my understanding no. Subtick is specifically the system that timestamps inputs from the player and passes it to the server. Which makes outcomes technically more accurate. This has nothing to do with latency, as input latency is (as far as I know) tied to the players hardware and doesn't have anything to do with the server or how it communicated information.
As stated above Subtick is like a time stamp. It works well and Imo is a huge improvement over cs go. The only issue is that subtick is still time stamping 64tick servers and broken hit registry lol!
Something about V sync is that it not only depends on the game but how you do it. For example battle(non)sense did videos testing v sync using an ldat in other games like overwatch. He found that limiting the fps below the refresh by small amount (142 fps cap in a 144 Hz monitor) did not add any latency and in fact reduced it. Using v sync with uncapped fps is the main culprit for the delay
Phillip after testing input latency using specialized expensive hardware and very complicated software techniques: CS2 is fine and its impossible for humans to notice input latency as long as your pc itself is build correctly, meaning there's nothing you can do to make your latency better or worse. Random CS pro after doing zero tests and basing his entire knowledge on random reddit posts: Yeah CS2 is broken guys it takes 1000 years to shoot idk bad gamve valve pls fix.
@huthunterhut Some people feel it more than others. My friend is one of them. He is pretty skilled when it comes to gaming in general, and also has a pretty good PC. Says he feels like his hand is "sick" when the latency is "too" high.
@@griffin1366 I get what you mean, but he caps it at around 120 in COD. He spends hours every week just testing and tweaking settings, both in and outside of the game(s). He tries everything from getting a new "correct" mouse to trying out different types of mouse pads. Only game he really uncaps the framerate in is CSGO. He can't play CS2, even with a cap he says it's like swimming through mud.
while this maybe alot to ask we need to see the cost of each graphic setting in term of fps and input latency like setting global shadow to the highest and see how it affect input latency. that would be freaking sick
Always wonder about 'power management mode' on nvidia control panel. Yes it won't be big difference when it comes to FPS, but how about latency *by maintaining gpu clock speed in CPU bound scenario? Will 'prefer maximum performance'+ just reflex be same as reflex + boost?
No, reflex works by letting the GPU instantly push the frames out to your monitor instead of passing the frames through the CPU first. It doesn't really have anything to do with clock speed or power usage.
@@smurfeeNnCPU PASS the frame to GPU. Reflex makes CPU sending frames as much as GPU can handle in realtime(by limiting some of those, and more complex things than that), thus lower latency as GPU can render frames fast as possible. Most important, I'm not saying about reflex, I'm talking about 'power management mode.'
Prefer Maximum Performance isn't the same as boost. Even in CS:GO with PMP enabled I wouldn't hit p0 states on the GPU. The only way you can get around that is to disable power savings with a registry key but then you're always running at p0 state (don't worry it's safe, my GPU has been at p0 for 4 years). +Boost does all that for you. In case you were wondering, not running p0 has a latency impact.
Love a good, chunky, "I casually got nvidia to send me a bunch of industry-grade latency testing devices" Phil video.
Chunkyyyyyyy
They liked his sponsored vid so much, hes practically a nvidia pr guy now.
Philip when the same nvidia he talked shit about sends him money to do a video : 🤯🤯🤯‼️‼️
@@Andytlpthat nvidia shill even has an nvidia christmas sweater.
@@Andytlp Ironic since philip has always been an AMD fanboy
I'm not deaf, but i still appreciate you taking the time ti make subtitles on EVERY video.
Sometimes there are little Jokes hidden 🥹
@@kugi518 He puts a ridiculous amount of effort in, i truly admire his dedication to his work.
Also, I'm happy youtube fixed their subtitles, for a couple of months, it showed the whole text for the first 5 secondes and then nothing
They are useful for non native English speakers too :P
I also love it cuz if im eating or something is happening in the video other than him talking which rarely happens i can just read it.
"The solution is usually easy but it's hard to figure out which easy solution is the correct one" describes literally every problem in computing
Only to discover edge-cases where the easy solution falls apart. Fixing these is a recipe for spaghetti code. Sometimes the harder approach ends up simpler than the easy one.
@@beardedchimp"For every complex problem there is an answer that is clear, simple, and wrong." --H. L. Mencken
@@BenLubar can't tell you the number of times I've hit a problem and after some reading couldn't believe why nobody had thought of this simple solution. After implementation I understand exactly why it's not used.
That said, occasionally, very occasionally you hit upon a unique simple approach and with trepidation you realise it is actually new and innovative.
19:53 - in case anyone is confused about that, the trick is making sure your GPU isn't oversaturated at 100% usage, that's basically what reflex is doing. For example:
- 131 fps, 99% GPU -- less responsive
- 120 fps, 91% GPU -- more responsive
That said, if your GPU is underutilized, then it's the other way around: the more fps, the better.
- 131 fps, 57% GPU -- slightly more responsive
- 120 fps, 51% GPU -- less responsive
As someone who has had to change the refresh rate on two friends' monitors because they never changed the setting i've got to say: spreading the word is important. CHECK YOUR MONITOR SETTINGS, PEOPLE
played 4 months on my new omen 165hz monitor before realizing it was set to 60hz in nvidia settings like 2 weeks ago... cheers!
@@mcwurzn8194Spread the word to everyone u know. Even your grandma's cat
if you do your opponents will get better!!
I had spent almost 6 years playing on a 75hz freesync monitor locked to 60hz and didn't even realize it lol
Bruh idiots
Getting Nvidia itself to sponsor you and send you testing hardware definitely feels like a badge of honor that you've earned after all these years of creating my favorite tech and gaming related videos across all of your channels. Congrats!
in b4 the amd fanboys calling him an "nvidia shill" just like phil's previous vid regarding Ray reconstruction in his last video, its fun and hilarous to see amd fanboys cry
@@Eleganttf2 how ironic, you ridicule AMD fanboys before they utter a single sentence revealing yourself as an even more rabid fanboy. For reference I currently game on an nvidia card and have happily criticised both AMD/ATI and nvidia for decades.
get a life@@Eleganttf2
@@beardedchimp lol you don't know how much amd fanboys stormed phil's last vid when he's literally just talk about Ray reconstruction go see for yourself if you dont believe me, and im just stating the truth here to prevent more amd fanboys from trashing on this vid's comment section
@@beardedchimp but sure if you wanna keep catering to amd just like any other tech tubers fishing for the subs and likes because the popular narrative is now "nvidia=bad, greedy. Amd=generous, good guy" then go ahead
Thumbs up for the K'nex bolt build striking the mouse, well done Philip.
including all your latest testings . could you device a best settings video for cs2 . An optimisations of sorts
s@@3kliksphilip
11:34
3KP: You probably set this to the highest and think it's the best...
Gamers: Oh no 😳
3KP: ... And you're right.
Gamers: Sigh of relief
it is not always best to set it to the max. there are cases where setting the mouse's polling rate to >500 can make games unplayable. but in 2023 you probably wont have to worry about this, just set it to the max
@@slonkazoidvery true i play on a very high poll rate on both my mouse and keyboard, some software just straight up cant handle it, i get double inputs on my keyboard and specifically when trying to use optifine in minecraft if you use high dpi it will break and over accelerate your mouse
@@slonkazoid 8000hz has issues in most games. People run them at 2000 or 4000hz.
@@griffin1366 I know many years ago mouse manufacturers used to use artificial sampling 'boosters' to give highly marketable huge numbers. The chipset had a native max refresh above which they were less accurate. I'm not sure if that is done any more as technology progressed. Gamers were also obsessed about using the lowest sampling rate, or the "holy grail" of 400hz. I'm glad this video dispelled the myth that 400hz was optimal for input latency.
@@griffin1366 i dont get why one would go above 4000hz anyways. what are they doing? why do they need 8000 position packets per second?
the liquid crystals (from LCD) in the display are actuated electromagnetically, overdrive just overvolts these tiny electromagnets in the hopes that it accelerates the crystals faster. but as the electromagnets don't actively slow the crystals down for their new position, they can overshoot and then bounce back a bit
Yeah, OD is meant to combat ghosting, but it won't do anything about the latency.
"accelerate" the crystal faster? tf does this mean
Not electromagnetically, electro-statically.
@@Andy_M.S.cthey move faster
@@Andy_M.S.c moving from one shape into another requires some acceleration, which can be stronger or weaker, the chnage voltage changes that strength
Unbelievable how professional this video is, even compared to your normal ones.
well it's a well paid ad. just like the other "nvidia good!" videos.
It's hardly an ad.
This is actually the best video i've seen at explaining all the different components that go into rendering an image and what each of the options you can add on to that process do. awesome work!
We've had this video for like several years. I forgot who made it but this isn't the first one.
You never cease to amaze me Philip. This is actual insanity. I can't believe you went through all of that for this game. I love this community !
I know this logo.
finally a video that actually explains what an ldac is instead of assuming the viewer knows all about it and it's history of an obscure device they'll never see
Everyone I've ever seen use an LDAT has explained what it is and how it works at least once. If you just watch a later video of theirs where they don't repeat that basic info that's on you.
You can always Google what an LDAT is.
@@mechanicalmonk2020least obnoxious redditor
@@fnutek3720 ¯\_(ツ)_/¯
@@fnutek3720just because he's got a point, he's a redditor?
@@cezarsoba "erm if you werent an idiot and used google you would maybe not have this problem you little plebian".
The delivery is very superiority complex esk. and it just comes off a little weird.
Formality means alot in text.
This video saved my sanity. I have had the black reflex testing box on my screen for months. No idea where it came from or how to fix it. I scoured the internet with the most common response being a dying monitor or various other issues, whose solutions did not solve my problem. IT HAS BEEN MONTHS!!!
I quite literally debated clicking this video. I'm not concerned with my input lag in CS2 as I don't notice a problem..........If I hadn't, I would still be playing with that little black box that flashes every time you click.
Thank you for giving my sanity back
really interesting to see all these factors measured.
Well done Philip. This is the most comprehensive video on input latency I've ever seen. You do brilliant work with your platform in the CS community. Cheers 🍺
it's an ad.... wtf are you talking about lol.
@@NulJern So you didn't watch it then? Just because something is sponsored by a company doesn't mean that the information contained in the video isn't useful, accurate and comprehensive. SMH.
@@gnomecs2 you are naive little boy if you actually think that. Yes people promotes other for free in this world
@@NulJern Since you can hardly speak English, I'd advise you to sit down and take a breath. BTW, I'm probably twice your age. Unfortunately, you are not showing much intelligence.
THAT BLACK REFLEX SQUARE!!!
I used to have it on HDR games and thought my PC was broken/weird for it because I had no idea what it was
This is a ridiculously impressive research project. Great work Philip!
Hi Phillip, Reflex+GSYNC+VSYNC is theoretically supposed to limit your framerate to a percentage below your refresh rate. I believe that's why your vsync+gsync testing showed 58fps and such a reduction in latency! However I have noticed this feature does not kick in with 100% consistency. Maybe something to look into a little further? Great video!
Ducks need HUGS
this is all true to the LAN setups tho. but there is also server fps, which also apparently is dependant to the amount of dust2 in valve computers.
Great work mr philip, there's not many resources out there that go this in depth on latency. I'm sure this video will be a lot of use to people over the years
Awesome video! One thing I wanna note, all of these different things add up really quick, to the point where your latency could be 4-6ms faster then if you didn’t fine tune everything. That doesn’t sound like alot but you definitely can notice the difference. That paired with making sure your cpu and gpu aren’t at a super high usage in order to make sure you have the most consistent frames with the highest 1% lows, is the most important thing you can do for your setup. The highest framerate ≠ the best and smoothest experience. Frame caps are very important, finding the middle ground between latency and fps is the hardest part. My rule of thumb is to make sure your gpu and cpu arent over 85% usage, because past that it starts affecting the consistency of your frames. Cap your fps to something you can constantly obtain within that 85%. Higher 1% lows = less microstuttering and buttery smooth gameplay.
There is also a GAINT rabbit hole of tweaks and different things you can change on your pc in order to lower latency, some of it is not good so always be careful and do your research before listening to what some random person online says
13:20 As a side note: when using gyro input and flick stick for playstation 4 controllers on steam input, be sure to use a wired cable. When I tried wireless, the input lag was extremely noticable due to bluetooth delay, or at least to me. If you want an idea for a continuation for this video, I think it'd be interesting to see how controllers stack up to keyboard/mouse.
Just another reason why Valve needs to get off their butts and release a second-gen Steam Controller with all the features the DS4 has but none of the reliance on Bluetooth.
@@stevethepocket I'll take bluetooth. Just give me the option like that Razer mouse Philip has shown. I don't want a single-purpose USB dongles when I don't need them.
Bluetooth was never meant to have good response times. Although for some reason PS3 controllers do well wirelessly. That's not to say that wireless is slow, just the Bluetooth protocol is.
@@vasilis23456 I'm pretty sure Sony controllers all have a 2.4G mode that relies on a proprietary receiver inside the console. They wouldn't be able to do stuff like PSVR head tracking without making people sick otherwise.
Ds4 is extremely fast via bt, especially when it’s overclocked to 1000hz. Your high latency comes from shitty bt module in your pc.
I love this video! You totally nailed what's been bugging me since I ditched my old CRT for a crappy LCD. Had no clue about input latency, and here I was, wondering why everything felt off. Thanks for clearing that up, seriously I feel enlightened now
I'm glad you got an Ldat , and you have it the 3Kliks treatment , logical , informative and entertaining. Thank you for all your work 🙏
Now this is truly in depth... Amazing video
I have been searching for a video like this all week. I have been using my 240fps camera to test different setups. Thanks for the video. I learned a lot.
Just to make it clear, you want to enable vsync when you use gsync to get completely rid of tearing. When you have vsync and gsync on, it acts differently than regular vsync. But you need to cap your framerate slightly below your refresh rate, or use nvidia reflex which does it for you. If you don't cap your framerate, it'll hit your refresh rate and default to regular vsync which introduces a lot of latency.
No, Vsync will have as much Input latency as Gsync/Freesync,/Adaptive Sync these technologies adapt the refresh rate of the monitor to the actual framerate you create.
If you have as much frames as you have refreshes V-Sync and adaptive Sync act the same. Adaptive sync lowers the monitor refresh rate when your FPS drops below the monitor refresh rate, which leads to less input latency than V-Sync in that scenario, because V-Sync will simply output the same frame again thus halving FPS for that moment.
The best solution is to use triple buffering solutions like Fast Sync (Nvidia) or Enhanced Sync (AMD) together with an adaptive sync technology.
Because these triple buffering solutions always render a new frame and only output the newest one, thus reducing latency(if you have more FPS than monitor refreshes).
(But be aware there is another triple buffering technology which act as a frame queue that greatly increases latency, so don't assume triple Buffering will always decrease latency)
@MrDavibu No, wrong. OP is right. Enabling Vsync with Gsync enabled IS NOT the same as only using Vsync. When using Gsync you need to also enable Vsync in Nvidia settings, otherwise you still get some screen tearing on top and bottom of your screen. Specifically, you also need to cap framerate to AT LEAST 3fps below monitor refresh rate to guarantee Vsync never kicks in. Enabling Vsync while Gsync is enabled will not increase latency like it does with only Vsync enabled.
Lol so ON 144hz gsync i have to cap to 140fps for IT to Work Prob? Nice 100€ Price tag
When I recorded my 240hz monitor with a 240 fps cam and tested different overdrive settings the main difference between them was that at lower overdrive levels a new frame often looked transparent. The white object that I used for testing was often grey the first frame it was visible and turned white in the next frame. With the highest overdrive level this still happened sometimes but a lot less often.
I'd assume that the transition from black to grey is already enough to trigger the LDAT.
I feel like the title shouldn't be limited to CS2. Anyone searching for videos on input latency would learn so much from this, but they might not click the video if they think it only applies to CS.
Hey Philip! Always love these videos! I'd really like to see some testing with Gsync's intended use with vsync on + a capped framerate! I remember reading blur busters' articles testing gsync both with and without vsync on. They basically came to the conclusion that frametimes were more consistent with it turned ON and the framerate capped. Would like to see some additional testing to backup you saying in this video that you should always turn vsync off
u know its gonna be a good phillip video when theres a Fluffykins sighting in the first minute
Leave it to you to make going through all of these settings extremely interesting and helpful...bravo and thank you!
Awesome and meticulously researched. Very impressive video!
Wow. I've been a regular binge watch viewer of your channels but this video might be one of your top 10 masterpieces.
The way you stated that this company is the best with pronunciation of the exact word while stating your personal opinion about advertise - the very thing you are doing right there - was just word artistry.
And then the level you invest into testing and research! This has at least the potential for a standalone Paper journal contributed, if not even a whole PhD thesis in computer science.
I think I have never seen somebody so seemingly unintentionally achieving such an academic level of depth while doing something so obviously entertaining for oneself at the same time.
I swear this is your coolest video yet. I'm so glad Nvidia sponsored you and gave you all this equipment
what are you talking about? it's an ad.. it's as biased as can be,
@@NulJern what's your point ? Are you trying to say the ldat tool from nvidia isn't legit just because its from nvidia ? 🤣 stupid amd fanboy
@@Eleganttf2 lol i wrote my point you dumb kid. Read what i wrote instead of reading into what i wrote, so no that's NOT what i am trying to say, and you must be ignorant even thinking that, considering it wasn't what i wrote.
@@NulJern Show me at least one part of the video that was heavily biased.
@@roamn4979 use your brain to figure that out and if you can't figure it out then you might lack the brain power, not that it takes a lot.
Please make a part two on the topic of network latency while taking into consideration everything already mentioned here!
I cant believe he uploaded a 30 minute video!! I’m so excited
I didn't realize it was 30 mins until I read your comment O_O
Nice content i was looking for it, found it and get even more... thanks
V-Sync can be useful (it gets rid of tearing occurring at the lower part of the monitor) without adding any additional latency when used with G-Sync and capping the frame rate below the refresh rate of your monitor, please correct this misinformation I am of the constant slander towards V-Sync by people who don't know how to use it.
Loved the vid. Of everyone I'm subbed to, a big substantial vid like this from Phil gets me more excited than most channels
Philip, I would love to see the "Going Low" series in CS2, like you did in CS:GO again after 10 years. It was such a good series, I would love to see it again.
Fantastic video! Whenever you say conclusion for your summary in these videos I always am reminded about how my English teacher teachers hated that :)
@@3kliksphilip I've heard similar comments from my English teachers before. I think their main issue with it is that it is kind of generic and they wanted us to use more creative language when writing essays and stuff. If I remember correctly, they suggested we could instead restate the thesis with slightly different language, but I could be off about that. That kind of thing never really bothered me personally, loved the video as always!
@@3kliksphilip I've read student essays where they constantly end each section with "in conclusion". When you finally reach the end and the actual conclusion they've already concluded several times.
In conclusion, explain your point rather than tell the reader your surmising is what they should conclude.
(This is here purely to bump up the word count to submit the essay)
Final conclusion, Jane Austen ruined the 90's for me, I'll never forgive GCSE English lit.
@@3kliksphilip Like other people have said I think it’s because it’s considered a cliché. I never really fully understood that since it just means the audience 100% knows it is one. However they are the teacher so they know much better than my whiny complaints.
Extremely detailed explanation, thanks!
I'm wondering how much of a factor a keyboards input lag would be in CS.
Counter strafing is very important in this game, and standard mechanical keyboards have a debounce delay of around 5ms.
Keyboards with optical and analog switches don't have that 'problem'.
Could be an idea for a follow up video.
Interesting!
amazing video philip! Awesome to see the LDAT in action, its a super sweet piece of tech.
Sometimes, I think that my input latency in CS2 genuinely reaches 400 ms- it always happens shortly after a lagspike before going down again, and it's not very frequent, but it's really annoying, especially since I never had that problem in Global Offensive.
Have you tried disabling hardware accelerated GPU scheduling in windows?
@@w04hWait. Does that help?
do you still have those problems after the new updates?
@@ToodelsALXno, he literally said his problem is directly related to his network connection, why tf would hardware acceleration do anything. I mean let’s be honest guys, this video is great, but in reality we, as users, barely have any control over any of the graphics pipelines being used in games. That’s why he even points out that Nvidia has to calibrate these things, we can affect it don’t get me wrong, but if you genuinely think you can fix 400ms response delays with a flip of a switch you might be crazy
@@avramcschill
As someone who watches videos with subtitles on, I thank you.
I think the only thing that wasn't answered by this video was how the latency compares between GO and CS2, other than that, this video is incredibly well done! I'd be curious how the measure up with the tools you now have to measure such things!
With the latest CS2 update they got rid of all the input latency that was present in csgo and made firing client sided instead of server sided. There was an upto 16ms input lag in csgo and it doesnt exist anymore with cs2 which means firing is now instant
@@Acidity01that's great to hear!
@@Acidity01 I know they changed the firing animation to play sooner, but I mean the input latency itself, like does it actually take longer for CS2's inputs to register than CSGO's? I know subtick means that the shot will register where the input was detected via the game client, meaning --sort of-- directly after the CPU processes it and it's sent to the game, which would happen more or less in parallel with the display of the shot, but I'd just be interested how the "feeling" of latency compares between the games now.
Is CS so SUPER DUPER competitive focus that even millisecond latency matters? Like, holy
@@Max128ping I mean, I personally don't think it does, but there's been so much anti-CS2 rhetoric, it'd be nice to at least put it in perspective, that's all
Love seeing Philips analysis leap to a new level. Hoping now he's got these cool tools we'll be seeing further deep dives into the topic!!
If you have a G-sync monitor turn V-Sync ON and enable Nvidia Reflex. It will automatically cap your FPS just below your monitors max refresh rate. This eliminates tearing and has other benefits.
And dyac is it better than g-sync?
But will it increase input lag?
I like how in depth you go with your videos.
Thank you, mr Philip! I was waiting for such a test. :)
we are talking about 3-10 miliseconds differences here, average human reaction is 150-300ms. Its much more worth it to have GSYNC + REFLEX ON+BOOST and cap fps -3 below refresh rate for the ultimate smooth experience imho
Something to bear in mind is how most reaction time tests use a physical feedback system. Your brain can recognise changes much quicker than 150-300ms, it's just that it takes that long for your nervous system to create a signal from visual changes, process it in the brain, send the signal to your hand and then actuate the muscles. Which means that a noticeable/detectable difference in latency is probably a lot smaller than assumed
@@nade5557 true and fair point, but if we are talking about input latency being an advantage then we need to take into consideration the whole chain, the server, tickrate, visual change, brain recognizing it, sending signal to move your arm and click, then comes into play your peripherals, your system HW and its processing time, CPU sending data to GPU to render and then into monitor etc. Thats basically why i am saying that i enjoy more a game that is silky smooth with no tearing with potential 5ms higher input lag then a micro stutter/teary mess with ab solute minimal input latency. But hey maybe its the age talking, ive been playing competitively from 1999 to 2015 and now prefer the visual enjoyable experience over extra few ms sacrificed :D
@BeowulfGaming yeah, that's fair.
I was really expecting a latency comparison between CS2 and CSGO. Also as a man silly enough to daily drive a 4090 and 13900k as well, I was expecting to see which CS2/PC settings offer the absolute lowest input latency on that specific setup. Great video nonetheless!
I'm pretty sure ive seen tests with CSGO before and it was kinda bad in terms of latency (mouse click VS seeing shot on screen). Also CSGO lacked any sort of reflex implementation so I wouldnt be surprised if it performs worse than CS2
@@Ferdam I'm quite sure it would, I was thinking it would be a fun statistic to throw at people noceboing themselves into thinking CS2 is the worst game ever developed. However a bit more consideration and reading some comments here makes me think it wouldn't really matter and there will always be some other unrelated thing made up to support their view.
@@meryman7669 i totally agree, would be another thing in favor of cs2
What exactly is silly about daily driving the best system in the world? If he has the money for it, why not? Christ.
As a game dev and hardcore cs2 player , I cannot stress the value of the video: great job !
3kliks is one of the most valuable resources to the general gaming community, he answers questions I don't even know I want the answer to
(not this one obviously lol, but I mainly mean his other videos)
Seeing those clips of 1999 Unreal Tournament at the end sent me straight back to high school....
FYI, if you're using GSync or FreeSync and you're given the option of VSync, always keep it on. You were never supposed to be given that option in the first place.
small clear up regarding adaptive sync.
it is supposed to be used alongside vsync with a framerate limit 1 frame below your maximum refresh rate.
this has minimal additional input latency compared to fixed refresh rate vsync which doubles your input latency sometimes.
so gsync + vsync and limit framerate 1 frame below refresh rate
I simply say what nvidia's engineers think is best. If you know better then you do you :)
Testing for muzzleflash vs a flashing box is going to be slower because the game engine must register the fire input, pass it through game logic, generate the appropriate animation, particles, sounds, client-side predicted hitscan, and rendering of all of these things before you see it on the screen. One is specific to the game itself, the other is purely system latency before whatever a game adds on top. Some games end up buffering 2-3 frames, or have other situations going on like using a message/event system to pass input and game logic events around (i.e. input message is entered into the queue one frame, the next frame the game logic processes it and issues more events to generate animation/particles/sounds which only start the frame after, and might not even be visible until another frame after that) which at 100FPS is automatically ~30ms on top of system latency. Each game is going to be different depending on what the developers cared about. The original PC release of Halo: Combat Evolved was the worst thing I'd seen before or since. It had a good solid 200+ milliseconds of input latency in spite of running at a decent framerate, and coming from playing id Software FPS games for a decade (at the time) and playng Halo:CE with a mouse meant YUCK!
I don't think you really know how rendering works.
It is so stuff doesn't just "appear", muzzleflashes are slightly delayed so your brain feels better about it.
It is not because it "has to pass through game logic"...
Input events will usually only be a frame delayed at most, and not super common to be delayed.
exceptionally good work
Dude how long were you testign for?
This video is incredibly valuable, good job man. I cant really verify the results myself but this is amazing
His whole channel is this amazing 🎉
@@WaltuhBlackjr ya i know but video caught me off guard since it kept going on haha
Putting the polling rate to max on most gaming mice will result in the accuracy of the movement (particularly rapid movement like flick shots) of the cursor tracking getting worse compared to its default. This is because manufacturers are running that higher polling by "overclocking" the individual components of the mouse, making them perform worse under pressure. This is done so they can have a bigger number on the box, not because it's actually superior. If it was objectively better, it would be the default, or the option to adjust polling rate would be removed altogether!
In my experience, if a mouse offers 250hz, 500hz and 1000hz as options, 500hz strikes a good middle-ground, while 250hz is usually the most accurate.
There are various stripped-down windows 10/11 versions. A test of them with a comparison of input latency and fps would be pretty cool
Good point. Also, can't wait for Linux to become mainstream and games to have native builds for Linux, as OS bloat in Linux is much less as compared to Windows. (well, if you install a lot of stuff it can get the same or worse, but Linux itself is much more configurable and can be stripped down much more. And configured to have no telemetry at all (like not even check if it's enabled or not) for example)
@@Winnetou17 Alright alright lol. I'd like to switch to linux too, if not for the fact that gaming still sucks on it
It depends on what "tweaks" the ISO does.
There's a few things you can do that drops input latency by 10-15% (1-3ms depending on the game / scene etc.) but it's not worth breaking your install for. Also while it may be a bit faster, it's less stable so to speak.
@@Winnetou17 "bloat in Linux is much less as compared to Windows", you say that but have you ever looked at the xorg source code? It is ancient bloat layered with decades of bloat on top of bloat.
I'd be really interested to see this tested with xorg vs wayland, nvidia vs amd. I was gifted a 2080ti and use i3wm, nvidias hatred of open standards made using sway (wayland) a nightmare. While I can get it running these days there are all sorts of issues and last time I tried to game on it the input lag was horrible.
With the AMD driver being open source, I'd bet that if there was serious effort put into minimising latency across the entire pipeline they could achieve incredible numbers.
@@Concodroid gaming sucks on linux? I've found my steamdeck to be absolutely fantastic, by far the best gaming handheld I've ever used.
I've been testing my ZOWIE 2540K vs newly purchased LG OLED 27GR95QE. On 1080p max settings the 2540K average PC latency displayed on NVIDIA's in-game overlay is around 7ms average vs LG's 1440p max setting settings at an average of about 10ms. In conclusion, if you're strictly a CS player, ZOWIE is still king, IMO.
Thank you very much for this video - learned a bunch!
Right on time. thank you
great video man, been looking for this stuff for a long time
18:14 There actually is one scenario where (double-buffer) vsync can be used with the express purpose of minimizing input lag, and that is when it is paired with variable refresh rate modes (freesync/gsync.) Battle(non)sense covered this in detail in his videos. However you also must cap your frame rate to slightly below your maximum refresh rate in this scenario. If you prefer to play at an unlimited frame rate, then instead you avoid vsync and free/gsync and instead use nvidia ultra low latency or AMD anti-lag to keep the input lag down.
Your suspicions are true that locking fps dramatically decreases input lag but for some reason the decrease is minimal in source 2!
Can you use the LDAT on a CRT monitor? I have a fairly nice Philips CRT that I can run at 1920x1440 at 85Hz. It feels so buttery smooth because there isn't the sample-and-hold method used on digital monitors. I don't know if that means there is less "real" latency but I'd love to see someone do a proper CRT test with an LDAT. A Trinitron-based CRT would be the best to use for a test.
CRT smoothness is something else. Not even OLED or any BFI tech gets close to what a well tuned high end CRT can do. Prices of good CRTs still make me cry though.
+1 to this. The whole time I was hoping he'd try on a crt! I imagine it'll be less, but no clue by how much.
A CRT isn't displaying a constant image, each time it scans it will be flashing bright and dark which we perceive as a constant image.
If you are trying to detect a change in brightness then you'd have to differentiate between the flash during each refresh and the brightness between refreshes. The phosphors in different CRTs also have different decay curves after being stimulated complicating matters. That effect along with bleed is part of the reason why it feels so smooth, just like film our eyes don't see a single discrete frame but the smooth blur between movement. Though with film it is done during exposure, CRT is from its display.
Also CRTs are superior simply due to the de-gauze button. I can't tell how many times I pressed it in the 90's for that beautifully satisfying sound. I was worried I might damage my fancy monitor I bought just for CS 1.3, but I couldn't help myself.
This is an AWESOME rabbit hole, thanks Philip.
15:14 your cat is maybe suffering from infection. His left eye is a little swelling. one female cat always came to my home in the evening for milk. And i noticed swelling and it makes it harder to see for her. Luckily There is a medicine for it.
Despite all of the problems behind this release, I don't think I've experienced much issues with input lag. In fact, I'd say it feels much more accurate than GO.
However, there are parts in 2 where I experience chugging and that *feels* like input lag even if it isn't. I miss a lot more shots but I'm able to now blame it on my bad aim, not on the servers. It's a marked improvement.
You can enable V-Sync to avoid tearing without a latency penalty if your game + hardware + screen combination supports both G-Sync and NVIDIA Reflex (and you enable all three)
Source: ua-cam.com/video/Gub1bI12ODY/v-deo.html
It's 1-2ms depending on your monitor. So 5-20% depending on your refreshrate. It's noticable, I only use it in single player games.
The Latency actually changes if you measure it on the top, mid or bottom of the display. At least with LCD panels.
I got a really handy tool build by Leo Bodnar. It's similar to the LDAT, but it generates a Testsignal on an HDMI Output. This allows you to specifically determine the latency of your display.
And it got 3 test areas. Top, Mid and Bottom.
Can we get a comparison between CSGO and CS2 latency? Like stats with uncapped and capped? Would be really interesting to see if cs2 has actually higher input latency in the engine or can the perception of more lag come just from the lower fps.
Considering the comment from nvidia that CS2 testing was so low latency they thought it was a bug, I can't see how CSGO could be better. As you said, I suspect lower fps combined with gamers being hyper critical and having the perception that updates/new releases are broken and inferior.
@@beardedchimp
The "idea" that something is true is often more impactful on a person's perception than reality.
Especially considering that gamers rarely actually know these details.
@@beardedchimp had 4-6ms with nvidia reflex tool on windows 7 5600x build and viper 8k mouse on csgo, yes getting windows 7 to work on a 5600x was pain and suffering but god damn did it feel snappy
@@fridging that does sound satisfying after enduring the pain. I remember doing that with an ancient laptop in 2009, the CPU had such an old instruction set that even linux/unix distros had abandoned it. Those that "hadn't" would actually segfault all over the place, just nobody was around still using that CPU to tell them.
I eventually found a distro that actually worked and once I got XFCE running, my god it was gloriously snappy. Couldn't believe it. Oh and it had no cd drive and couldn't boot off usb, had to network boot from another computer via TFTP. God the pain, but oh my it made the satisfaction ever so euphoric.
How did you measure end-to-end input latency as 4-6ms? It takes a 360Hz monitor to even approach those numbers. My 144hz love child (or similar) uses all of those milliseconds even before the mouse input and rendering pipeline has a say.
@@beardedchimp i have a 360hz monitor when i did that, now use a 390hz with no reflex lkatency analyser
I've also had the same suspicion of input latency lowering when you use the fps max command to limit to slightly lower than your average, im glad you tested it because I don't really have the tools to test it properly lmao.
I also have noticed that on my rig frame time consistency is slightly better when you use fps max with an actual number rather than 0, even if its 999 and you NEVER hit 999. very odd behavior
I wish you had also tested the AMD low latency thats built into the drivers, as it feels slightly odd to me but I cant place why. I suspect it COULD be input latency related as I don't see an obvious frame rate or frame pacing issue
Hey philip, I had a question after watching the video and wanted to ask you as you stated "if you have stupid questions, do ask".
Since the Operating System (and the USB SW which I assume mean drivers) is part of the latency chart, I wonder if different operating systems would have resulted on different results.
What do you think? Will it be a noticable latency change or barely noticable?
Yes, in fact the Linux kernel has an alternative specialized "realtime" configuration, which offers the smallest latency possible. It is used by industrial and scientific applications mainly, where response time is _critical_ (as in, heavy machinery might crash if the program "misses a beat"), and audio processing hardware (yeah some audio gear is just a computer running Linux).
This is not viable for desktop use because it assumes programs won't try to hog all the CPU, but there are other, less dramatic configurations you can make to reduce desktop latency, and in fact some Linux distributions do use these settings by default.
Does CS2 run on Linux? Time for more tests...
@@Ixb3rs3rkxI It does! I don't think Valve would have launched the Steam Deck without making sure their most popular games worked perfectly on it xD
If you look at the diagram with the green blocks, you'll see the composite stage. He mentioned this regarding fullscreen vs. windowed in Windows. Since the GUI uses different compositors on Linux, I'm sure there would be a difference here. How much? I don't know. There are probably other differences regarding drivers and USB handling, as well as how games are rendered in the first place.
USB port used matters far more.
Great and valuable resource, thank you for this
The weapon sway probably also adds to the illusion of increased latency, I remember turning it off in css for it to feel/look more responsive
wonderful as always
wow I love Phillip he is awesome and his music is great and his history in life is just fantastic. I am no bot, no no.
Stfu "you won free giftcard" looking ass
@@hautoa1513 that means
@@hautoa1513thats mean I mean toi say
This is the science we've been looking for. Many thanks
Ever since some mid-october update I can't hit anything, but it seems like a problem with cs2 servers and not input, never had a problem with it.
@huthunterhutyes yes, bullets ghosting throught player models is skill issue XD. After these updates they again fucked something
My freesync monitor is one of the beat purchases I ever made, I didn't know what I was missing until I had it. Visual tearing has always driven me nuts, and when I first got into PC gaming I didn't know how much delay vsync added and always had it on. I've come so far
i dont use freesync in cs, i dont notice any tearing in 500 fps.
Tearing drives me absolutely nuts. Since the mid 90's technology has seemed to fluctuate where tearing is getting better, then suddenly it is the bane of my existence. The worst is watching films where they have a slow cinematic panning shot totally ruined and torn up, LOTR in particular.
Depends on the refresh rate on the monitor (like said in the video) i also dont notice screen tearing at 500 fps with a 240hz monitor... but with my 4k 60hz monitor/tv i certainly do a lot! @@budgetking2591
Nah, This is IMPOSSIBLE. I just like minutes ago went to your channels wondering where u were or you took a small vacation for Valve to update the game.
actually pretty cool and surprisingly educational . learned a lot abt my monitor that i never knew existed
test gpu scale vs display, please
YESS an LDAT, maaan would I love to have one of those, such a cool piece of tech to have around
You forgot a mention for Reflex, Reflex is also acting as a FPS limiter, to prevent the card using too much of the BUS or the graphic card itself, because a maxed BUS or GPU cores increase drasticaly the inputlag, Reflex is preventing that and is keeping the GPU and BUS usage in line to not bottleneck the card, Battlenonsens made a pretty good analysis of this and how FPS cap, Reflex, GSYNC settings alone or mixed reduce or not the inputlag, even just limiting the FPS from 10-20% away from your max FPS reduce the inputlag massively, even on high end hardware.
Also, IIRC, VSync SHOULD BE ON on the Nvidia control panel when using GSYNC (NOT IN THE INGAME SETTINGS).
I've always found that capping FPS does wonders for reducing jitter. When it comes to human perception, jitter can really ruin perceived responsiveness and input lag. It also feels nasty, at least input lag tends to be consistent.
My goodness, I can already tell I need to rewatch this video.
Question: Could the very low imput latency of CS2 be the result of the subtick system?
From my understanding no. Subtick is specifically the system that timestamps inputs from the player and passes it to the server. Which makes outcomes technically more accurate. This has nothing to do with latency, as input latency is (as far as I know) tied to the players hardware and doesn't have anything to do with the server or how it communicated information.
As stated above Subtick is like a time stamp. It works well and Imo is a huge improvement over cs go. The only issue is that subtick is still time stamping 64tick servers and broken hit registry lol!
honestly amazing vid, up to the level to that of million subs youtubers
Meanwhile AMD users got banned for using AMD's attempts at reflex, because it triggered VAC.
Something about V sync is that it not only depends on the game but how you do it. For example battle(non)sense did videos testing v sync using an ldat in other games like overwatch.
He found that limiting the fps below the refresh by small amount (142 fps cap in a 144 Hz monitor) did not add any latency and in fact reduced it.
Using v sync with uncapped fps is the main culprit for the delay
Phillip after testing input latency using specialized expensive hardware and very complicated software techniques: CS2 is fine and its impossible for humans to notice input latency as long as your pc itself is build correctly, meaning there's nothing you can do to make your latency better or worse.
Random CS pro after doing zero tests and basing his entire knowledge on random reddit posts: Yeah CS2 is broken guys it takes 1000 years to shoot idk bad gamve valve pls fix.
@huthunterhut Some people feel it more than others. My friend is one of them. He is pretty skilled when it comes to gaming in general, and also has a pretty good PC.
Says he feels like his hand is "sick" when the latency is "too" high.
Remember when shroud didnt even notice the game had vsync on at s tournament and game was capped at 60hz
@@TheBloopers30 Tell him to cap the FPS for a stable latency. I can't believe people still run uncapped.
@@griffin1366 I get what you mean, but he caps it at around 120 in COD. He spends hours every week just testing and tweaking settings, both in and outside of the game(s). He tries everything from getting a new "correct" mouse to trying out different types of mouse pads.
Only game he really uncaps the framerate in is CSGO. He can't play CS2, even with a cap he says it's like swimming through mud.
@@TheBloopers30 He is his own worst enemy by "tweaking". I've been down the rabbit hole myself.
while this maybe alot to ask we need to see the cost of each graphic setting in term of fps and input latency like setting global shadow to the highest and see how it affect input latency. that would be freaking sick
I liked my own comment
I'm sorry you don't like it anymore. I hope you find that self love again some day.
The results between reflex and reflex+boost correlates well with my personal experiences
Always wonder about 'power management mode' on nvidia control panel.
Yes it won't be big difference when it comes to FPS, but how about latency *by maintaining gpu clock speed in CPU bound scenario? Will 'prefer maximum performance'+ just reflex be same as reflex + boost?
No, reflex works by letting the GPU instantly push the frames out to your monitor instead of passing the frames through the CPU first. It doesn't really have anything to do with clock speed or power usage.
@@smurfeeNnCPU PASS the frame to GPU. Reflex makes CPU sending frames as much as GPU can handle in realtime(by limiting some of those, and more complex things than that), thus lower latency as GPU can render frames fast as possible.
Most important, I'm not saying about reflex, I'm talking about 'power management mode.'
Prefer Maximum Performance isn't the same as boost. Even in CS:GO with PMP enabled I wouldn't hit p0 states on the GPU. The only way you can get around that is to disable power savings with a registry key but then you're always running at p0 state (don't worry it's safe, my GPU has been at p0 for 4 years). +Boost does all that for you. In case you were wondering, not running p0 has a latency impact.
That measuring device is so cool. Every computer should come with one!