I (and many, many others) notice lag in games with v-sync turned on vs off, and we're talking about single digit milliseconds there.
It's possible that Stadia might be "good enough" for casual gaming (and granted, that's a very large market segment), but I don't think it will ever be good enough for anyone who is sensitive to latency.
>I (and many, many others) notice lag in games with v-sync turned on vs off, and we're talking about single digit milliseconds there.
Indeed, I found this recently out in a double-blind kind of way, when running at 165hz. I had to take out my GPU to fix the fan, so I wanted to try playing some demanding games on my integrated Intel GPU. I switched settings to the lowest (along with the game itself auto downgrading settings). After I put my GPU back, I put all the settings back, or so I thought. I was feeling some input lag and the mouse a little "floaty" but thought I just imagining it for a day or two. Afterwards I checked settings and lo behold, VSync was on.
I was surprised that I could tell the difference when running at 165hz and 165 fps.
I feel like there must be something way wrong with the way vsync is implemented in most games. It often feels like it adds massive latency (like an entire extra buffered frame, like bad implementations of triple buffering), but if my FPS is close to what it would be with vsync, then I don't see why it should add much latency.
That is to say, on my 60Hz monitor, I don't feel much difference between 60 FPS and anything above with vsync off. But if I go from from 70FPS (vsync off) to 60 FPS (vsync on), it's yarrgh unplayable, as far as first person shooters go.
With vsync, a late frame delays input by a whole refresh cycle. When your hardware is only capable of average frame rate similar to your refresh rate, there will be plenty of those.
What is your screen refresh rate? With Vsync on, you are capped to the screen refresh values. If you have a 144Hz screen and you're running 165FPS with VSync off, you will run 144FPS with VSync on. But if your GPU struggles to hit 144FPS, with Vsync on it will drop instantly to 72 FPS, then to 36 FPS (assuming no GSync/FreeSync).
All major engines (Unity, Unreal and even Godot) provide 2 different mechanisms for callbacks in your scripts: A callback before each rendered frame and a callback before each physic step (Update() and FixedUpdate() in Unity).
Now, physics always run in a fixed time step. Usually 60 times per second. If you move your player, the game will always need to wait for the next physic callback to move the entity around. Same for spawning projectiles and or checking hitscan weapons with raycasts. You always need to wait.
In this case, when V-Sync is on an your screen updates at 60Hz per second everything is fine. But on a screen with 144Hz, there is inevitable input lag.
Can someone tell me, how on earth can this be improved by disabling V-Sync? It's basically not possible to reduce input lag in this scenario.
I know there are ways to hide the lag through Input Prediction and Entity Interpolation, but those are necessary band-aids for multiplayer games, where you accept lag which is already an order of a magnitude larger than local input lag. And for single-player games? If you introduce physics your oh-so-nice 1000 Frames per Seconds are basically useless. But please tell me, if I missed something.
The controller function in unreal is separate to the physics engine. Raycasts and spawning do not need to wait for a new physics tick to do their thing.
Interp and extrapolation do not delay your input. Interp fills in lag time with predicted movement for other player characters to hide 50ms+ of lag from the player. Your client side input still exists and is updated client side in realtime. It may get corrected by lag after the fact. Your input is not delayed to match their lag.
Network lag is not an order of magnitude larger than local input lag.
E.g. Dota 2 and CSGO servers in Europe have <40ms ping for most people. Local lag should be up to some 16ms from a 60Hz display plus 1ms from a 1000Hz mouse/keyboard, half of that on average. If the hardware is bad, it could easily be some extra 30ms from the display and maybe 16ms from the mouse/keyboard -- think 125Hz poll rate and bad debounce implementation.
I remember Arma 3 (A pretty serious MilSim game) having 100ms+ input lag at one point so there is definitely room to play with but I don't think it's ever going to be good enough for stereotype PC games - equally, I don't think they're the target market.
Input lag is annoying for sure, but it's the sort of thing where as long as it's consistent and not sporadic, you would mostly adapt to it... Especially assuming you're not playing some kind of competitive real-time game (which most of these games aren't).
Vsync off can sometimes produce tearing, but I find this preferable to vsync on, which eliminates tearing but introduces latency.
Edit: Freesync/G-sync is an attempt to eliminate tearing AND latency. Unfortunately, this often requires a higher response time on many monitors. For example, on my simracing setup with a Samsung CHG90[1], you can't both enable freesync and the fastest response time on the monitor. I opt for the faster response time as it just feels faster and most of the time I don't notice the tearing.
It's possible that Stadia might be "good enough" for casual gaming (and granted, that's a very large market segment), but I don't think it will ever be good enough for anyone who is sensitive to latency.