I do not understand how people say there's virtually no latency. There is, and it's _huge_, because light is actually quite slow and no tech can improve on that.
Makes me think people that say this have never played on a high end PC, which in turn has lower latency compared to a last gen console. And that's considering the fact that even modern PC have a TON of latency. NVIDIA seems to be working towards that, thankfully.
I bet playing Quake 3 Arena multiplayer on Stadia would be noticeably worse that on a PC from 20 years ago.
I'd say there's effectively no latency, since most games don't need or benefit from <10 ms response times.
I mainly play twitchy shooters on a fairly high-end PC (CSGO, Tarkov, Q3A back in the day) and was super impressed with Stadia to play games like Assassin's Creed. It felt like I had my PC anywhere, but I attribute that to the forgiving latency requirements for the game.
I wouldn't expect CSGO to work as well (though I'd definitely try it).
Because sending a light beam 10 miles away is slower than not doing so, obviously. Light isn't instantaneous in this universe.
It's surprising people still don't get this. Operating a computer miles away will always be slower than operating a computer centimetres away. You can be smart about it and optimise as much as possible, but Google isn't running alien tech that magically is orders of magnitude better than consumer hardware so that it overcomes the distance issue.
Light takes about a millisecond to travel 200 miles. Compared to the latency introduced by computation, it is pretty insignificant, unless you are connected to a server very far away.
No. However fast is your Internet, it is still slower to send data out on the Internet pipes and back again than doing local computation.
5ms ping to your local Stadia server is at least 5ms of additional latency compared to a high end PC. Add virtualisation costs, CPU steal time, packet loss, video compression and decompression etc for another measurable increase.
How long is the rest of the latency chain? For example, keyboard input over usb is gonna be ~15 ms, processing time is gonna be >5ms, and display time like 12ms. Adding those comes to a minimum of around 32ms. [1]
I'm not sure if I can tell the difference between 32 ms and 37 ms.
Your 37ms is based on 5ms roundtrip and 0 CPU time, which is impossible. And add network jitter which might be worse than static latency.
And the Stadia market isn't people with fast monitors, Ethernet connected and ultra stable internet, but high latency TVs, avg tier Wifi, slow hardware to decode video.
I meant to assume 5ms CPU time: 12 input + 5 processing + 15 output = 32. Add 5 for network round-trip to get 37.
> 5ms roundtrip
5 ms network round-trip or less is common in offices or homes with fiber. Here's ICMP ping 1.1.1.1 from my office just now:
rtt min/avg/max/mdev = 4.075/4.700/6.407/0.459 ms. (UDP wouldn't be so different.) Of course, on wifi or low-speed broadband it wouldn't be so fast.
> high latency TVs
High-latency displays makes network latency less noticeable relative to a conventional console game (but more noticeable relative to a PC game on a fast-updating screen).
Lucky you, I can't even play on most TVs hooked as-is via HDMI because of horrendous render lag.