As a web developer you can just throttle your connection in developer tools though, no self-limiting required. But nobody does that in big corporations building most of the sites needed by people with slow connections.
Yeah, though it doesn’t quite capture all of the experience of working with slower broadband.
For example if you have a website that is meant to be used alongside a video call or while watching video, it’s difficult to really simulate all of that “feel”.
Using a link that is slow in practice is an invaluable experience.
In my experience, browsers limit speeds in a way that's kind of nice and stable. You tell them to stick to 100kbps and they'll have 100kbps. Packet loss, jitter, it's all a single number, and rather stable. It's like a 250kbps fiber optic connection that just happens to be very long.
In my experience, real life slow internet isn't like that. Packet loss numbers jump around, jitter switches second by second, speeds vary wildly and packets arrive out of order more than in order. Plus, with sattelites, the local router sends fake TCP acknowledgements to hide the slow data transfer, so the browser thinks it's connected while the traffic is still half a second away.
There are software tools to limit connectivity in a more realistic way, often using VMs, but they're not used as often as the nice browser speed limiter.
Good points, but it would still be a major step forward if websites start handling browser-simulated 3G well. Right now the typical webshit used by regular people more often than not ranges from barely usable to completely unusable on browser-simulated 3G, let alone browser-simulated 2G or real world bad connections. As a first step, make your site work well on, say, 200ms and 1Mbps.