I had a similar experience as the author on a boat in the south pacific. Starlink was available but often wasn't used because of its high power usage (60+ watts). So we got local SIM cards instead which provided 4G internet in some locations and EDGE (2G) in others.
EDGE by itself isn't too bad on paper - you get a couple dozen kilobits per second. In reality, it was much worse. I ran into apps with short timeouts that would have worked just fine, if the authors had taken into account that loading can take minutes instead of milliseconds.
An issue that the anonymous blog author didn't have was metered connections. Doing OS or even app upgrades was pretty much out of the question for cost reasons. Luckily, every few weeks or so, we got to a location with an unmetered connection to perform such things. But we got very familiar with the various operating systems' ways to mark connections as metered/unmetered disable all automatic updates and save precious bandwidth.
The South Pacific should be very sunny. I guess that you didn't have enough solar panels to provide 60+ watts. I am genuinely surprised.
And "local SIM cards" implies that you set foot on (is)lands to buy said SIM cards. Where did you only get 2G in the 2020s? I cannot believe any of this is still left in the South Pacific.
My previous smartphone supported 4G/3G/Edge, but for some reason the 4G didn't work. At all, ever, anywhere (not a provider/subscription or OS settings issue, and WiFi was fine).
In my country 3G was turned off a while ago to free up spectrum. So it fell back to Edge all the time.
That phone died recently. I'm temporarily using an older phone which also supports 4G/3G/Edge, and where the 4G bit works. Except... in many places where I hang out (rural / countryside) 4G coverage is spotty or non-existant. So it also falls back to Edge most of the time.
Just the other day (while on WiFi) I installed Dolphin as a lightweight browser alternative. Out in the countryside, it doesn't work ("no connection"), even though Firefox works fine there.
Apps won't download unless on WiFi. Not even if you're patient: downloads break somewhere, don't resume properly, or what's downloaded doesn't install because the download was corrupted. None of these issues over WiFi. Same with some websites: roundtrips take too long, server drops the connection, images don't load, etc etc.
Bottom line: app developers or online services don't (seem to) care about slow connections.
But here's the thing: for the average person in this world, fast mobile connections are still the exception, not the norm. Big city / developed country / 4G or 5G base stations 'everywhere' doesn't apply to a large % of the world's population (who do own smartphones these days, even if low-spec ones).
Not that some low-tier mobile plans also cap connection speeds. Read: slow connection even if there's 4G/5G coverage. There's a reason internet cafe's are still a thing around the world.
I live in a developed country with 4g/5g everywhere and its still no better than the 3g era I remember. Modern apps and sites have gobbled up the spare bandwith so the general ux feels the same to the user in terms of latency. On top of that there are frequent connection dropouts even with the device claiming a decent connection to the tower. Using mobile internet seems like 4g often can’t bring the speed to load a modern junked up news or recipe site in sometimes any amount of time.
In the Marquesas and Tuamotus, you don't see a lot of 4G reception, no matter what Vini's pretty map claims.
Re: Sunny - there's quite a bit of cloud cover and other devices onboard like the water maker and fridge (more important than Starlink!) also need a lot of power.
> Low bandwith, high latency connections need to be part of the regular testing of software.
One size does not fit all. It would be a waste of time and effort to architect (or redesign) an app just because a residual subset of potential users might find themselves on a boat in the middle of the Pacific.
Let's keep things in perspective: some projects even skip testing WebApps on more than one browser because they deem that wasteful and an unjustified expense, even though it's trivial to include them on a test matrix, and this is a UI-only.
Websites regularly break because I don't have perfect network coverage on my phone every single day. In a lot of places, I don't even have decent reception. This in Germany in and around a major city.
Why do you think this only applies to people on a boat?
> Websites regularly break because I don't have perfect network coverage on my phone every single day.
Indeed, that's true. However, the number of users that go through similar experiences are quite low and even those who do are always a F5 away from circumventing that issue.
I repeat: even supporting a browser other than the latest N releases of Chrome is a hard sell to some companies. Typically the test matrix is limited to N versions of Chrome and the latest release of Safari when Apple products are supported. If budgets don't stretch even to cover the basics, of course that even rarer edge cases such as a user accessing a service through a crappy network will be far from the list of concerns.
EDGE by itself isn't too bad on paper - you get a couple dozen kilobits per second. In reality, it was much worse. I ran into apps with short timeouts that would have worked just fine, if the authors had taken into account that loading can take minutes instead of milliseconds.
Low bandwith, high latency connections need to be part of the regular testing of software. For Linux, there's netem (https://wiki.linuxfoundation.org/networking/netem) that will let you do this.
An issue that the anonymous blog author didn't have was metered connections. Doing OS or even app upgrades was pretty much out of the question for cost reasons. Luckily, every few weeks or so, we got to a location with an unmetered connection to perform such things. But we got very familiar with the various operating systems' ways to mark connections as metered/unmetered disable all automatic updates and save precious bandwidth.