I was pondering how internet latency seems to be just barely sufficient for a decent fast-paced online multiplayer gaming experience. If human cognition were say, 20x faster relative to the speed of light, we'd be limited to playing many games only with players from the same city.
More significantly, single-threaded compute performance relative to human cognition would effectively be limited to the equivalent of 300 MHz (6 GHz / 20), which I suspect makes it a challenge to run even barebones versions of many modern games.
This led me to wondering how software development would have progressed if CPU clock speeds were effectively 20x slower.
Might the overall greater pressure for performance have kept us writing lower-level code with more bugs while shipping less features? Or could it actually be that having all the free compute to throw around has comparatively gotten us into trouble, because we've been able to just rapidly prototype and eschew more formal methods and professionalization?
Windows 95 could do a decently responsive desktop UI on an 80386. Coding was a lot less elegant in one way - C code that returns a HWND and all that - but with the number of levels of indirection and abstraction these days, we've made some things easier at the cost of making other things more obfuscated.