Hacker News new | past | comments | ask | show | jobs | submit login

It's only not a tech problem if every service has Google levels of server capacity and performance. Websites still regularly fall over when posted on aggregation sites.



They fall over because they're misconfigured (namely, default Apache config on a $5 VPS; works fine at low traffic, but throw some more at it and Apache spawns more processes than you have RAM, starving the DB and OS filesystem cache, then swapping to death), not because websites are bandwidth-limited.


I'm not just talking about simple static websites. Please, let's not pretend that upgrading everyone to fiber will be sufficient. You know better than that.


I don't know what systemic tech problem you think exists for web hosts. There's no lack of bandwidth on their side of the internet -- heck, if you excluded Netflix and YouTube, over 60% of last mile capacity in the country would be completely unused during peak hours. Those few sites, like video streaming services, that are bandwidth-limited also tie their revenue to consumption -- more bandwidth on the consumer end just widens their potential customer base.


There are many infrastructure changes that are needed in order to increase the performance of the Internet besides faster home connections. CDNs / caching / DNS performance / SPDY / distributed computation and on and on and on. The OP makes it sound like the only thing between us and 0 ms response times on every service and every web page is some politics.

In other words the reason we don't have Google Fiber levels of performance is because 1) we don't have Google Fiber and 2) all the infrastructure to maximize the potential of that last mile connection isn't there. 2) is a problem!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: