Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The web feels slower and more broken each day (nngroup.com)
19 points by kluck on June 27, 2015 | hide | past | favorite | 15 comments


I just came across this post: https://news.ycombinator.com/item?id=6825557 where another user expresses his/her feeling about the perceived performance of websites these days.

And I myself also feel that it is a real problem. Developers (like myself and a lot of hackernews folk) should tackle this. I constantly find myself switching browsers because none really fits the bill. No browser works with all websites - there is too much complexitiy going on while in theory it could be so simple...

A lot is to blame for heavy use of JavaScript where it is not actually needed. But I am sure there are many more reasons.


I think you've just forgotten how things used to be. Like if you encounter BSOD or something, you'll get upset. But do you remember how often Windows 3.1 freeware / shareware apps crashed? I also remember sites which said immediately, that this site operates only using IE 6 or something similar, refusing completely to load without right browser.

When I create web sites, I'll always focus on performance. Like most of pages require just single item to be downloaded and no, I'm not inlining stuff. I'm just reusing already downloaded stuff. And even amount of that stuff is minimized. Benefit? On slow mobile connection my pages load in under a second, versus many sites which might take minutes(!) yes, that's right to load.

I'm sure anyone using NoScript has noticed how full of s*t many popular web sites are today. My sites all load from single domain using https, no 3rd part junk.


I try to create fast websites as well. For that reason I really use no javascript. At all. None. With all the new HTML5 media tags there is really no reason.

Another problem is a typical website these days contains a lot other stuff around the actual content that needs to be loaded, like adverts...


The browsing experience on my quad core MBP is slower than on the 300 MHz PII I had in 1998 (on a 256k DSL connection) and isn't one bit better for all the bling. I actually feel pangs of bitterness when I see websites in 1990's movies. Just text and images like God intended.

At some point, the eye rapists took over all media. TV, movies, the web. Everything is more in your face and terrible than it was 20 years ago.


It should be possible to create visually appealing websites without JavaScript and using standard system fonts.


That guy [1] used search engine autocomplete to find out that many people search for phrases like "why is facebook so slow". This is really just another indication of something beeing terribly wrong here...

[1] http://calendar.perfplanet.com/2013/why-is-the-web-so-slow/


With rightly adjusted content filters, the web today is blazingly fast. The NY Times loads in under 2 seconds, the most popular German news provider (spiegel.de) in roughly a second. No external sources loaded.

Very heavy pages like Google Maps do have some speed issues, but I guess that's at least partly caused by my core i5 instead of my internet connection.

Sounds like a client problem to me.


That it is not a problem of "how fast the browser renders stuff" can be easily demonstrated by this small experiment: just create a website that contains an infinite JS loop (maybe through the use of window.setTimeout() instead of plain recursive...) and watch the browser break...

Sometimes I feel that a website that I choose to load and that should serve me, just takes control over my browser and ultimately over my system (RAM, CPU). The browser is not the good "sandbox" it used to be.


Under two seconds? Wow. What kind of latency and bandwidth do you have to NYC? Transfer time should be well under half a second, and processing time should be negligible. If that's what you consider a good loading time, you've done a great job proving the author's point.


With Chrome, the full page is already rendered while there are still resources to load. This makes load times mostly on par with my brains content capturing speed.

If that's the articles point, I don't see it.


This leads me to another thing that is really annoying: during page load stuff bounces around and changes its size and position. It freqently happens that while I want to click on something it moves away and I click some advert - really stupid.


I'm interested in rightly adjusting my content filters. How could I do this?


I mostly use ublock with many lists and a few custom filters/rules for services I don't like. I don't want pages to appear broken. Therefore no Noscript etc.


Install RequestPolicy, turn off all third-party requests, turn on requests to the same domain, turn on usability/browser subscriptions. Install NoScript, uBlock etc.


Disappointing that the observations from article (which is from 2010) still apply, and appear to be even worse.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: