Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Half the web you don't really want to use... the majority of sites I come across in search results etc. are perfectly fine being static content, and if they somehow require JS to show that content, then I'm more likely to go find the same content somewhere else (i.e. the next search result.)


It seems to be getting a lot worse lately. I've been browsing with no-script for years both on mobile and desktop but I think I have caught a case of no-script fatigue.

HN is one of my main news sources and due to its link submission nature I frequently visit sites I have never visited before. It seems like 90 percent of submissions need at least one round of whitelisting just to see the text content. And frequently a second or third round to get embedded code snippets or other relevant content to load.

It's tiring and I noticed that I frequently just give up and copy paste the url into an alternative browser without blockers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: