Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not sure the metaphor works here. For example most sites let Google scrape them as much as it likes, but go out of their way to block other robots. By doing so they are effectively forcing the whole world to use (or support, since smaller search engines have to piggyback on the big ones wih special status, and pay them) proprietary spyware.

In your analogy, most websites block everyone except the biggest pervert known to man.




Isn’t that a choice the website owner should be able to make?


Of course it's your choice to make.

Is someone forcing you to respond to requests you'd prefer to ignore?


yes, people like OP who get the farms of scrapers.

The website owners make their preferences clear with robots.txt, IP blocks and other antibot technology. Scrapers intentionally ignore owners' desires and force the to respond.


If crawlers are stealth DDoSing my site then I lose the ability to respond entirely.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: