Hacker News new | past | comments | ask | show | jobs | submit login

Why is discriminating against robots unfair? There are valid reasons (for instance, robots take a lot of resources to serve and don't lead to revenue).



Just because it's a robot doesn't mean that it takes more resources to load a page. A robot that loads 1000x more pages than a normal user, sure. But then rate-limit everyone rather than blocking specifically bots.

And that bots don't lead to revenue depends on why the bot is navigating on your page no? If it's some indexer that links back to your website and it's a popular index, then you'll maybe end up with more revenue thanks to that bot than a normal user.


Accepting robots + humans takes more resources than only accepting humans.

Your arguments about revenue are website-dependant and it's the website owner who is in the best position to decide whether robots are good for them or not (and plenty of sites don't ban bots in their robots.txt). In this case, the company that ran the bots is directly competing with Linkedin's products that sell aggregated data to employers and such, and linkedin clearly decided it's not going to lead to more revenue for them.


my browser is a robot that renders your page.


What is exactly the difference between a robot and a person using a browser?

Does an ad-blocking browser counts as a bot or as a human? And what is something that concatenates all of your infinite scrolling to represent a paginated view? What is something that changes the structure of your page? What is something that concatenates different pages before displaying?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: