> The Robots Exclusion Protocol is a mechanism for publishers to discriminate between what users and crawlers are allowed to access, and discriminate between different crawlers (for example, allow Bingbot to crawl but not Googlebot).
To me as a search engine end user, this kind of behavior is undesirable. Why would I want a website to selectively degrade my experience because of my choice in search engine or browser?
Brings back horrible flashbacks of “this website is only compatible with IE6”.
To me as a search engine end user, this kind of behavior is undesirable. Why would I want a website to selectively degrade my experience because of my choice in search engine or browser?
Brings back horrible flashbacks of “this website is only compatible with IE6”.