The robots.txt does not exclusively list what not to scrape.
It provides information on which parts are allowed and wich are not (disallowed).
It also provides sitemaps for crawlers as a starting point with more information (eg. which sites are available and how often are they updated, etc.)
The robots.txt does not exclusively list what not to scrape.
It provides information on which parts are allowed and wich are not (disallowed).
It also provides sitemaps for crawlers as a starting point with more information (eg. which sites are available and how often are they updated, etc.)