robots.txt is more like a sign that asks certain people not to look at a bunch of other publicly visible signs.
One can't post a sign in public that tells people not to look at other publicly visible signs and expect the government to arrest or fine them for ignoring it.
What if I user curl to pipe web content to my mail so that I can read it in a quirky way? What if I write a Chrome extension to crawl a site? Where does w3m stands?
This is not a question of the tool (UA) but of the intent (mass crawling, indexing, mass-replicating stuff). robots.txt is made as hints for crawlers and the like, not optimistically ACL whether something is public or not.
the ACL is the robots.txt. A door with or without a lock doesn't determine whether the place is public or not.