You're right, that's why I'm questioning the reason Anubis implemented it this way. Lots of big AI companies are at least honest about their crawlers and have proper user agents (which Anubis outright blocks). So "unethical" companies who change the user-agent to something normal will have an advantage with the way Anubis is currently set up by default.
I'm aware that end users can modify the rules, but in reality most will just use the defaults.
Despite broadcasting their user agents properly, the AI companies ignore robots.txt and still waste my server resources. So yeah, the dishonest botnets will have an advantage, but I don't give swindlers a pass just because they rob me to my face. I'm okay with defaults that punish all bots.
I'm aware that end users can modify the rules, but in reality most will just use the defaults.