I don't trust robots.txt after I had a site get systematically removed from Google over the course of three months...because I had a crawl-delay directive in it. Which had been there for a decade (where they ignored it anyway). Removed it, and the pages slowly came back.
I do have AI bot blocking active on Cloudflare. I wish there was a way some sites with a lot of data would "poison" all the AI bots with false information, to the point that their services become useless. I'm tired of having "AI" shoved in my face with every application I open now--it's not even "intelligence." It's just an updated algorithm.