There's something people should consider before calling "robots.txt" and adjacent efforts not good enough: the alternative is worse. The only way to exclude AI scrapers reliably would be to turn all platforms into closed off silos that don't expose anything to the public internet. You would have to create an account for all of them and confirm somehow that you're a human. If the whole internet did that, it would fucking suck badly. It's as if everything turned into Discord servers you can't peek into. It's basically what all the AI bros have been saying to justify their scraping: "Don't make things public, it's your own fault."
Do you really want this? I don't. I'd rather see laws passed that make obeying robots.txt et al mandatory.