What are the ways that US domains can block AI? I figure pay walls, and captchas, but is there something we can add to robots.txt that has any teeth against AI scraping? I mean would we even know if they obeyed it anyway? How do we set traps and keep this shit out?
What are the ways that US domains can block AI? I figure pay walls, and captchas, but is there something we can add to robots.txt that has any teeth against AI scraping? I mean would we even know if they obeyed it anyway? How do we set traps and keep this shit out?