Websites are Blocking the Wrong AI Scrapers (Because AI Companies Keep Making New Ones)
This is an example of “how much of a mess the robots.txt landscape is right now,” the anonymous operator of Dark Visitors told 404 Media. Dark Visitors is a website that tracks the constantly-shifting landscape of web crawlers and scrapers—many of them operated by AI companies—and which helps website owners regularly update their robots.txt files to prevent specific types of scraping. The site has seen a huge increase in popularity as more people try to block AI from scraping their work.
We just blocked everyone and everything. We don’t particularly care about it being found, it’s not like the blogosphere is still a viable entity.