Locking out AI bots with the Dark Visitors API
The Robots Exclusion Protocol was initially developed in the early years of the web to prevent small websites from being overwhelmed by search engine crawlers, and its use has expanded to exclude robots for a variety of reasons, most recently to keep artificial intelligence bots from feeding their large language models with web content. There are various hand-picked lists of known AI bots that have been circulating among web developers, and there’s also Dark Visitors, which maintains a comprehensive list of known bots, categorised by type....