Cloudflare updates robots.txt for the AI era – but publishers still want more bite against bots

Robots.txt got some much-needed TLC last week, courtesy of Cloudflare’s latest update. 

Cloudflare’s new Content Signals Policy effectively upgrades the decades-old honor system and adds a way for publishers to spell out how they do (and perhaps more importantly – how they don’t – want AI crawlers to use their content once it’s scraped.) 

For publishers, that distinction matters because it shifts the robots.txt file from a blunt yes-or-no tool into a way of distinguishing between search, AI training and AI outputs. And that distinction goes to the heart of how their content is used, valued and potentially monetized. 

Continue reading this article on digiday.com. Sign up for Digiday newsletters to get the latest on media, marketing and the future of TV.