Graber replied that generative AI companies are “already scraping public data from across the web,” including from Bluesky, since “everything on Bluesky is public like a website is public.” So she said Bluesky is trying to create a “new standard” to govern that scraping, similar to the robots.txt file that websites use to communicate their permissions to web crawlers.