2019 Week 27 SEO News Recap

Google’s robots.txt parser is now open source

Google announced as part of its efforts to standardizing the robots exclusion protocol that it is open sourcing its robots.txt parser. That means how GoogleBot reads and listens to robots.txt files will be available for any crawler or coder to look at or use. It is rare for Google to share anything they do in core search with the open source world but here Google has published it to Github for all to access.

Google to stop supporting noindex directive in robots.txt

Effective September 1, Google will stop supporting unsupported and unpublished rules in the robots exclusive protocol, the company announced on the Google Webmaster blog. That means Google will no longer support robots.txt files with the noindex directive listed within the file. “we’re retiring all code that handles unsupported and unpublished rules (such as noindex) on September 1, 2019. For those of you who relied on the noindex indexing directive in the robots.txt file, which controls crawling, there are a number of alternative options,” the company said.

Bing: We Never Supported Noindex In Robots.txt

Frédéric Dubut from Bing said that its search engine never supported the noindex in a robots.txt file before. So nothing is going to be changing with Bing on that front. While we heard Google did unofficially support it and will stop supporting it on September 1st.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.