Industry giant, Google, has officially announced through its twitter page that it will be “saying goodbye to undocumented and unsupported rules in robots.txt”.
A post on Google’s official blog explains that this decision was made “in the interest of maintaining a healthy ecosystem and preparing for potential future open source releases,” thus beginning September 1, 2019 “all code that handles unsupported and unpublished rules (such as noindex directive)” will no longer be supported by Google.
Google Solutions for Page Indexing
However, for those companies who relied on robots.txt to avoid getting its pages crawled by Google has provided some alternatives on its blog:
Noindex in robots meta tags: According to the blog post this is the most effective way to remove your URLs.
404 and 410 HTTP status Codes: Applying this server not found codes to your pages will prevent the page from being crawled and indexed.
Password Protection: Protecting pages with passwords will stop them from being indexed. On the other hand, if you wish for your pages to be indexed by Google even when password protected you can do this with markup to indicate subscription or paywalled content.
Search Console Remove URL tool: Easily remove your URL from Google search results with the use of this tool.
If your business was relying on these rules, don’t wait until September to make required updates.
Contact CY Digital NET today. Our SEO experts will go through your website and make sure that your indexing rules are good.
If you want to build your website or improve your ranks in search results, we can help you.