If you have a website you probably have pages that you may not want shown to the public such as login pages, thank you pages, download pages, etc.

For years, one popular way to accomplish keeping pages suppressed from then url being indexed was to list the page urls in a robots.txt file for noindex: and upload this file to the server the site is hosted on.

This was an 'unofficial' way to achieve this and is very common since all other methods are/were confusing and time consuming to implement.

Yesterday, Google announced that as of Sept, 1 it will no longer 'recognize' this method and those page urls may become indexed. In short, if you do not want sensitive pages indexed, you need to take action now.

The official announcement: https://webmasters.googleblog.com/2019/07/a-note-on-unsupported-rules-in-robotstxt.html

Action needed: Insert the "noindex robots meta tag" into the head of each page itself before the /head.

See more info here: https://developers.google.com/search/reference/robots_txt

If your site is a wordpress site, using a popular plugin to suppress pages is an option to explore BUT I don't like using this plugin because it fills my dash with "you need to buy stuff from us" to the point, I won't use it. So being who I am I created a lightweight little plugin that will insert the code on each page per the new protocol. It's quick and easy.

If you have any questions and would like some assistance, please call and leave me a message.