When I start a new blog, many times I will go into Settings, Privacy, and block search engines from accessing pages on the site until I can get a homepage and secondary pages and posts developed, proofread, and checked for errors. When the site or homepage is ready, I will go into the Settings and allow search engines to find the pages.

I started on a new blog yesterday called SodastreamReviewed.com and I proceeded to do the exact steps above to block the search engines while I developed the home page. However, after I finished and unblocked the search engines I was left with a problem.

When I went into Google Webmaster Tools and tried to add a sitemap to the be crawled I received an error stating “URL Blocked by Robots.txt”. Now WordPress uses what’s called a virtual robots.txt file, in other words there isn’t actually one to edit, but when a visitor or bot accesses the site, the robots.txt file is created on the fly. So, the error about a blocked URL had me a bit confused.

Luckily I found a great plugin to fix the issue. In the WordPress plugins repository I discovered a plugin called WP Robots TXT by Christopher Davis. This plugin adds another section to the Privacy section so that you can change the robots.txt file yourself or review it. Once I added this plugin and activated it, my problem was solved.

I went back into Google Webmaster Tools and added a sitemap for the domain and all was well again. So, the next time your robots.txt file is blocking the search engines from indexing your site but you know it isn’t, try downloading this plugin.

 Note: In some cases, the robots.txt file is being cached by Google, so you just have to wait 24 hours or so until Googlebot returns to your site and checks it again. Patience sometimes is the answer.

Tagged with:

Filed under: Errors

Like this post? Subscribe to my RSS feed and get loads more!