If you own a WordPress site and have it setup within Google Webmaster Tools, there’s a good chance you received an email today stating something like, “Googlebot cannot access CSS and JS”. What does this mean? Should you fix it? The answers are relatively straight forward and we’ll walk you through it:
Why is this occurring?
About a year ago Google starting rendering full versions of your website as a typical human would see when visiting your site. The issue here is that almost all CMS systems (like WordPress) have specific files within their wp-includes and wp-admin directories that are by default, blocked from search engines. Therefore, when a robot tries to render the full site, they are denied access to specific files.
What you should do…
The change is simple, we just need to either create or modify your robots.txt file to allow Google bots to access the CSS and JS files within your wp-admin & wp-includes directories. In order to do this you will need FTP access to your sites root directory. Once logged in, check and see if you have a robots.txt file created. If you do great, just modify the current text with the following:
#Googlebot
User-agent: Googlebot
Allow: *.css
Allow: *.js
# Other bot spider
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
If not, you will need to create a new robots.txt file and still paste the following text within the newly created file.
Either way, making this change will grant Google Bots the ability to access all of your JS and CSS files and prevent any negative SEO Google may asses as a result of not being able to access your full site.
As always, if you need any assistance with this feel free to contact us. You can also visit our partner site, WP Cover for full security, maintenance, and website update plans that would include changes such as this.