Googlebot cannot access CSS and JS – What You Should Do…
Googlebot cannot access CSS and JS – What You Should Do…
Thousands of people have been receiving this email throughout the day, as today is the first time Google has been notifying site owners regarding a policy they actually implemented about a year ago.
Why is this occurring?
About a year ago Google began rending the full page of your website as the user would see in. When rendering the full website, the files included must be accessible by the bot, which by default they are not in systems such as WordPress.
Should I modify your site?
Google’s thought process behind this is if they cannot view your entire site, it may result in sub-par rankings as they will not understand what you’re all about and for the most part, this makes sense. Therefore, we recommend our clients modify their robots.txt file to allow the Google Bot to access the protected directories.
The Robot dot what file?
It’s really a simple change, go into the root directory of your WordPress install. If a robot.txt file exists, modify the code within to say:
#Googlebot
User-agent: Googlebot
Allow: *.css
Allow: *.js# Other bot spider
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
If you do not currently have a robot.txt file that exists, create a new file, name it robot.txt and place the same code in it. Save your changes and you should be all set.
Some people are arguing that this change is a further invasion of Google into the privacy of files they do not want to allow access to. I tend to disagree, rather Google is trying to better understand the full extent of your website to provide better rankings.
If you need assistance with this setup. Why not sign-up for one of our WP Cover WordPress plans? This would definitely be included in our small task support!
Thanks for this. I have had a lot of clients asking for further information.