Why Blocking JS and CSS May be Harmful After Panda 4.0

Published Jul 08, 2014 by James Parsons in SEO
Estimated read time of 3 minutes and 24 seconds3 1
The views of contributors are their own, and not necessarily those of SEOBlog.com

Why-Blocking-JS-and-CSS-May-be-Harmful-After-Panda-4

Across the internet, when Panda 4.0 was rolled out, people noticed a shift in their rankings. Many went down, of course. Some, surprisingly, went up – an effect of the slackened penalties for small businesses. Still others noticed an interesting correlation; when they had crawling of JavaScript and CSS blocked, they tended to be penalized.

Why Blocking JS and CSS is Bad

First and foremost, Google stepped forward with some responses to the issue. They aren’t the most clear and concise responses, as usual for Google, but they do give some insight into the problem. Essentially, what they say is that there is no direct penalty for blocking JavaScript or CSS in your robots controls.

That said, your site ranking might still go down when you block those scripts, for a completely different reason. One very good reason is, for example, CSS-controlled site layouts and design. If you have worked long and hard on a responsive design and that design is based on CSS controls, you’re going to encounter problems when you tell Google not to register your CSS. When you do, it makes your desktop and mobile sites look identical. This means either your desktop navigation is thin or your mobile website is poorly laid out. Both of those do incur Google penalties, or at least hurt your site in some small way.

Another scenario cited by the Google response would be the instance of JSON calls populating your site with some content. That content, when the scripts are blocked from Google’s view, may as well not exist. Even if users love your page due to that content, Google won’t promote you for it.

Can Google Read?

One common perception is the thought that Google can’t parse JavaScript and other forms of scripts. SEO advice is often given to avoid such scripts and content calls because they may as well not exist on your site as far as Google is concerned. Worse, using JavaScript to obscure spam content is a black hat technique and can earn your site a penalty.

Can-Google-Read

In reality, Google has become a very sophisticated piece of machinery. It’s also staffed by thousands of human workers who do, in some cases, go through search results manually to teach the algorithm how to react to certain kinds of content. It’s true that at one time, in the past, Google could not parse JS or CSS very well. They obscured site content. By now, however, Google very much can parse and read common scripts, so long as those scripts are well-coded. Malformed scripts or broken code breaks the web crawler, and can be grounds for a diminished site ranking.

A Coincidence of Panda Timing

Google Panda has always been focused on thin content, content quality issues and problems with duplicate and spun content. It is a focused algorithm, disregarding other types of issues with sites. If the crawler looking for Panda violations sees a violation with links, it may report it to Penguin, but the Panda itself won’t penalize you.

Of course, that was more true before, when Panda was a separate beast. These days, it’s part of the main algorithm, updating along with other parts of search in general. Panda updates still happen, but they can include bleed from other updates in the background.

This is more or less the case for any penalties received for blocking CSS and JS when Panda 4.0 rolled out. Panda doesn’t care about those issues, unless they blocked code is what was providing the content Panda grades. However, in the background, Google is always tweaking results. It’s likely that any webmasters penalized by the much smaller blocked code parsing issue would check post-Panda to see if their rankings declined. Seeing that they did, and drawing the connection with their blocked scripts, they assume Panda caused the problem. It did not.

Giving Google the Full Picture

Hiding anything on your site is asking for trouble. Telling Google not to parse scripts may be completely inconsequential, if those scripts aren’t doing much beyond loading ads, a comments box or some other minor application of distributed content. Blocking CSS may hide the fact that your H1 tags are orange, but it won’t hide the fact that you use H1 tags, which is the relevant information to Google.

On the other hand, if Google detects that you’re hiding something behind the code that would earn you a penalty, they will find out and they will penalize your site. So that’s not a good reason to block scripts from being parsed.

Giving-Google-the-Full-Picture

There are, meanwhile, many good reasons to allow Google to parse your scripts. In the CSS example, Google doesn’t care about colors and formatting all that much, unless it interferes with the user experience in some way. That means hiding CSS does nothing or hurts you, while revealing it does nothing or benefits you.

JavaScript is a larger issue, in that it’s a more robust language that can do more things for your site. CSS has plenty of power now, too, as the above example for responsive design indicates. Essentially, both types of code – and any other you might be using in your web design – are part of your website. You want to show Google your website in the best possible light, which generally means showing them everything. Anything hidden may be suspicious, after all.

Google wants the full picture, so it can analyze the user experience. This includes your responsive design, your script-pulled content, your dynamic user experience and everything else. If you’re doing everything right, presenting a complete picture to Google will benefit your site. After all, you’re giving users a robust site with easy navigation. On the other hand, if you’re doing something wrong, Google will find out sooner or later. It might be through parsing your scripts despite your wishes. It might be through human intervention. It might be in some other way. The point is, you would earn your decline in search ranking whether or not you block Google’s ability to read your scripts.

Written by James Parsons

James Parsons

James Parsons is a blogger and marketer, and is the business development manager at AudienceBloom.com. When he isn’t writing at his personal blog, he is working on his next big project.

  • Joe Seveski

    Makes sense, I’m surprised this was okay to do for so long. I feel it’s being used more for bad then it is for good. Google is getting smarter!

Like us on Facebook
Follow us on Google+