Identifying files with crawl issues like blocked linked images, CSS, and JavaScript is made easier following a recent update on the Google Webmaster Tools.

Looking for Blocked Resources

Google announced that starting March 11, 2015, webmasters will have access to information identifying which of their website features are being blocked through the Blocked Resources Report feature.

CSS, JavaScript files, and linked images often get blocked from being crawled by Googlebot. When this happens, all that hard work to make sure your page is beautiful and functional just goes to waste. Until now, it is likely you never even knew which files were being blocked and why.

Google’s latest update enables webmasters to identify and resolve these types of issues easily.

The report works by first naming the hosts where the blocked resources are. Clicking on the host (JavaScript, CSS, or images) will reveal a diagnosis in more detail, including a list of blocked resources and a step-by-guide on dealing with the issues.

Why do these Blocked Resources Matter?

Google also updated Fetch and Render to answer this.

When webmasters request a URL to be fetched and rendered, they will be given screenshots of how a specific page looks to Googlebot and the typical user. This makes it easier to spot the blocked resources and why they are problematic.

This is similar to the Fetch and Render update from October 2014 that provides webmasters an opportunity to optimize resources for optimum rendering and indexing.

If you have a lot of blocked resources, Google recommends fixing the ones that will make the biggest difference on the page’s visuals when unblocked.

The update, however, will only show files from hosts it believes you have a certain influence over. You won’t see blocked files from hosts used by different sites.

Hopefully, the new feature will make it easier to track and unblock resources used by your site.

Similar Posts