There has been a great deal of debate over whether or not duplicate content penalties exist. The truth is every action you take as a web marketer will have consequences, whether or not search engines manually take action. This is what I want to talk about in today’s blog.

Duplicates

Duplicate content comes in what we can see as three types: first, the exact duplicate, where every bit of content – from the text, to the images and videos, to the tags – is exactly the same. The content pieces are sometimes placed on two different inner pages of the same domain. The second is the partial duplicate, where only some elements of your content are an exact match. Finally, there is duplicate content where you have either an exact or partial duplicate, but they are in two entirely different domains.

Manual Action Against Duplicate Content

Posting duplicate content is a mistake many web marketers and website owners still commit today. Some may do this on purpose to try and trick people into visiting the site, or to monetize others’ content. This is considered unethical, and it may lead to search engines taking manual action against the site’s rankings.

Copy Cat

Google in particular is very stringent about this issue. They penalize sites in their index if they see that you posted duplicate content on purpose. Also, Google’s algorithm will detect which version is the original and rank that higher, filtering out the version posted by the scrapers.

Algorithmic ‘Penalty’

Majority of duplicate content is unintentional, and in such cases, Google may not take action to punish the website. Instead, they constantly update their algorithm to give higher rankings to site’s with the original content on them. The organic placement falls if search engine crawlers detect your version is not the original one.

Avoiding Algorithmic Penalties

There are ways to avoid algorithmic penalties. First, if you have duplicate content within your website, make sure to update the sitemap and indicate your preferred URL for ranking. You could also remove the exact duplicate and do a 301 redirect. If the page has no significant traffic, remove the page altogether and return a 404 error. You may also leave the page back on, but make sure to edit the robots.txt file to block it from search crawlers.

Search engines also provide URL removal tools. These give you control of which duplicate pages to remove from their index. They also have parameter blocking tools where you can indicate what pages to ignore and what to crawl.

Our SEO products all receive a duplicate content audit as soon as the project starts. Contact us to find out if your sites have duplicate content penalties.

Similar Posts