The Google Duplicate Content Penalty
First of all, was my last post really in March? Whoops! I have been kind of busy. I now have 17 websites on the go and I admit to losing track a little. Overall everything is going well and I am making a small amount of regular cash. The amount coming in is doubling slowly and so I am hopeful about seeing some genuine residual income in the future.
However this post is about the Google duplicate content penalty and my real life experiences of it rather that what SEO ”experts” will tell you.
Some Real World Experiences of the Duplicate Content Penalty
Regular readers will know that I joined an online group back at the start of March in an effort to make a regular income from writing. This experiment has not really run its course yet and so I have not commented very much on it. What has happened is that I now know an awful lot more about SEO and making sure that people actually see what you write, instead of blogging to an audience that basically consists of friends and the neighbour’s cat. Admittedly much of what I have learned I still haven’t applied to this site (more is the pity) but will do so in time.
The Importance of Google
Like it or not Google is extremely important to webmasters. The majority of search traffic goes through big G. If you rank badly in Google then a significant number of potential visitors will most likely never even see your site. SEO “experts” say content is king, but great content with no traffic is pointless. If you don’t care about traffic then you should write a diary and keep it under the bed, rather than writing web pages.
What I want to talk about today is events surrounding two of the sites I have set up.
The first is a product review type site, monetised by Amazon Affiliates and the second is an online gaming site that employs auto-feeds to load flash games.
Despite a recent Page Rank review by Google both of these sites are currently PR0 – which effectively means they are at the bottom of the pile. They are hidden from search engine traffic like a naughty stepchild, locked in the basement. As I have placed links to these sites, I know from the results on my other websites that these pages should have seen Page Rank improvements and SERP’s increases.
Google maintains that there is no such thing as a duplicate content penalty. They say that all that happens when two pages show the same content is that they will decide which one is authoritative and show results for that one only. Webmasters are told that, for a product review site, there is no point in listing the same information as Amazon, unless you add significant original content and reader value alongside the content from Amazon’s page. This is exactly what I have done, but it appears that this information is wrong.
On my product review site I show an affiliate link to the product on sale along with a review that varies from 350 to 550 words in length. The Amazon text is marked as blockquote as I believed that a search engine would recognise it for what it is – additional information from a reliable source. This process of quoting from an authoritative source is standard academic practice and is a principle on which the internet is founded – always quote your source.
But I was wrong.
The whole site remains at PR0 despite having nine incoming PR6 links along with a variety of others. The site does not rank in the top 150 in Google even for its own domain name! This final point is what nails it.
The site you are reading this article on is currently PR2, with only 3 PR6 links and a lot of nofollow PR0 blog links from places where I have left comments. My other sites have ranked with minimal linking, but all my other sites have 100% original content, bar one.
And guess what? The other site is also PR0 and doesn’t rank even for its own domain name despite being over 600 pages in size. So what is the problem? I suspect that again it is duplicate content. The second site imports a game, along with a description of what the game is about, how to play etc. This description comes from the game publisher and is definitely duplicated content as it is widely syndicated. I did not alter this text as I thought it was wrong to alter the original authors work. To me, the service I am offering to users is the ability to play games online, not words on a screen. Again, at least so far as Google is concerned, I was wrong.
To be clear on this, I have no problem with Google penalising duplicate content. A few years ago the top ten or twenty web pages for a given search would be pr0n or the same content duplicated by a number of webmasters. This was frustrating and had to stop.
Is Googles Duplicate Content Advice Wrong?
What annoys me in this instance is that Google advises two things:
Duplicate content penalties are only relevant to the page the content is on – this doesn’t appear to be true as on the product review site about half of the pages have no Amazon content on them at all and yet none of the pages have a page rank higher than zero and none of the pages rank appropriately in Googles SERPs. The same pages rank well in Bing and Yahoo incidentally, so it is definitely a Google issue.
The other thing Google implies is that partially duplicated content is ok, so long as the page adds significant value apart from that content. My product review site carries about 100 words from Amazon and on average maybe 450 words of the review itself, yet the page gets spanked by Google. In fact not only does the page get spanked, but the whole site does, despite the fact that half of the pages have 100% original content.
Google maintains that there is no such thing as a site wide duplicate content penalty. Forgive me, but this looks like a site wide penalty to me.
What annoys me further is that I regularly see sites with reasonable page rank and SERPs, that are only providing duplicate content. How are they getting away with it? Something is clearly wrong here and as yet I do not understand why this should be the case. No doubt time and experience will reveal all.
Avoiding the Google Duplicate Content Penalty
Looking to the future, what can be done? Well, I have signed up for copyscape.com’s services. I now know exactly which pages are being flagged for duplicate content and have started the long process of re-writing them.
Once each page is re-written I will throw a few new links at the sites and affected articles and hope that Googles spider will notice that the content has changed. If this does not work then I guess I will have to look at other options.
One thing is certain, I will be a lot more careful about employing affiliate links and product information in future.