Matt Cutts at Google announced a change in the search engine giant’s algorithm which will target sites with low levels of original content. The upshot is that sites which have created original content will be more likely to gain search exposure for that content than someone else who has copied it. This ties in neatly with previous posts on the need to ensure you have quality content and while copying someone else’s great looking articles and blog posts is quick and easy, it is also called “cheating” and search engines hate will hate you for it.
The Google Algorithm Change
Algorithm changes happen all the time, so by itself this is no big deal. What is a big deal is that the change means if your site contains duplicate content you are going to be penalized and ranked lower in the search engine results. The change means a negative direct impact on your site rankings if you have copied content; equally the change will have a positive ranking impact if you have a site with 100% original content.
The algorithm change is motivated by increasing negative press for Google’s results. Criticism includes a lack of freshness with the results Google returns, together with the incidence of spam results. Also, site content may be original but the quality is poor – after all anyone can create reams of content which is 100% original but provides no real value to readers.
Sites which carry duplicate content are known as “scraper” sites: if you cut and paste content you are content scraping, though some websites conduct scraping in a fully-automated fashion. While carrying some duplicate content is inevitable, for instance if I quoted Matt Cutts’ comment on the algorithm change:
“…we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.”
This is effectively duplicate content! The difference between the odd quote and full-scale replication is a question of judgment, but there is a basic principle for webmasters to follow here – make sure your content is as close to 100% original as it is possible to make it and never plagiarize other sites’ content.
Content Originality versus Quality
I’ve written on website content issues recently, dealing with how you are providing content for two sets of readers – the search engine algorithms and for human beings. Both are important but writing and creating content which best serves the human reader is most important of the two. If you have original content but it’s rubbish, you may be able to fool a search engine but not for long, because how many visitors and how long they are staying on a website are also factors in the algorithm. If your content is poor quality, as a real person I’m just not going to waste my time with you and I’ll hit the back button on the browser.
Being original is important, but you also need to deliver real value to your readership – your human readers.
SEO impacts on the content because it is a primary ranking factor which we can control, and I suspect that dominance in SEO thinking is where webmasters start to go wrong. The content needs to interest and rouse human users, but the focus is typically on gaining high rank fast or once at the top of the heap, throwing more content out to attack other keywords and consolidate existing position. This has some serious ramifications for websites which house a lot of original content with low quality. Google is going to deal with this issue so expect more algorithm changes to follow.
Create content for real people – the search engines come last – SEO should be a natural flowing activity which helps your site rather than taking it over. The search engines will not buy your product or service, but real people will if you give them real value.
Being original is good, but you also need to be useful and relevant to people too.