Today's question comes from Gary Taylor. Oh, he's asking from Stratford Upon Avon in the United Kingdom. And Gary asks, "How does Google handle duplicate content, and what negative effects can it have on rankings from an SEO perspective?" This is a question that we hear a lot, so we figured it was worth tackling again. It's important to realize that if you look at content on the web, something like 25% or 30% of all the web's content is duplicate content. There's man pages for Linux, all those sorts of things. So duplicate content does happen. People will quote a paragraph of a blog and then link to the blog, that sort of thing. So it's not the case that every single time there's duplicate content, it's spam. And if we made that assumption, the changes that happened as a result would end up probably hurting our search quality rather than helping our search quality. So the fact is Google looks for duplicate content. And where we can find it, we often try to group it all together and treat it as if it's just one piece of content. So most of the time, suppose we're starting to return a set of search results, and we've got two pages that are actually kind of identical. Typically we would say, OK, you know what? Rather than show both of those pages, since they're duplicates, let's just show one of those pages, and we'll crowd the other result out. And then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, OK, I want to see every single page. And then you'd see that other page. But for the most part, duplicate content is not really treated as spam. It's just treated as something that we need to cluster appropriately. And we need to make sure that it ranks correctly. But duplicate content does happen. Now that said, it's certainly the case that if you do nothing but duplicate content, and you're doing it in an abusive, deceptive, or malicious or manipulative way, we do reserve the right to take action on spam. So someone on Twitter was asking a question about, how can I do an RSS auto blog to a blog site and not have that be viewed as spam? And the problem is, if you are automatically generating stuff that's coming from nothing but an RSS feed, you're not adding a lot of value. So that duplicate content might be a little bit more likely to be viewed as spam. But if you're just making a regular website, and you're worried about whether you have something on the .com and the .co.uk, or you might have two versions of your terms and conditions, an older version and a newer version, or something like that, that sort of duplicate content happens all the time on the web, and I really wouldn't get stressed out about the notion that you might have a little bit of duplicate content. As long as you're not trying to massively copy for every city in every state in the entire United States show the same boilerplate text, which is oh, no dentists found in this city either, for the most part, you should be in very good shape and not really have to worry about it.
Learn more about how Cultured Digital approaches SEO.
We've been doing this since 2001 and have helped 100s of companies around the world. Cultured Digital is a Sunderland SEO company, we'll be more than happy to put you in touch with current and past clients to talk about how we've helped them grow.Get in touch