Membership is FREE, giving all registered users unlimited access to every Acorn Domains feature, resource, and tool! Optional membership upgrades unlock exclusive benefits like profile signatures with links, banner placements, appearances in the weekly newsletter, and much more - customized to your membership level!

Duplicate Content

Status
Not open for further replies.
Joined
Sep 30, 2011
Posts
1,225
Reaction score
197
I am working on a financial qualifications site of mine to which I would like to add the syllabus details of the qualifications. Would there be any negative effect in doing this in terms of SEO owing to it being 'duplicate content'?
 
If you copy/paste, it's not great.

If you add expertise and comment, it's fine (assuming there are no copyright issues?)
 
Copyright issue has been sorted. I think I will go ahead with commenting and adding some expertise and explanations to it.

Thanks for your help.
 
Or just don't index the page...

Yes, it's always worth thinking about the fact that your site should serve visitors as well as the search engines. So if you can give them a better experience by providing near-duplicate content on different qualifications then it's always an option to block the indexing of the pages in question, either via nofollowing the links to them, NOINDEX in the header, and/or exclusions in the robots.txt file for the directory that they're in.
 
I think your misunderstanding what duplicate content is, having snippet information or paragraphs on a page is not duplicate content and would not cause a penalty, its syndicated or shared content.

Providing you have your own content on the page as well and its not a direct impression of the page you are referencing from then you'll be fine, adding credit linkbacks as references to the information is also a good idea to show your source.

Duplicate content is massively misunderstood, although a penalty as such can happen unless you've copied the entire page including its structure and elements its unlikely it will cause a problem, if there are too many similarities and you havent made it "your own" enough your page simply wont carry as much weight, it wont be penalised as such.
 
Last edited:
Yes, it's always worth thinking about the fact that your site should serve visitors as well as the search engines. So if you can give them a better experience by providing near-duplicate content on different qualifications then it's always an option to block the indexing of the pages in question, either via nofollowing the links to them, NOINDEX in the header, and/or exclusions in the robots.txt file for the directory that they're in.

If a specific page on your site gives users a good experience, then blocking search engines from accessing it is a weird line of thinking.

If a page on a site is useful to a user then you should be doing all you can to make it accessible to search engines.

Using noindex and robots.txt combined is a bad idea. As is nofollowing links to them.

I wouldn't block them anyway. You're not going to get penalised for having duplicate syllabuses on there. Look at lyric websites to put your mind at ease, they all rank fine.
 
Didn't a load of them get hit very hard lately?

Well rapgenius got a manual penalty as they got caught out encouraging bloggers to embed links. But it has some ties to Google investors so it didn't last long. But I don't know of any other notable ones?

The lyrics industry is always going to be rife with algorithmic / filter or manual penalties though as it's pageview economy stuff and they're pushing the boat out with links to rank 1. But that's not really about on page content.

A site that puts their own unique spin on lyrics isn't going to be very popular :D unless they can do a better job than the original :lol:

My point was more there's multiple sites ranking all with identical content so not something I'd worry about in this instance when it's industry syllabuses.

It's a case by case thing really, for this one, I wouldn't care if they are identical. If you can add some of your own unique stuff to sit alongside the syllabus then happy days, if not then not a deal breaker.

On the reverse, if you're building an ecommerce site and just using manufacturer descriptions and nothing else then you're going to have a very bad day.
 
There was indeed another cull by google which seemed to target lyric or sites alike which have thin content or content which has little else to add ( or so its being said ). Thats not duplicate content.

Good article here recently : http://www.hmtweb.com/marketing-blog/april-2014-google-algorithm-updates-song-lyrics/

I saw a number of webmasters talking about it on Search engine land in post commmets last month but cant find the post it was on now.
 
If a specific page on your site gives users a good experience, then blocking search engines from accessing it is a weird line of thinking.

If a page on a site is useful to a user then you should be doing all you can to make it accessible to search engines.

Using noindex and robots.txt combined is a bad idea. As is nofollowing links to them.

I wouldn't block them anyway. You're not going to get penalised for having duplicate syllabuses on there. Look at lyric websites to put your mind at ease, they all rank fine.

I don't agree at all.

Imagine a site selling "widgets" that differ in a few aspects, but are broadly the same. Each widget has a spec sheet, safety notice, etc. associated with it that could be 99% the same. That information is super-useful for users, because they could very well have come straight to that widget's page from the search engines (rather than navigating to it from homepage->category style navigation)

In other words, the user is coming fresh to the material and having all the details on that widget just a click away is great for providing reassurance in advance of a sale.

However, Google's spiders spider pretty much EVERYTHING.

So Google might see 1,000 near-identical pages in the case of 1,000 widgets being on sale on that site. Hence the need to block certain material from the spiders, not from users.
 
I don't agree at all.

Imagine a site selling "widgets" that differ in a few aspects, but are broadly the same. Each widget has a spec sheet, safety notice, etc. associated with it that could be 99% the same. That information is super-useful for users, because they could very well have come straight to that widget's page from the search engines (rather than navigating to it from homepage->category style navigation)

In other words, the user is coming fresh to the material and having all the details on that widget just a click away is great for providing reassurance in advance of a sale.

However, Google's spiders spider pretty much EVERYTHING.

So Google might see 1,000 near-identical pages in the case of 1,000 widgets being on sale on that site. Hence the need to block certain material from the spiders, not from users.

Instead of using widgets. Give a real example and we'll go from there.
 
Status
Not open for further replies.

The Rule #1

Do not insult any other member. Be polite and do business. Thank you!

Premium Members

New Threads

Our Mods' Businesses

*the exceptional businesses of our esteemed moderators
General chit-chat
Help Users
  • No one is chatting at the moment.
      There are no messages in the current room.
      Top Bottom