PDA

View Full Version : duplicate content



Cloughie
06-06-2008, 01:21 AM
Hi,

We have been offered about 100 high quality articles from a VERY reputable author which is well know with our customer base. So, it's super tempting to publish them all on my site www.atlargenutrition.com. Not only for SEO reasons, but our customers will love it and probably most won't be aware of them already so it will be pretty unique for them.

My dilemma is that they are currently all published in PDF on their own site and in google. Will I get penalised if I publish them on my site?

What's the best way of doing this?

I really want to use this as it will give me 3-6 months worth of strong, regular content updates and add 500+ pages to my site. Just gotta work out the best way of going about it... Prefer not to have to change them!

Cloughie
06-13-2008, 10:15 AM
bump :(

allout
06-16-2008, 09:03 AM
If you posts a few mixed with lots of your own unique articles, chances are there will be no problems. If this is your only source of content, Google may mark it as spam or duplicate content. The sites they normally target are those that are all duplicated content.

Cloughie
06-16-2008, 09:07 AM
Thanks,

I had planned to launch them in a staggered fashion and there will be other articles which are exclusive to us alongside these, so on that basis we should be ok.

Anyone else have any thoughts?

Chris
06-16-2008, 01:06 PM
I think the fact that they are in PDF on the original site and in HTML on your site helps as well.

Any chance the author though would be willing to robots.txt block the ones on his site as part of the arrangement?

Dan Schulz
06-22-2008, 10:44 PM
I was just about to say the same thing, Chris. Also submitting a removal request to Google in case those PDFs have already been indexed would be a good idea as well.

bermuda
06-24-2008, 10:24 AM
Maybe the author can also be using the tags like NOINDEX and NOARCHIVE but they might take some time to turn effective.

Dan Schulz
06-28-2008, 04:11 AM
Bermuda, if the OP has the author block those files with the robots.txt file then there will be no need to use those tags - the search engines never even SEE them in the first place which means bandwidth gets saved (among other wonderful benefits).

Cloughie
07-14-2008, 05:43 AM
Heres a question.

Would I be better off publishing the articles on Wannabebig.com or www.atlargenutrition.com

Chris
07-14-2008, 10:55 AM
It depends on your long term goals for the two sites, for the most ALN benefit though, publish directly there.

Cloughie
07-15-2008, 12:15 AM
On Atlarge it will go then :)

Thanks Chris.

Mr. Pink
10-30-2008, 06:05 AM
Bermuda, if the OP has the author block those files with the robots.txt file then there will be no need to use those tags [like NOINDEX and NOARCHIVE] - the search engines never even SEE them in the first place which means bandwidth gets saved (among other wonderful benefits).

Dan,

Are you saying that search engines completely ignore NOINDEX and NOARCHIVE tags, or are you saying that if you exclude a page with the robots.txt file, then the search engines never see those tags in that particular file, because they never go to that file?

Dan Schulz
11-10-2008, 10:19 AM
They'll never see it - at least they're not supposed to. (Sorry, was dealing with computer issues the past month.)

abusinessfinder
07-29-2009, 01:35 AM
You can always modify the articles a little bit - like use synonyms where possible, change sentences a bit, place your own comments in between the article, etc.

I honestly don't believe that duplicate content is such a big issue. You will be amazed at how many other people write similar articles.