views:

246

answers:

4

I'm about to launch a multi-domain affiliate sites which have one thing in common which is content. Reading about the problem with duplicate content and Google I'm a little worried that the parent domain or sub sites could get banned from the search engine for duplicated content.

If I have 100 sites with similar look and feel and basically same content with some minor element changes, how will I go on preventing banning, indexing these correctly?

Should I should just prevent sub-sites from been indexed completely with robots?

If so how will people be able to find their site... I actually think the parent is the one that should only be indexed to avoid, but will love to her other expert thoughts.

+1  A: 

You won't get banned straight away. You will have to be reported by a person.

I would suggest launching with the duplicate content and then iterating over it in time, creating unique content that is dispersed across your network. This will ensure that not all sites are spammy copies of each other and will result in Google picking up the content as fresh.

Alex
+2  A: 

If I have 100 sites with similar look and feel and basically same content with some minor element changes, how will I go on preventing banning, indexing these correctly?

Unfortunately for you, this is exactly what Google downgrades in its search listings, to make search results more relevant, and less rigged / gamed.

Fortunately for us (i.e. users of Google), their techniques generally work.

If you want 100s of sites, to be properly ranked, you'll need to make sure they each have unique content.

DanSingerman
A: 

I would say go ahead with it, but try to work in as much unique content as possible, especially where it matters most (page titles, headings, etc).

Even if the sites did get banned (more likely they would just have results omitted, but it is certainly possible they would be banned in your situation) you're now just at basicly the same spot you would have been if you decided to "noindex" all the sites.

Eric Petroelje
+1  A: 

Google have recently released an update that will allow you to include a link tag in the head of pages that are using duplicated content that point to the original version, they're called canonical links and they exist for the exact reason you mention, to be able to use duplicated content without penalisation

For more information look here..

http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html

This doesn't mean that your sites with duplicated content will be ranked well for the duplicated content but it does mean the original is "protected". For decent ranking in the duplicated sites you will need to provide unique content

Nick Allen - Tungle139
+1 since I had never heard of this before - But per the FAQ on that page, this method cannot be used across domains, so it doesn't really apply for the original question.
Eric Petroelje