views:

113

answers:

3

I like the way Google handles internationalization, with domains such as google.co.uk, google.nl, google.de etc. I'd like to do this for my own site, but I'm concerned that Google will interpret this as content duplication, particularly across countries that speak the same human language, as there won't be any translation to hint that the content is different. My site is a web application, not a content farm, so is this a legitimate concern? Would I be better off with subdomains of my .com? Directories?

+5  A: 

Most sites use something like this

http://www.example.com/en-us/
http://www.example.com/en-au/

There is no reason however why you can not forward http://www.example.com.au to the second address above. It would probably be best to send a 301 header too to tell Google and other bots that the address is permanently at the new address.

Hosting them all at one domain has the benefit that any SEO stuff like PageRank is consolidated onto one domain.

alex
+1  A: 

Yes is the answer. The site with original content will get the PageRank points while the other will be penalised for having duplicate content.

Alex is correct that the best and only solution that will stop you getting penalised is to use a 301 redirect. Then you get all of the link juice.

If you want to register different country codes for marketing, brand name reasons then use a 301 redirect to your original site. The use of subdomains or directories is up to you and how you want the URL to look to the robots and humans.

Neil
+2  A: 

Yes, Google will punish content duplication no matter where you put it in - across multiple country domains, different subdomains, different subdirectories, anything.

Building on top of Alex's example:

http://www.example.com/en-us/
http://www.example.com/en-au/

Most probably en-us and en-au will nearly be duplicate assuming the content is the same and US English is not really any different from AU English.

Solution Suggestion #1: Use the robots meta tag

What you can do is set a robots meta tag in the head section of all secondary pages so that Google will know to skip those pages. The meta tag as follows:

<meta name="robots" content="noindex,follow"/>

This tells Google to not index the content of the page but still follow any available links.

Taking the Google example itself, you would create a main domain (in Google's case, google.com) which will check the visitor's location and redirect accordingly. The main domain should be indexable, whereas the secondary domains (the destination of the redirections) should not be indexable, as long as it is using the same language as the main domain (thus seen as duplicates).

Solution Suggestion #2: Specify the canonical

Another alternative to this is using a canonical hint to tell Google that this same page is actually accessible from multiple URLs.

Carpe diem on any duplicate content worries: we now support a format that allows you to publicly specify your preferred version of a URL. If your site has identical or vastly similar content that's accessible through multiple URLs, this format provides you with more control over the URL returned in search results. It also helps to make sure that properties such as link popularity are consolidated to your preferred version.

It's better to read about this from the Official Google Webmaster Central Blog itself. :)

Amry