views:

82

answers:

2

So my company has made an affiliate deal with a news portal. It links to our e-shop in a well-seen place on their home page, and in turn, we share the profits of any conversions that come from their link and we also display their advertisements to visitors that came to us through them. That's all fine and dandy.

The unfair part is that in the end (it wasn't part of the deal) they insisted that our site would be linked and be accessible through a subdomain of theirs (i.e. oursite.their-site.com).

To be clear, we do have our own site, they just redirect their subdomain with the aforementioned URI to us, stealing our hard-earned SEO in the process. Eventually, keywords that bring visitors to our site will bring them to theirs, showing them interfering ads that we don't get revenue from and - more importantly - profits from sales will have to be shared with the news portal site, even though users will come to us via say google not them.

To counter this, I want to prevent SEO indexing when the site is pointed via the subdomain. Now there are two ways that come to my mind of doing this, but none seem right to me:

  • serve a custom, deny-all robots.txt when under the domain
  • redirect requests to the subdomain'ed site if the referer is not the news portal

Has anyone encountered something similar, what would the best way be of solving this? (rearranging the deal is out of the question, sadly)

+1  A: 

"serve a custom, deny-all robots.txt when under the domain" is the right option, it is simple and effective.

David Dorward
+4  A: 

Google supports a rel=canonical tag you can use to tell Google (and anyone else who supports it) that "Even though you saw this page at http://oursite.theirsite.com, the URL I'd like you to remember is http://oursite.com"

James Polley