views:

24

answers:

1

I have a site who's search ranking has plumetted. It should be quite SEO friendly because its built using XHtml/CSS and has been run against the SEO toolkit.

The only thing I can think that may be annoying Google is

  • The keywords are the same accross the whole site rather than being page specific. (cant see why this would be a massive deal
  • Another URL has been set up that simply points to my site (without redirecting) (again - no big deal)

  • Non UK users are automatically forwaded onto the US version of the site which is a different brand. I guess this could be the problem. If google spiders my site from the US then it will never get the UK version

So the question is is it possible to detect if who is accessing your site is actually a search engine that is spidering my site. In this case I don't want to do any geo-location

+2  A: 
  1. Do not use same keywords on entire site. Try to use specific keywords per page.
  2. Do not let several URL:s point directly to the same site since this will cause the inlinks from the different domains to be treated as to different domains. If you point URLs by redirect, all inlinks will be added to the target domain and thus increase it's "inlink score".
  3. To detect is request is from a crawler, use the browsercaps project: http://owenbrady.net/browsercaps/
Andreas Paulsson
Nice point on #2
AJM
@AJM: yes, and unfortunately this is a pretty common error that can make a big difference.
Andreas Paulsson