Hi all, not really a problem more of a question. If you could tell the difference between google bot and other users browsing your site, and then you load different content in depending if it is one or the other, is there anyway google could find out? After all, they don't send a human to visually check things.
What you're talking about is called cloaking.
They will find out eventually - especially if one of your competitors tips them off - then you'll get delisted.
The only way you'd be able to tell is via the IP address the HTTP requests come through or through other HTTP Request information, probably the User-Agent.
Edit: I've just searched on User-Agent and found the following interesting site. I've never used this info, but this may be your solution:
Google sends a different 'User-agent' header than a normal browser, so you can easily use that as the determining factor and route bot requests to a different page. However, you should bear in mind that they're probably a lot more advanced than you in the arms race to spoof their crawlers.
Specifically you should probably read this page before attempting shenanigans.
EDIT: actually this page is more apropos.
Google's bots can't and won't run JavaScript, so one thing you can do is have your page's contents be one thing and then on load replace it with JavaScript.
People who have all-Flash sites and don't want to be invisible to Google sometimes use this: http://blog.deconcept.com/swfobject/
it's basically a JavaScript library which does the switcheroo.
But it has problems
- For legitimate usage it means you have to keep your Flash and HTML code in sync.
- For anyone who doesn't have JavaScript enabled it will not work (they'll see the content you intended for the Google crawler).
- As others have pointed out, if you use it for anything other than just making the HTML of your site match the Flash content and Google ever catches on you'll be banished pretty much forever.
But for a site where you have to use all Flash but want Google to see it (and don't have a recent version of Flash, which supposedly nullifies this) you can use it. I've done it before.
Google makes it easy to identify their robot, but they also run a large network of people who allow Google to track their searches and the pages that they retrieve. This way Google can identify sites that display special content for the robot and drop them from the database.
This reminds me of the people who think that they have developed a new uncrackable encryption algorithm, but they won't tell anyone what the algorithm is because it is such a great one. Unfortunately, they end up making the same mistakes that countless others have made before them.
Kudos to you for asking the question openly before building the code.