views:

101

answers:

1

I've got a bunch of ajaxified links that do things like vote up, vote down, flag a post - standard community moderation stuff.

Problem is that the googlebot crawls those links, and votes up, votes down, and flags items.

Will adding this to robots.txt prevent the googlebot from crawling those links? Or is there something else I need to do?

User-agent: Googlebot 
Disallow: /item/*/flag/
Disallow: /item/*/vote/

Thanks!

EDIT: changed method to post and googlebot is still flagging posts.

syntax is:

<a href="onclick="$.ajax({data:'authenticity_token=' + encodeURIComponent('blahblahblah'), dataType:'script', type:'post', url:'/item/flag/236?method=post'}); return false;">

any thoughts?

+3  A: 

You should change these links to be POSTs rather than GETs, then you won't have to worry about crawlers. There are other reasons to do this: web caches may decide to cache the GET, and therefore not actually hit your server when the user clicks the link. They won't cache POSTs.

Ned Batchelder
+1. Don't use links for unsafe operations. http://www.w3.org/2001/tag/doc/whenToUseGet.html#checklist
David Dorward
thank you guys!
kareem
So...how about accepting the answer? That was good advice
Rafe Lavelle