Using robot.txt is it possible to restrict robot access for (specific) query string (parameter) values?
ie
http://www.url.com/default.aspx #allow
http://www.url.com/default.aspx?id=6 #allow
http://www.url.com/default.aspx?id=7 #disallow
Using robot.txt is it possible to restrict robot access for (specific) query string (parameter) values?
ie
http://www.url.com/default.aspx #allow
http://www.url.com/default.aspx?id=6 #allow
http://www.url.com/default.aspx?id=7 #disallow
User-agent: *
Disallow: /default.aspx?id=7 # disallow
Disallow: /default.aspx?id=9 # disallow
Disallow: /default.aspx?id=33 # disallow
etc...
You only need to specify the url's that are disallowed. Everything else is allowed by default.