You could check (regex) against Request.UserAgent
.
Peter Bromberg wrote a nice article about writing an ASP.NET Request Logger and Crawler Killer in ASP.NET.
Here is the method he uses in his Logger
class:
public static bool IsCrawler(HttpRequest request)
{
// set next line to "bool isCrawler = false; to use this to deny certain bots
bool isCrawler = request.Browser.Crawler;
// Microsoft doesn't properly detect several crawlers
if (!isCrawler)
{
// put any additional known crawlers in the Regex below
// you can also use this list to deny certain bots instead, if desired:
// just set bool isCrawler = false; for first line in method
// and only have the ones you want to deny in the following Regex list
Regex regEx = new Regex("Slurp|slurp|ask|Ask|Teoma|teoma");
isCrawler = regEx.Match(request.UserAgent).Success;
}
return isCrawler;
}