views:

1982

answers:

2

I learned Why Request.Browser.Crawler is Always False in C# (http://www.digcode.com/default.aspx?page=ed51cde3-d979-4daf-afae-fa6192562ea9&article=bc3a7a4f-f53e-4f88-8e9c-c9337f6c05a0).

Does anyone uses some method to dynamically update the Crawler's list, so Request.Browser.Crawler will be really useful?

A: 

You could check (regex) against Request.UserAgent.

Peter Bromberg wrote a nice article about writing an ASP.NET Request Logger and Crawler Killer in ASP.NET.

Here is the method he uses in his Logger class:

public static bool IsCrawler(HttpRequest request)
{
   // set next line to "bool isCrawler = false; to use this to deny certain bots
   bool isCrawler = request.Browser.Crawler;
   // Microsoft doesn't properly detect several crawlers
   if (!isCrawler)
   {
       // put any additional known crawlers in the Regex below
       // you can also use this list to deny certain bots instead, if desired:
       // just set bool isCrawler = false; for first line in method 
       // and only have the ones you want to deny in the following Regex list
       Regex regEx = new Regex("Slurp|slurp|ask|Ask|Teoma|teoma");
       isCrawler = regEx.Match(request.UserAgent).Success;
   }
   return isCrawler;
}
splattne
Warning - this is *not* fool-proof! If you install certain versions of the Ask.com toolbar (in IE, at least) it will modify the user-agent to include 'Ask' in some form, causing false-positives.
Kurt Schindler
+3  A: 

I've been happy the the results supplied by Ocean's Browsercaps. It supports crawlers that Microsoft's config files has not bothered detecting. It will even parse out what version of the crawler is on your site, not that I really need that level of detail.

DavGarcia
Nice! I will check it.
Click Ok
Thanks for pointing out Ocean's - I've been stuck with a very old set of BrowserCaps on our 1.1 sites for some time now.
Zhaph - Ben Duguid