I am trying to detect is a visitor is human or not. I just got an idea but not sure if this will work or not. But if I can store a cookie on the persons browser and retrieve it when they are browsing my site. If I successfully retrieve the cookie can this be a good technique to detect bots and spiders?
A well-designed bot or spider can certainly store -- and send you back -- whatever cookies you're sending. So, no, this technique won't help one bit.
Browsers are just code. Bots are just code. Code can do anything you program it too. That includes cookies.
Bots, spammers and the like work on the principle of low-hanging fruit. They're after as many sites or users as they can get with as little effort as possible. Thus they go after popular packages like phpBB and vBulletin because getting into those will get them into a lot of sites.
By the same token, they won't spend a lot of effort to get into your site if the effort is only for your site (unless your site happens to be Facebook or the like). So the best defense against malicious activity of this kind of simply to be different in such a way that an automatic script already written won't work on your site.
But an "I am human" cookie isn't the answer.
No, as Alex says this won't work; the typical process is to use a robots.txt
to get them to behave. Further to that, you start to investigate the user-agent
string (but this can be spoofed). Any more work than this and you're into CAPTCHA territory.
What are you actually trying to avoid?
You should take a look at the information in the actual http headers and how .Net exposes these things to you. The extra information you have about the person hitting your website is there. Take a look at what Firefox is doing by downloading Live Http Headers plugin and go to your own site. Basically, at a page level, the Request.Headers property exposes this information. I don't know if it's the same in asp.net mvc though. So, the important header for what you want is the User-Agent. This can be altered, obviously, but the major crawlers will let you know who they are by sending a unique UserAgent that identifies them. Same thing with the major browsers.
I wrote a bot that works with cookies and javascript. The easiest way of bot/spam prevention is use Nobot component in Ajax Control Toolkit.
http://www.asp.net/AJAX/AjaxControlToolkit/Samples/NoBot/NoBot.aspx