Okay as all posts says if you want to make it search engine friendly then bots can scrap for sure .
But few things you can still do and it may be affective for 60-70 % scrapping bots.
Make a checker script like below.
if an particular ip visiting very fast then after few visits (5-10) put it ip+browser info in a file or DB.
Next Step.
(This would be a background process and running all time or scheduled after few minutes)
Make one another script that will keep on checking those suspicious ips.
Case 1. If the user Agent is of known search engine like google, bing,yahoo (you can find more info on user agents by googling it). then you must see http://www.iplists.com/ this list and try to match patterns .And if it seems a faked user-agent then ask to fill captcha on next visit. (You need to research a bit more on bots ips . I know this is achievable and also try whois of ip ,can be helpful)
Case 2. No user agent of a search bot simply ask to fil capthca on next visit.
Hope above will help
Sorry for bad englis :)
You can contact me if needs further help on making this stuff.(kinda interesting stuff)