I have found Bot Trap to be extremely effective in blocking the sneakiest of the web scraper bots.
Bot Trap works by placing a hidden link on your homepage. That link can only be seen in the source code to the page. Ergo, the only things that should see and follow that link are web robots. But, that link is Disallowed in robots.txt, so polite robots will never try to follow that link.
Robots that do follow that link get automatically added with Deny statements into your sites .htaccess file.
Sometimes legitimate web robots get out of sync, so to make the script able to run unattended, I recommend that you whitelist those in your .htaccess file.
Here's my current whitelist:
This blocks people stealing your content to place on MFA sites and it blocks people downloading your entire websites for offline reading. I constantly see people trying to download my fifty-thousand page web sites. It's a complete waste of bandwidth.
Allow from 127.0.0.1
Allow from 126.96.36.199/19 # Google
Allow from 65.55 # MSN
Allow from 207.46 # MSN
Allow from 66.249 # Google
Allow from 67.195 # Yahoo!
Allow from 188.8.131.52/18 # Google
Allow from 72.30 # Yahoo!
Allow from 74.6 # Yahoo!
Allow from 184.108.40.206/16 # Google
Allow from 220.127.116.11 # Baidu
Allow from 202.160 # Yahoo!
Bot Trap is friendly though. Users will see a message telling them that they are blocked and they only have to enter the word "access" into a form to be automatically unblocked.
Every blocking or unblocking action generates an email to the site admin.
It's really a beautiful script.