views:

108

answers:

2

I'd like to make sure that my website blocks automation tools like Selenium and QTP. Is there a way to do that ? What settings on a website is Selenium bound to fail with ?

+3  A: 

With due consideration to the comments on the original question asking "why on earth would you do this?", you basically need to follow the same strategy that any site uses to verify that a user is actually human. Methods such as asking users to authenticate or enter text from images or the like will probably work, but this will likely have the effect of blocking google crawlers and everything else.

Doing anything based on user agent strings or anything like that is mostly useless. Those are trivial to fake.

Rate-limiting connections or similar might have limited effectiveness, but it seems like you're going to inadvertently block any web crawlers too.

Gian
I agree with @Gian the best way to prevent somebody from automating user interaction with your site is to introduce something like a captcha. http://en.wikipedia.org/wiki/CAPTCHA
Dave Hunt
Take it to the next level by using REcaptcha (http://en.wikipedia.org/wiki/ReCAPTCHA). Then you prevent automated interaction while at the same time helping digitize old books and newspaper!
Zugwalt
thanks for your answer
sms169
A: 

While this questions seems to be strange it is funny, so I tried to investigate possibilities

Besides adding a CAPTCHA which is the best and the only ultimate solution, you can block Selenium by adding the following JavaScript to your pages (this example will redirect to the Google page, but you can do anything you want):

<script>
var loc = window.parent.location.toString();
if (loc.indexOf("RemoteRunner.html")!=-1) {
  // It is run in Selenium RC, so do something
  document.location="http://www.google.com";
}
</script>

I do not know how can you block other automation tools and I am not sure if this will not block Selenium IDE

ZloiAdun
thanks for your answer
sms169