views:

17

answers:

2

Here is the scenario

We are load testing a web application. The application is deployed on two VM servers with a a hardware load balancer distributing the load.

There are tow tools used here 1. HP Load Runner (an expensive tool). 2. JMeter - free

JMeter was used by development team to test for a huge number of users. It also does not have any licensing limit like Load Runner.

How the tests are run ? A URL is invoked with some parameters and web application reads the parameter , process results and generates a pdf file.

When running the test we found that for a load of 1000 users spread over period of 60 seconds, our application took 4 minutes to generate 1000 files. Now when we pass the same url through JMeter, 1000 users with a ramp up time of 60 seconds, application takes 1 minutes and 15 seconds to generate 1000 files.

I am baffled here as to why this huge difference in performance.

Load runner has rstat daemon installed on both servers.

Any clues ?

A: 

Most likely the culprit is in HOW the scripts are structured.

Things to consider:

  • Think / wait time: When recording, Jmeter does not automatically put in waits.
  • Items being requested: Is Jmeter ONLY requesting/downloading HTML pages while Load runner gets all embedded files?
  • Invalid Responses: are all 1000 Jmeter responses valid? If you have 1000 threads from a single desktop, I would suspect you killed Jmeter and not all your responses were valid.
BlackGaff
All the JMeter responses are valid. I check the count of files. Another interesting thing to note is that this call to server is asynchronous request. You do not have to wait for response to come back.
vsingh