views:

1220

answers:

3

I use Visual Studio Team System 2008 Team Suite for load testing of my Web-application (it uses ASP.MVC technology).

Load pattern:Constant (this means I have constant amount of virtual users all the time). I specify coniguratiton of 1000 users to analyze perfomance of my Web-application in really stress conditions.I run the same load test multiple times while making some changes in my application.

But while analyzing load test results I come to a strange dependency: when average page response time becomes larger,the requests per second value increases too!And vice versa:when average page response time is less,requests per second value is less.This situation does not reproduce when the amount of users is small (5-50 users).

How can you explain such results?

+3  A: 

Perhaps there is a misunderstanding on the term Requests/Sec here. Requests/Sec as per my understanding is just a representation of how any number of requests that the test is pushing into the application (not the number of requests completed per second).

If you look at it that way. This might make sense.

High Requests/Sec will cause higher Avg. Response Time (due to bottleneck somewhere, i.e. CPU bound, memory bound or IO bound).

So as your Requests/Sec goes up, and you have tons of object in memory, the memory is under pressure, thus triggering the Garbage Collection that will slow down your Response time.

Or as your Requests/Sec goes up, and your CPU got hammered, you might have to wait for CPU time, thus making your Response Time higher.

Or as your Request/Sec goes up, your SQL is not tuned properly, and blocking and deadlocking occurs, thus making your Response Time higher.

These are just examples of why you might see these correlation. You might have to track it down some more in term of CPU, Memory usage and IO (network, disk, SQL, etc.)

Jimmy Chandra
The trouble is that my test configuration is same all the time(it means that number of requests the test is pushing into the application is the same each time I run load test). So if my changes in the application made it working more slowly (page response time became higher) it should lead for total test time became higher and requests/sec should become less.
Why Request/sec goes up if I run the same test and the page response time is higher? Your answer could be right if I had varied the amount of virtual users.But I do not change load test configuration.
A: 

A few more details about the problem: we are load testing our rendering engine [NDjango][1] against the standard ASP.NET aspx. The web app we are using to load test is very basic - it consists of 2 static pages - no database, no heavy processing, just rendering. What we see is that in terms of avg response time aspx as expected is considerably faster, but to my surprise the number of requests per second as well as total number of requests for the duration of the test is much lower. Leaving aside what we are testing against what, I agree with Jimmy, that higher request rate can clog the server in many ways. But it is my understanding that this would cause the response time to go up - right? If the numbers we are getting really reflect what's happening on the server, I do not see how this rule can be broken. So for now the only explanation I have is that the numbers are skewed - something is wrong with the way we are configuring the tool.

[1]: http://www.ndjango.org NDjango

mfeingold
You are using too many virtual users for the test machine. Add more test machines (need licence for the user agent) or reduce the number of virtual users.
Nat
A: 

This is a normal result as the number of users increases you will load the server with higher numbers of requests per second. Any server will take longer to deal with more requests per second, meaning the average page response time increases.

Requests per second is a measure of the load being applied to the application and average page response time is a measure of the applications performance where high number=slow response.

You will be better off using a stepped number of users or a warmup period where the load is applied gradually to the server.

Also, with 1000 virtual users on a single test machine, the CPU of the test machine will be absolutely maxed out. That will most likely be the thing that is skewing the results of your testing. Playing with the number of virtual users you will find that there will be a point where the requests per second are maxed out. Adding or taking away virtual users will result in less requests per second from the app.

Nat