I'm running some load tests from Visual Studio on a WCF service and I would like some help in trying to interpret / analyze the results.
After enabling counters in web.config the host has provided us data for the following counters: "Calls Duration" and "Calls Per Second".
I've assumed that "Calls Duration" is the figure I need to analyse as "Test Time" (inside Visual Studio) is (implicitly) dependant upon latency of the call over the internet. The sampling rate of the data provided from the host is per second.
- What is the relationship between the load (the number of users) and the value for Calls Duration? For example, if I have a constant load pattern of 10 users and a corresponding value for "Calls Duration" of 0.037 does this mean that this is the average time to process each call?
- Is there an "accepted" or "standard" maximum value for "Calls Duration"?
- Is "Calls Per Second" a value for throughput? For example, if the value is "0.9862" what does this tell me?
The objective of the tests is to find the limit of the service, i.e. it will support XXX users.
All help is greatly appreciated.
Thanks,
Jose