tags:

views:

17

answers:

1

Hi,

I'm writing a web app which is using a mysql database. I want to show running time for a particular query, but I want it to be useful for other developers trying to do the same thing. The point is to give other developers an idea as to the cost of doing this query if they try the same web app pattern.

What is a good way to do this? I can run the query on mysql N times and average the results. I can modify the dataset I'm running on to provide expected, best, and worst case scenarios. Is any of that useful though for other developers? Is there some other way to go about this?

I see in mysql query browser that it'll report the time it took to run the query. Is that all that's needed to provide an accrate report?

I understand the same pattern will have different run times on different architectures,

Thanks

A: 

Determine the number of logical reads used by the query. This won't fluctuate like elapsed time will.

Mitch Wheat
Hi Mitch, can you elaborate on this? Does a row-fetch count as a read? For example, with a simple select [select x from table where age > 13] - how many reads would that generate? At least a read per row where age > 13, but do the rows that have an age < 14 each have to be 'read' to do the filter?
depends on the row sizes...
Mitch Wheat