views:

568

answers:

6

Hi guys,

I guess no one will have a definative answer to this but considered predictions would be appriciated.

I am in the process of developing a mySQL database for a web application and my question is:

Is it more efficient to make a single query that returns a single row using AJAX

or

To request 100 - 700 rows when the user will likely only ever use the results of two or three?

Really I am asking what is heavier for the server 2-3 requests with one result or 1 request with 100 - 700 results?

Thanks,

Mark

+1  A: 

If you only send requests to "single rows", you will have more stages where caching the response is possible. If the data is not likely to change within minutes, the result can also be cached by the client, by specifying valid until date. Having big data, you can even use ETAGs, what can save you a lot of bandwidth.

Zed
+1  A: 

It depends on the size of the returned data.

If your full result set is larger than um... lets say 100k, your users will likely get a better experience from the ajax version pulling a limited number of rows. Rendering large return sets also takes alot longer in the browser.

It's heavier database load to make several small queries, but bandwidth goes down significantly, so it appears faster to the user. The execution time for 10 versus 700 rows on the database side is probably pretty similar, but the transfer and rendering times are going to be significantly different.

Daren Schwenke
A: 

Returning 700 requests would be an order of magnitude larger than returning 3 results. If you generally know your average user would only be interested in 5 or fewer results, it would be safe to assume you could return those results up front and save yourself the trouble of wasting bandwidth from those extra 695 results nobody really cares about.

It is, after all, about providing users exactly what they want. If 700 is overkill, simply cut it back and save yourself the trouble.

If each row of data is 1kb, which may be an overestimation, that is a difference of 5kb per request (plus headers) vs. 700kb. That would add up very quickly given a number of users.

cballou
+1  A: 

You may want to time how long it takes to get the data from the database. If it takes too long, which will need to be determined by you, but I would guess is over 10 seconds, then you may want to break it into smaller requests, but 700 would be much. You could determine how long you want it to take, but you may want to try to get it to take less than 1 second for each request, and see how many requests that will take.

Profiling is important, as the database may not be the problem. If you are building a large table, for example, with 1000 rows, then if you fully create the table then render it, that will take a while.

Your best bet is to use firebug, in firefox, and see how long it actually took to get the data, since you have already profiled how long the database connection actually takes, and then you can see how long it seems to take to display the data once the information returned from the server.

Profiling is your best bet, as firebug allows you to profile javascript, to see if you really need to optimize the database.

James Black
A: 

I pretty much agree with what everyone else has had to say. I will add this, though. It really depends on your data and how you intend to use it and/or cache it. If the 700-1000 rows can be cached in a hash table and everyone will pull bits from that set, then a combination of the two approaches would work. Pull and cache the big set periodically (invalidate the cache as needed) but only push what a particular client needs via Ajax. That gives you the advantage of a single large query (which is typically more efficient) but only pushing the data the client needs (since the network is usually the biggest bottleneck).

And always do what James Black says...profile. That's the only way to know for sure.

Michael Johnson
+1  A: 

well there are certain things to consider here,

making several selects in separate queries and then processing the results in your php script is obviously going to face the overhead of network latency as the data has to travel back and forth between the client and the mysql server,

but since as you say the user would most probably only view a few of the results, so its better if you limit the result set to say 10 records and trigger some kind of an event to fetch more records. one way of doing it would be to have a more link after the 10 records which would fetch another 10 records and so on.

geeko