views:

36

answers:

2

I have an area which gets populated with about 100 records from a php mysql query.

Need to add ajax pagination meaning to load 10 records first, then on user click to let's say "+" character it populates the next 20.

So the area instead of displaying all at once will display 20, then on click, then next 20, the next... and so on with no refresh.

Should I dump the 100 records on a session variable?

Should I call Myqsl query every time user clicks on next page trigger?

Unsure what will be the best aproach...or way to go around this.

My concern is the database will be growing from 100 to 10,000.

Any help direction is greatly apreciated.

+1  A: 

your ajax call should call a page which only pulls the exact amount of rows it needs from the database. so select the top 20 rows from the query on the first page and so on. your ajax call can take a parameter called pagenum and depending on that is what records you actually pull from the database. no need for session variables.

Caimen
Im lost on the last part. So I keep pagenum on a session variable and then update that with ajax call correct?
Codex73
i generally do it as a querystring, so my ajax page would be something like ajax.php?pagenum=1
Caimen
the ajax page should return one of the following to be displayed, xml, json, or just straight html
Caimen
ok so each call will pass the pagenum to the script and the script will return, let's say for page two 21 - 41, sending back pagenum++?
Codex73
+1  A: 

If you have a large record set that will be viewed often (but not often updated), look at APC to cache the data and share it among sessions. You can also create a static file that is rewritten when the data changes.

If the data needs to be sorted/manipulated on the page, you will want to limit the number of records loaded to keep the JavaScript from running too long. ExtJS has some nice widgets that do this, just provide it with JSON data (use PHP's encode method on your record set). We made one talk to Oracle and do the 20-record paging fairly easily.

If your large record set is frequently updated, and the data must be "real time" accurate, you have some significant challenges ahead. I would look at comet, ape, or web workers for polling/push solution and build your API to only deal in updates to the "core" data--again, probably cached on the server rather than pulled from the DB every time.

AutoSponge
Thanks for such a detailed answer.
Codex73