views:

93

answers:

5

I was wondering if it would be possible to use jQuery's ajax function to continually return a result to a page and update accordingly.

What I mean is this: - User fires off a search - jQuery uses ajax function to get the first 25 results and adds them to a table. - While the user is looking at that list, jQuery keeps grabbing results 25 at a time and adding them to the table.

The idea behind this is, say your users search has 10,000 results. I would like to load them into a table that has paging controlled by Javascript so that we don't have to go back to the server each time the user wants to go to the next page. The time it takes a user to look at the first 25 results, we could probably have the next 50 loaded, which means we will have a very snappy looking interface with all the results.

Now there are definitely some downsides to this: - the user can't immediately go to "Last" - the user can't sort the table right away (or if they do they might immediately have a table that is not sorted correctly)

Still, I think this is an interesting idea and would love to try it out...but I have no idea where to begin.

How do you make the ajax function continue to run until a certain result happens? Can you add the results to a Table and have that table be continually changed without giving the user a bad interface experience?

A: 

JQuery provides callbacks for $.post, $.get and $.ajax. So, you can write a function to handle the input and keep feeding the function as the callback until the result sets are finished loading.

If the user does try the sort the table or jump down the results list, the table would need to reset and the next callback would need to detect the sort/paging change and send those variables to the server.

Something like this:

var page_number= 1;
var sort_by= 'column_name';
var sort_direction= 'asc';

function handle_result(data) {
  // Do something with the result/return data from the POST
  ....
  // $.post again . . . when there are more rows
  if (data.has_more_rows) {
    page_number++;
    $.post('process.php', { page_number: page_number, sort_by: sort_by, sort_direction: sort_direction }, handle_result);
  }
}

// The initial post . . .
$.post('process.php', { page_number: page_number, sort_by: sort_by, sort_direction: sort_direction }, handle_result);
pygorex1
A: 

Can't you put a callback? (pseudocode)

var keepCalling = function(){ $.getJSON(url,function(data){
    if(data.last){
      //Finish
    } else {
     sleep(5);
     keepCalling();
  })
};
eipipuz
A: 

You could use a callback and then in the returned data just have a flag that says whether or not it's the last set of data. However, there are some downsides here you'll want to consider. Storing information in JavaScript is not free. So if you're storing 10,000 results, that's going cost you something. How much will depend on the size of a record. The other issue you might run into is that if the user tries to do something else while your loop happens to be parsing the response of a few records, they'll notice a delay because JavaScript is single threaded. So it's possible that by trying to make your page snappier, you'll actually make it slower. Finally, keep in mind that you'll be making a lot of AJAX requests that won't ever be used so you'll be putting some unnecessary load on your servers.

Usually the compromise folks make is to preload and cache only some of the data, not all of it. So I'd suggest preloading maybe the next page or two and then caching a few of the pages so if the user backs up, that's fast. The YUI datasource is a nice piece of code that handles some of these complexities.

Sorry to play devil's advocate here, just some things to keep in mind.

Bialecki
A: 

It seems that both solutions provided will fire off one ajax call for every 25 results. So if there are 10,000 records, that would mean 400 ajax calls...per page load! This kind of defeats the purpose of the ajax call.

A better solution might be to load enough data for one page following the current page. Once the user accesses the last loaded page, then reload another batch of data. Always staying one page ahead of the user.

czarchaic
A: 

The best way to do something like this is not to continually grab as many results as possible, but instead only grab what your user will need.

In a basic AJAX paging table that displays 25 results per page, you would grab the first page on pageload, and when your user hits the page 2 link, grab 26-50, page 3 = 51-75, etc... If you wanted to be a bit smarter and make the app a bit snappier, you'd grab the next set while they are looking at the current set.

function get_results(start_record) {
    $.get("results.php", function(data) {
         // do something with the data
    });
}

Events:

// Page load

$(document).ready(function() {
get_results(0); //page 1
get_results(25); // page 2
});


// When user clicks second page link, results appear instantly, meanwhile...
get_results(50); // 3rd page data

If you really wanted to, you could grab 100 at a time. As others have mentioned though, be careful about how much your are pulling per chunk, grabbing too much can defeat the purpose of using AJAX.

Also, if you are not going to need data, delete *variable_name* it to free up some memory.

Derek Gathright