I'm working on a website where the frontend is powered by AJAX, interacting with the server in the ordinary RESTful manner and receiving responses as JSON.
It's simple enough to manage POST, DELETE, and PUT requests, since these won't differ at all from a traditional approach. The challenge comes in GETting content. Sometimes, the client needs to GET one resource from the server. Sometimes more.
Is there a performance issue with firing off each individual GET request asynchronously and populating the DOM elements as the responses come in?
I imagine so (but correct me if I'm wrong), and that a performance gain is possible by, say, providing an array of API queries in one request, and for the server to reply with a corresponding JSON array of responses.
POST, DELETE, and PUT are semantically incorrect for this sort of task. Also, GET request seems incorrect. After all sending a GET request to /ajax_handler?q=get_users,get_pages,get_current_user
seems kind of weird, since I'm used to seeing a GET request as that of a single resource.
Yet another alternative is just to prepare all the relevant data for each GET query (like you would in a regular non-AJAX page), and put it all together, leaving figuring out what's significant/new to the client, possibly through a last-modified item in each JSON array.
For fear of being closed as subjective, my specific question is whether there is a semantically ideal way to use one GET request to GET multiple, only distantly related pieces of data from a server, or whether the performance gain is even worth it?