views:

110

answers:

3

Hi all,

I have a form (that consists of 25+ fields) and the values for these fields, range from a tiny value to a concatenated string. This is like a search tool. Now when the user fills the form and submits the info, he gets to see all the relevant data that matches the criteria in the form. I am displaying 15 records at a time to the user. I have implemented pagination to enable user to see the other records as well.

THE MAIN PROBLEM:

The part, till the users submits the info and gets back the 1st set of data is good. The problem arises when user tries to go to 2nd page (or any page of his choice) via pagination. The user is able to navigate to the other pages but the query that is needed to execute properly for pulling out the results from the DB is not triggered. Notice that initally it was a POST operation that was performed in the form and the pagination performs a GET operation. So I am losing the values of the form that the user has input and I want to retain these values and query the DB with these values.

I am trying to avoid sending the form field values via GET because I fear that the data may exceed the maximum permissible value in the URL (& as it is less secure than a POST operation). There are other operations that can be performed on the results page that can lead to loss of the form values if I try to use a POST operation (like update query). Sessions would not really work as the user can choose to run the same form in different tabs with different inputs to compare the results and this can lead to the data of the older query replaced by the data from the newer query. I haven't thought of cookies as the user may have chosen to block it. Pretty much all the options seem to be exhausted.

So what can I do to retain the form values, run the proper query and get back the relevant values irrespective of the number of times the same form may be processed by the same user in different browser tabs/windows, without using sessions(given the restrictions on passing data via GET and possibly losing them in POST operations) and be able to perform other activities on the page as well?

Thank you in advance.

+1  A: 

First off, GET is no "less" secure than POST. Both are not to be trusted at all (It's more inviting to modify a url string, but by no means harder)...

You have a few options:

One, would be to keep a global "store" of search results. You can use a db table with id, data where data is a serialized array of the variables. Then when someone submits a search, check to see if the data is in the table. If so, use that id. If not, then serialize the data and insert it (and get the id). Then redirect to a page like: results.php?id=4. That way, they can share the link yet it stays reasonably secure from tampering (they can't alter the search params). The downside here, is that the table could grow HUGE.

Another would be to base64 encode the data and pass that through as a get parameter (base64_encode(serialize($data));). I would try to stay away from this if you're concerned with tampering or url length.

Another solution would be to intercept the next link click in JS, and use it to issue a POST back to your server from hidden variables.

EDIT: Removed the session solution. Realized that it wouldn't work for your problem.

ircmaxell
@ircmaxell Thank you for the reply. Storing in DB sounds good. I would Just have to keep a tab on them. I will give it some more thought on how it can be done with the same approach. Thanks.
Devner
@ircmaxell How will an index on data column (that holds the serialized array of the variables) affect our search results? Index on that will definitely be faster, but considering the length of the data, what do you suggest?
Devner
Well, create two indexes. A PK on `id`, and a index on `data`. That way both operations are simple index lookups... Now, since the index length on data will be smaller (Use a Text column for the `data` field, so the index would be around 255 chars), it's not a true index lookup, but it should be plenty fast even with tens or hundreds of thousands of searches...
ircmaxell
@ircmaxell Thank you for the clarification. Appreciate it.
Devner
A: 

I am trying to avoid sending the form field values via GET because I fear that the data may exceed the maximum permissible value in the URL

Then you're going to have to do your pagination with a form as well, so they can also POST. Either that, or you'll have to store query terms on the server side somewhere, possibly in a session -- but don't forget that a user might have multiple tabs open at once, so you need to be able to store more than query per user.

But there's no reason you can't store multiple queries in a single session, e.g. $_SESSION['queries'][1234]. Then your pagination links would look like ?query=1234&page=3

Consider, however, that it may be useful for users to be able to share URLs of their search results. e.g., if google used POST exclusively, I couldn't send you a link to http://google.com/search?q=somequery

(& as it is less secure than a POST operation)

Not really.

Frank Farmer
@Frank Thank you for the reply. Sessions are less useful in my case. Could you please tell me what you mentioned about the sharing of the URLS? I kind of lost you on that.
Devner
Say your user, Bob, wants to share the page of results he's looking at in your application with his coworker, Sally. For that to work well, the URL in the address bar must contain enough information to reconstruct the whole query.
Frank Farmer
Thank you for the clarification.
Devner
A: 

A possible approach would be to store the whole query you used for searching in your database in a table say called search_queries. That table should contain essentially two columns, a hash and the query used for that search item.

When a user submits a search form, his query is evaluated and inserted in that table and he is redirected to a page with his search_hash. Every time he navigates to a different page his hash is pulled from the database and results re-evaluated accordingly -- with the proper LIMIT of course.

Make sure you cron that table (for that you might need a timestamp for each search item)

Another viable implementation of this approach would be to store the query in a SESSION variable and use it for your querying purpose. For pagination, you would /search?page=1 and your _SESSION['query'] would be for example "SELECT * FROM Topics WHERE title LIKE '%test%'". You would essentially add "LIMIT "+($page*$perpage)+", $perpage"

However the latter approach would not be able to detect multi-windows this user has with the site. You could use an array in your _SESSION['queries'] and have the user submit a /search?id=0&page=1 where id would represent which query of the array you're querying in that window.

msakr
@mahmoudsakr Thank you for the reply. Sessions are not much use in my case. I have thought of using storing in DB as ircmaxell suggested and I see that you have suggested a similar approach in a bit different way. Thanks for the same.
Devner