views:

171

answers:

13

In the design process of my framework, I come to a point where I think about merging POST and GET parameters into one single $parameters variable.

The advantage for the developer: The framework filters all parameter values to secure agains XSS-attacks (i.e. funny kids inserting bad javascript code to redirect visitors to a spam site) and other sort of useful validation / filtering.

But as usual: Is there any real advantage to separating POST and GET, without respect to that they are just different because they come from different sources?

I mean: Does that matter? Would it be "good design" at any point, when a POST parameter has the same name as an GET parameter, and both are really used? In my eyes that's ugly, but maybe someone has a good explanation why I should not even attempt to merge POST and GET.

I would consider POST to be overriding GET in any case. I hope for honest answers :-)

+5  A: 

POST and GETrequest have a different semantic. A short description is available on Wikipedia. Basically a GET request

should not be used for operations that cause side-effects, such as using it for taking actions in web applications. One reason for this is that GET may be used arbitrarily by robots or crawlers, which should not need to consider the side effects that a request should cause.

Note that this is not enforced by the HTTP protocol, it is something your application must ensure. Therefore you should separate the different HTTP verbs in your framework.

An example what might happen if a GET request is not simply returning a resource with the above-mentioned restrictions: Well-Intentioned Destruction.

0xA3
Well, from a security point of view any half-intelligent delete-bot could just fake POST requests. I see no big difference except that it's easier for a mad user to fool around with GET, but it's also not hard do fool around with POST at all. Just grab Firebug. Same thing ;)
openfrog
0xA3
(Continued) The differences are a bit better described here: http://carsonified.com/blog/dev/the-definitive-guide-to-get-vs-post/
0xA3
+1  A: 

I believe the popular Ruby on Rails framework combines them into a params variable (technically a method...) but also allows you to access the original GET or POST parameters through other means.

In my code, I have been combining them both and have yet to run into any problems.

Topher Fangio
A: 

I'm pretty sure if you are building a RESTful web app, that it definitely DOES matter.

DJTripleThreat
A: 

Ideally*, POST has (or can have, to be pedantic) side-effects and GET doesn't. So a third party can follow GET links without fear of say, deleting things. Or whatever passes for modification in a system.

Responses to GET can also be safely cached under certain circumstances while POST should never be cached.

I wouldn't merge the two, simply because you lose these distinctions.

*Ok, alot of people screw this up so you can't rely on this behavior; but why contribute to the problem?

Kevin Montrose
"Responses to GET can also be safely cached (in the absence of information to the contrary)" Actually, the spec says they SHOULD not be cached if they have any query params unless there's a caching directive that allows it.
EricLaw -MSFT-
To extend on Eric a bit: http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13.9 *We note one exception to this rule: since some applications have traditionally used GETs and HEADs with query URLs (those containing a "?" in the rel_path part) to perform operations with significant side effects, caches MUST NOT treat responses to such URIs as fresh unless the server provides an explicit expiration time.*
Arjan
Correct and corrected.
Kevin Montrose
+2  A: 

In some instances, accepting a GET rather than a post could make you more subject to a CSRF attack. That's not a hard and fast rule, however, and you should take steps to prevent CSRF even when accepting POST.

EricLaw -MSFT-
For example, Django's XSRF protection prevents spoofed POST requests, but not GET requests.
Tobu
+1  A: 

From W3:

9.1.1 Safe Methods

Implementors should be aware that the software represents the user in their interactions over the Internet, and should be careful to allow the user to be aware of any actions they might take which may have an unexpected significance to themselves or others.

In particular, the convention has been established that the GET and HEAD methods SHOULD NOT have the significance of taking an action other than retrieval. These methods ought to be considered "safe". This allows user agents to represent other methods, such as POST, PUT and DELETE, in a special way, so that the user is made aware of the fact that a possibly unsafe action is being requested.

Naturally, it is not possible to ensure that the server does not generate side-effects as a result of performing a GET request; in fact, some dynamic resources consider that a feature. The important distinction here is that the user did not request the side-effects, so therefore cannot be held accountable for them. 9.1.2 Idempotent Methods

Methods can also have the property of "idempotence" in that (aside from error or expiration issues) the side-effects of N > 0 identical requests is the same as for a single request. The methods GET, HEAD, PUT and DELETE share this property. Also, the methods OPTIONS and TRACE SHOULD NOT have side effects, and so are inherently idempotent.

However, it is possible that a sequence of several requests is non- idempotent, even if all of the methods executed in that sequence are idempotent. (A sequence is idempotent if a single execution of the entire sequence always yields a result that is not changed by a reexecution of all, or part, of that sequence.) For example, a sequence is non-idempotent if its result depends on a value that is later modified in the same sequence.

A sequence that never has side effects is idempotent, by definition (provided that no concurrent operations are being executed on the same set of resources).

So basically GET is intended to be idempotent (if you resubmit the form, you end up with the same result as once - you don't get your orders from Amazon delivered twice as a consequence, for instance.) POST is intended to be more liberal in how it behaves.

le dorfier
See Eric's comment at http://stackoverflow.com/questions/2030694/is-there-any-good-reason-why-i-should-care-about-if-parameters-have-been-passed-v/2030743#2030743 (section 13.9 of that same RFC).
Arjan
+2  A: 

GET queries can be bookmarked, linked to, and are saved in browser's history. This can be good or bad; for instance, your users wouldn't want other people seeing that they visted example.com/?password=jigglypuff, or have someone tricked into clicking the link example.com/?changepasswordto=irh4x0r

anonymous
Well, that' wasn't really the question... the question is: Is it bad to merge POST and GET? Not: Should I use only POST or only GET?
openfrog
There are times you would and wouldn't want to use GET. Thus it DOES answer his question: the times you want to use GET, you could also use POST (though it's considered bad practice because, why bother), but there are times you absolutely don't want to use GET, and thus should use POST.
anonymous
+1  A: 

If you use POST, you can't bookmark the action directly. Imagine you have a method that creates a new item:

YourPage.aspx?action=create&param=abcde

If I happen to bookmark this make (it may be by accident because it displays another page that I want to bookmark), every time I open my Bookmark I try to create a new item.

That's especially a concern when Search Engine come into play - if you combine that with bad authentication then in the moment when Google starts indexing all those "?action=delete" links, the fun starts.

Maybe stick to literal english: Use GET to get data, use POST to modify data.

Michael Stum
A: 

In addition to the good points already made, there's often a limit as to how long a URL can be, imposed by any one of the web browser, the web server, or the CGI framework. For example, the URL is often passed to the CGI process via an environment variable which has a limited size. If you're using a GET request to pass a lot of data, say editing a text box, you can run up against that limit. Often that means the data is silently truncated.

Schwern
A: 

Interesting question.. It actually took me quite sometime to get the POST vs GET question straight in my own projects - but for quite some time I do things like this: I usually use POST whenever forms are involved, and GET whenever I want to trigger something through a regular link.

Or to put it this way: I use POST for administration and GET for navigation.

Thinking of security I suppose it actually doesn't really make much difference - attacks can come through either POST or GET - so be sure to always check your users' input - no matter what method you use..

tillinberlin
+1  A: 

As a lot of people have stated it kinda depends on the application you are building. The one thing that I have come across working of several web apps is that if you are using GET, there is a size limit (i believe 255 bytes). Depending on what you are doing this might not be an issue, but in circumstances that you have large amounts of text or parameters being passed back to the server, you can hit this limit and it will drive you crazy trying to figure out what happened!

Stephen
+2  A: 

I think everyone's missing the point of your question (or maybe I'm just misunderstanding it.) You're not asking the difference between GET/POST, you're wondering if its a good or bad idea for the framework that you're building to automatically merge the results of these two together into one safe variable. Both .Net and PHP do this so I don't see why not.

In PHP you can use $_GET or $_POST for a specific method or just $_REQUEST. Same with .Net, Request.QueryString and Request.Form vs Request. If someone has a reason to only get the POST/GET the variables are still there.

Chris Haas
True, but *using* things like `$_REQUEST` might introduce a vulnerability to cross-site request forgery (CSRF); see http://stackoverflow.com/questions/2030694/is-there-any-good-reason-why-i-should-care-about-if-parameters-have-been-passed-v/2093598#2093598
Arjan
A: 

Some calls really should be POST-only, or provide other means (like a simple confirmation question) to ensure it's indeed the user who requested some action. When allowing GET to change things, one might be vulnerable to cross-site request forgery (CSRF). So:

To ensure developers use your framework in a secure way, you could somehow enforce the developers to explicitly define whether they expect GET or POST. Hence, you might not want to merge parameters? When not merging, a GET will probably fail if the script refers to POST parameters.

Arjan