views:

107

answers:

4

Is it bad practice to issue the following POST request:

/test?a=1&b=2
POST data: c=3&d=4

Notice that 2 parameters are part of the URL and 2 parameters are part of the POST content.

On another note, is the following rule still recommended:

  • GET request: retrieve content from the server but do not change anything on the server.
  • POST request: post content to the server which may modify data on the server

I am asking because I see a bit of everything online.

Laurent Luce

+1  A: 

There is nothing wrong with that. The reason modifying data should be sent in the POST, is that you don't won't to change the data again if the user clicked the Refresh button. In that case, only the GET info will be sent.

Oren
If the person hits refresh, and the last operation was a POST (like in this request), it'll still most likely redo the post operation. Unless the server issues a redirect to a GET operation.
notnoop
he probably meant bookmarking/indexed by search engines. You really don't want data modifying requests to be indexed...
Itay Moav
+2  A: 

Yes, your assumptions are correct. You should be consistent on how you pass your parameters or require the parameters to be passed, but it's not going to do any harm really.

GET operations are supposed to be safe operations, that don't perform any side-effects (besides caching, etc), so they are easily cached by proxies and such. POST operations on the other hand may encure side effects.

I would recommend reading the Wikipedia entry on HTTP protocol:

GET

Requests a representation of the specified resource. Note that GET should not be used for operations that cause side-effects, such as using it for taking actions in web applications. One reason for this is that GET may be used arbitrarily by robots or crawlers, which should not need to consider the side effects that a request should cause. See safe methods below.

POST

Submits data to be processed (e.g., from an HTML form) to the identified resource. The data is included in the body of the request. This may result in the creation of a new resource or the updates of existing resources or both.

There are other operations too (e.g. HEAD, PUT, DELETE), and you should consider using them if you are designing an API. These are heavily discussed in RESTful API design.

notnoop
+3  A: 

That rule is most definitely still recommended.

It's reflected in the refresh behaviour of modern browsers. These will happily refresh with GET values, but will pop up a warning dialog on a refresh of a POST ('are you sure you want to resubmit?' etc.).

It looks like you were trying to combine the two methods (GET and POST) .. by POSTing to an URL with GET values. While this should work fine, it isn't commonly done. Forms usually exclusively use either one or the other.

+1  A: 

Yes, the semantics of GET and POST should be respected.

Given that fact, then there is often a very good reason for putting some parameters in the GET and some in the POST vars - consider the case where you have a web based script which does something like:

UPDATE datatable SET quantity=30 WHERE order=21559

This might be represented as:

 /update?order=21559
 POST data: quantity=30

C.

symcbean