+2  A: 

You can use curl for this.

But you might have other issues to think about. Cookies, sessions etc are set when a browser physically posts to the another url. These might not be set when post from the server. You also should check out screen scraping questions on SO for more on this.

Assuming this is not the case, you should be able to get the post, validate the fields, and repost using curl. There are many many examples of doing this.

edit

  • post form to your server.php
  • process/validate fields in server.php
  • post validated parameters using curl to remote.service
Byron Whitlock
Thanks for the response, especially the edits! The David Walsh example was what I needed. I had gotten lost in the curl_setopt options.
jerrygarciuh
+2  A: 

I haven´t tried it, but what I would do, is submit the form to it´s final destination and add a javascript onSubmit() function that does an ajax request to your server and returns a true or false.

That is, if you can rely on javascript...

jeroen
This is a great approach and doesn't require any server side resources.
Byron Whitlock
This would be acceptable as long as you don't care if the user knows what the final URL is, giving them the opportunity to post any data they want to your final destination.
Matt Huggins
@Matt Huggins: Very true. You could of course have that same ajax request return / set the final destination if you want to avoid that.
jeroen
+2  A: 

Just have the request/response go like this:

+---------+   request    +--------+   curl request   +--------+
|         | -----------> |        | ---------------> |        |
| browser |              | url #1 |                  | url #2 |
|         | <----------- |        | <--------------- |        |
+---------+   response   +--------+   curl response  +--------+

The user sitting behind the browser won't have the benefit of knowing what the final URL (url #2 from above) is since it's nowhere in the HTML source, so they won't ever hack and jump past the middleman URL (url #1) manually.

Matt Huggins
+1  A: 

I find that issuing a "wget" tends to be easier to manage than CURL.

$remoteContent = `wget -o - http://someremoteurl`;

Aside from that Matt's response is correct. However you if the response from the remote site that you're screen-scraping contains links, you'll have to search and replace them (if you want to handle them yourself) - at which point you're creating a proxy server....

-CF

ChronoFish