views:

116

answers:

1

Hi, i've got a website that receives Posted XML data from a third party.

I'm looking for a method so I can batch post a number of XML files to this script for development/debugging purposes.

I have built a php script that loops through an array of files and uses curl to post each file separately. However due to the number of files i wish to post, I feel PHP isn't the best method as the script times out.

Ideally i'm looking for a terminal process/os x application that will pick up all files in a given directory and post the contents of each one to a defined URL one by one.

Any suggestions/ideas greatly received.

Jim

A: 

I'm a little confused. Why not just set the CURLOPT_TIMEOUT option appropriately on your posting application ? Otherwise your PHP solution seems to be functioning ok ?

If you don't want a PHP solution, have you looked at posting via HTTP in parallel, and use some scripting language with threading support (e.g. ruby or similar). The only headache is that you're now going to be loading your server more for the benefit of your script running faster, and you need to determine what sort of load your server-side process can handle.

Brian Agnew
Yeah, I could increase the timeout, but php doesn't seem to a 'nice' solution. There's gonna be 1000+ separate files to post. Ideally i'm looking for a method/app that would enable me to specify a sleep time between each post, so I don't cripple the server with to many requests at once. Also I don't want a local php script, posting the files, taking 30+ minutes to complete processing the script.I know i could wait for each CURL post to finish before posting the next, but i'm uncomfortable with running a script that will take aaaaages to process :-)
th3hamburgler