tags:

views:

54

answers:

3

Hello,
I'm trying to make a script that will download search results from a HTTPS website using POST. So far, I'm able to download the web page before the submission but not the response page containing the search results. The problem seems to be that curl isn't waiting long enough for the response page to appear.

The website behaviors likes this.
Website appears-> input form data -> click submit -> progressing icon appears -> returns new web page with search results( new data but the url doesn't change )
My code: curl -d "postData" -k url

A: 

There are a couple options that you should take a look at:

--connect-timeout <seconds>
-m/--max-time <seconds>
Marcus Adams
A: 

Nope. Still doesn't work. I'm going to try using PHP and see what happens.

Larry Battle
A: 

I solved the problem by using Chickenfoot 1.0.7, a Firefox extension. The advantages of using Chickenfoot are the ability to script browser interaction, delay an operation and write to a file. But the API is a little hard to understand.
Anyhow, here's the general code I used.

//Type: Javascript
include( "fileio.js" ); //Provides the write command.
go( "https://the-url" );
enter( element, desiredInputValue ); //Use the "Record actions" option for help
click( "Search" ); // Clicks the search button to POST.
sleep( 9 );
write( "searchResults.txt", document.body.innerHTML );

Larry Battle