views:

52

answers:

3

I have several laptops in the field that need to daily get information from our server. Each laptop has a server2go installation (basically Apache, PHP, MySQL running as an executable) that launches a local webpage. The webpage calls a URL on our server using the following code:

$handle = fopen( $downloadURL , "rb");
$contents = stream_get_contents( $handle );
fclose( $handle );

The $downloadURL fetches a ton of information from a MySQL database on our server and returns the results as output to the device. I am currently returning the results as their own SQL statements (ie. - if I query the database "SELECT name FROM names", I might return to the device the text string "INSERT INTO names SET names='JOHN SMITH'"). This takes the info from the online database and returns it to the device in a SQL statement ready for insertion into the laptop's database.

The problem I am running into is that the amount of data is too large. The laptop webpage keeps timing out when retrieving info from the server. I have set the PHP timeout limits very high, but still run into problems. Can anyone think of a better way to do this? Will stream_get_contents stay connected to the server if I flush the data to the device in smaller chunks?

Thanks for any input.

A: 

Write a script that compiles a text file from the database on the server, and download that file.

Martin Hohenberg
What does that offer that simply returning the text in an echo statement doesn't? Does a browser download a text file faster than a webpage with just text?
JKov
your problem is likely founded in the text file being relatively slowly created from the database and thus hitting the maximal execution time for scripts. This only multiplies if several machines try to do it at the same time. If you create the output on the server into the text file, you are able to circumvent the max execution time, and you have to do this only once per day.
Martin Hohenberg
A: 

What if you just send over the data and generate the sql on the receiving side? This will save you a lot of bytes to transmit.

Is the data update incremental? I.e. can you just send over the changes since the last update?

If you do have to send over a huge chunk of data, you might want to look at ways to compress or zip and then unzip on the other side. (Haven't looked at how to do that but I think it's achievable in php)

milesmeow
A: 

You might want to consider using third-party file synchronization services, like Windows Live Sync or Dropbox to get the latest file synchronized across all the machines. Then, just have a daemon that loads up the file into the database whenever the file is changed. This way, you avoid having to deal with the synchronization piece altogether.

Alex