I'm re-designing an app I inherited that sends digital photos from a laptop to a web server. The idea is to take photos "out on the field" and have them instantly published on a web page (with some more fancy features).
Typical scenario
1. Photos are transferred from the camera to the laptop using standard USB.
2. The photos are processed in various ways. (Not important)
3. Each photo is POSTed in small pieces (~64 kb each) using a webrequest to a standard Apache web server where it's merged together again.
The problem with the current design is that it often hangs when the network connection is unreliable. As we're using a mobile network (3G) and often end up out of coverage, I need a way to handle this properly.
My question is whether there's a better solution for doing this that won't make the app hang when the connection drops every now and then.
(Bonus question is how this could be properly unit tested without having to take a hike with the laptop.)
EDIT 2008-11-24: I've now managed to set up a proper test environment for this using a combination of NetLimiter and TMnetsim (freeware). I tried setting 5 kb/sec and dropping 1% of all packets - my app still works well with the new design.
EDIT 2008-12-11: Just to update how I did this. I created one background worker (as suggested below) that is started whenever a camera is detected to copy the photos from the camera to PC. Then another background worker i started when files arrive on PC to upload using asynchronous HTTP transfer. It sure was a pain to get everything right, especially since the operation should be "cancellable" at any time... But anyhow, now it works. A big THANKS to everyone who helped me!