views:

11

answers:

0

Without a novel, I'll try to explain and hope it makes sense.

We have an app that handles sending data to a server via simple HTTPS, no problem all works fine. HOWEVER, if the packets are sent via a satellite phone, the latency of transmissions is a lot longer than the typical milliseconds handled on land-line high-speed network access. So, the exact same packets being sent out are broken up automatically via Windows API when through the following functions:

DECLARE INTEGER InternetOpen IN WININET.DLL
DECLARE INTEGER InternetCloseHandle IN WININET.DLL
DECLARE INTEGER InternetConnect IN WININET.DLL
DECLARE INTEGER HttpOpenRequest IN WININET.DLL
DECLARE INTEGER InternetQueryOption IN WININET.DLL
DECLARE INTEGER InternetSetOption IN WININET.DLL
DECLARE INTEGER HttpSendRequest IN WININET.DLL
DECLARE INTEGER HttpQueryInfo IN WININET.DLL
DECLARE INTEGER InternetReadFile IN WININET.DLL

I am not specifically controlling the "packets", just building an entire request and sending it and get an answer.

What APPEARS to be happening is this. The packets are getting split into smaller chunks as typical with a large amount of data. However, by the time the get to the satellite, they are NOT re-assembled in the correct sequence thus acceptance at the final destination fails.

So, now the question... Is there a way I can specifically let Windows know to slow down the frequency that packets are sent out to HELP prevent the latency issues? Since we don't have access to the satellite system, we can't prove this is the issue, but it appears they get the packets, just don't put them back together in the correct sequence.

Thanks.