views:

147

answers:

2

I want to send a huge set of data to a WCF service. The data might consist of thousands od records (entities), depending on the parsed input file.

Now the question is: what is the most optimal way to send these data?

a. Record by record?

By this I will be sure that I won't exceed the maximum allowed message size and I can recover from a network problem at the position of las successfully send entity, but on the other hand there will be A LOT of overhead related to connecting to the same service thousands of times and transmit with each record the SOAP headers (really big overhead).

b. All at once?

This will save me some overhead, but on the other hand, if the message size will reach let's say 500 MB or 2 GB, I will block the machine, I will exceed the maximum message quota and let's say that after uploading 490 MB out of 500 MB a network connection occurred, then I have to re-send the 490 MB.

c. Parts?

By this I mean sending the data but in part. I'll split the data after each 100 records and upload them part by part. Some savings on the overhead.

Is there any better way to perform this? Any ideas? Which one is the most optimal one?

Thanks in advance.

+2  A: 

WCF supports streaming to allow large files to be transferred to/from a service endpoint in a performant fashion. Check out this article for more information.

pmarflee
A: 

I find "c. Parts" to be the most suitable way to solve this. The article mentioned by pmarlee is very good, but that way of uploading the rows will not enable me to use WS-ReliableMessenging and this is very limiting because I need to make sure that the entities are all uploaded and their order is preserved.

Each part will be at most 64 KB.

Karim