tags:

views:

300

answers:

1

QUESTION:

Is it better to send large data blobs in JSON for simplicity, or send them as binary data over a separate connection?

If the former, can you offer tips on how to optimize the JSON to minimize size?

If the latter, is it worth it to logically connect the JSON data to the binary data using an identifier that appears in both, e.g., as "data" : "< unique identifier >" in the JSON and with the first bytes of the data blob being < unique identifier > ?

CONTEXT:

My iPhone application needs to receive JSON data over the 3G network. This means that I need to think seriously about efficiency of data transfer, as well as the load on the CPU.

Most of the data transfers will be relatively small packets of text data for which JSON is a natural format and for which there is no point in worrying much about efficiency.

However, some of the most critical transfers will be big blobs of binary data -- definitely at least 100 kilobytes of data, and possibly closer to 1 megabyte as customers accumulate a longer history with the product. (Note: I will be caching what I can on the iPhone itself, but the data still has to be transferred at least once.) It is NOT streaming data.

I will probably use a third-party JSON SDK -- the one I am using during development is here.

Thanks

A: 

You could try to compress the JSON (gz perhaps) before you send it and then uncompress it on the client-side.

But I'm not sure how that affects iPhone performance.

Vivin Paliath
That's certainly a possibility. Whether that would help depends on whether the bottleneck is ultimately the network transfer or the decoding on the phone.
Amagrammer
My first guess for the bottleneck would be the uncompressing. As long as you're not doing this too often, it shouldn't be too bad.
Vivin Paliath