tags:

views:

114

answers:

1

I have to transfer a big array from one server to another using a file. It's a multidimensional, but quite simple array. Now I'm searching for the most efficient way, to get this file into my application on the second server. So this question is about the file->array part, not the array->file part on the first server.

Of course I did some benchmarks on the 3 alternatives that seemed most promising. My complete benchmark data:

time:

  • include: 0.338...
  • unserialize: 0.180...
  • json_decode: 0.134...

peak memory usage:

  • include: 384374.64
  • unserialize: 201377.28
  • json_decode: 219528.08

file size:

  • include: 3135 kB
  • unserialize: 3142 kB
  • json_decode: 1838 kB

I think json_decode is the way to go, because peak memory usage is my smallest concern and even there json is quite good. But the speed and file size of json just rocks. I woul have never thought it to be that fast.

Any objections or other suggestions?

+1  A: 

Kudos to Jan for 1) actually trying out different methods 2) sharing the results

Some time ago, I was working on an AI system whwew the knowledgebase was stored in a large array. I found it was an order of magnitude faster to rebuild the entire array from database records (local mysql, approx 30,000 rows) rather than unserialize the array from a file.

(This also meant that I was later able to amend the code to only selectively load the relevant parts of the knowledge tree which speeded it up even more).

HTH

C.

symcbean