views:

118

answers:

1

So, my question is sort of based on experience, so I'm wondering about those out there who have tried to load datasets out there, what's a reasonable amount of data to load. My users have relatively big pipes, so I don't have to worry about modem users, but I do concern myself with processing times. I'm guessing my limit is somewhere in the 300-1024k range, but does anyone have a way or a website which has done something which can be a little more definitive?

I've run across this resource. It's from 2005, so I'd consider it out of date even though the general lesson seems to be pretty sound:

http://blogs.nitobi.com/dave/2005/09/29/javascript-benchmarking-iv-json-revisited/

I also came across this:

http://www.jamesward.com/census/

Is there is anything else out there worth checking into?

+3  A: 

A typical JSON packet can (and should) be compressed using gzip by the web server to approx. 10% its initial size. So you're really looking at 30-100k. If those responses can be cached, then it's even less of a problem.

The size of the transmission should not be the deciding factor in whether a packet is "too much". Instead, look at how long it will take the browser to process this packet (update the UI, etc).

Actually parsing the JSON should be very fast, up to many megabytes of data. Turning that into something new in the UI will largely depend on how complicated the HTML you're producing is.

Chase Seibert