tags:

views:

31

answers:

1

I'm writing a google maps page with about 160 records, 1 for each location. The original map data is held in a sql table. As there is quite a lot of text this could be a max of 900 characters per record.

I filter this data my month so there may be 20-30 records shown as markers on the map. I want the user to quickly switch between months so I want to keep the data local (not run a sql query).

Is reading all this data into php arrays from SQL (say max of 150k) a good idea (ie. sitting in ram) or does not really matter?

Is Json any better? (have never tried this before).

A: 

Loading data into memory will work only per each request, 10 request = 10x data load. If you have a lot of data, looks like you do, it is a BAD idea to load everything into memory. You will be wasting memory and putting a load on database (especially if many users will use it at the same time).

You are better of using AJAX and request only data that you need and caching it (storing in Hash table) on users side. You will have big performance gain. If you are new to AJAX and json, look into JavaScript framework - Mootools or JQuery. Both of them are fantastic frameworks, I personally prefer Mootools ;-).

However, if you still want to do it your way. It will have to be done like this:

  1. Load data from database into array of objects ($data)
  2. Encode the data json_encode($data)
  3. Store the encoded data to some hidden input element on the page
  4. On the page, using JavaScript get the stored data from the hidden input, decode it and do stuff with it...

This way the load will goto on the user's browser.

Alex