views:

47

answers:

3

Hello,

My application gets configured via a lot of key/values (let's say 30.000 for instance)

I want to find the best deployment method for these configurations, knowing that I want to avoid DEFINEs to allow for runtime re-configuration.

I have thought of

  • pre-compiling them into an array via a php file
  • pre-compiling them into a tmpfs sqlite database
  • pre-compiling them into a memcached db

what are my options for

  • the best random access time to these configuration (memory is not an issue) ?
  • the best structured access time if i can break up these configuration into families like (network, i18n, ..)

Thanks Jerome

A: 

How about in a database?

With a schema like this:

| key | value |

You could have data like this:

| currency | pound |
| timezone | GMT   |

It also means you can query it like this:

SELECT * FROM options WHERE key = 'timezone'

Or even return many options:

SELECT * FROM options WHERE key IN ('timezone','currency')

This can be in any kind of database, not least an SQLite database. If you're using a language such as PHP you can use ADOdb for database abstraction so your application is portable between different database types, just a thought if you were concerned about being tied down to one database.

ILMV
The problem with a simple database setup is the access time. For each key or group of keys needed in a specific php code zone you need a request. Even with a cache that would probably mean hundreds of database fetch.
Jerome WAGNER
I don't know if I'm misunderstanding your point, but are you saying you would have repeating code? Because you can easily use functions or even classes as a wrapper and fetch the options that way, for instance `Options::getOption('timezone')`.
ILMV
no. There is an in-process cache for already fetched keys, but depending on the page being rendered, the keys may be different
Jerome WAGNER
+2  A: 

Well, if memory isn't an issue, just serialize an array to a file. There is absolutely no faster solution than this. You don't have the I/O and library overhead of SQLite, and you don't have the network overhead of memcached.

But keep in mind, memory must really not be an issue. You'd be loading the entire 30,000 element array into memory at once, as opposed to using a database, where you could load them on an as-needed basis.

To structure the settings, you could put each in its own file.

But really, you should be using a database. That's what they're there for. I really question why you would need to worry about 30k settings..you might want to reconsider your application design.

ryeguy
Random Access Memory always provides the best random access in my experience.
webbiedave
@webbiedave: can't argue with that statement.
ryeguy
by "serialize" an array to a file, you mean creating the array in a php file, or using the unserialize method on a pre-serialized array ? Does that make a performance difference ?
Jerome WAGNER
Just use `serialize()`. You're worrying too much about performance. Yeah, I guess you could generate a `.php` file that contains an array, but then you'd have to manually loop through your massive array and output each key/value. You'd have to check the datatype to make sure it's quoted and escaped properly. You'd have to handle nested arrays correctly. Or, you could just use `serialize()`. That's what it's there for.
ryeguy
And do you really need access to 30,000 settings at once?
ryeguy
no. but the access depends on what path the user request takes through the code.
Jerome WAGNER
Typically, how many settings do you need to load per request? Couldn't you just figure out what settings they need and load them all at once at the end of the request using an `IN` clause?
ryeguy
A: 

If you want structured, I'd say ini files using the parse_ini_file function.

Arkh