views:

377

answers:

1

Okay, here's one for the pro's:

For a couple of years now, i've been working on my own PHP ORM/ActiveRecord implementation that i named Pork.dbObject.

It's loosly based on the 'make your own site with rails in 5 minutes' movie we all saw a couple of years ago. You can do things like:

$clients = dbObject::Search("Client", array("ID > 500"));

or

$client = new Client(218); // fetch row with id 218 from client table

or

$projects = $client->Find('Project');

This will fetch one or more rows from the database, wrap them in a dbObject and return them in one array, or return false of there are no results.

All of this has been working perfectly in dozens of sites and backends, but now my colleague is using it to create a huge logparser and here starts the memory usage problems..

The queries he runs can return over 20.000 rows, maybe even more, which is ofcourse not a very good thing to wrap into an object wrapper all at once and return as a single array.

The obvious solution would be to return an object that implements the Iterator interface instead of an array. It shouldn't instantly fetch all the records from the resultset, but just hold the result resource for the generated database query and use mysql_fetch_* internally when you traverse the object as if it was an array.

Now we get to my real question: Can I, without any problems just do this? Are databases able to handle multiple open resultsets, and mix them and keep them in memory for a while?

For example, fetch 20 objects, loop them, let each of these 20 fetch 5 others, wich in their turn also fetch 3 others. This would create a loop where a number of different result handles will be kept in memory.

I know i can't serialize one of these objects, but will i be able to implement this without any problems in PHP5, or will database interfaces give me problems?

+2  A: 

It depends on which database you're using and your database configuration.

For MySQL you need to make sure you use buffered queries. In PDO you set it like this:

$myPdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, true);

This means that all the data will be sent to the client (not the same as fetching it all in PHP).

The other (probably worse) alternative is to open a new database connection whenever you detect a query being run with a result set still open.

The normal mysql_query() uses a buffered query, so that will work with multiple result sets.

Greg
Thanks, this is exactly what i'm looking for. I'm rewriting everything to use PDO now.
SchizoDuckie