views:

644

answers:

7

I have an application that creates, and destroys thousands of objects. Is it worth caching and reusing objects, or is Delphi's memory manager fast enough that creating and destroying objects multiple times is not that great an overhead (as opposed to keeping track of a cache) When I say worth it, of course I'm looking for a performance boost.

+4  A: 

Only a profiler will tell you. Try both approaches in a tight loop and see what comes out on top :-)

Orion Edwards
A: 

I think this depends on the code your objects will execute during create and destroy. The impact from TObject.Create and TObject.Destroy is normally neglectable and may easily be outweight by the caching overhead.

You should also consider that the state of an object may differ when reused from that after just being created.

Uwe Raabe
A: 

Often the only way to tell - is to try it.

If current performance is adequate then you don't have much call to try and increase it. However, if you have performance issues, then some caching (or indeed some other strategies) may help.

Brody
A: 

You will also need some stats on how often a specific object (instance) is being used. If you're referencing the same set of data regularly, than caching may really improve performance but if the accesses are distributed across all the possible objects, than your cache miss-rate might be too high for it to be worth-while.

Dana the Sane
+14  A: 

From recent testing - if object creation is not expensive (i.e. doesn't depend on external resources - accessing files, registry, database ...) then you'll have a hard time beating Delphi's memory manager. It is that fast.

That of course holds if you're using a recent Delphi - if not, get FastMM4 from SourceForge and use it instead of Delphi's internal MM.

gabr
+12  A: 

Memory allocation is only a small part of why you would want to cache. You need to know the full cost of constructing a semantically valid object, and compare it with the cost of retrieving items from the cache, and not just for a micro-benchmark: cache effects (CPU cache, that is) may change the runtime dynamics in a real live running application.

Or to put it another way, measure it and find out. If you're not measuring, you're not engineering, just guessing.

Barry Kelly
I know you are right. I was just wondering whether it was even worth doing the benchmarking in this area, or concentrate my efforts elsewhere. Good job with the new features in the D2009 compiler btw!!
Steve
+2  A: 

You absolutely have to measure with real-world loads to answer questions like this. Depending on what resources are held in those objects, any resource contention, construction cost, size, etc., the answer may surprise you, and may even change depending on the nature of the load.

It is usually very difficult to determine where your performance issues will be without measuring.

Brent Rockwood
I hear what you are saying. I just wanted to find out whether it was even worth contemplating profiling this area!
Steve