views:

931

answers:

3

Given a data structure (e.g. a hash of hashes), what's the clean/recommended way to make a deep copy for immediate use? Assume reasonable cases, where the data's not particularly large, no complicated cycles exist, and readability/maintainability/etc. are more important than speed at all costs.

I know I can use Storable, Clone, Clone::More, Clone::Fast, Data::Dumper, etc. What's the best practice in 2008?

+9  A: 

My impression is that Storable::dclone() is somewhat canonical.

chaos
+4  A: 

Clone is probably what you want for that. At least, that's what all the code I've seen uses.

Dan
+6  A: 

Clone is much faster than Storable::dclone, but the latter supports more data types.

Clone::Fast and Clone::More are pretty much equivalent if memory serves me right, but less feature complete than even Clone, and Scalar::Util::Clone supports even less but IIRC is the fastest of them all for some structures.

With respect to readability these should all work the same, they are virtually interchangiable.

If you have no specific performance needs I would just use Storable's dclone.

I wouldn't use Data::Dumper for this simply because it's so cumbersome and roundabout. It's probably going to be very slow too.

For what it's worth, if you ever want customizable cloning then Data::Visitor provides hooking capabilities and fairly feature complete deep cloning is the default behavior.

nothingmuch