We have an application that performs comparisons on data objects to determine if one version of the object is different than another. Our application also does some extensive caching of these objects, and we've run into a bit of a performance issue when it comes to doing these comparisons.
Here's the workflow:
- Data item 1 is the current item in memory. This item was initially retrieved from cache and deep cloned (all sub objects such as Dictionaries etc). Data item 1 is then edited, and its properties are modified.
- We are then comparing this object against the original version that was stored in cache. Since Data item 1 was cloned and its properties changed, these objects should be different.
There are a couple of issues here.
The main issue is our deep clone method is very expensive. We profiled it against a shallow clone and it was 10x slower. That's crap. Here's our method to deep clone:
public object Clone()
{
using (var memStream = new MemoryStream())
{
var binaryFormatter = new BinaryFormatter(null, new StreamingContext(StreamingContextStates.Clone));
binaryFormatter.Serialize(memStream, this);
memStream.Seek(0, SeekOrigin.Begin);
return binaryFormatter.Deserialize(memStream);
}
}
We were initially using the following to clone:
public object Clone()
{
return this.MemberwiseClone();
}
This was more performant, but because it does a shallow clone all the complex objects that were properties of this object, such as Dictionaries etc, were not cloned. The object would still contain the same reference as the object that was in the cache, therefore the properties would be the same upon comparison.
So, does anyone have an efficient way of doing a deep clone on C# objects that would cover cloning the entire object graph?