views:

61

answers:

2

In the example below I am interning the string in the constructor which is fine. However when i deserialise the object from the binary formatter I don't think the string will be interned as the constructor should be called. How should I be ensuring the _name string is interned? ... or will it be interned ok?

Edit: So it seems to work (interns the strings correctly) without handling the OnDeserializedAttribute. How does it do that?

I'm using a memory profiler, with or without the method below it still interns the strings? Magic? :-/

   [OnDeserializedAttribute]
   private void OnDeserialized(StreamingContext context)
   {
       _name = string.Intern(_name);
   }

Thanks

[Serializable]
class City
{
    private readonly string _name;

    public City(string t)
    {
        _name = string.Intern(t);
    }

    public string Name
    {
        get { return _name; }
    }

    public override string ToString()
    {
        return _name;
    }
}
+1  A: 

This is possible if you implement the ISerializable interface (not the Attribute). It will let you do the deserialization.

But it seems very unnecessary. Are you sure you are accomplishing anything with this?

Henk Holterman
yes. saving memory
DayOne
That is a micro optimization... you must have a really good reason, like working with a embedded system, to have that as reason for string Interning.
Dykam
i have good reason cheers
DayOne
A: 

Look at OnDeserializedAttribute

maciejkow