views:

113

answers:

1

I have a class Bar which contains a List<Foo>, with both Foo and Bar implementing ISerializable.

When deserializing a Bar, the List<Foo> is initially filled with (the correct number of) nulls; then on exiting the Bar deserialization ctor, each Foo's deserialization ctor is called, filling the List<Foo> with the (correctly deserialized) Foos.

Why is this happening? I can't replicate it in a test project: whatever I have tried has resulted in the Foo deserialization ctors being called before the Bar ctor. This is actually the behaviour I would like, as I need the list to be filled in order to do some initialization for the deserialized Bar!

Anyone have an idea as to what could be causing the Foos to be deserialized so late? Thanks!

+2  A: 

It is logic. The deserializer deserialized it object by object, then following references. So, first it sets up the List with X spaces... which actually all are NULL.

Then it goes in and deserializes object by object, putting them into the proper references.

All check etc. logic from you should ONLY run AFTER deserialization has completed - per definition you have to always have partial / invalid states while the deserializer runs.

The issue why things are done that late is proabably that your test scenario is a lot easier than the real data, so something makes the serializer "turn the order" on the production side.

TomTom
That does not make sense to me. Either the serializer is going through the objects one at a time, in which case the test case would also behave like Bar, or it is looking at the object graph and starting at the bottom, in which case Bar should work differently.
Joel in Gö
And in neither case does it do what I would expect, which is start with Bar; then when it reaches the list of Foos, deserialize each Foo; then carry on with Bar.
Joel in Gö