I have a class Bar
which contains a List<Foo>
, with both Foo
and Bar
implementing ISerializable.
When deserializing a Bar
, the List<Foo>
is initially filled with (the correct number of) null
s; then on exiting the Bar
deserialization ctor, each Foo
's deserialization ctor is called, filling the List<Foo>
with the (correctly deserialized) Foo
s.
Why is this happening? I can't replicate it in a test project: whatever I have tried has resulted in the Foo
deserialization ctors being called before the Bar
ctor. This is actually the behaviour I would like, as I need the list to be filled in order to do some initialization for the deserialized Bar
!
Anyone have an idea as to what could be causing the Foo
s to be deserialized so late? Thanks!