Lets say I have classA which contains classB and both are [Serializable].
I assumed, that on Deserialization classB would be deserialized first.
This is not the case however, as I could confirm by just logging when each [OnDeserialized] methods were hit.
Now I have the following issue:
After classA's deserialization is complete, it is supposed to set itself up, using values from classB. Unfortunately, classB has not been deserialized yet at this point, so classA gets set up wrong.
My problem would be solved, if I could force the BinaryFormatter to deserialize classB before classA, or resolve the Object Graph bottom to top instead of top to bottom.
Another obvious solution would be to make classB fire an event when it is deserialized and then have classA set itself up, but I want to stay away from this non-elegant workaround.
So I would appreciate if somebody knows of a better solution.