I'm running into problems serializing lots of objects in .NET. The object graph is pretty big with some of the new data sets being used, so I'm getting:
System.Runtime.Serialization.SerializationException
"The internal array cannot expand to greater than Int32.MaxValue elements."
Has anyone else hit this limit? How have you solved it?
It would be good if I can still use the built in serialization mechanism if possible, but it seems like have to just roll my own (and maintain backwards compatibility with the existing data files)
The objects are all POCO and are being serialized using BinaryFormatter
. Each object being serialized implements ISerializable
to selectively serialize its members (some of them are recalculated during loading).
It looks like this an open issue for MS (details here), but it's been resolved as Wont Fix. The details are (from the link):
Binary serialization fails for object graphs with more than ~13.2 million objects. The attempt to do so causes an exception in ObjectIDGenerator.Rehash with a misleading error message referencing Int32.MaxValue.
Upon examination of ObjectIDGenerator.cs in the SSCLI source code, it appears that larger object graphs could be handled by adding additional entries into the sizes array. See the following lines:
// Table of prime numbers to use as hash table sizes. Each entry is the // smallest prime number larger than twice the previous entry. private static readonly int[] sizes = {5, 11, 29, 47, 97, 197, 397, 797, 1597, 3203, 6421, 12853, 25717, 51437, 102877, 205759, 411527, 823117, 1646237, 3292489, 6584983};
However, it would be nice if serialization worked for any reasonable size of the object graph.