In the project I'm currently working on there is a need to save a sizable data structure to disk (edit: think dozens of MB's). Being an optimist, I thought that there must be a standard solution for such a problem; however, up to now I haven't found a solution that satisfies the following requirements:
- .NET 2.0 support, preferably with a FOSS implementation
- Version friendly (this should be interpreted as: reading an old version of the format should be relatively simple if the changes in the underlying data structure are simple, say adding/dropping fields)
- Ability to do some form of random access where part of the data can be extended after initial creation, without the need to deserialize the collection created up to this point in time (think of this as extending intermediate results)
- Space and time efficient (XML has been excluded as option given this requirement)
Options considered so far:
- XmlSerializer: was turned down since xml serialization does not meet requirement 3 and 4.
- SerializableAttribute: does not support requirements 2 and 3.
- Protocol Buffers: was turned down by verdict of the documentation about Large Data Sets - since this comment suggested adding another layer on top, this would call for additional complexity which I wish to have handled by the file format itself.
- HDF5,EXI: do not seem to have .net implementations
- SQLite/SQL Server Compact edition: the data structure at hand would result in a pretty complex table structure that seems too heavyweight for the intended use
- BSON: does not appear to support requirement 3.
- Fast Infoset: only seems to have paid .NET implementations.
Any recommendations or pointers are greatly appreciated. Furthermore if you believe any of the information above is not true, please provide pointers/examples to prove me wrong.