views:

430

answers:

5

Serializing/deserializing with BinaryFormatter, resulting serialized file is ~80MB in size. The deserialization takes a few minutes. How could I improve on this? Here's the deserialization code:

    public static Universe DeserializeFromFile(string filepath)
    {
        Universe universe = null;

        FileStream fs = new FileStream(filepath, FileMode.Open);

        BinaryFormatter bf = new BinaryFormatter();
        try
        {
            universe = (Universe)bf.Deserialize(fs);
        }
        catch (SerializationException e)
        {
            Console.WriteLine("Failed to deserialize. Reason: " + e.Message);
            throw;
        }
        finally
        {
            fs.Close();
        }

        return universe;
    }

Maybe read all to memory prior to deserializing or use some other serialization technique?

A: 

Please take a look at this thread.

adatapost
A: 

Try reading the file into a memory stream first in one go, then deserialize using the memory stream.

Binder
If that makes things better rather than worse, then the serialization format sucks. Why do an I/O bound task *followed by* a CPU-bound task when you could do both, interleaved?
hobbs
+1  A: 

Try UnsafeDeserialize. It is said to improve speed.

leppie
UnsafeDeserialize 460138 ms, Deserialize 459967 ms.. ie. Deserialize was actually faster! I set the headers with the UnsafeDeserialize to null, is this perhaps the reason?
Carlsberg
A: 

Implement ISerializable in the Universe class

Gary
A: 

How complex is the data? If it is an object tree (rather than a full graph), then you might get some interesting results from trying protobuf-net. It is generally pretty easy to fit onto existing classes, and is generally much smaller, faster, and less brittle (you can change the object model without trashing the data).

Disclosure: I'm the author, so might be biased - but it really isn't terrible... I'd happily lend some* time to help you try it, though.

*=within reason

Marc Gravell