Hi, I'm working on a system that requires high file I/O performance (with C#). Basically, I'm filling up large files (~100MB) from the start of the file until the end of the file. Every ~5 seconds I'm adding ~5MB to the file (sequentially from the start of the file), on every bulk I'm flushing the stream. Every few minutes I need to update a structure which I write at the end of the file (some kind of metadata).
When flushing each one of the bulks I have no performance issue. However, when updating the metadata at the end of the file I get really low performance. My guess is that when creating the file (which also should be done extra fast), the file doesn't really allocates the entire 100MB on the disk and when I flush the metadata it must allocates all space until the end of file.
Guys/Girls, any Idea how I can overcome this problem?
Thanks a lot!
From comment:
In general speaking the code is as follows, first the file is opened:
m_Stream = new FileStream(filename,
FileMode.CreateNew,
FileAccess.Write,
FileShare.Write, 8192, false);
m_Stream.SetLength(100*1024*1024);
Every few seconds I'm writing ~5MB.
m_Stream.Seek(m_LastPosition, SeekOrigin.Begin);
m_Stream.Write(buffer, 0, buffer.Length);
m_Stream.Flush();
m_LastPosition += buffer.Length; // HH: guessed the +=
m_Stream.Seek(m_MetaDataSize, SeekOrigin.End);
m_Stream.Write(metadata, 0, metadata.Length);
m_Stream.Flush(); // Takes too long on the first time(~1 sec).