I work mainly on Windows and Windows CE based systems, where CreateFile
, ReadFile
and WriteFile
are the work horses, no matter if I'm in native Win32 land or in managed .Net land.
I have so far never had any obvious problem writing or reading big files in one chunk, as opposed to looping until several smaller chunks are processed. I usually delegate the IO work to a background thread that notifies me when it's done.
But looking at file IO tutorials or "textbook examples", I often find the "loop with small chunks" used without any explanation of why it's used instead of the more obvious (I dare to say!) "do it all at once".
Are there any drawbacks to the way I do that I haven't understood?
Clarification:
By big file I compared my single chunk with the multiple chunks. The multiple chunks examples I mentioned often have chunk sizes in the order 1024 bytes on Windows CE and 10 times it on the desktop. My big files are usually binary files like camera photos from mobile phones etc. and as such in the size order 2-10 MB. Not close to 1 GB, in other words.