Hello again,
I've got the lovely task of working out how to handle large files being loaded into our application's script editor (its like VBA for our internal product for quick macros). Most files are about 300-400Kb which is fine loading. But when they go beyond 100Mb the process has a hard time as you'd expect.
What happens is that the file is read and shoved into a RichTextBox which is then navigated - don't worry too much about this part.
The developer who wrote the initial code is simply using a StreamReader and doing [Reader].ReadToEnd() which could take quite a while to complete.
My task is to break this bit of code up, read it in chunks into a buffer and show a progressbar with an option to cancel it.
Some assumptions:
- Most files will be 30-40Mb
- The contents of the file is text (not binary), some are UNIX format, some are DOS.
- Once the contents is retrieved we workout what terminator is used.
- No-ones concerned once its loaded the time it takes to render in the richtextbox, its just the initial load of the text.
Now for the questions:
- Can I simply use StreamReader, then check the Length property (so ProgressMax) and issue a Read for a set buffer size and iterate through in a while loop WHILST inside a background worker so it doesn't block the main UI thread? Then return the stringbuilder to the main thread once its completed.
- The contents will be going to a StringBuilder, can I initialise the SB with the size of the stream if the length is available?
Are these (in your professional opinions) good ideas? I've had a few issues in the past with reading content from Streams because it will always miss the last few bytes or something, but I'll ask another question if this is the case.