views:

200

answers:

2

Many hex editors, such as Hex Workshop, can even read large-sized files while keeping a relatively small memory footprint, but still manage to keep scrolling smooth. I'm looking for the best way to achieve this, and so I have several related questions.

Should I just use FileStream?
  - Is its buffering based on the current Seek location? (Will it usually pagefault when scrolling backwards?)
  - If I create a wrapper for FileStream which only uses Seek internally, will I hurt FileStream's ability to buffer properly? (i.e., will performance suffer greatly from repeated seeking, even if seeks are nearby? Can I rely on the Buffering algorithm or the Disk scheduler to keep performance up?)

Would it be better to use Memory-Mapped I/O? (I only really expect files up to maybe 100MB)
  - Would pagefaults from searching/jumping/fast scrolling create noticeable performance issues?

Ultimately the Data has to be displayed. Should I render the whole file as a bitmap and invalidate parts of the image upon changes (letting the scrolling control do its own paging on the image), or should I just generate the current display area on scroll events?

So in Short, do I page the Data, the generated image, or both, or do I get/generate them as needed? What are the (WPF/.Net) libraries/API objects best suited to this task?

+1  A: 

100MB really isn't that large. So in memory would probably work on new machines.

But you wouldn't want your solution to not scale as time goes by. The minute you assume the limit is 100MB someone will try it with 200MB. So I'd recommend you do take the "seek" route - that's the common way to do it.

Assaf Lavie
+2  A: 

You seem to already have the answer.
The general solution for this is using Memory Mapped files and let the OS bother with caching and seeking.
First try the simplest and most obvious solution. If it doesn't work to your satisfaction, optimize the bottlenecks. Premature optimizations is the root of all evil.

shoosh