views:

52

answers:

3

In order to be able to so some benchmarks I need to cleanup Windows disk read cache. How can I do this?

In fact I want to compare if loading a big Unicode file (UTF-8 or UTF-16) from disk is faster or not, considering that in memory I do keep UTF-16.

I know that it should be no significant difference but in order to benchmark it I need to be sure that that file is not cached - I need to see if size on disk has more or less impact than decoding the file.

A: 

Asked and answered, my friend:

http://stackoverflow.com/questions/85595/flush-disk-cache-from-windows-cli

Otherwise, try here:

http://www.google.com/search?q=flush+the+disk+read+cache+Windows%3F&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a

Hila's Master
**read** is not the same as **write** :p
Sorin Sbarnea
That SO post is about the write cache - sync.exe (and the underlying NT native call) only flushes the write cache.
snemarch
-1 This is not the same. If you read the question, it's not about *flushing* the disk (write) cache, but *discarding* the read cache. Your answer is about making sure the write cache is flushed...
sleske
+1  A: 

The only solution I found so far was http://chadaustin.me/2009/04/flushing-disk-cache/ but this ones takes too much time so I hope we'll find a better one.

Sorin Sbarnea
+1  A: 

AFAIK, it's unfortunately not possible to discard the read cache under Windows. I spent some time looking into this some years ago, and only found out how to flush the write cache.

As I see it, you have three options, unless somebody else has found some magic:

  1. If possible, do your read file I/O in unbuffered mode.
  2. Each time you want to benchmark, create a new copy of the test data specifying unbuffered mode when creating the new copy (this should keep the copy out of read cache, but I haven't tested).
  3. Allocate enough memory that windows has to discard the disk cache (ugh!).
snemarch
Note that the program `flushmem` from Sron Sbarnea's answer implements solution 3. It works, but allocating that much memory is apparently quite slow (probably because it causes a lot of paging/swapping).
sleske
@sleske: yep, it's not a pretty solution - it forces Windows to trim the working sets for all processes, doing pageout to swap... and then there'll usually immediately be pagein activity afterwards.
snemarch