I have a problem where the system cache grows too big such that my server applications do nt have enough free RAM. This is a known issue in 64-bit Windows Server 2003. Using SetSystemFileCacheSize API works to limit the max size of the system file cache, BUT, when it is limited, then suddenly a system thread starts up running RtlCompressBuffer which takes 100% of CPU on a single core (takes up 100% of one core on the box). It looks like what is happening is windows is deciding to start compressing memory pages when it thinks the system cache is not big enough. I get this behaviour regardless of size of system cache (even if I set max size to 16GB). I have 2-CPU 8-core 64-bit box with 32 GB of RAM. We are running Lucene.NET on an index about 64 GB in size. We have the system cache grow to take up to 24 GB of RAM, leaving very little free RAM for running searches. Setting cache to smaller amount such as 12 GB helps a LOT with searches initially, but then it does quickly take up CPU as described above and also seems to run slower. Any idea what is going on regarding RtlCompressBuffer and size of system file cache? I am strongly considering moving this to Linux soon...