tags:

views:

59

answers:

3

Hi,

I've built some file sorting functionality into a java app, and it's designed to sort files larger than 20GB. The general approach is to read the file in chunks, sort each chunk in memory, then write it to its own temporary sorted file. On the second pass, I open all chunk files simultaneously and weave them together into a final sorted file.

I'm wondering if there are any practical limits I should be aware of when opening and reading a large number of files simultaneously?

On my own machine (Mac OS X), I've been able to read >250 files without issues. Perhaps someone is aware of limits that might apply to other platforms?

Thanks for your time.

+2  A: 

Even though there is no specific limit in Java, OS does impose limit on number of open files per process.

On Linux, you can change this limit by setting

ulimit -n max_open_files

ZZ Coder
A: 

On Linux, at least, you can check with ulimit -n

Justin K
A: 

It depends on the platform, the jvm will happily open files on linux for instance until it hits the maximum number of open file descriptors (remember on unix sockets are also file descriptors).

jskaggz