tags:

views:

318

answers:

2

I have a "processor" component that can process a single File, InputStream, Reader, or etc.

For various reasons, I end up with several large files instead of one huge file.

Is there a way to construct an input stream (or reader) that: transparently "appends" all these files so that: 1) The "processor" does not know where one file started or another ended 2) No changes occur in the file system (e.g., no actual appending of files) 3) Each file is read in order so that I do not pay the cost of loading all of them to memory and appending them before the processor starts reading?

I'm sure it is possible to write something like this, but I'm wondering if one exists already; it's been a while since I did file based IO.

+10  A: 

SequenceInputStream concatenates multiple streams.

List<InputStream> opened = new ArrayList<InputStream>(files.size());
for (File f : files) 
  opened.add(new FileInputStream(f));
InputStream is = new SequenceInputStream(Collections.enumeration(opened));

Exception handling (not shown) when opening each file is important; be certain that all files are certain to be closed eventually, even if the operation is aborted before the SequenceInputStream is created.

erickson
Damn, I was familiar with the Piped streams, didn't know this one exists, should have taken a better look. Thank you !
Uri
+2  A: 

You can use something like SequenceInputStream to read one stream after the other.

yx