I have a Java program that accepts connections, receives HTTP requests and sends HTTP replies and some data stored in file (this is a part of caching proxy). Removing everything irrelevant, my code looks like this:
FileInputStream fileInputStream = new FileInputStream(file);
OutputStream outputStream = socket.getOutputStream();
byte[] buf = new byte[BUFFER_SIZE];
int len = 0;
while ((len = fileInputStream.read(buf)) > 0) {
outputStream.write(buf, 0, len);
}
outputStream.flush();
socket.close();
This code is executed in particular thread for each connected client.
When I deal with small files (.htm, .gif, .swf, etc.), everything works fine (however, I don't see anything wrong in browser). But when I download large files (.iso), especially several files simultaneously, when system is under load, sometimes I get really strange behavior. Browser downloads 99.99% of a file and when there are less than BUFFER_SIZE of undownloaded bytes, downloading stops for a few seconds and then browser says that error has occured. I can not understand what happens, because all data is successfully read and even all data is successfully written to outputStream. As you can see, I even do flush(), but it takes no result.
Can anyone explain me what happens?
EDIT
Uploaded project to filehosting.org.
Download source files. There is zip archive with source code, Build.xml and Readme.txt. Use ant to build solution. Described problem occurs in ClientManager.java, you'll find a comment there.