I am getting OutOfMemoryErrors when uploading large (>300MB) files to a servlet utilizing Commons FileUpload 1.2.1. It seems odd, because the entire point of using DiskFileItem is to prevent the (possibly large) file from residing in memory. I am using the default size threshold of 10KB, so that's all that should ever be loaded into the heap, right? Here is the partial stack trace:
java.lang.OutOfMemoryError
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:177)
at org.apache.commons.fileupload.disk.DiskFileItem.get(DiskFileItem.java:334)
at org.springframework.web.multipart.commons.CommonsMultipartFile.getBytes(CommonsMultipartFile.java:114)
Why is this happening? Is there some configuration I'm missing? Any tips/tricks to avoid this situation besides increasing my heap size?
I really shouldn't have to increase my heap, because in theory the most that should be loaded into memory from this operation is a little over 10KB. Plus, my heap max (-Xmx) is already set for 1GB which should be plenty.