I get a java.lang.outOfMemoryError exception, while writing a big file to the servletOutputStream. Every response is by default wrapped using a ehcache.constructs.web.filter Object for GZIP compression. And as per the logs, the exception is thrown in the Filter object. Is there a way to increase the available memory so, that the outOfMemoryError Exception does not occur ?
Use -Xmx Java command line option as shown below
java -Xms256m -Xmx512m com.test.MyMain
Xms represents lower end of memory allocation and Xmx represents the upper end of memory allocation
Set the following JVM options to your servlet container -Xmx256m -Xms128m
(in Tomcat it is in catalina.sh
/ catalina.bat
)
You need to add the option "-Xmx1024m" to the java
command which runs your servlet container. (Replace 1024m with whatever heap size you like; this means 1024 megabytes.) If you're using, say, Tomcat, then this means you set this in the environment variable CATALINA_OPTS. Your server configuration will vary.
However, the problem is buffering such a big file in memory. This is wasteful. Try this compression filter, which doesn't buffer this way:
Don't forget about possibly needing to increase your PermGen size:
-XX:PermSize=64m -XX:MaxPermSize=256m
Also do make sure you are efficiently streaming out the file. There may be unnecessarily large buffering in the output or inputstream pipe.