tags:

views:

681

answers:

6

I've the error described below in the trace, when I try upload an 80,193KB FITS file for processing, in order to display select fields. Basically I have a mock web interface that allows a user to select up to 6 FITS files for uploading and processing. I don't get an error when I upload two [different] FITS files, roughly 54,574KB each, at a go for processing. The fields are displayed/printed on console. However on uploading a single 80,193KB file I get the error below. How do I resolve it?

I initially thought that the iteration was being computationally expensive but I suspect it occcurs on invoking the readHDU for the 80MB file:

while ((newBasicHDU = newFits.readHDU()) != null) { 

How do I resolve efficiently resolve it? I'm running the program on Windows 7. Cheers

Trace:

SEVERE: Servlet.service() for servlet FitsFileProcessorServlet threw exception
java.lang.OutOfMemoryError: Java heap space
    at java.lang.reflect.Array.multiNewArray(Native Method)
    at java.lang.reflect.Array.newInstance(Unknown Source)
    at nom.tam.util.ArrayFuncs.newInstance(ArrayFuncs.java:1028)
    at nom.tam.fits.ImageData.read(ImageData.java:258)
    at nom.tam.fits.Fits.readHDU(Fits.java:573)
    at controller.FITSFileProcessor.processFITSFile(FITSFileProcessor.java:79)
    at controller.FITSFileProcessor.doPost(FITSFileProcessor.java:53)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:849)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:454)
    at java.lang.Thread.run(Unknown Source)

Code:

/**
     * 
     * @param
     * @return
     */
    public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException {

        // Check that we have a file upload request
        boolean isMultipart = ServletFileUpload.isMultipartContent(request);

        if (isMultipart) {

            Fits newFits = new Fits();
            BasicHDU newBasicHDU = null;
            ServletFileUpload upload = new ServletFileUpload();                     // Create a new file upload handler

            // Parse the request
            try {
                //List items = upload.parseRequest(request);                        // FileItem
                FileItemIterator iter = upload.getItemIterator(request);

                // iterate through the number of FITS FILES on the Server
                while (iter.hasNext()) {
                    FileItemStream item = (FileItemStream) iter.next();
                    if (!item.isFormField()) {
                        this.processFITSFile(item, newFits,newBasicHDU );
                    }
                }
            } catch (FileUploadException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }       
        }
    }

    /**
     * 
     * @param
     * @return
     */
    public void processFITSFile(FileItemStream item, Fits newFits, BasicHDU newBasicHDU) throws IOException {

        // Process the fits file
        if (!item.isFormField()) {
            String fileName = item.getName();                                       //name of the FITS File
            try {   
                System.out.println("Fits File Fields Printout: " +  fileName);
                InputStream fitsStream = item.openStream();                             
                newFits = new Fits(fitsStream);
                System.out.println( "number of hdu's if: " + newFits.getNumberOfHDUs());

                while ((newBasicHDU = newFits.readHDU()) != null)  {                //line 76
                    System.out.println("Telescope Used: " + newBasicHDU.getTelescope());
                    System.out.println("Author: " + newBasicHDU.getAuthor());
                    System.out.println("Observer: " + newBasicHDU.getObserver() );
                    System.out.println("Origin: " + newBasicHDU.getOrigin() );
                    System.out.println("End of Printout for: \n" + fileName);
                    System.out.println();               
                }

                fitsStream.close();

            } catch (Exception e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }       
    }
+5  A: 

Sounds like you've not allocated enough memory to Tomcat - you can address this by specifying -Xmx512m for example to allocate up to 512Mb of memory.

See here for more details on how to set this.

Brian
+2  A: 

You're trying to use more RAM than what you have.

Try increasing the maximum memory by adding to the -Xmxflag when starting your program.

java -Xmx128m  youProgram 

That would assign 128 megabytes of memory as maximum to your program.

OscarRyz
"That would assign 128 megabytes of memory to your program" - not quite - this would set the *maximum* size that the JVM will use for the heap to 128m. Unless otherwise specified (with -Xms) the *starting* size for the heap will still be the default value.
matt b
And it should be noted that the system will use a considerable amount of memory more than the heap for non-heap things. Typically I see 30-40 MB over the specified heap size.
Software Monkey
Oscar - generally yes but in this case the poster's using an app server (Tomcat) so the "normal" java -X route may not be that helpful
Brian
A: 

The java.lang.OutOfMemoryError means that you have exceeded the memory allocated by the JVM. Use the -Xmx to change the maximum memory heap size of your JVM. (thank you Software Monkey)

You can make this to see what is the size of the jvm memory:

MemoryMXBean memoryBean = ManagementFactory.getMemoryMXBean();
System.out.println( memoryBean.getHeapMemoryUsage() );
Yannick L.
Incorrect - use `-Xmx` to change the maximum heap size. -Xms changes the initial heap size.
Software Monkey
+3  A: 

Best fix would be to enhance the code so that it does not unnecessarily duplicate the data in memory. The stacktrace suggests that the code is trying to clone the file contents in memory. If it is not possible to enhance the (3rd party?) code so that it does not do that, but instead immediately processes it, then rather configure/use commons FileUpload so that it does not keep the uploaded files in memory, but instead in a temp storage at the local disk file system.

Your best try to minimize the memory used would then be using the DiskFileItemFactory with a little threshold (which defaults to 10KB by the way, which is affordable).

ServletFileUpload upload = new ServletFileUpload(new DiskFileItemFactory());

This way you've enough memory for the real processing.

If you still hits the heap memory limits, then the next step would indeed be increasing the heap.

BalusC
+10  A: 

Most of the answers have been centered around increasing your heap size from the default (which is 64MB). 64MB would explain why you are able to upload 54,574KB successfully because it's under that size, and I bet tomcat and your program don't take up much more than 10MB when booted up. Increasing your memory is a good idea, but it's really treating the symptoms not the disease.

If you plan on allowing multiple users to upload big files then two users uploading 80MB files at the same time will require 160MB. And, if 3 users do this you'll need 240MB, etc. Here is what I would suggest. Find a library that doesn't read this stuff into RAM before writing it to disk. That will make your app scale, and that's the real solution to this problem.

Use jconsole (comes with the JDK) to look at your heap size while the program is running. Jconsole is really easy to use and you don't have to configure your JVM to use it. So in that since it's much easier than a profiler, but you won't get as much detail about your program. However, you can see the three portions of memory better (Eden, Survivor, and Tenured). Sometimes strange things can cause one of those areas to run out of memory even though you have allocated lots of memory to the JVM. JConsole will show you things like that.

chubbard
A: 

hi guis, I'm suresh. i created one bolg for reading fits file using java and nom.tam.fits package.

http://sureshmssnc-java.blogspot.com/2010/06/my-code-requires-following-tools.html. see the site. I hope it's helpful to you.

suresh