I am trying to load big image files in Java and i am getting a memory error message if the file is too big ( i have already tried increasing the heap size by the command line flag).
I am loading images with the following way :
If the image is not a tiff image i am using this code:
BufferedImage img = ImageIO.read(fileToOpen);
And if the file is a tiff i am using this code :
BufferedImage img = JAI.create("fileload",
fileToOpen.getAbsolutePath()).getAsBufferedImage();
My question actually boils down to this: How do image manipulation programs (like Photoshop for instance) load files of hundreds of megabytes without getting memory errors?
It is my understanding that the reason a 20MB jpeg is hard to load into memory is because when loading into a BufferedImage for example you are saving the image in an uncompressed fashion. So one possible solution would be to have a Java class that subclasses the Image abstract class but stores the data in a compressed form. But that would possibly have speed issues as the runtime machine would have to uncompress the data while drawing. Another option would be to cache the raw uncompressed data to disk and seamlessly read from there but the speed problem would still persist.
So how do the big boys do it? How can Photoshop load a 200MB jpeg in memory and draw all resolutions of it without any apparent issues?
(final note: in my application because of speed issues after i get my BufferedImage i draw its contents onto a VolatileImage with the same dimensions. This increases the drawing speed greatly)