tags:

views:

1237

answers:

2

I am trying to load big image files in Java and i am getting a memory error message if the file is too big ( i have already tried increasing the heap size by the command line flag).

I am loading images with the following way :

If the image is not a tiff image i am using this code:

BufferedImage img = ImageIO.read(fileToOpen);

And if the file is a tiff i am using this code :

BufferedImage img = JAI.create("fileload", 
    fileToOpen.getAbsolutePath()).getAsBufferedImage();

My question actually boils down to this: How do image manipulation programs (like Photoshop for instance) load files of hundreds of megabytes without getting memory errors?

It is my understanding that the reason a 20MB jpeg is hard to load into memory is because when loading into a BufferedImage for example you are saving the image in an uncompressed fashion. So one possible solution would be to have a Java class that subclasses the Image abstract class but stores the data in a compressed form. But that would possibly have speed issues as the runtime machine would have to uncompress the data while drawing. Another option would be to cache the raw uncompressed data to disk and seamlessly read from there but the speed problem would still persist.

So how do the big boys do it? How can Photoshop load a 200MB jpeg in memory and draw all resolutions of it without any apparent issues?

(final note: in my application because of speed issues after i get my BufferedImage i draw its contents onto a VolatileImage with the same dimensions. This increases the drawing speed greatly)

+2  A: 

The required memory for uncompressed RGBA image is width * height * 4 bytes. Try setting your memory according to this. There might be size cap limitations of the underlying JDK/DirectX/etc. system though.

The big boys exploit the structure of the JPG image and they don't need to load it into memory. Perhaps they draw it from file directly, every time.

BufferedImage has automatic acceleration capabilities similar to volatile image. Set the acceleration priority to 1 and the first paint will move it to VRAM on latest JDKs.

Edit I presume you are running a 32bit system. If your uncompressed image is quite large, more than 1.4GB, you won't be able to handle it in memory due JVM restrictions. If the image is not a one-time image, then you could find tools to stream-uncompress it into a temp-raw image and use random file access to grab parts of it.

kd304
Thank you for your answer. Especially the set acceleration part. I am reserving the correct vote button in case someone comes up with a more applicable (for my case) answer
Savvas Dalkitsis
You mean accepting the answer. Voting up a helpful answer is free, and you can do it for every answer, no need for reserving it, unless you already run out of votes due the SO limit. There is also the option you refine your question and perhaps I (or anybody else) can come up with a more precise answer.
kd304
hehe yes i did mean that. i also meant the vote up button. i want my answer to remain in the unanswered section a bit longer for more exposure and then vote up. i hope that's not too unethical :P
Savvas Dalkitsis
Anyone know of an implementation of such a "temp-raw image/ random access" class maybe?
Savvas Dalkitsis
I fear you have to implement your own JPG decompressor. JPG is a 8x8 blocked image. Index it for image blocks then uncompress the required regions on the fly.
kd304
+1  A: 

The trick is not dealing with the full size image. Divide it into manageable pieces like 50 by 50 pixels of the current resolution and generate the content from the original file when it is needed. If a given piece is not needed (e.g. for being off screen) it can be deleted (a weak reference would be perfect) or persisted to disk.

Thorbjørn Ravn Andersen
But again: to generate the pieces, wont i need to load the image at least once into memory?
Savvas Dalkitsis
not necessarily. A carefully crafted image decoder can read only the bytes needed to decode the area of the piece in question.
Thorbjørn Ravn Andersen