views:

126

answers:

5

I have java program that reads a jpegfile from the harddrive and uses it as the background image for various other things. The image itself is stored in a BufferImage object like so:

BufferedImage background
background = ImageIO.read(file)

This works great - the problem is that the BufferedImage object itself is enormous. For example, a 215k jpeg file becomes a BufferedImage object that's 4 megs and change. The app in question can have some fairly large background images loaded, but whereas the jpegs are never more than a meg or two, the memory used to store the BufferedImage can quickly exceed 100s of megabytes.

I assume all this is because the image is being stored in ram as raw RGB data, not compressed or optimized in any way.

Is there a way to have it store the image in ram in a smaller format? I'm in a situation where I have more slack on the CPU side than RAM, so a slight performance hit to get the image object's size back down towards the jpeg compression would be well worth it.

+1  A: 

File size of the JPG on disk is completely irrelevant.
The pixel dimensions of the file are. If your image is 15 Megapixels expect it to require crap load of RAM to load a raw uncompressed version.
Re-size your image dimensions to be just what you need and that is the best you can do without going to a less rich colorspace representation.

fuzzy lollipop
@fuzzy lollipop: I'm sorry but *a)* you aren't answering (hence the -1) and *b)* your comment that it is *"irrelevant"* is **HIGHLY** misleading (too bad I cannot give -2). The OP's problem is obviously that JPEG compression is **VERY** efficient on some kind of image and hence he's surprised to see the compressed JPEG / uncompressed (A)RGB size difference. You aren't helping in any way.
Webinator
It is irrelevant how big the source file is. Just like trying to read a 1GB TXT file that has been zipped is irrelevant the zip file size. The one disk file size __has nothing__ to do with the amount of memory it takes to display the uncompressed image.
fuzzy lollipop
the answer is there, resize the image to be smaller dimensions
fuzzy lollipop
A: 

You could copy the pixels of the image to another buffer and see if that occupies less memory then the BufferedImage object. Probably something like this:

BufferedImage background= new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);

int[] pixels = background.getRaster().getPixels(0, 0, imageBuffer.getWidth(), imageBuffer.getHeight(), (int[]) null);
karlphillip
@karlphillip: great code sample. Would you mind adding some more breaks in there to avoid scrolling?
Dinah
+2  A: 

I assume all this is because the image is being stored in ram as raw RGB data, not compressed or optimized in any way.

Exactly... Say a 1920x1200 JPG can fit in, say, 300 KB while in memory, in a (typical) RGB + alpha, 8 bits per component (hence 32 bits per pixel) it shall occupy, in memory:

1920 x 1200 x 32 / 8 = 9 216 000 bytes 

so your 300 KB file becomes a picture needing nearly 9 MB of RAM (note that depending on the type of images you're using from Java and depending on the JVM and OS this may sometimes be GFX-card RAM).

If you want to use a picture as a background of a 1920x1200 desktop, you probably don't need to have a picture bigger than that in memory (unless you want to some special effect, like sub-rgb decimation / color anti-aliasing / etc.).

So you have to choices:

  1. makes your files less wide and less tall (in pixels) on disk
  2. reduce the image size on the fly

I typically go with number 2 because reducing file size on hard disk means you're losing details (a 1920x1200 picture is less detailed than the "same" at 3940x2400: you'd be "losing information" by downscaling it).

Now, Java kinda sucks big times at manipulating pictures that big (both from a performance point of view, a memory usage point of view, and a quality point of view [*]). Back in the days I'd call ImageMagick from Java to resize the picture on disk first, and then load the resized image (say fitting my screen's size).

Nowadays there are Java bridges / APIs to interface directly with ImageMagick.

[*] There is NO WAY you're downsizing an image using Java's built-in API as fast and with a quality as good as the one provided by ImageMagick, for a start.

Webinator
+1  A: 

Do you have to use BufferedImage? Could you write your own Image implementation that stores the jpg bytes in memory, and coverts to a BufferedImage as necessary and then discards?

This applied with some display aware logic (rescale the image using JAI before storing in your byte array as jpg), will make it faster than decoding the large jpg every time, and a smaller footprint than what you currently have (processing memory requirements excepted).

Stephen
A: 

One of my projects I just down-sample the image as it is being read from an ImageStream on the fly. The down-sampling reduces the dimensions of the image to a required width & height whilst not requiring expensive resizing computations or modification of the image on disk.

Because I down-sample the image to a smaller size, it also significantly reduces the processing power and RAM required to display it. For extra optimization, I render the buffered image in tiles also... But that's a bit outside the scope of this discussion. Try the following:

public static BufferedImage subsampleImage(
    ImageInputStream inputStream,
    int x,
    int y,
    IIOReadProgressListener progressListener) throws IOException {
    BufferedImage resampledImage = null;

    Iterator<ImageReader> readers = ImageIO.getImageReaders(inputStream);

    if(!readers.hasNext()) {
      throw new IOException("No reader available for supplied image stream.");
    }

    ImageReader reader = readers.next();

    ImageReadParam imageReaderParams = reader.getDefaultReadParam();
    reader.setInput(inputStream);

    Dimension d1 = new Dimension(reader.getWidth(0), reader.getHeight(0));
    Dimension d2 = new Dimension(x, y);
    int subsampling = (int)scaleSubsamplingMaintainAspectRatio(d1, d2);
    imageReaderParams.setSourceSubsampling(subsampling, subsampling, 0, 0);

    reader.addIIOReadProgressListener(progressListener);
    resampledImage = reader.read(0, imageReaderParams);
    reader.removeAllIIOReadProgressListeners();

    return resampledImage;
  }

 public static long scaleSubsamplingMaintainAspectRatio(Dimension d1, Dimension d2) {
    long subsampling = 1;

    if(d1.getWidth() > d2.getWidth()) {
      subsampling = Math.round(d1.getWidth() / d2.getWidth());
    } else if(d1.getHeight() > d2.getHeight()) {
      subsampling = Math.round(d1.getHeight() / d2.getHeight());
    }

    return subsampling;
  }

To get the ImageInputStream from a File, use:

ImageIO.createImageInputStream(new File("C:\\image.jpeg"));

As you can see, this implementation respects the images original aspect ratio as well. You can optionally register an IIOReadProgressListener so that you can keep track of how much of the image has been read so far. This is useful for showing a progress bar if the image is being read over a network for instance... Not required though, you can just specify null.

Why is this of particular relevance to your situation? It never reads the entire image into memory, just as much as you need it to so that it can be displayed at the desired resolution. Works really well for huge images, even those that are 10's of MB on disk.

S73417H