Consider the following file:

-rw-r--r-- 1 user user 470886479 2009-12-15 08:26 the_known_universe.png

How would you scale the image down to a reasonable resolution, using no more than 4GB of RAM?

For example:

$ convert -scale 7666x3833 the_known_universe.png

What C library would handle it?

Thank you!

+2  A: 

You might be able to use the PNG data specification to split up the IDAT data chunks into smaller pieces, each of which could be shrunk, one by one. Then you resew the shrunken chunks back together with a modified IHDR header.

Alex Reynolds
+4  A: 

I believe libpng has a stream interface. I think this can be used to read parts of the image at a time; depending on the image file you might be able to get the lines in order. You could then shrink each line (e.g. for 50% shrinking, shrink the line horizontally and discard every second line) and write to an output file.

Using libpng in C can take a fair amount of code, but the documentation guides you through it pretty well.

+1 - this is the way to go. I had to do something similar with the NASA image and used the streaming API
Thank you for this.
Dave Jarvis
+1  A: 

You could try making a 64 bit build of ImageMagick or seeing if there is one. My colleague wrote a blog with a super-simple png decoder (assumes you have zlib or equivalent) so you can kind of see the code you'd need to roll your own.

You would need to do the resample as you're reading it in.

Lou Franco
You also have to make sure you compile with 8 bits per channel.
+1  A: 

I used cximage a few years ago. I think the latest version is at after moving off of CodeProject.

Edit: sorry, it's C++ not C.

Windows programmer
C++ is okay, too.
Dave Jarvis
+1  A: 

You could use an image processing library that is intended to do complex operations on large (and small) images. One example is the IM imaging toolkit. It links well with C (but is implemented at least partly in C++) and has a good binding to Lua. From the Lua binding it should be easy to experiment.


Have you considered exploring pyramid based images? Imagine a pyramid where the image is divided up in multiple layers, each layer with a different resolution. Each layer is split up into tiles. This way you can display a zoomed out version of the image, and also a zoomed in partial view of the image, without having to re-scale.

See the Wikipedia entry.

One of the original formats was FlashPix, which I wrote a renderer for. I've also created a new format of a pyramid converter and renderer, which was used for a medical application. An actual scanner would produce 90GB+ scans of a slice of an organ for cancer research. The algorithm of the converter was actually pretty tricky to get efficient, to produce the pyramid images efficienty. Believe it or not, it was actually Java based, and it performed much better than you'd think. It used multithreading. Benchmarking showed it was unlikely that a C version would do a whole lot better. This was 6ish years ago. The original renderer I did over 10 years ago. You don't hear anything about pyramid based images anymore these days. But it's really the only efficient way to produce scaled images on demand without having to generate cached scaled versions.

Jpeg2000 may or may not have an optional pyramid feature as well.

I recall that ImageMagick's supporter formats and conversions perhaps, include FlashPix. Googling for "image pyramid" reveals some interesting results. Bring back some memories ;-)


If you can move it to a 64-bit OS you can open it as a memory mapped file or equivalent and use pretty much any library you want. It won't be fast, and may need the increase of the page/swap file (depending on the OS and what else you want to do with it) but in return you won't be limited to streaming libraries so you'll be able to do more operation before going into resolution reduction or slicing.