views:

80

answers:

3

We would like to display very large (50mb plus) images in Internet Explorer. We would like to avoid compression as compression algorithms are not what CSI would have us believe that they are and the resulting files are too lossy.

As a result, we have come up with two options: Silverlight Deep Zoom or a Flash based solution (such as Zoomify). The issue is that both of these require conversion to a tiled output and/or conversion to a specific file type (Zoomify supports a single proprietary file type, PFF).

What we are wondering is if a solution exists which will allow us to view the image without a conversion before hand.

PS: I know that you can write an application to tile the images (as needed or after the load process) and output them; however, we would like to do this without chopping up the file.

A: 

The browser isn't going to smoothly load a 50 meg file; if you don't chop it up, there's no reasonable way to make it not lag.

Dean J
+3  A: 

The tiled approach really is the right way to do it.

Your users don't want to download a 50mb file before they can start viewing the image. You don't want to spend the bandwidth to serve 50 megs to every user who might only view a fraction of your image.

If you serve the whole file, users will eventually be able to load and view it, but it won't run smoothly for most of them.

There is no simple non-tiled way to serve just a portion of an image unless you want to use a server-side library like imagemagik or PIL to extract a specific subset of the image for each user. You probably don't want to do that because it will place a significant load on your server.

Alternatively, you might use something like google's map tool to provide zooming and scaling. Some comments on doing that are available here:

http://webtide.wordpress.com/2008/08/27/custom-google-maps/

Paul McMillan
That's what I was thinking, I just figured throw it out to SO. Some solutions the community has come up with have blown me away so I ask even though I'm 99% sure of the solution :)
Nate Noonen
I've definitely done the same thing and sometimes found very interesting suggestions. Wish I had a better solution for ya...
Paul McMillan
It's also worth noting that (depending on the variety of your images) you might be able to use google's map tool as an alternative to silverlight or flash. I've updated the answer with a link.
Paul McMillan
A: 

If you dont want to tile, you could have the server open the file and render a screen sized view of the image for display in the browser at the particular zoom resolution requested. This way you arent sending 50 meg files across the line when someone only wants to get an overview of the image. That is, the browser requests a set of coordinates and an output size in pixels, the server opens the larger image and creates a smaller image that fits the desired view, and sends that back to the web browser.

As far as compression, you say its too lossy, but if thats what you are seeing you are probably using the wrong compression algorithm or setting for the type of image you have. The jpg format has quality settings to control lossiness, and PNG compression is lossless (the pixels you get after decompressing are the exact values you had prior to compression). So consider changing what you are using as compression, and dont just rely on the default settings in an image editor.

GrandmasterB
Any compression algorithm is going to be lossy after a specific point. You can't take a 50mb JPG and shrink it down to a size that is legit for the web (about 1mb or so) without losing fidelity. You may get it down but not that far.
Nate Noonen
You are simply incorrect. PNG compression is lossless. As is GIF compression. See http://en.wikipedia.org/wiki/Lossless_data_compression Think about it... if all compression algorithms were lossy, how do you think you can zip up a 500 meg file and unzip it, getting the originals back?
GrandmasterB
What I meant is that there is a limit of the size that you can get it down to without switching to a lossy compression. You can take a PNG that's 50mb and get it to 40mb. If you deflate the stream into a zip, you may get down to 20mb. It is (to my knowledge) impossible to get a 50mb image down to 1mb without lossy compression.
Nate Noonen
You can get a *500* meg image down to a few bytes in size... if its all black. Compression ratios depend as much on the data being compressed as the algorithm. You neither specified the *type* of image you were compressing in your question (photo, map, etc), nor a target size or quality requirement. The type of image is important because whats good on a photo is not necessarily whats good on, say, a map.
GrandmasterB