views:

103

answers:

3

Hi there. I have a database which stores .png images as the sql "image" type. I have some code which retrieves these images as a byte[], and sends them to the page via the FileContentResult object in .Net. Performance is key in this application, and the images have to be retrieved and displayed as quickly as possible. My question is, can this operation be performed quicker by passing a byte stream from the database to the browser, and not at anytime storing the whole byte array in memory. If this is possible and worthwhile doing, how do I do it?

Here is the code I have so far:

// Get: /Image/Get/5
        public FileResult Get(int id)
        {


            Response.Cache.SetExpires(DateTime.Now.AddSeconds(300));
            Response.Cache.SetCacheability(HttpCacheability.Public);
            Response.Cache.SetValidUntilExpires(true);

            // Get full size image by PageId.
            return base.File(page.getFullsizeImage(id), "image/png");
        }

And

public byte[] getFullsizeImage(int pageId)
        {


                    return (from t in tPage
                            // Filter on pageId.
                            where t.PageId == pageId
                            select t.Image).Single().ToArray();


        }

Thanks for any help!

+1  A: 

Edit: Based on your comments, I think you should consider using DeepZoom from microsoft. Essentially, what this allows you to do is generate a specialized image file on the server. When a user is browsing the image in full view, just the couple of million or so pixels that are displayed on the screen are sent to the browser via AJAX. Then when the user zooms in, the appropriate pixels for the zoom level and x and y axis are streamed out. There is a DeepZoom Composer which can be accessed via the command line to generate these image files on demand and write them to a network share. Your users will be really impressed.

Take a look at this example. This is a massive image - Gigabytes. in about the middle of the image you will see some newspaper pages. You can zoom right in and read the articles.

End of Edit

Do you have to have images with a large file size? If they are only meant for displaying in the browser, they should be optimized for the web. All main image editing applications have this ability.

If you do need the large file size, then you could provide optimized images and then when the user clicks on the image, allow them to download the full file. They should expect this download to take some time.

In Photoshop, the task is "Save for web". There is a similarly named plugin for Gimp.

I know that this doesn't answer your direct question ("can this operation be performed quicker by passing a byte stream"), but it might help solve your problem.

Daniel Dyson
Hi. Thanks for the response. The images are scans which users upload to the system, and can therefore be of varying size, and resolution.
shaw2thefloor
Hi. Thanks for the response. The images are scans which users upload to the system, and can therefore be of varying size, and resolution. The users need to zoom in quite close to the images at times and therefore the images need to remain reasonably high quality, so that sort of optimisation probably won't be appropriate.
shaw2thefloor
Ok. If you want users to be able to quickly see a gallery of scans then the thumbnail approach would perform better. It would not be difficult to add a task that automatically produces optimized images based on what the users upload to your system. Then, you could show the optimized image when your page first loads and then if they Zoom in, or download the High Definition image, they get that. Your site will perform a whole lot faster.
Daniel Dyson
Also, your users might be impressed by you providing a High Definition button. It reminds me of numerous episodes of CSI where they say "Can you enhance the number plate?" and all of a sudden the grainy CCTV footage is crystal clear. Never underestimate the gullability of your users. ;)
Daniel Dyson
The thing is, there is only going to be one image on the screen at a time (the image area take up most of the screen), and the default zoom will be quite high. The application enables the user to inspect the images often at full resolution.
shaw2thefloor
With that said, I think having a local file cache as I described on my answer is the way to go. If the image is big, IIS cache will get full very easily and then purge the cache.
Aliostad
We did consider using Deep Zoom initially, and it does look awesome, but we thought the overhead of using the composer would be too costly.
shaw2thefloor
It might be worth spending a few days prototyping. If you design your workflows right, the overhead would be all at upload time, out of process, maybe as a batch run - no human intervention. This would optimise the performance for viewing the scans. The commands line API would allow full out-of-process automation, perhaps based on database triggers. Deep Zoom is a great technology looking for a problem to be applied to. I wish I had a business justification for trying it out at work. :)
Daniel Dyson
+1  A: 

Changing the linq from single to first should give you nicer SQL, if PageId is the primary key you can safely assume first and single will return the same result.

Rob Rob
+1  A: 

A nice question.

Reality is the code required to send the image as a stream is really minimal. It is just Response.Write~~~ byte array and setting the HTTP's content-type header which must be very fast.

Now you seem to need to open up your database to the world to get it done quicker. That, being probably possible using features that allow SQL server to serve HTTP/interact with IIS (long time ago I looked at it), not a good idea so I do not believe you should take that risk.

You are already using the caching so that is cool but files being large, cache gets purged frequently.

But one thing to do is to have a local File Cache on the IIS and if image is used, it is written to the file on teh web server and from then on (until maybe next day when this is cleared) this other URL (to the static asset) is returned so requests would not have to go through the ASP.NET layer. It is not a great idea but will achieve what you need with least risk.

Aliostad
OK. Thats really what I was looking for. I will do some testing on the IIS cache and see if it improves processing. Thanks for all the responses.
shaw2thefloor
If you do this, I would recommend you do some very thorough testing at high loads. The ASP.NET cache is good but as @Aliostad has alluded to, it does get quite full. Consider using an a distributed, out-of-process cache such as NCache, Memcached or Velocity, preferabbly on a dedicated caching server farm.
Daniel Dyson