tags:

views:

2212

answers:

5

We are using a WCF service layer to return images from a repository. Some of the images are color, multi-page, nearly all are TIFF format. We experience slowness - one of many issues.

1.) What experiences have you had with returning images via WCF 2.) Do you have any suggestions tips for returning large images? 3.) All messages are serialized via SOAP correct?
4.) Does wcf do a poor job of compressing the large tiff files?

Thanks all!

A: 

In a previous project I worked we had a similar issue. We had a Web Service in C# that received requests for medias. A media can range from files to images and was stored in a database using BLOB columns. Initially the web method that handled media retrieval requests read the chunk from the BLOB and returned in to the caller. This was one round trip to the server. The problem with this approach is that the client has no feedback of the progress of the operation.

There is no problem in computer science that cannot be solved by an extra level of indirection.

We started by refactoring the method in three methods.

Method1 setup the conversation between caller and the web service. This includes information about the request (like media Id) and capabilities exchange. The web service responded with a ticked Id which is used for the caller for future requests. This initial call is used for resource allocation.

Method2 is called consecutively until there is more that to be retrieved for the media. The call includes information about the current offset and the ticked Id that was provided when Method1 was called. The return updates the current position.

Method3 is called to finish request when Method2 reports that the reading of the request media has completed. This frees allocated resources.

This approach is practical because you can give immediate feedback to the user about the progress of the operation. You have a bonus that is to split the requests to Method2 in different threads. The progress than can be reported by chunk as some BitTorrent clients do.


Depending on the size of the BLOB you can choose to load it from the database on one go or reading it also by chunks. This means that you could use a balanced mechanism that based on a given watermark (BLOB size) chooses to load it in one go or by chunks.


If there is still a performance issue consider packaging the results using GZipStream or read about message encoders and specifically pay attention to the binary and Message Transmission Optimization Mechanism (MTOM).

smink
Thank you for your comment. We are using an API to retrieve the document from a third party repository. Our issue is really that WCF seems to do a poor job of compressing the data when its a large TIFF file.
schmoopy
With that said, your suggestion may still be of value if we can figure out a way to multithread getting chunks of the image back, but we may be limited to the 3rd party API.
schmoopy
Being limited to the 3rd party API you probably are left with the only choice of using a binary transport channel and probably do custom compression using GZipStream for example. I assume that the 3rd party library only can give the images in one go and not by chunks.
smink
+1  A: 

What bindings are you using? WCF will have some overheads, but if you use basic-http with MTOM you lose most of the base-64 overead. You'll still have the headers etc.

Another option would be to (wait for it...) not use WCF here - perhaps just a handler (ashx etc) that returns the binary.

Re compression - WCF itself won't have much hand in compression; the transport might, especially via IIS etc with gzip enabled - however, images are notorious for being hard to compress.

Marc Gravell
Good Question: we are using netTCP. We are at the mercy of the API and all calls to it must be performed through the WCF service tier.
schmoopy
+2  A: 

If you are using another .Net assembly as your client, you can use two methodologies for returning large chunks of data, streaming or MTOM.

Streaming will allow you to pass a TIFF image as if it were a normal file stream on the local filesystem. See here for more details on the choices and their pros and cons.

Unfortunately, you're still going to have to transfer a large block of data, and I can't see any way around that, considering the points already raised.

ZombieSheep
+2  A: 

I just wanted to add that it is pretty important to make sure your data is being streamed instead of buffered.

I read somewhere that even if you set transferMode to 'Streamed' if you aren't working with either a Stream itself, a Message or an implementation of IXmlSerializable, the message is not streamed.

Make sure you keep that in mind.

sebastian
Great point, had not even cosidered that. Will investigate. We may have to move the image retrieving portion to a sep service that is streamed.
schmoopy
+5  A: 

Okay Just to second the responses by ZombieSheep and Seba Gomez, you should definitely look at streaming your data. By doing so you could seamlessly integrate the GZipStream into the process. On the client side you can reverse the compression process and convert the stream back to your desired image.

By using streaming there is a select number of classes that can be used as parameters/return types and you do need to modify your bindings throughout.

Here is the MSDN site on enabling streaming. This is the MSDN page that describes the restrictions on streaming contracts.

I assume you are also controlling the client side code, this might be really hard if you aren't. I have only used streaming when I had control of both the server and client.

Good luck.

smaclell