views:

564

answers:

2

I have a 3.5 SP1 project, WCF service which is limited to consumption by Silverlight 3 clients. Due to the business requirements we have to work with large object graphs that are hydrated via SQL Server on the WCF side and then sent to the Silverlight client. They are deep, you might have a class that has two collection properties and each item in the collection has collections inside of it. The fundamental design is what it is, something I inherited and must work within for the short term. How big are we talking? An example top level collection with 250 items once serialized is 14mb when coming across the wire using no modiciations (httpBinding and the DataContractSerializer). 250 items is small, the requirements we are facing require us to be able to work with 10,000+ items which given my limited math skills is well over 500mb to pull across the wire. No walk in the park - in fact - you could probably take a walk in the park while that churned away.

So there are several things we are considering, one is to move away from DataContractSerializer and use the XmlSerializer so we can move lots of these properties into attributes and reduce the payload size. We are also looking at Binary Xml bindings.

My question is this, what would you do? Can IIS compression play a role here? Moving away from the DCS a bad idea? Is there a better technique? Am I up a creek without a paddle?

A: 

Do you absolutely need to pull down all of that data in one shot? If this is a Silverlight app, I can't see you physically displaying 10,000+ (or even 250 for that matter) records of the size you describe. Is it possible to use paging to reduce the amount of data pulled across the wire?

Instead of having 10,000+, you could be showing 10-20 records per page, so the system only queries 10-20 depending on the circumstances.

Not sure if that is an option for you, given the requirements you describe, but that seems to be the most obvious solution to me.


Also, to marginally help your performance, have you made sure serialization assemblies are being generated for your business objects? This would optimize at least the serialization/deserialization piece of your code. It may not be a huge performance boost, but it will help.

Dan Herbert
Long term I will refactor things so that only the elements I need in a given dataset come across the wire but the reality is that for now, we have considerable charting requirements that necessitate us acting over the data on the SL side. I need all 10K items because various elements of them are used for different elements of the chart and their child objects are needed for aggregates and such.
keithwarren7
A: 

As others have said, look if you need to transfer this amount of data.

The best way to send large amounts of data is to use streaming, see:

http://msdn.microsoft.com/en-us/library/ms733742.aspx

Shiraz Bhaiji