views:

151

answers:

3

I'm new to WCF Data Services so I've been playing. After some initial tests I am disappointed by the performance of my test data service.

I realize that because a WCF DS is HTTP-based there is overhead inherent in the protocol but my tests are still way slower than I would expect:

Environment:

  • All on one box: Quad core 64-bit laptop with 4GB RAM running W7. Decent machine.
  • Small SQL database (SQLExpress 2008 R2) with 16 tables... the table under test has 243 rows.
  • Hosted my test service in IIS with all defaults.

Code:

  • I've created a Entity Framework model (DataContext) for this database (stock codegen by VS2010).
  • I've created a data-service based on this model.
  • I've created a client which has a direct service reference (ObjectContext) for this service (stock codegen by VS2010)
  • In the client I am also able to call the EF model directly and also use Native SQL (ADO.NET SqlConnection)

Test Plan:

  • Each iteration connects to the database (there is an option to reuse connections), queries for all rows in the target table ("EVENTS") and then counts them (thus forcing any deferred fetches to be performaed).
  • Run for 25 iterations each for Native SQL (SqlConnection/SqlCommand), Entity Framework (DataContext) and WCF Data Services (ObjectContext).

Results:

  • 25 iterations of Native SQL: 436ms
  • 25 iterations of Entity Framework: 656ms
  • 25 iterations of WCF Data Services: 12110ms

Ouch. That's about 20x slower than EF.

Since WCF Data Services is HTTP, there's no opportunity for HTTP connection reuse, so the client is forced to reconnect to the web server for each iteration. But surely there's more going on here than that.

EF itself is fairly fast and it's the same EF code/model is reused for both the service and the direct-to-EF client tests. There's going to be some overhead for Xml serialization and deserialization in the data-service, but that much!?! I've had good performance with Xml serialization in the past.

I'm going to run some tests with JSON and Protocol-Buffer encodings to see if I can get better performance, but I'm curious if the community has any advice for speeding this up.

I'm not strong with IIS, so perhaps there are some IIS tweaks (caches, connection pools, etc) that can be set to improves this?

A: 

In order to eliminate most of the connection overhead you can try to batch all operations to the WCF DS to to see if that makes a significant difference.

NorthwindEntities context = new NorthwindEntities(svcUri);
var batchRequests = 
     new DataServiceRequest[]{someCustomerQuery, someProductsQuery};

var batchResponse = context.ExecuteBatch(batchRequests);

For more info see here.

ntziolis
A: 

How u pass those 25 iterations for wcf?

var WCFobj = new ...Service();
foreach(var calling in CallList)
   WCFobj.Call(...)

if u call like that it means u call WCF for 25 times which consume too many resources. for me, i use to build up everything into DataTable and user Table Name to Store Proc i'm calling. and DataRow is params. and when calling, just pass the DataTable in encrypted form by using

var table = new DataTable("PROC_CALLING")...
...
StringBuilder sb = new StringBuilder();
var xml = System.Xml.XmlWriter.Create(sb);
table.WriteXml(xml);
var bytes = System.Text.Encoding.UTF8.GetBytes(sb.ToString());
[optional]use GZip to bytes
WCFobj.Call(bytes);

The thing is u pass all 25 calls at once. and that save performance significantly. and if the return object is same structure, just pass it as DataTable in bytes form and convert it back to DataTable. i use to implement this methods with GZip for import/export data modules. becus passing large amount of bytes gonna make WCF unhappy. Its depend wheater u want to consume computing resource or networking resource.

888
A: 

Try setting security to "none" in the binding section in the configuration. This should make big improvement.

fhj