My application has a WCF service tier between the front-end and the database. Since we currently host in IIS6 we are using SOAP over HTTP. How can I find out how much real world time I am spending doing serialization activities in my application?
One of the dead simple things old mainframes always used to to was to include a "sever time" field in the response. Therefore you'd know that anything that happened between your observed time and server time is network/serialization overhead. I know it kind of 1960-ish, but it still works well.
If you want avoid changing the method signature you can set it in a http response header or something ;)
Create a message inspector and add it to your endpoint behavior,
In the message inspector, you implement the interface which has the following method
AfterReceiveRequest
{
DateTime start = datetime.now;
return start;
}
BeforeSendReply
{
// start will be passed in as state in the parameter.
TimeSpan period = datetime.now - start
}
The xperf tools are a good choice if you can run on Windows Server 2008 and Vista on the client. These are ETW based and use the Windows sample profile interrupt to profile anything running on the system. Here is series of posts on the xperf tools from myself. This post is specifically about profiling.
note that you can download the latest version of the tools directly for this page.
You may want to experiment with some of the other kernel events, such as disk IO and hard faults.
Not about measuring - but about improving: I've been working on protobuf-net, an implementation of Google's "protocol buffers" (a compact, low-CPU binary serialization format) for use with .NET - including a WCF hook (to replace the DataContractSerializer). It has some pretty good metrics re serialization.
When used with the basic http binding, it also works with MTOM, so you don't even get the base-64 overhead of binary. There is a WCF sample here.
It might be of interest...