views:

407

answers:

4

Given a class:

[DataContract]
public sealed class ChangedField
{
    [DataMember(Name="I")] public ushort FieldId { get; set; }
    [DataMember(Name="V")] public object Value { get; set; }
}

WireShark shows that, when sent via a WCF TCP binding, the encoding of the message is in binary (printable characters only, but you get the idea):

ChangedFielda.I..a.V....e:double..e http://www.w3.org/2001/XMLSchema.s.....a.

But if I serialise an instance of this type like so...

var ser = new DataContractSerializer(typeof(ChangedField));
var stream = new MemoryStream();
ser.WriteObject(stream, new ChangedField { FieldId = 1, Value = 1.23d });

...then the stream contains SOAP XML resembling this:

<ChangedField>
  <I>1</I>
  <V i:type="a:double" xmlns:a="http://www.w3.org/2001/XMLSchema"&gt;1.23&lt;/V&gt;
</ChangedField>

So my question is how can I control DataContractSerializer to produce this binary representation in my own code?

As an aside:

As you can see, the message is bloated by the fact that the object property must have its type encoded (hence the URI). I'm going to change this to use a custom binary encoding, as in my scenario, the field ID determines the type (in this case, double).

A: 

Have you read the documentation on the DataContractSerializer?

To use the DataContractSerializer, first create an instance of a class and an object appropriate to writing or reading the format; for example, an instance of the XmlDictionaryWriter. Then call the WriteObject method to persist the data. To retrieve data, create an object appropriate to reading the data format (such as an XmlDictionaryReader for an XML document) and call the ReadObject method.

You will then want to read about XmlDictionaryWriter.

John Saunders
@John - I don't see how XmlDictionaryWriter helps. It's an abstract type with many abstract members, and I couldn't find any public implementations of it in the framework. Could you elaborate?
Drew Noakes
+2  A: 

the TCP binding uses the binary message encoder by default, where as you're just serializing the data contract into XML in your second example. What happens with the binary message encoder is that it basically provides the data contract serializer with a custom XmlWriter implementation that generates a proprietary binary format, instead of XML.

If you want to use this with a different binding (say, HTTP), then you need to create a custom binding instead and add the BinaryMessageEncodingElement element instead of the normal TextMessageEncodingElement.

tomasr
Thanks tomasr. I don't want to change the binding. What I want to do is write some test cases that prove the difference in message size, but I don't want to snoop with WireShark every time.
Drew Noakes
Drew: That's possible as well. I've posted an example app showing how to do it here: http://winterdom.com/2009/07/comparing-text-and-binary-serialization-in-wcf
tomasr
Ah, that looks much more like what I am after. I'll try it out at work tomorrow and get back to you. Thanks very much!
Drew Noakes
I've summarised the approach in code in another answer: http://stackoverflow.com/questions/1074008/wcf-measuring-approximate-message-sizes-programmatically/1206831#1206831
Drew Noakes
+1  A: 

You cannot control the DataContractSerializer - but you can control in your WCF binding whether it'll serialize to text format or binary.

As you noticed yourself, NetTcpBinding defaults to binary and therefore is very compact and fast. The Http-based bindings default to Text for interoperability's sake and therefore are more verbose and less performant.

But there's nothing stopping you from creating your own binary-based HTTP binding - IF you're willing to accept that the interoperability with external clients is out the window if you do. This will ONLY work with your own internal clients that use the same custom binding.

Basically, what you need to do is build your own custom binding from scratch - in code or config. In config it's pretty easy:

<system.serviceModel>
  <bindings>
     <customBinding name="binaryHttpBinding">
        <binaryMessageEncoding />
        <httpTransport />
     </customBinding>
  </bindings>
</system.serviceModel>

The two minimum parts you must specify are a message encoding and a transport protocol - anything else is optional.

Now, you can specify your WCF service and client so that they'll use this new, custom binary-over-HTTP binding:

<system.serviceModel>
   <services>
      <service name="YourService">
         <endpoint address="http://whatever.com:8888/YourAddress"
                   binding=binaryHttpBinding"
                   contract="IYourContract" />
      </service>
   </services>
</system.serviceModel>

(and the same also works for the client in the <client> section, of course).

There you have it - it's not a matter of controlling the DataContractSerializer in your code in any way, shape or form - just create a custom binding and use it.

Marc

EDIT (after comment left):
Well, WCF is indeed very extensible, so you can do this like message inspectors on the way out of the client. You could write such a message inspector (derived from IClientMessageInspector) which does nothing but inspect the message and measure its size and write that out to a file or database

marc_s
Thanks Marc for this answer. It's interesting, but not really what I'm asking. I really just want to measure the number of bytes that would be sent for a particular object using whatever binary formatting the NetTcpBinding uses, but from code. I'm working on reducing the on-the-wire size, and it's easier to do this with a small app or unit test than with a client/server and a packet sniffer.
Drew Noakes
A: 

Here is the simplest solution I could find to the problem:

static byte[] Serialise(object obj, MessageEncodingBindingElement encoding)
{
    encoding.MessageVersion = MessageVersion.Soap12;
    var stream = new MemoryStream();
    var message = Message.CreateMessage(MessageVersion.Soap12, "", obj, 
       new DataContractSerializer(obj.GetType()));
    encoding.CreateMessageEncoderFactory().Encoder.WriteMessage(message, stream);
    return stream.ToArray();
}

...where the encoding parameter is either:

new TextMessageEncodingBindingElement()

...or...

new BinaryMessageEncodingBindingElement()

...depending upon whether you want text or binary formatting.

Thanks to tomasr for the link to his article on this topic.

Drew Noakes
Do you use along with message inspector? I mean, where do you hook that method.
dragonfly