I have a WSDL that the consumer of my web service expects will be adhered to strictly. I converted it into an interface with wsdl.exe and had my web service implement it. Except for this problem, I have been generally pleased with the results.
A simple GetCurrentTime method will have the following response class generated from the WSDL in the interface definition:
[System.CodeDobmCompiler.GeneratedCodeAttribute("wsdl", "2.0.50727.3038")]
[System.SerializableAttribute()]
[System.Diagnostics.DebuggerStepThroughAttribute()]
[System.ComponentModel.DesignerCategoryAttribute("code")]
[System.Xml.Serialization.XmlTypeAttribute(Namespace="[Client Namespace]")]
public partial class GetCurrentTimeResponse {
private System.DateTime timeStampField;
[System.Xml.Serialization.XmlElementAttribute(Form=System.Xml.Schema.XmlSchemaForm.Unqualified]
public System.DateTime TimeStamp{
// [accesses timeStampField]
}
}
When I put the response data into the automatically generated response class, it gets serialized into an appropriate XML response. (Most of the web methods have much more complicated return types with multiple levels of arrays.)
The problem is that the default serialization of DateTime objects violates one of the requirements in the WSDL:
...
<xsd:simpleType name="SearchTimeStamp">
<xsd:restriction base="xsd:dateTime">
<xsd:pattern value="[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}:[0-9]{2}:[0-9]{2}(.[0-9]{1,7})?Z">
</xsd:restriction>
</xsd:simpleType>
...
Note the last part of the pattern where subseconds must be either 1 or 7 characters if they are included. The client seems to be rejecting the response because it does not match that requirement.
The main issue is that when .NET serializes a DateTime object, it omits all trailing zeroes, meaning the resulting subsecond value varies in length. (e.g., "12:34:56.700" gets serialized as "<TimeStamp>
12:34:56:7</TimeStamp>
" by default). We use millisecond precision, so I need all timestamps to format with 7 subsecond digits in order to be compliant with the WSDL.
It would be easy if I could specify a format string, but I'm not sure how to control the string that the DateTime object uses to serialize to XML, or to otherwise override the serialization behavior. How do I do this? Keeping in mind the following...
- I would like to modify the generated code as little as possible... preferably not at all if the change can be made through a partial class or inherited class.
- Using an inherited class for the return type of the web method will cause the web service to no longer implement the auto-generated interface.
- The TimeStamp type occurs in other, more complex response types. So, manually overriding the entire serialization process may be prohibitively time-consuming.
Update:
I tried implementing IXmlSerializable early on, as John suggested, and got the following error:
System.InvalidOperationException: There was an error reflecting type '[...].GetCurrentTimeResponse'. ---> System.InvalidOperationException: Only XmlRoot attribute may be specified for the type [...].GetCurrentTimeResponse. Please use XmlSchemaProviderAttribute to specify schema type.
After that, I figured that I would have to hack up the generated code too much for my liking and kept searching for other solutions.