tags:

views:

45

answers:

2

I'm setting a date on a DateTime type field and I call my web service which is located on the same server using WCF:

// on the client        
myObject.Date = DateTime.Now;
myChangedObject = proxy.DoNothing(myObject);  // passes back the object

// on the server
public MyObjectType DoNothing(MyObjectType source)
{
    var obj = new MyObjectType();
    obj.Date = source.Date;

    return obj;
}

When it gets to the server the DateTime is precise down to the ticks, but the object I receive back has a different number of ticks.

    Assert.IsTrue(myChangedObject.Date == myObject.Date);  // fails miserably

What Am I doing wrong here? I verified that the values being stored in

+2  A: 

Storing a DateTime in SQL Server's Datetime type will cause it to lose precision because SQL Server only stores times in 1/300 sec increments. I would use a function that truncates to seconds before comparing, like this:

bool AreDateTimesEqual(DateTime a, DateTime b)
{
    return a.AddMilliseconds(-a.Millisecond)
        == b.AddMilliseconds(-b.Millisecond);
}

Or if you have control over the schema and SQL Server 2008, switch to Datetime2, which has the full precision of DateTime.

Gabe
A: 

Maybe you should try DateTime.UtcNow instead?

I can't speak for MS's implementation, but I had a problem with DataSet passed via WCF - all DateTime columns had DateTimeKind set to UnspecifiedLocal and the web service (I guess) tried to be wise so it converted the time to UTC... It also tried to reverse the process (still staying at DateTimeKind.UnspecifiedLocal) on the client side, but the client was in a different TimeZone, so it failed miserably.

So to sum up, they may have converted it to UTC... So the best way is actually passing everything in UTC and then converting to local time when needed. This is also the best practice to store date/time.

Paweł Dyda