I use an asp.net [WebMethod] to push a .net object back to the Ajax call on a browser.
One of the properties of the object is of a DateTime type.
When it arrives at the browser the time is seven hours before the time that is stored in the SQL Server.
Okay, so my browser is in Peru (GMT-5) and the server is in Germany (currently GMT+2), that's where the 7 hours come from.
As a fix I send the UTC offset on the client with the Ajax request
d = new Date();
d.getTimezoneOffset();
then on the server I figure out the offset there:
// get a local time zone info
TimeZoneInfo tz = TimeZoneInfo.Local;
// get it in hours
int offset = tz.BaseUtcOffset.Hours;
// add one hour if we are in daylight savings
if (tz.IsDaylightSavingTime(DateTime.Now))
{
offset++;
}
Now I can fix the time field in my object before it is send to the browser.
My real question is, how does the serializer know about the 7 hours?
The http request doesn't include any time information.
Do I ask too much if I want the exact time as stored in the database?
Update:
Here's an example, the date in the database is: 2009-Oct-15 22:00
There is no TimeZone information attached to that.
When I call my WebMethod on my dev machine where client and server are obviously in the same time zone, the JSON from the server is:
{"d":{"TheDate":"\/Date(1255662000000)\/"}}
The JSON from the remote server in Germany is:
{"d":{"TheDate":"\/Date(1255636800000)\/"}}
There is a difference of 7 hours in the JSON as seen in Firebug. At this point there is no JavaScript involved yet.
One idea I had is that asp.net attaches a TimeZone to a session but that doesn't seem to be the case.