I am experimenting with a program that inserts data into an SQL 2005 Server database (on XP SP3) at high rate of speed. (This is for collecting timing data so I can evaluate different aspects of my design).
My basic set up involves inserting a data into a table like the following (and using an SP that just specifies the payload field):
create table data
(
Id int PRIMARY KEY Identity,
payload datatime not null,
inserted datetime default (getdate()) not null
)
Note that both datetime fields have UNIQUE constraints on them as well.
On the client program I was calling the SP in such a tight loop that I had problems with the precision of the .Net DateTime.Now value (and possibly thread sleeping as well) and hence violating the payload's unique constraint. I addressed by a combination of a stopwatch variable, a bit of Thread.Sleep() and manually constructing the "payload" data so that it didn't violate the resolution of the SQL DateTime field (3.3 mS)
However with the inserts being generated at a rate between 5mS and 10mS I have started to see issues with the "Inserted" field on the SQL side where a row is being rejected regularly for a unique key violation. It is only when I slow my insert rate to more than 15 or so mS that this problems goes away. This rate is suspiciously like the precision issue I had with the .Net DateTime.Now (I read 16mS on a post somewhere) so I am wondering what the actual precision of the SQL Getdate() function is.
So can someone tell me what is backing GetDate(), and would it be tied to the same source as the .Net DateTime.Now value? And what sort of precision should I expect from it?
As an aside I know about the DATETIME2 type in SQL 2008 server, so that raises the question as to what the precision is for GetDate() in that system as well.