I'm wondering what the precision of the Timer class is in System.Timers, because it's a double (which would seem to indicate that you can have fractions of milliseconds). What is it?
Windows desktop OSes really aren't accurate below about 40ms. The OS simply isn't real time and therefore presents significant non-deterministic jitter. What that means is that while it may report values down to the millisecond or even smaller, you can't really count on those values to be really meaningful. So even if the Timer interval gets set to some sub-millisecond value, you can't rely on times between setting and firing to actually be what you said you wanted.
Add to this fact that the entire framework you're running under is non-deterministic (the GC could suspend you and do collection duing the time when the Timer should fire) and you end up with loads and loads of risk trying to do anything that is time critical.
System.Timers.Timer is weird. It's using a double as the interval but in fact calls Math.Ceiling on it and casting the result as an int to use with an underlying System.Threading.Timer. It's theorical precision is then 1ms and you can't specify an interval that exceeds 2,147,483,647ms. Given this information, I really don't know why a double is used as the interval parameter.
A couple of years ago I found it to be accurate to about 16ms... but I unfortunately don't remember the details.
You can find this out easily for yourself by performing a loop that constantly samples the resulting time duration and check on what granularity it steps.