If i have a large number of timers (10 to possibly a couple hundred) all with < 100 ms interval, will that affect the firing of any one of them?
Logically, no, but in practise I imagine you may find their performance bounded by physical CPU constraints.
I suspect you may want to revisit your design.
It will, but I don't know if you will see it.
If you have a 1GHz machine. Then one clock tick is 1 nanosecond (1/1.000.000.000). Lets say that a timer set takes 10 clock ticks (probably takes more, but just a thought experiment). Then 100 timer sets will take 1 microsecond.
But there are many other variables that you need to account for a real life example, but this should help you see what's going on.
What Timer class are you using? The resolution of many of the standard Timers is not great, and they probably should not be used for intervals less 100 ms. See here for an example of a higher resolution timer.
As already stated, you should reconsider your design, as having 10s or 100s of timers is unnecessarily complicated and a waste of resources. A single high resolution 1 ms timer can be used to time any interval of 1 ms of more--just count how many times the timer has fired. If you have multiple intervals to track, then use the 1 ms timer with multiple counts, one per interval. So for a 100 ms interval, when the timer count reaches 100 the interval has elapsed (you will then need to reset this count to 0).
First of all it depends greatly on whether it's System.Timers.Timer
or System.Windows.Forms.Timer
that you are using.
The windows timers will all run in the main thread, so one handler can not start until the previos has finished. A timer is never guaranteed to start exactly at the desired time, but if you have a lot of them they will affect each other.
The threaded timers are less sensetive, as the handlers start in separate threads, but the processor still can't run more threads at once than there are cores, so they may still affect each other somewhat.