views:

127

answers:

4

Hi,

Is there a way to fire events in C# at a resolution of a few microseconds?

I am building a midi sequencer and it requires an event to be fired every midi tick which will then play any note registered at that time. At 120 beats per minute and at a resolution of 120 ppqn (pulses per beat/quarter note), that event should fire every 4.16666 milliseconds. Modern sequencers have higher resolutions such as 768ppqn which would require that event to be fired every 651 microseconds.

The best resolution for short-timed events i have found is of 1 millisecond. How can i go beyond that?

This problem must have already been solved by any c# midi sequencer or midi file player. Maybe am i just not looking at the problem through the right angle.

Thank you for your help.

A: 

instead of using timer

use stopwatch

example, for 10x 1 second

    static void Main(string[] args)
    {
        Stopwatch masterSW;
        Stopwatch sw=null;
        int count;

        sw = Stopwatch.StartNew();
        sw.Reset();

        for (int i = 0; i < 10; i++)
        {
            count = 0;
            masterSW = Stopwatch.StartNew();
            while (count!=1536) //1537*651 microsecond is about a second (1.0005870 second)
            {
                if (!sw.IsRunning)
                    sw.Start();

                if (sw.Elapsed.Ticks >= 6510)
                {
                    count++;
                    sw.Reset();
                }
            }

            Debug.WriteLine("Ticks: " + masterSW.Elapsed.Ticks.ToString());
        }
    }

will output:

Ticks: 10005392 (which is 1.0005392 second)
Ticks: 10004792
Ticks: 10004376
Ticks: 10005408
Ticks: 10004398
Ticks: 10004426
Ticks: 10004268
Ticks: 10004427
Ticks: 10005161
Ticks: 10004306

which seem kind of ok

Fredou
Thank you. I have tried using stopwatch but from what i know it only allows to check elapsed time between stopwatch.start and stopwatch.stop, it does not fire events at regular intervals
Brice
He's executing `count++` at fairly regular intervals. As long as absolutely nothing else is running on the machine, this could work.
Ben Voigt
+2  A: 

I think you are unlikely to get exactly the correct resolution from a timer. A better approach would be to use the 1ms accurate timer, and when it fires, to check which MIDI events are pending and to fire them.

So, the MIDI events go in a sorted queue, you peek the first one, and set the timer to fire as close as possible to that time. When the timer fires, consume all events from the queue that have elapsed, until you encounter a future event. Calculate time to this event. Reschedule timer.

Of course, if you are outputting to your soundcard, the approach is fundamentally different, and you should be counting samples for all your timings.

spender
+4  A: 

Most midi sequencers/midi players will either convert large blocks of time to waveform (for playing through computer speakers) or take a large block of MIDI instructions (for an external device attached to a MIDI port). Either way, a block of data is copied to the sound card, and the sound card takes care of exact timing.

You might want to look at the Multimedia Control APIs.

See this post over at the Microsoft discussion forum

Ben Voigt
+1  A: 

It is not possible to have events accurately fired on microsecond intervals in .NET.

In fact because Windows itself is not a real time OS, performing anything with 100% accuracy to certain microseconds, in user mode software, is pretty much impossible.

For more information on why this is so difficult see the MSDN magazine article: Implement a Continuously Updating, High-Resolution Time Provider for Windows. While it talks about Windows NT, this still generally applies to later versions of Windows.

The conclusion of this article sums it up well:

If you now think that you can obtain the system time with an almost arbitrary precision here, just a slight warning: don't forget the preemptiveness of a multitasking system such as Windows NT. In the best case, the time stamp you'll get is off by only the time it takes to read the performance counter and transform this reading into an absolute time. In the worst cases, the time elapsed could easily be in the order of tens of milliseconds.

Although this might indicate you went through all of this for nothing, rest assured that you didn't. Even executing the call to the Win32 API GetSystemTimeAsFileTime (or gettimeofday under Unix) is subject to the same conditions, so you are actually doing no worse than that. In a majority of the cases, you will have good results. Just don't perform anything requiring real-time predictability on the basis of time stamps in Windows NT.

Ash
That should say "in user mode", not "in software", because kernel drivers are also software and certainly are capable of sub-microsecond timing/latency.
Ben Voigt