I have an asynchronous dataflow system written in C++. In dataflow architecture, the application is a set of component instances, which are initialized at startup, then they communicate each other with pre-defined messages. There is a component type called Pulsar, which provides "clock signal message" to other components which connect to one it (e.g. Delay). It fires message (calls the dataflow dispatcher API) every X ms, where X is the value of the "frequency" parameter, which is given in ms.
Short, the task is just to call a function (method) in every X ms. The question is: what's the best/official way to do it? Is there any pattern for it?
There are some methods I found:
- Use SIGALRM. I think, signalling is not suits for that purpose. Altough, the resolution is 1 sec, it's too rare.
- Use HW interrupt. I don't need this precisity. Also, I aware using HW-related solution (the server is compiled for several platforms, e.g. ARM).
- Measure elapsed time, and usleep() until next call. I'm not sure that it's the best way to measure time to call time related system calls by 5 thread, each 10 times in every second - but maybe I'm wrong.
- Use RealTime kernel functions. I don't know anything about it. Also, I don't need crystal precise call, it's not an atomreactor, and I can't install RT kernel on some platforms (also, 2.6.x Kernel is available).
Maybe, the best answer is a short commented part of an audio/video player's source code (which I can't find/understand by myself).
UPDATE (requested by @MSalters): The co-author of the DF project is using Mac OSX, so we should find a solution that works on most Posix-compilant op. systems, not only on Linux. Maybe, in the future there'll be a target device which uses BSD, or some restricted Linux.