I've been reading a paper on real-time systems using the Linux OS, and the term "scheduling jitter" is used repeatedly without definition.
What is scheduling jitter? What does it mean?
I've been reading a paper on real-time systems using the Linux OS, and the term "scheduling jitter" is used repeatedly without definition.
What is scheduling jitter? What does it mean?
Jitter is the irregularity of a time-based signal. For example, in networks, jitter would be the variability of the packet latency across a network. In scheduling, I'm assuming the jitter refers to inequality of slices of time allocated to processes.
Read more here http://en.wikipedia.org/wiki/Jitter
So, given djc's answer, scheduling jitter for my semantic domain in the question above would be:
Scheduling jitter: inequality of slices of time allocated to processes by the system scheduler that occur out of necessity. An example of where this might occur would be: If one has a requirement where all processes in a real-time environment would use no more than 100ms of processor time per scheduled time, a process that requires and uses 150ms of time would cause significant scheduling jitter in that real-time system.