views:

196

answers:

2

I've been reading a paper on real-time systems using the Linux OS, and the term "scheduling jitter" is used repeatedly without definition.

What is scheduling jitter? What does it mean?

+5  A: 

Jitter is the irregularity of a time-based signal. For example, in networks, jitter would be the variability of the packet latency across a network. In scheduling, I'm assuming the jitter refers to inequality of slices of time allocated to processes.

Read more here http://en.wikipedia.org/wiki/Jitter

djc
I'm going to give you accepted answer, because you helped me figure this out and gave me the key piece. I think my own answer is a bit more pointed in this case. However, you can't read my mind or my paper, and I think it would be extremely self-serving to consider my answer to be best in this case, so you get it. :)
sheepsimulator
+2  A: 

So, given djc's answer, scheduling jitter for my semantic domain in the question above would be:

Scheduling jitter: inequality of slices of time allocated to processes by the system scheduler that occur out of necessity. An example of where this might occur would be: If one has a requirement where all processes in a real-time environment would use no more than 100ms of processor time per scheduled time, a process that requires and uses 150ms of time would cause significant scheduling jitter in that real-time system.

sheepsimulator
Thanks. The key idea is clear, I just didn't know exactly how to formulate it for your domain (since I'm not really familiar with it).
djc