views:

296

answers:

4

Say, we have a program that gets user input or any other unpredictable events at arbitrary moments of time.

For each kind of event the program should perform some computation or access a resource, which is reasonably time-consuming to be considered. The program should output a result as fast as possible. If next events arrive, it might be acceptable to drop previous computations and take up new ones.

To complicate it further, some computations/resource access might be interdependent, i.e. produce data that can be used in other computations.

What's important we know the pattern in which these events usually occur. For example: their relative frequency with respect to each other, or a common order and time intervals in which they happen.

The task is to make an algorithm which deals with the problem in the most statistically efficient way. Approaches yielding sub-optimal solutions can be more than sufficient.

Is there a concept which embraces designing such algorithms?


Example:

A tabbed internet browser.

When told to load different web pages in several tabs, should decide whether to load the page in an active tab with higher priority, to render just the visible part of the page or pre-render the full page, if so what to do first - pre-render the whole page for the active tab or render other tabs instead, etc.

(I know nothing about how browsers actually work, but assuming it is this way won't hurt)

+7  A: 

I think scheduling algorithms deal with these kind of scenarios.

Mehrdad Afshari
+4  A: 

What you're describing is a prioritizing application scheduler. You would need to be more specific to determine which algorithm would be best, but here's a list that you might find useful.

John Feminella
A: 

I am tossing keywords: Scheduling with pre-emption? Also, prefetching, double-buffering

dirkgently
A: 

I don't know a lot about it but this sounds like something that the reactor patern may be used for.

Jeremy French