views:

31

answers:

1

Hi,

I was wondering what would be the best approach to split up array processing using multiple queued functions into small time chunks?

So say I have an multi dimensional array, and I want to run a function(s) over it, but only in small timed chunks, say 500ms each time I trigger the processing to occur.

What would be the best approach?

One way I can think of is using psuedo code

#get time 
#loop_marker:
#if current function_to_run is None
#   pop function to run off queue store
#
#run function_to_run once
#increment start_index (in array) for next time or set function_to_run to None if finished array 
#check time_diff
#if time_diff < limit - time_diff (i.e. if we can run this function again before hitting the limit)
#   goto loop_marker
#else 
#    yield

Obviously this is not very Pythonic... So any ideas? Any cleaner ways?

I cannot install anything on the machine its processing on apart from python 2.5

Would love you hear your thoughts

Mark

+1  A: 

You could set up a queue of functions to run as a generator that would yield each run, then have a small loop that looked like this:

time_elapsed = 0
for func in function_queue_generator:
    if time_elapsed > time_limit:
        yield
        time_elapsed = 0
    func()

Such a generator could be implemented like so, perhaps:

def run_func_on_args(input_arg_sets, func):
    for argset in input_arg_sets:
        yield lambda: func(argset)

There are many possible ways you could create such a generator; the above is just a simple example. You could create generators to run functions across multidimensional arrays, et cetera.

Amber