views:

56

answers:

2

First, let me show you the codez:

a = array([...])
for n in range(10000):
    func_curry = functools.partial(func, y=n)
    result = array(map(func_curry, a))
    do_something_else(result)
    ...

What I'm doing here is trying to apply func to an array, changing every time the value of the func's second parameter. This is SLOOOOW (creating a new function every iteration surely does not help), and I also feel I missed the pythonic way of doing it. Any suggestion?

Could a solution that gives me a 2D array be a good idea? I don't know, but maybe it is.

Answers to possible questions:

  • Yes, this is (using a broad definition), an optimization problem (do_something_else() hides this)
  • No, scipy.optimize hasn't worked because I'm dealing with boolean values and it never seems to converge.
A: 

If a is of significant size the bottleneck should not be the creation of the function, but the duplication of the array.

Till Backhaus
`a` is an 1D numpy array of length 100
Agos
so what das `func` do? If you cannot reveal what `func` does for some reason you have to search for the bottleneck for yourself. A Profiler will help with that (http://docs.python.org/library/profile.html).
Till Backhaus
A: 

Did you try numpy.vectorize?

...
    vfunc_curry = vectorize(functools.partial(func, y=n))
    result = vfunc_curry(a)
...
bpowah