views:

68

answers:

1

I want to interpolate a polynomial with the Lagrange method, but this code doesn't work:

def interpolate(x_values, y_values):
    def _basis(j):
        p = [(x - x_values[m])/(x_values[j] - x_values[m]) for m in xrange(k + 1) if m != j]
        return reduce(operator.mul, p)

    assert len(x_values) != 0 and (len(x_values) == len(y_values)), 'x and y cannot be empty and must have the same length'

    k = len(x_values)
    return sum(_basis(j) for j in xrange(k))

I followed Wikipedia, but when I run it I receive an IndexError at line 3!

Thanks

+1  A: 

Check the indices, Wikipedia says "k+1 data points", but you're setting k = len(x_values) where it should be k = len(x_values) - 1 if you followed the formula exactly.

AndiDog
Thank you! Now I check this and I tell you the result.
rubik
Ok, and why if I do: `interpolate([1, 2, 3], [1, 4, 9])` it returns `-0.5x^2 + 1.5x` ? Take a look at this: http://i.imgur.com/MkATz.gif
rubik
@rubik: Sorry, but I can't help you with such a specific problem without knowing the interpolation algorithm (and I won't read up on it). Check your logic again or search for an existing implementation. If you post more code on how you apply the interpolation (e.g. the definition/initial value of `x` is missing in your question), then somebody might be able to help you further.
AndiDog
I'm using pypol (http://pypol.altervista.org/) and x is monomial(x=1) (http://pypol.altervista.org/functions.html#pypol.monomial)
rubik