Hi,
I'm trying to calculate the lag between two signals in Python using cross correlation. The two signals are almost identical except for a very small timelag. I've tried numpy.correlate and scipy.convolve (alot faster) and both works relatively well but gives a small error. I'm starting to suspect that the error is the result of Python/scipy/numpy truncating a float somewhere. Has anyone been able to get high accuracy signal delay calculations working in Python?
Best regards Fredrik