I have a question with regards to programming a Generalised Cross Correlation (GCC) function in Matlab for sound localisation experiments, to reduce the effects of reverberation.
I found some sample code that implements GCC: http://webscripts.softpedia.com/script/Scientific-Engineering-Ruby/Signal-Processing/Generalized-Cross-Correlation-Methods-34698.html. However the results, on sample data seem to be quite different from the expected values and very different to the correlation results. The code works when a time delay is added to a signal within the code and noise is superimposed. However the code doesn't seem to work on actual signals.
I also tried coding the GCC in matlab by:
- getting the cross correlation of the two signals by using xcorr
- taking the fft of the correlation to get the cross spectrum
- dividing the cross spectrum by the magnitude of the cross spectrum, to determine correlation in the frequency domain
- taking the ifft of the frequency domain answer to get the correlation in the time domain.
- by finding the point of maximum correlation I should be able to determine the time delay however the code always seems to give an answer of zero.
I would like to know why the both codes give answers of zero, even though normal correlation gives better results even though the general correlation is supposed to give better results for reverberant environments.
Thanks