I'm trying to understand autocorrelation as related to some pitch detection 
algorithms I've been looking at.

In example code for the autocorrelation, it is the summation of a sample at 
indexes multiplied by a sample at indexes plus an offset.

eg

for (i=0;i<size;i++) 
{
    sum=0;
    for (j=0;j<size;j++) 
    {
        sum+=x[j]*x[j+i];
    }
    R[i]=sum;
}

When I graph this (R[i]) in a spreadsheet for various offsets, I do not get a 
minima as expected when the samples match (I have tried with a couple of 
generated periodic waves). 

However, if I use the summation of the differences I do get the minima as 
expected.

What am I missing?

Veronica

--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to