I'm looking for a way to show how two continuous signals are correlated
over time. In other words, if x(t) and y(t) are correlated signals (with
some phase lag between them), does that correlation change over time?
(and if so, then how does it vary)
What I'd ideally like to get is something like the spectrogram except
instead of frequency vs time, the axes would be correlation vs. lag vs.
time.
The most obvious solution I've thought of is to use a sliding window on
each signal to evalute the cross-correlation (at different lags) over
small epochs. I'm wondering if there are other more elegant solutions
out there.
I'd appreciate any advice on the subject.
Thanks.
-Tony Reina
===========================================================================
This list is open to everyone. Occasionally, less thoughtful
people send inappropriate messages. Please DO NOT COMPLAIN TO
THE POSTMASTER about these messages because the postmaster has no
way of controlling them, and excessive complaints will result in
termination of the list.
For information about this list, including information about the
problem of inappropriate messages and information about how to
unsubscribe, please see the web page at
http://jse.stat.ncsu.edu/
===========================================================================