Hi Dave, I suspected that this was the case. I did a simple simulation using normal random numbers, rescaled using a log-transform; indeed, the acf shows significant peaks that go on 'forever'. I suspect that a nonstationarity in the original time-series is the cause of the behavior I see, yet, if the process is nonlinear, then I can't justify elminating the nonstationarity which might destroy the (if any) nonlinear dynamics of the process that i'm interested in. I'm somewhat stuck on what to do here.
What is considered a reasonable procedure for examining this time-series and determining whether it contains nonlinear structure? that is, what types of linear analysis should I undertake before trying to look at nonlinearity. I realize that one could follow the ARIMA approach and attempt to model trends, seasonality, take differences, etc and then fit an ARMA model to the resultant stationary process, but if we are interested in the nonlinear structure, would this still be the correct approach? Any suggestions are appreciated. A recipe for initial analysis is even more appreciated. regards, P _____________________________________ Pradyumna Sribharga Upadrashta, PhD Student Scientific Computation, UofMN >-----Original Message----- >From: [EMAIL PROTECTED] >[mailto:[EMAIL PROTECTED] On Behalf Of David Reilly >Sent: Saturday, September 27, 2003 12:08 PM >To: [EMAIL PROTECTED] >Subject: Re: [edstat] stationarity, time-series analysis, >power spectra, etc > > >[EMAIL PROTECTED] (Pradyumna S Upadrashta) wrote in >message news:<[EMAIL PROTECTED]>... >> If one applies a linear filter, with a lowpass of 400Hz, is it >> possible for it to induce spurious autocorrelation (that persists >> essentially what seems like 'forever') in a time-series? >> >> I'm examining a time-series which was recorded from an empty >shielded >> room (the sensors detect very minute magnetic fields on the order of >> femtoTesla), so it should consist of sensor noise + environmental >> noise. However, upon inspection of the power spectrum, I notice a >> roll-off behavior ~400Hz (orig sampling rate was 1017Hz) and the ACF >> and PACF display very odd behavior as mentioned. Our data >acquisition >> person insists that a standard low-pass filter was used, with a DC >> offset. If this is the case, then i'm unable to understand why a >> time-series that should be pure white noise, isn't... I >don't want to >> jump on the idea that it is nonlinear without sufficient >justification >> for doing so. >> >> I'm not sure if these results are due to some inherent >nonlinearities >> in the signal, or whether the filter could have induced these >> behaviors? If the filter was nonlinear, then I speculate >that it could >> explain the signal structure i'm seeing? I have often heard of long >> autocorrelation times being associated with nonlinear >structure (e.g., >> Kantz and Schreiber, 1997 "Nonlinear Time Series Analysis") >> >> Also, what is the reasonable thing to do, with regard to measuring >> autocorrelation functions, for a time-series that appears >> non-stationary locally (say on plots of a few 1000 ms duration), but >> doesn't exhibit significant drift in the long run (on plots >of 5000 ms >> duration)? That is, when can we declare a time-series to be >> 'stationary' in the wide-sense? Can we declare it to be >> non-stationary, if at large time-scales it appears to be relatively >> stable? Is it enough if a plot is visually stationary? >> >> The caveat of not being able to 'know for sure' when something is >> stationary a priori, is that one can't rely on the power spectrum >> since it assumes a stationary signal structure for its >estimation! (or >> am I wrong about this?) >> >> I suppose one could look for stationary segments of the signal, and >> then attempt to compare spectra across multiple 'stationary looking' >> segments; if the signal really is stationary, i'd expect no changes >> right? I have 45,000 data points in a single time-series. >> >> What is the most correct test for non-stationarity which >doesn't rely >> on any assumptions? Also, if what i've done is correct, then >how do I >> interpret the ACF and PACF functions i've described? >> >> Any help would be greatly appreciated! >> >> thanks in advance... >> p >> >> _____________________________________ >> Pradyumna Sribharga Upadrashta, PhD Student >> Scientific Computation, UofMN >> >> . >> . ================================================================= >> Instructions for joining and leaving this list, remarks about the >> problem of INAPPROPRIATE MESSAGES, and archives are available at: >> . http://jse.stat.ncsu.edu/ . >> ================================================================= > > >P. > >It is always possible to inject structure by using an >incorrect filter . For example if you difference a white noise >process you create a series that has an autocorrelation >function that goes on forever. This is why you only use >filters that are required. > > >You should also know that changes in level in a series can >cause the ACF to go on forever, so to speak .This also applies >to trend changes. > >I don't rally know but I suspect that unnecessary power >transformations such as logs, reciprocals, arc-sine , square >root et.al. my also have an effect on the ACF. > > > >Regards > >Dave Reilly >AUTOMATIC FORECASTING SYSTEMS >http://www.autobox.com >. >. ================================================================= >Instructions for joining and leaving this list, remarks about >the problem of INAPPROPRIATE MESSAGES, and archives are available at: >. http://jse.stat.ncsu.edu/ . >================================================================= > . . ================================================================= Instructions for joining and leaving this list, remarks about the problem of INAPPROPRIATE MESSAGES, and archives are available at: . http://jse.stat.ncsu.edu/ . =================================================================
