Hello,

I'm a novice R user. I'd like to estimate the linear trends (b) and their
statistical significance (p-value) of quite a few univariate time series.
Because several time series show autocorrelation and heteroskedasticity,
the ordinary least squares method lm(), as I understand it, isn't the most
appropriate choice; the generalized least squares method gls() in package
nlme can account for both autocorrelation and heteroskedasticity, and
therefore seems to be my best shot. But some of my time series also contain
outlier observations, which gls() might not be robust against.

I'm now playing around with the robust linear regression estimator lmrob()
in package robustbase. According to
http://cran.r-project.org/web/views/Robust.html, lmrob() "uses the latest
of the fast-S algorithms and heteroscedasticity and autocorrelation
corrected (HAC) standard errors". However, such information is absent in
the lmrob() help. I wonder if anyone is able to confirm or repudiate the
application of lmrob() to autocorrelated and/or heteroskedastic time
series. Again, my goal is simply linear trend estimation.

Thanks,
David

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to