On 23 October 2019 at 13:34, Paul Thompson wrote: | Hopefully, you may be able to shed some light on a problem that I have | regarding ‘speeding up’ the R function nlme::gls which I use for fitting | models with temporal autocorrelation. In a vignette by Doug Bates and Dirk | Eddelbuettel ( | https://cran.r-project.org/web/packages/RcppEigen/vignettes/RcppEigen-Introduction.pdf), | I found that they used RcppEigen to solve some least-squares problems, but | I wasn’t sure if this could be extended to generalised least squares? | | The current function nlme::gls takes hours to execute and I was hoping to | use the Rcpp framework to ameliorate this problem, but I have little | experience in Rcpp or C++ programming and wasn’t sure if this had already | been tackled. After extensive searching on the web, I haven’t found any | implementation. Does anyone have any ideas or advice please? | | My model syntax in R is currently as follows: | | myfit <- gls(y~stim1+stim2+t+I(t^2)+I(t^3)+signal+stim1_signal,data=mydata, | correlation=corAR1(form=~t)) | | Some other information, each time series has approximately 20,000 | observations. | | Thank you in advance for any help.
I don't know those models well. So I can only suggest to profile and measure. And to look very closely at e.g. examples/FastLM/ in Rcpp (and the various fastLm() functions in three Rcpp* packages). Calling to have the formula in y ~ .... resolved WAY dominates the computation in the fastLm case (so if in a hurry, ALWAYS run with a matrix and a vector via the alternate interfaces). It may be different here. But it is unlikely that there is a 'free lunch' anywhere. R is a few decades old, and workhorse functions like that one have been reviewed and reviewwed and reviewed ... Dirk -- http://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org _______________________________________________ Rcpp-devel mailing list Rcpp-devel@lists.r-forge.r-project.org https://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/rcpp-devel