Dear List: I am very much a unix neophyte, but recently had a Ubuntu box installed in my office. I commonly use Windows XP with 3 GB RAM on my machine and the Ubuntu machine is exactly the same as my windows box (e.g., processor and RAM) as far as I can tell.
Now, I recently had to run a very large lmer analysis using my windows machine, but was unable to due to memory limitations, even after increasing all the memory limits in R (which I think is a 2gig max according to the FAQ for windows). So, to make this computationally feasible, I had to sample from my very big data set and then run the analysis. Even still, it would take something on the order of 45 mins to 1 hr to get parameter estimates. (BTW, SAS Proc nlmixed was even worse and kept giving execution errors until the data set was very small and then it ran for a long time) However, I just ran the same analysis on the Ubuntu machine with the full, complete data set, which is very big and lmer gave me back parameter estimates in less than 5 minutes. Because I have so little experience with Ubuntu, I am quite pleased and would like to understand this a bit better. Does this occur because R is a bit friendlier with unix somehow? Or, is this occuring because unix somehow has more efficient methods for memory allocation? I wish I knew enough to even ask the right questions. So, I welcome any enlightenment members may add. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.