Dear Sir / Madam Hi, I have written some code For R that uses for loops to do 2-dimensional grid searches for maximum likelhood combined with iterated GLS estimation. As can be expected, depending on the szie of the grid, estimation can take quite some time. However, I have noticed that the same code run on a windows operating system is much faster than when run on a Mac (I basically paste the code into the console and then run things from there). I was wondering if anyone knew if this is typical, or if there is some good reason for this? (I am not that familiar with the mac operating system, so might be missing something obvious). I am writing a user manual for the code and would like to have some explanation (or possible improvement) for mac users, so any info on this would be much appreciated Sincerely Jason Pienaar
______________________________________________ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.