If you have a code that takes 2 weeks to run, than it might be a case of inefficient algorithm design. I was able to go from overnight runs (SELDI data analysis) to 20 minute runs by identifying single inefficient function that took most of the time, and writing it in C.
Jarek ====================================================\======= Jarek Tuszynski, PhD. o / \ Science Applications International Corporation <\__,| (703) 676-4192 "> \ [EMAIL PROTECTED] ` \ -----Original Message----- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Zhilin Liu Sent: Monday, August 01, 2005 8:28 PM To: r-help@stat.math.ethz.ch Subject: [R] can we manage memory usage to increase speed? Hi, Thanks for reading. I am running a process in R for microarray data analysis. RedHat Enterprise Linux 4, dual AMD CPU, 6G memory. However, the R process use only a total of <200M memory. And the CPU usage is total to ~110% for two. The program takes at least 2 weeks to run at the current speed. Is there some way we can increase the usage of CPUs and memories and speed up? Any suggestion is appreciated. Thanks again. Zhilin [[alternative HTML version deleted]] ______________________________________________ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html ______________________________________________ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html