Hey,
I'm trying to generate a heat map of 30,000 fragments from probably 5-10
samples. Windows complains about memory shortage. Should I resort to Unix
system?

Also, if I only plot 1000 fragments out, they can finish it rather fast.
5000 would take more than 10 minutes. I don't know what to expect for
30,000...

And on the side note, it seems like R only uses up to 50% of CPU while doing
number crunching. Is there anyway to utilize 100% of CPU for R? I'm using R
2.6.2 on Windows XP SP2, average config.

Thanks.

UCLA Neurology Research Lab

-- 
Regards,
Anh Tran

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to