I've written a function that regularly throws the "cannot allocate vector of 
size X Kb" error, since it contains a loop that creates large numbers of big 
distance matrices. I'd be very grateful for any simple advice on how to reduce 
the memory demands of my function.  Besides increasing memory.size to the 
maximum available, I've tried reducing my "dist" objects to 3 sig.. fig.s (not 
sure if that made any difference), I've tried the distance function daisy() 
from package "cluster" instead of dist(), and I've avoided storing unnecessary 
intermediary objects as far as possible by nesting functions in the same 
command.  I've even tried writing each of my dist() objects to a text file, one 
line for each, and reading them in again one at a time as and when required, 
using scan() - and although this seemed to avoid the memory problem, it ran so 
slowly that it wasn't much use for someone with deadlines to meet...

I don't have formal training in programming, so if there's something handy I 
should read, do let me know.

Thanks,

Richard Gunton.

Postdoctoral researcher in arable weed ecology, INRA Dijon.


      
        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to