I think the question is discussed in other thread, but I don't exactly find
what I want .
I'm working in Windows XP with 2GB of memory and a Pentium 4 - 3.00Ghx.
I have the necessity of working with large dataset, generally from 300,000
records to 800,000 (according to the project), and about 300 variables
(...but a dataset with 800,000 records could not be "large" in your
opinion...). Because of we are deciding if R will be the official software
in our company, I'd like to say if the possibility of using R with these
datasets depends only by the characteristics of the "engine" (memory and
processor).
In this case we can improve the machine (for example, what memory you
reccomend?).

For example, I have a dataset of 200,000 records and 211 variables but I
can't load the dataset because R doesn't work : I control the loading
procedure (read.table in R) by using the windows task-manager and R is
blocked when the file paging is 1.10 GB.
After this I try with a sample of 100,000 records and I can correctly load
tha dataset, but I'd like to use the package tree, but after some seconds (
I use this tree(variable1~., myDataset) )   I obtain the message "Reached
total allocation of 1014Mb".

I'd like your opinion and suggestion, considering that I could improve (in
memory) my computer.

pestalozzi

        [[alternative HTML version deleted]]

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to