On Wed, Jul 1, 2009 at 3:02 AM, gug guygr...@netvigator.com wrote:
sapply(ls(), function(x) object.size(get(x)))
-This lists all objects with the memory each is using (I should be honest
and say that, never having used sapply before, I don't truly understand
the syntax of this, but it
Hello,
Is there a command for freeing up the memory used by R in holding data
tables?
The structure of the procedure I have is as follows:
1) Read multiple txt files in using read.table(...).
2) Combine the read tables using rbind(...).
3) Attach the data using attach(...) and then use do a
You can use 'rm' to remove objects. Are you remembering to do a 'detach'
after the 'attach'? Why are you using 'attach' (I personally avoid it)?
Think about using 'with'. Take a look at the size of the objects you are
working with (object.size) to understand where you might have problems. Use
Thanks - that's great. A combination of object.size, rm and gc seems
to be enough for me to work out what was causing the problem and then get
beyond it.
In particular, using rm on the result of the multiple regression seems to
make a big difference: it wasn't obvious to me before, but of
On Tue, Jun 30, 2009 at 2:36 PM, gug guygr...@netvigator.com wrote:
I've been using attach because I was following one of the approaches
recommended in this Basic Statistics and R tutorial
(http://ehsan.karim.googlepages.com/lab251t3.pdf), in order to be able to
easily use the column
It is usually better (and easier) to use the data argument that comes with
many modelling functions
-- Yes. And for functions without a data argument, see ?with.
Bert Gunter
Genentech Nonclinical Biostatistics
__
R-help@r-project.org mailing
Thanks to everyone who has posted. These posts have really helped me to
budge forward my understanding of R, as well as giving me a couple of new
areas that I still need to work on.
These (below) won't be news to the people who have posted, but for anyone
who is in my position, here are a
7 matches
Mail list logo