I get it from windows tack manager (under Window 7). I guess it is in K
something.
My point was not about "how big is my dataset" (anyway, it is a fake dataset, so it can be as big as
I want) but more about "where on hell are lost the 52 760 - 39 668 K ?"
:-)
Christophe
On Nov 6, 2012, at 1:03 PM, Christophe Genolini wrote:
Hi the list
I have package foo0 with a big dataset 'myData'.
In DESCRIPTION, if I use 'LazyData: no', then I get:
- when I open a R session : memory used=20 908
- when I attach 'library(foo0)' : memory used=24364
- then I load the set 'data(myData)' : memory used=39 668
If I use LazyData: yes', then I get
- when I open a R session : memory used=20 908
- when I attach 'library(foo0)' : memory used=52 760.
In this second example, after 'library(foo0)', I was expecting the memory to
rize up to 39 668, not to 52 760... Where does the difference come from?
What do you mean by "memory used" - i.e. where do you get that from? After GC?
This certainly doesn't look like a "big dataset" by the numbers - I would
classify that as tiny :)
Cheers,
Simon
Thanks
Christophe
--
Christophe Genolini
Maître de conférences en bio-statistique
Vice président Communication interne et animation du campus
Université Paris Ouest Nanterre La Défense
______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
--
Christophe Genolini
Maître de conférences en bio-statistique
Vice président Communication interne et animation du campus
Université Paris Ouest Nanterre La Défense
______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel