Re: [R] About Memory size

2007-06-24 Thread Patrick Burns
Uwe Ligges wrote:

 ...

 RAM is 
cheap and thinking hurts. 

...

Surely a fortune.

Patrick Burns
[EMAIL PROTECTED]
+44 (0)20 8525 0696
http://www.burns-stat.com
(home of S Poetry and A Guide for the Unwilling S User)

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] About Memory size

2007-06-23 Thread Uwe Ligges


Ferdouse Begum wrote:
 Hi,
 I am trying to analyse cancer data set (affymetrix) by
 using bioconductor packages. I have total 14 data set.
 Total size of the data set is 432MB. 

Do you mean it comsumes 432MB to have the data in R or in some format on 
the harddisc?
Do you need to work on the whole datasets at once?
Have you read the manuals and FAQs (there are sections about memory!!!)?


 Now I am trying
 to analyse these data sets in my PC with RAM 512. But


If you need 432MB just to have the data available in R, then you should 
have *at least* 1GB of RAM in your machine, and for certain function, 
you might need much more.

Hence the advise is to rethink how to reduce the problem or to buy 2GB 
of RAM for your machine (which is advisable in any case, because RAM is 
cheap and thinking hurts). We have upgraded all of our computer labs to 
at least 1GB these days.

Uwe Ligges


 if I want to get MAplot of my data set, I am getting
 the messege
 (
 MAplot(ptc.rawData)
 Error: cannot allocate vector of size 73.8 Mb
 In addition: Warning messages:
 1: Reached total allocation of 503Mb: see
 help(memory.size) 
 2: Reached total allocation of 503Mb: see
 help(memory.size) 
 3: Reached total allocation of 503Mb: see
 help(memory.size) 
 4: Reached total allocation of 503Mb: see
 help(memory.size))
 
 Now how can I get rid of this problem? 
 pls help.

 With thanks
 
 Ferdouse
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] about memory

2005-03-30 Thread jon butchar
How much memory is free when R fails (e.g., what does top show while trying 
to run your clustering)?  If there's still a sizeable amount of free memory you 
may have to look into the system limits, maximum data segment size in 
particular.  Many Linux distros have it set to unlimited but default Debian 
may not.  If this turns out to be the problem, please do not, _do not_ raise it 
to unlimited, but only to enough for R to work.

hth,

jon b



On Wed, 30 Mar 2005 18:36:37 +0800
ronggui [EMAIL PROTECTED] wrote:

 here is my system memory:
 [EMAIL PROTECTED] free
  total   used   free sharedbuffers cached
 Mem:256728  79440 177288  0   2296  36136
 -/+ buffers/cache:  41008 215720
 Swap:   481908  60524 421384
 
 and i want to cluster my data using hclust.my data has 3 variables and 1 
 cases.but it fails and saying have not enough memory for the vector size.  I 
 read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under 
 debian linux.but it still can not get the solution.so is my pc'memory not 
 enough to carry this analysis or my mistake on setting the memory?
 
 thank you.
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] about memory

2005-03-30 Thread ronggui
[EMAIL PROTECTED] ulimit -a
core file size(blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size   (kbytes, -m) unlimited
open files(-n) 1024
pipe size  (512 bytes, -p) 8
stack size(kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes(-u) unlimited
virtual memory(kbytes, -v) unlimited

so it seems the data segment size is not limited.
and it is still free mem(1000k or so),and swap(10k or so),and the error 
is(i translate it from chinese into english,maybe not exactly ,but i think the 
meanings are right):
error:can not allocate the vector size of 390585kb.
(: 390585 Kb)



On Wed, 30 Mar 2005 07:34:13 -0500
jon butchar [EMAIL PROTECTED] wrote:

 How much memory is free when R fails (e.g., what does top show while trying 
 to run your clustering)?  If there's still a sizeable amount of free memory 
 you may have to look into the system limits, maximum data segment size in 
 particular.  Many Linux distros have it set to unlimited but default Debian 
 may not.  If this turns out to be the problem, please do not, _do not_ raise 
 it to unlimited, but only to enough for R to work.
 
 hth,
 
 jon b
 
 
 
 On Wed, 30 Mar 2005 18:36:37 +0800
 ronggui [EMAIL PROTECTED] wrote:
 
  here is my system memory:
  [EMAIL PROTECTED] free
   total   used   free sharedbuffers cached
  Mem:256728  79440 177288  0   2296  36136
  -/+ buffers/cache:  41008 215720
  Swap:   481908  60524 421384
  
  and i want to cluster my data using hclust.my data has 3 variables and 
  1 cases.but it fails and saying have not enough memory for the vector 
  size.  I read the help doc and use $R --max-vsize=800M to start the R 
  2.1.0beta under debian linux.but it still can not get the solution.so is my 
  pc'memory not enough to carry this analysis or my mistake on setting the 
  memory?
  
  thank you.
  
  __
  R-help@stat.math.ethz.ch mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide! 
  http://www.R-project.org/posting-guide.html
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] about memory

2005-03-30 Thread Huntsinger, Reid
hclust creates a distance matrix. In your case it is 10,000 x 10,000. For
various reasons several copies are created, so you probably need at least 

100M x 8 bytes per entry x 3 copies = 2.4 GB

just for the distance matrix. If you don't have that much RAM the
computation will probably take longer than you're willing to wait.

Reid Huntsinger

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of ronggui
Sent: Wednesday, March 30, 2005 5:37 AM
To: r-help@stat.math.ethz.ch
Subject: [R] about memory


here is my system memory:
[EMAIL PROTECTED] free
 total   used   free sharedbuffers cached
Mem:256728  79440 177288  0   2296  36136
-/+ buffers/cache:  41008 215720
Swap:   481908  60524 421384

and i want to cluster my data using hclust.my data has 3 variables and 1
cases.but it fails and saying have not enough memory for the vector size.  I
read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under
debian linux.but it still can not get the solution.so is my pc'memory not
enough to carry this analysis or my mistake on setting the memory?

thank you.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] about memory

2005-03-30 Thread jon butchar
Yes, you may need more memory unless you can somehow free a good amount of RAM 
or find a more memory-efficient method for clustering.  If I'm reading it 
correctly, R wanted to allocate about 382 MB memory on top of what it had 
already taken but your computer had only about 98 MB swap plus about 1 MB RAM 
left to give.


On Wed, 30 Mar 2005 22:02:04 +0800
ronggui [EMAIL PROTECTED] wrote:

 [EMAIL PROTECTED] ulimit -a
 core file size(blocks, -c) 0
 data seg size (kbytes, -d) unlimited
 file size (blocks, -f) unlimited
 max locked memory (kbytes, -l) unlimited
 max memory size   (kbytes, -m) unlimited
 open files(-n) 1024
 pipe size  (512 bytes, -p) 8
 stack size(kbytes, -s) 8192
 cpu time (seconds, -t) unlimited
 max user processes(-u) unlimited
 virtual memory(kbytes, -v) unlimited
 
 so it seems the data segment size is not limited.
 and it is still free mem(1000k or so),and swap(10k or so),and the error 
 is(i translate it from chinese into english,maybe not exactly ,but i think 
 the meanings are right):
 error:can not allocate the vector size of 390585kb.
 (´íÎó: ÎÞ·¨·ÖÅä´óСΪ390585 KbµÄÏòÁ¿)

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html