Re: [R] memory allocation problem

2016-12-06 Thread Jeff Newmiller
Buy more memory? Do something different than you were doing before the error 
occurred? Use a search engine to find what other people have done when this 
message appeared? Follow the recommendations in the Posting Guide mentioned in 
the footer of this and every post on this mailing list? 
-- 
Sent from my phone. Please excuse my brevity.

On December 6, 2016 7:40:40 AM PST, Elham - via R-help  
wrote:
>hi everyone,
>I tried to run my code in RStudio,but I received this error
>message,what should I do?
>Error: cannot allocate vector of size 12.1 Gb
>In addition: Warning messages:
>1: In cor(coding.rpkm[grep("23.C", coding.rpkm$name), -1],
>ncoding.rpkm[grep("23.C",  :
>  Reached total allocation of 6027Mb: see help(memory.size)
>   [[alternative HTML version deleted]]
>
>__
>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation problem

2016-12-06 Thread Elham - via R-help
hi everyone,
I tried to run my code in RStudio,but I received this error message,what should 
I do?
Error: cannot allocate vector of size 12.1 Gb
In addition: Warning messages:
1: In cor(coding.rpkm[grep("23.C", coding.rpkm$name), -1], 
ncoding.rpkm[grep("23.C",  :
  Reached total allocation of 6027Mb: see help(memory.size)
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation using .C interface

2014-04-10 Thread Cassiano dos Santos
Ok, that is why i have suspected.

Thanks for the clear explanation.

[]s
Cassiano




2014-04-09 18:37 GMT-03:00 Peter Langfelder peter.langfel...@gmail.com:

 On Wed, Apr 9, 2014 at 11:27 AM, Cassiano dos Santos crn...@gmail.com
 wrote:
  I am testing a call to a C function from R, using .C interface. The test
  consists in passing a numeric vector to the C function with no entries,
  dynamically allocates n positions, makes attributions and return the
 vector
  to R.

 When execution enters your C function, the pointer x points to the
 content (numerical values) of the R object known as 'x' to R code.
 However, the content has length 0 and the value of the pointer may be
 undefined (not sure about how R handles empty vectors).

 You then change the C pointer x to point to the memory you allocated.
 This memory has no relation to the R object 'x', so any changes you
 make cannot be reflected in the R object x.

 Further, when execution exits your function, the pointer to your
 allocated memory is lost and your memory is not de-allocated (that is,
 returned to the system). You should call the Free function on exit
 from your function.

 So the answer is that you cannot use the .C interface for this. You
 could achieve your goal via the .Call interface but you have to read
 up about how to work with R objects in C code.

 HTH,

 Peter

 
  I'm using Calloc from R.h. The prototype of the function is
 
  type* Calloc(size_t n, type)
 
  as noted in Writing R Extensions.
 
  The problem is that I don't get the new vector with the allocated
 positions
  in R. The vector continues to have no entries.
 
  *The code in R*
 
  fooR - function(x) {
if (!is.numeric(x))
  stop(argument x must be numeric)
out - .C(foo,
  x=as.double(x))
return(out$x)}
 
  x - numeric()
 
  result - myfooR(x)
 
  *The function in C*
 
  #include R.h
  void myfooRealloc(double *x){
int i, n;
 
n = 4;
x = Calloc(n, double);
 
for (i = 0; i  n; i++) {
  x[i] = i;
  printf(%f\n, x[i]); //just to check
}}
 
  The question is: Can .C inteface handle with such memory allocation?
 
  [[alternative HTML version deleted]]
 
  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation using .C interface

2014-04-09 Thread Cassiano dos Santos
I am testing a call to a C function from R, using .C interface. The test
consists in passing a numeric vector to the C function with no entries,
dynamically allocates n positions, makes attributions and return the vector
to R.

I'm using Calloc from R.h. The prototype of the function is

type* Calloc(size_t n, type)

as noted in Writing R Extensions.

The problem is that I don't get the new vector with the allocated positions
in R. The vector continues to have no entries.

*The code in R*

fooR - function(x) {
  if (!is.numeric(x))
stop(argument x must be numeric)
  out - .C(foo,
x=as.double(x))
  return(out$x)}

x - numeric()

result - myfooR(x)

*The function in C*

#include R.h
void myfooRealloc(double *x){
  int i, n;

  n = 4;
  x = Calloc(n, double);

  for (i = 0; i  n; i++) {
x[i] = i;
printf(%f\n, x[i]); //just to check
  }}

The question is: Can .C inteface handle with such memory allocation?

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation using .C interface

2014-04-09 Thread Peter Langfelder
On Wed, Apr 9, 2014 at 11:27 AM, Cassiano dos Santos crn...@gmail.com wrote:
 I am testing a call to a C function from R, using .C interface. The test
 consists in passing a numeric vector to the C function with no entries,
 dynamically allocates n positions, makes attributions and return the vector
 to R.

When execution enters your C function, the pointer x points to the
content (numerical values) of the R object known as 'x' to R code.
However, the content has length 0 and the value of the pointer may be
undefined (not sure about how R handles empty vectors).

You then change the C pointer x to point to the memory you allocated.
This memory has no relation to the R object 'x', so any changes you
make cannot be reflected in the R object x.

Further, when execution exits your function, the pointer to your
allocated memory is lost and your memory is not de-allocated (that is,
returned to the system). You should call the Free function on exit
from your function.

So the answer is that you cannot use the .C interface for this. You
could achieve your goal via the .Call interface but you have to read
up about how to work with R objects in C code.

HTH,

Peter


 I'm using Calloc from R.h. The prototype of the function is

 type* Calloc(size_t n, type)

 as noted in Writing R Extensions.

 The problem is that I don't get the new vector with the allocated positions
 in R. The vector continues to have no entries.

 *The code in R*

 fooR - function(x) {
   if (!is.numeric(x))
 stop(argument x must be numeric)
   out - .C(foo,
 x=as.double(x))
   return(out$x)}

 x - numeric()

 result - myfooR(x)

 *The function in C*

 #include R.h
 void myfooRealloc(double *x){
   int i, n;

   n = 4;
   x = Calloc(n, double);

   for (i = 0; i  n; i++) {
 x[i] = i;
 printf(%f\n, x[i]); //just to check
   }}

 The question is: Can .C inteface handle with such memory allocation?

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation using .C interface

2014-04-09 Thread Dirk Eddelbuettel
Cassiano dos Santos crns13 at gmail.com writes:
 I am testing a call to a C function from R, using .C interface. The test
 consists in passing a numeric vector to the C function with no entries,
 dynamically allocates n positions, makes attributions and return the 
 vector to R.

Asking on StackOverflow *and* here is considered rude.  

I have tried to answer your question on StackOverflow.

Dirk

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation and management question

2013-07-15 Thread ivo welch
dear R experts:   I am curious again about R memory allocation strategies.
 Consider an intentionally inefficient program:

ranmatme - function( lx, rx ) {
m - matrix(NA, nrow=lx, ncol=rx)
for (li in 1:rx) {
cat(\tLag i=, li, object size=, object.size(m), \n)
m[,li] - rnorm(lx)
}
m
}

v - ranmatme( 1024*1024*128, 3 )


[1] on the first cat, the object size is only 1.6GB, which is half the size
of the 3.2GB that it is on the 2nd and 3rd call.  why?

[2] I tried to monitor the linux memory allocation in another window.  I
could be completely wrong, but it seems that upon function exit, memory
usage spikes briefly.  it is almost as if there was an explicit copy of m
into v, and both had to exist simultaneously for a moment in time.  is this
the case?  (if so, is there a way to return and assign just the reference?
 I may be blanking here---maybe the answer is obvious.)

regards,

/iaw

Ivo Welch (ivo.we...@gmail.com)

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation and management question

2013-07-15 Thread jim holtman
I can give you the answer to #1.  If you had put a print(str(m)) you
would have seen that initially the matrix was setup as logical which
requires 4 bytes per element.  On the first assignment of a numeric, the
mode of 'm' is changed to numeric which requires 8 bytes per element; that
is the reason for the doubling.


On Mon, Jul 15, 2013 at 6:50 PM, ivo welch ivo.we...@anderson.ucla.eduwrote:

 dear R experts:   I am curious again about R memory allocation strategies.
  Consider an intentionally inefficient program:

 ranmatme - function( lx, rx ) {
 m - matrix(NA, nrow=lx, ncol=rx)
 for (li in 1:rx) {
 cat(\tLag i=, li, object size=, object.size(m), \n)
 m[,li] - rnorm(lx)
 }
 m
 }

 v - ranmatme( 1024*1024*128, 3 )


 [1] on the first cat, the object size is only 1.6GB, which is half the size
 of the 3.2GB that it is on the 2nd and 3rd call.  why?

 [2] I tried to monitor the linux memory allocation in another window.  I
 could be completely wrong, but it seems that upon function exit, memory
 usage spikes briefly.  it is almost as if there was an explicit copy of m
 into v, and both had to exist simultaneously for a moment in time.  is this
 the case?  (if so, is there a way to return and assign just the reference?
  I may be blanking here---maybe the answer is obvious.)

 regards,

 /iaw
 
 Ivo Welch (ivo.we...@gmail.com)

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation and management question

2013-07-15 Thread ivo welch
thx, jim.  makes perfect sense now.

I guess a logical in R has a few million possible values ;-).

(Joke.  I realize that 4 bytes is to keep the code convenient and faster.)

regards,

/iaw

Ivo Welch (ivo.we...@gmail.com)

On Mon, Jul 15, 2013 at 4:26 PM, jim holtman jholt...@gmail.com wrote:

 I can give you the answer to #1.  If you had put a print(str(m)) you
 would have seen that initially the matrix was setup as logical which
 requires 4 bytes per element.  On the first assignment of a numeric, the
 mode of 'm' is changed to numeric which requires 8 bytes per element; that
 is the reason for the doubling.


 On Mon, Jul 15, 2013 at 6:50 PM, ivo welch ivo.we...@anderson.ucla.eduwrote:

 dear R experts:   I am curious again about R memory allocation strategies.
  Consider an intentionally inefficient program:

 ranmatme - function( lx, rx ) {
 m - matrix(NA, nrow=lx, ncol=rx)
 for (li in 1:rx) {
 cat(\tLag i=, li, object size=, object.size(m), \n)
 m[,li] - rnorm(lx)
 }
 m
 }

 v - ranmatme( 1024*1024*128, 3 )


 [1] on the first cat, the object size is only 1.6GB, which is half the
 size
 of the 3.2GB that it is on the 2nd and 3rd call.  why?

 [2] I tried to monitor the linux memory allocation in another window.  I
 could be completely wrong, but it seems that upon function exit, memory
 usage spikes briefly.  it is almost as if there was an explicit copy of m
 into v, and both had to exist simultaneously for a moment in time.  is
 this
 the case?  (if so, is there a way to return and assign just the reference?
  I may be blanking here---maybe the answer is obvious.)

 regards,

 /iaw
 
 Ivo Welch (ivo.we...@gmail.com)

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




 --
 Jim Holtman
 Data Munger Guru

 What is the problem that you are trying to solve?
 Tell me what you want to do, not how you want to do it.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] R memory allocation

2012-05-25 Thread swaraj basu
Dear All,

I am running R in a system with the following configuration

*Processor: Intel(R) Xeon(R) CPU   X5650  @ 2.67GHz
OS: Ubuntu X86_64 10.10
RAM: 24 GB*

The R session info is
*
R version 2.14.1 (2011-12-22)
Platform: x86_64-pc-linux-gnu (64-bit)

locale:
 [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
 [3] LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
 [5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
 [7] LC_PAPER=C LC_NAME=C
 [9] LC_ADDRESS=C   LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C   *


I have a matrix of dimensions 12 rows X  29318 columns. The matrix contains
numeric as well as NA values. I am using the* rcorr *function from the *
Hmisc* package to get correlation information from the matrix (*
rcorr(matrix)*). During the calculation I get the error *cannot allocate
vector of size 6.7 GB*. When I check the memory allocation of my R session
I get the following information

*gc()
  used (Mb) gc trigger (Mb) limit (Mb) max used (Mb)
Ncells  249638 13.4 467875 25.0 NA   407500 21.8
Vcells 1499217 11.52335949 17.9   7000  1970005 15.1

*Can someone please help me in finding a workaround to the problem.

-Regards


-- 
Swaraj Basu
PhD Student (Bioinformatics - Functional Genomics)
Animal Physiology and Evolution
Stazione Zoologica Anton Dohrn
Naples

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R memory allocation

2012-05-25 Thread Martin Morgan

On 05/25/2012 06:29 AM, swaraj basu wrote:

Dear All,

I am running R in a system with the following configuration

*Processor: Intel(R) Xeon(R) CPU   X5650  @ 2.67GHz
OS: Ubuntu X86_64 10.10
RAM: 24 GB*

The R session info is
*
R version 2.14.1 (2011-12-22)
Platform: x86_64-pc-linux-gnu (64-bit)

locale:
  [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
  [3] LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
  [5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
  [7] LC_PAPER=C LC_NAME=C
  [9] LC_ADDRESS=C   LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C   *


I have a matrix of dimensions 12 rows X  29318 columns. The matrix contains
numeric as well as NA values. I am using the* rcorr *function from the *
Hmisc* package to get correlation information from the matrix (*
rcorr(matrix)*). During the calculation I get the error *cannot allocate
vector of size 6.7 GB*. When I check the memory allocation of my R session
I get the following information


Perhaps you are trying to calculate correlations between the 12 rows, so 
want to transpose the matrix? If not and if this is a gene expression 
study then common practice is to reduce the number of probe sets to 
those that are most variable across all samples, as these are the ones 
that will provide statistical signal.


Martin



*gc()
   used (Mb) gc trigger (Mb) limit (Mb) max used (Mb)
Ncells  249638 13.4 467875 25.0 NA   407500 21.8
Vcells 1499217 11.52335949 17.9   7000  1970005 15.1

*Can someone please help me in finding a workaround to the problem.

-Regards





--
Computational Biology
Fred Hutchinson Cancer Research Center
1100 Fairview Ave. N. PO Box 19024 Seattle, WA 98109

Location: M1-B861
Telephone: 206 667-2793

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation error

2012-05-24 Thread swaraj basu
Dear All,

I am running R in a system with the following configuration

*Processor: Intel(R) Xeon(R) CPU   X5650  @ 2.67GHz
OS: Ubuntu X86_64 10.10
RAM: 24 GB*

The R session info is
*
R version 2.14.1 (2011-12-22)
Platform: x86_64-pc-linux-gnu (64-bit)

locale:
 [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
 [3] LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
 [5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
 [7] LC_PAPER=C LC_NAME=C
 [9] LC_ADDRESS=C   LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C   *


I have a matrix of dimensions 12 rows X  29318 columns. The matrix contains
numeric as well as NA values. I am using the* rcorr *function from the *
Hmisc* package to get correlation information from the matrix (*
rcorr(matrix)*). During the calculation I get the error *cannot allocate
vector of size 6.7 GB*. When I check the memory allocation of my R session
I get the following information

*gc()
  used (Mb) gc trigger (Mb) limit (Mb) max used (Mb)
Ncells  249638 13.4 467875 25.0 NA   407500 21.8
Vcells 1499217 11.52335949 17.9   7000  1970005 15.1

*Can someone please help me in finding a workaround to the problem.

-Regards
-- 
Swaraj Basu
PhD Student (Bioinformatics - Functional Genomics)
Animal Physiology and Evolution
Stazione Zoologica Anton Dohrn
Naples

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem (again!)

2012-02-08 Thread Christofer Bogaso
Dear all, I know this problem was discussed many times in forum, however
unfortunately I could not find any way out for my own problem. Here I am
having Memory allocation problem while generating a lot of random number.
Here is my description:

 rnorm(5*6000)
Error: cannot allocate vector of size 2.2 Gb
In addition: Warning messages:
1: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
2: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
3: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
4: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 memory.size(TRUE)
[1] 15.75
 rnorm(5*6000)
Error: cannot allocate vector of size 2.2 Gb
In addition: Warning messages:
1: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
2: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
3: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
4: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)

And the Session info is here:

 sessionInfo()
R version 2.14.0 (2011-10-31)
Platform: i386-pc-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United
States.1252   
[3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C

[5] LC_TIME=English_United States.1252

attached base packages:
[1] graphics  grDevices utils datasets  grid  stats methods
base 

other attached packages:
[1] ggplot2_0.8.9 proto_0.3-9.2 reshape_0.8.4 plyr_1.6  zoo_1.7-6

loaded via a namespace (and not attached):
[1] lattice_0.20-0

I am using Windows 7 (home version) with 4 GB of RAM (2.16GB is usable as my
computer reports). So in my case, is it not possible to generate a random
vector with such length? Note that generating such vector is my primary job.
Later I need to do something on that vector. Those Job includes:
1. Create a matrix with 50,000 rows.
2. Get the row sum
3. then report some metrics on that sum values (min. 50,000 elements must be
there).

Can somebody help me with some real solution/suggesting?

Thanks and regards,

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem (again!)

2012-02-08 Thread Justin Haynes
32 bit windows has a memory limit of 2GB.  Upgrading to a computer thats
less than 10 years old is the best path.

But short of that, if you're just generating random data, why not do it in
two or more pieces and combine them later?

mat.1 - matrix(rnorm(5*2000),nrow=5)
mat.2 - matrix(rnorm(5*2000),nrow=5)
mat.3 - matrix(rnorm(5*2000),nrow=5)

mat.1.sums - rowSums(mat.1)
mat.2.sums - rowSums(mat.2)
mat.3.sums - rowSums(mat.3)

mat.sums - c(mat.1.sums,mat.2.sums,mat.3.sums)



On Wed, Feb 8, 2012 at 8:37 AM, Christofer Bogaso 
bogaso.christo...@gmail.com wrote:

 Dear all, I know this problem was discussed many times in forum, however
 unfortunately I could not find any way out for my own problem. Here I am
 having Memory allocation problem while generating a lot of random number.
 Here is my description:

  rnorm(5*6000)
 Error: cannot allocate vector of size 2.2 Gb
 In addition: Warning messages:
 1: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 2: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 3: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 4: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
  memory.size(TRUE)
 [1] 15.75
  rnorm(5*6000)
 Error: cannot allocate vector of size 2.2 Gb
 In addition: Warning messages:
 1: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 2: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 3: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 4: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)

 And the Session info is here:

  sessionInfo()
 R version 2.14.0 (2011-10-31)
 Platform: i386-pc-mingw32/i386 (32-bit)

 locale:
 [1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United
 States.1252
 [3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C

 [5] LC_TIME=English_United States.1252

 attached base packages:
 [1] graphics  grDevices utils datasets  grid  stats methods
 base

 other attached packages:
 [1] ggplot2_0.8.9 proto_0.3-9.2 reshape_0.8.4 plyr_1.6  zoo_1.7-6

 loaded via a namespace (and not attached):
 [1] lattice_0.20-0

 I am using Windows 7 (home version) with 4 GB of RAM (2.16GB is usable as
 my
 computer reports). So in my case, is it not possible to generate a random
 vector with such length? Note that generating such vector is my primary
 job.
 Later I need to do something on that vector. Those Job includes:
 1. Create a matrix with 50,000 rows.
 2. Get the row sum
 3. then report some metrics on that sum values (min. 50,000 elements must
 be
 there).

 Can somebody help me with some real solution/suggesting?

 Thanks and regards,

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem (again!)

2012-02-08 Thread Ernest Adrogué
 8-02-2012, 22:22 (+0545); Christofer Bogaso escriu:
 And the Session info is here:
 
  sessionInfo()
 R version 2.14.0 (2011-10-31)
 Platform: i386-pc-mingw32/i386 (32-bit)

Not an expert, but I think that 32-bit applications can only address
up to 2GB on Windows.

-- 
Bye,
Ernest

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation in R

2011-11-27 Thread Simon Urbanek

On Nov 23, 2011, at 10:42 AM, Marc Jekel wrote:

 Dear R community,
 
 I was observing a memory issue in R (latest 64bit R version running on a win 
 7 64 bit system) that made me curious.
 
 I kept track of the memory f my PC allocated to R to calculate + keep several 
 objects in the workspace. If I then save the workspace, close R, and open the 
 workspace again, less memory is allocated to keep the same set of variables 
 into the workspace. For my case, the reduction in memory size was quite 
 significant (approx. 2 GB).
 
 Does anyone know why R behaves in this manner - put differently: What does R 
 keep in the workspace beyond the objects before I close R? Can I induce the 
 reduction in memory without the need to close R?
 

You can explicitly clean up using gc() [do not use gctorture() - that is 
nonsensical this context]. After that R keeps in memory only objects that are 
currently in use. What is in the workspace (global environment) is explicitly 
under your control. Note, however, that the system (reported by tools like ps 
or top) may not be able to reclaim memory (in particular Linux) even though R 
has released it - see R FAQ 7.42 for details.

Cheers,
Simon


 Thanks for an email!
 
 Marc
 
 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
 
 

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation in R

2011-11-23 Thread Marc Jekel

Dear R community,

I was observing a memory issue in R (latest 64bit R version running on a 
win 7 64 bit system) that made me curious.


I kept track of the memory f my PC allocated to R to calculate + keep 
several objects in the workspace. If I then save the workspace, close R, 
and open the workspace again, less memory is allocated to keep the same 
set of variables into the workspace. For my case, the reduction in 
memory size was quite significant (approx. 2 GB).


Does anyone know why R behaves in this manner - put differently: What 
does R keep in the workspace beyond the objects before I close R? Can I 
induce the reduction in memory without the need to close R?


Thanks for an email!

Marc

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation in R

2011-11-23 Thread Mehmet Suzen
You may want to enable garbage collection on

gctorture(on = TRUE)

see: ?gctorture
 ?gcinfo
 ?object.size

-Original Message-
From: r-help-boun...@r-project.org
[mailto:r-help-boun...@r-project.org]
On Behalf Of Marc Jekel
Sent: 23 November 2011 15:42
To: R-help@r-project.org
Subject: [R] memory allocation in R

Dear R community,

I was observing a memory issue in R (latest 64bit R version running on
a
win 7 64 bit system) that made me curious.

I kept track of the memory f my PC allocated to R to calculate + keep
several objects in the workspace. If I then save the workspace, close
R,
and open the workspace again, less memory is allocated to keep the same
set of variables into the workspace. For my case, the reduction in
memory size was quite significant (approx. 2 GB).

Does anyone know why R behaves in this manner - put differently: What
does R keep in the workspace beyond the objects before I close R? Can I
induce the reduction in memory without the need to close R?

Thanks for an email!

Marc

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-
guide.html
and provide commented, minimal, self-contained, reproducible code.
LEGAL NOTICE
This message is intended for the use o...{{dropped:10}}

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem

2011-04-08 Thread Luis Felipe Parra
Hello, I am runnning  a program on R with a big number of simulations and
I am getting the following error:

Error: no se puede ubicar un vector de tamaño  443.3 Mb

I don't understand why because when I check the memory status in my pc I get
the following:

 memory.size()
[1] 676.3
 memory.size(T)
[1] 1124.69
 memory.limit()
[1] 4000

which should in theory allow to have a vector of size 443.Mb. I am running
it on a pc on windows, 4gb RAM and intel core i7 processor. Does anybody
know what might be going on?

Thank you

Felipe Parra
-- 

Este mensaje de correo electrónico es enviado por Quantil S.A.S y puede
contener información confidencial o privilegiada.

This e-mail is sent by Quantil S.A.S and may contain confidential or
privileged information

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem

2011-04-08 Thread Joshua Wiley
Hi Felipe,

On Fri, Apr 8, 2011 at 7:54 PM, Luis Felipe Parra
felipe.pa...@quantil.com.co wrote:
 Hello, I am runnning  a program on R with a big number of simulations and
 I am getting the following error:

 Error: no se puede ubicar un vector de tamaño  443.3 Mb

 I don't understand why because when I check the memory status in my pc I get
 the following:

 memory.size()
 [1] 676.3
 memory.size(T)
 [1] 1124.69
 memory.limit()
 [1] 4000

 which should in theory allow to have a vector of size 443.Mb. I am running
 it on a pc on windows, 4gb RAM and intel core i7 processor. Does anybody
 know what might be going on?

It is not that *a* vector of size 443 MB could not be allocated given
your system.  However, during your simulation, multiple objects take
up memory, and that 443 MB vector was the final one that was too big
to assign.  Depending how your simulation is setup, you might be able
to remove objects that are no longer needed or rewrite it to a less
memory intensive form.  I do not know enough about memory+R to offer
any specific advice as to solutions.  Hopefully someone else here can
chime in.

Good luck,

Josh


 Thank you

 Felipe Parra
 --

 Este mensaje de correo electrónico es enviado por Quantil S.A.S y puede
 contener información confidencial o privilegiada.

 This e-mail is sent by Quantil S.A.S and may contain confidential or
 privileged information

        [[alternative HTML version deleted]]


 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

-- 
Joshua Wiley
Ph.D. Student, Health Psychology
University of California, Los Angeles
http://www.joshuawiley.com/

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation problem

2010-11-03 Thread Lorenzo Cattarino
Following on my memory allocation problem...

I tried to run my code on our university HPU facility, requesting 61 GB
of memory, and it still can not allocate a vector of 5 MB of size.

 load('/home/uqlcatta/test_scripts/.RData')
 
 myfun - function(Range, H1, H2, p, coeff)
+ {
+
-(coeff[1]+coeff[2]*H1+coeff[3]*H2+coeff[4]*p)*exp(-(coeff[5]+coeff[6]*H
1+coeff[7]*H2+coeff[8]*p)*Range)+coeff[9]+coeff[10]*H1+coeff[11]*H2+coef
f[12]*p
+ }
 
 SS - function(coeff,steps,Range,H1,H2,p)
+ {
+ sum((steps - myfun(Range,H1,H2,p,coeff))^2)
+ }
 
 coeff - c(1,1,1,1,1,1,1,1,1,1,1,1)
 
 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
Execution halted

May it be a proble of the function?

Any input is very much appreciated

Lorenzo

-Original Message-
From: Lorenzo Cattarino 
Sent: Wednesday, 3 November 2010 2:22 PM
To: 'David Winsemius'; 'Peter Langfelder'
Cc: r-help@r-project.org
Subject: RE: [R] memory allocation problem

Thanks for all your suggestions,

This is what I get after removing all the other (not useful) objects and
run my code:

 getsizes()
[,1]
org_results 47240832
myfun  11672
getsizes4176
SS  3248
coeff168
NA  NA
NA  NA
NA  NA
NA  NA
NA  NA

 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
In addition: Warning messages:
1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)


It seems that R is using all the default availabe memory (4 GB, which is
the RAM of my processor).

 memory.limit()
[1] 4055
 memory.size()
[1] 4049.07


My dataframe has a size of 47240832 bytes, or about 45 Mb. So it should
not be a problem in terms of memory usage?

I do not understand what is going on.

Thanks for your help anyway

Lorenzo

-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net] 
Sent: Wednesday, 3 November 2010 12:48 PM
To: Lorenzo Cattarino
Cc: r-help@r-project.org
Subject: Re: [R] memory allocation problem

Restart your computer. (Yeah, I know that what the help-desk always  
says.)
Start R before doing anything else.

Then run your code in a clean session. Check ls() oafter starte up to  
make sure you don't have a bunch f useless stuff in your .Rdata  
file.   Don't load anything that is not germane to this problem.  Use  
this function to see what sort of space issues you might have after  
loading objects:

  getsizes - function() {z - sapply(ls(envir=globalenv()),
 function(x) object.size(get(x)))
(tmp - as.matrix(rev(sort(z))[1:10]))}

Then run your code.

-- 
David.

On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:

 I would also like to include details on my R version



 version  _

 platform   x86_64-pc-mingw32
 arch   x86_64

 os mingw32
 system x86_64, mingw32
 status
 major  2
 minor  11.1
 year   2010
 month  05
 day31
 svn rev52157
 language   R
 version.string R version 2.11.1 (2010-05-31)

 from FAQ 2.9

(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021

http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021 ) it says that:
 For a 64-bit build, the default is the amount of RAM

 So in my case the amount of RAM would be 4 GB. R should be able to
 allocate a vector of size 5 Mb without me typing any command (either  
 as
 memory.limit() or appended string in the target path), is that right?



 From: Lorenzo Cattarino
 Sent: Wednesday, 3 November 2010 10:55 AM
 To: 'r-help@r-project.org'
 Subject: memory allocation problem



 I forgot to mention that I am using windows 7 (64-bit) and the R  
 version
 2.11.1 (64-bit)



 From: Lorenzo Cattarino

 I am trying to run a non linear parameter optimization using the
 function optim() and I have problems regarding memory allocation.

 My data are in a dataframe with 9 columns. There are 656100 rows.

 head(org_results)

 comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

 1   1   0.1  0  0 11000
 0.2528321

Re: [R] memory allocation problem

2010-11-03 Thread Lorenzo Cattarino
Thanks for all your suggestions,

This is what I get after removing all the other (not useful) objects and
run my code:

 getsizes()
[,1]
org_results 47240832
myfun  11672
getsizes4176
SS  3248
coeff168
NA  NA
NA  NA
NA  NA
NA  NA
NA  NA

 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
In addition: Warning messages:
1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)


It seems that R is using all the default availabe memory (4 GB, which is
the RAM of my processor).

 memory.limit()
[1] 4055
 memory.size()
[1] 4049.07


My dataframe has a size of 47240832 bytes, or about 45 Mb. So it should
not be a problem in terms of memory usage?

I do not understand what is going on.

Thanks for your help anyway

Lorenzo

-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net] 
Sent: Wednesday, 3 November 2010 12:48 PM
To: Lorenzo Cattarino
Cc: r-help@r-project.org
Subject: Re: [R] memory allocation problem

Restart your computer. (Yeah, I know that what the help-desk always  
says.)
Start R before doing anything else.

Then run your code in a clean session. Check ls() oafter starte up to  
make sure you don't have a bunch f useless stuff in your .Rdata  
file.   Don't load anything that is not germane to this problem.  Use  
this function to see what sort of space issues you might have after  
loading objects:

  getsizes - function() {z - sapply(ls(envir=globalenv()),
 function(x) object.size(get(x)))
(tmp - as.matrix(rev(sort(z))[1:10]))}

Then run your code.

-- 
David.

On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:

 I would also like to include details on my R version



 version  _

 platform   x86_64-pc-mingw32
 arch   x86_64

 os mingw32
 system x86_64, mingw32
 status
 major  2
 minor  11.1
 year   2010
 month  05
 day31
 svn rev52157
 language   R
 version.string R version 2.11.1 (2010-05-31)

 from FAQ 2.9

(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021

http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021 ) it says that:
 For a 64-bit build, the default is the amount of RAM

 So in my case the amount of RAM would be 4 GB. R should be able to
 allocate a vector of size 5 Mb without me typing any command (either  
 as
 memory.limit() or appended string in the target path), is that right?



 From: Lorenzo Cattarino
 Sent: Wednesday, 3 November 2010 10:55 AM
 To: 'r-help@r-project.org'
 Subject: memory allocation problem



 I forgot to mention that I am using windows 7 (64-bit) and the R  
 version
 2.11.1 (64-bit)



 From: Lorenzo Cattarino

 I am trying to run a non linear parameter optimization using the
 function optim() and I have problems regarding memory allocation.

 My data are in a dataframe with 9 columns. There are 656100 rows.

 head(org_results)

 comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

 1   1   0.1  0  0 11000
 0.2528321  0.1393901

 2   1   0.1  0  0 11000
 0.4605934  0.1011841

 3   1   0.1  0  0 11004
 3.4273670  0.1052789

 4   1   0.1  0  0 11004
 2.8766364  0.1022138

 5   1   0.1  0  0 11000
 0.3496872  0.1041056

 6   1   0.1  0  0 11000
 0.1050840  0.3572036

 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
 Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
 p=org_results$p)

 Error: cannot allocate vector of size 5.0 Mb

 In addition: Warning messages:

 1: In optim(coeff, SS, steps = org_results$no.steps, Range =
 org_results$Range,  : Reached total allocation of 1Mb: see
 help(memory.size)

 2: In optim(coeff, SS, steps = org_results$no.steps, Range =
 org_results$Range,  : Reached total allocation of 1Mb: see
 help(memory.size)

 3: In optim(coeff, SS, steps = org_results$no.steps, Range =
 org_results$Range

Re: [R] memory allocation problem

2010-11-03 Thread Jonathan P Daily
The optim function is very resource hungry. I have had similar problems in 
the past when dealing with extremely large datasets.

What is perhaps happening is that each 'step' of the optimization 
algorithm stores some info so that it can compare to the next 'step', and 
while the original vector may only be a few Mb of data, over many 
iterations a huge amount memory is allocated to the optimization steps.

Maybe look at the control options under ?optim, particularly stuff like 
trace, fnscale, ndeps, etc. that may cut down on the amount of data being 
stored each step as well as the number of steps needed.

Good luck!
--
Jonathan P. Daily
Technician - USGS Leetown Science Center
11649 Leetown Road
Kearneysville WV, 25430
(304) 724-4480
Is the room still a room when its empty? Does the room,
 the thing itself have purpose? Or do we, what's the word... imbue it.
 - Jubal Early, Firefly



From:
Lorenzo Cattarino l.cattar...@uq.edu.au
To:
David Winsemius dwinsem...@comcast.net, Peter Langfelder 
peter.langfel...@gmail.com
Cc:
r-help@r-project.org
Date:
11/03/2010 03:26 AM
Subject:
Re: [R] memory allocation problem
Sent by:
r-help-boun...@r-project.org



Thanks for all your suggestions,

This is what I get after removing all the other (not useful) objects and
run my code:

 getsizes()
[,1]
org_results 47240832
myfun  11672
getsizes4176
SS  3248
coeff168
NA  NA
NA  NA
NA  NA
NA  NA
NA  NA

 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
In addition: Warning messages:
1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)


It seems that R is using all the default availabe memory (4 GB, which is
the RAM of my processor).

 memory.limit()
[1] 4055
 memory.size()
[1] 4049.07


My dataframe has a size of 47240832 bytes, or about 45 Mb. So it should
not be a problem in terms of memory usage?

I do not understand what is going on.

Thanks for your help anyway

Lorenzo

-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net] 
Sent: Wednesday, 3 November 2010 12:48 PM
To: Lorenzo Cattarino
Cc: r-help@r-project.org
Subject: Re: [R] memory allocation problem

Restart your computer. (Yeah, I know that what the help-desk always 
says.)
Start R before doing anything else.

Then run your code in a clean session. Check ls() oafter starte up to 
make sure you don't have a bunch f useless stuff in your .Rdata 
file.   Don't load anything that is not germane to this problem.  Use 
this function to see what sort of space issues you might have after 
loading objects:

  getsizes - function() {z - sapply(ls(envir=globalenv()),
 function(x) object.size(get(x)))
(tmp - as.matrix(rev(sort(z))[1:10]))}

Then run your code.

-- 
David.

On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:

 I would also like to include details on my R version



 version  _

 platform   x86_64-pc-mingw32
 arch   x86_64

 os mingw32
 system x86_64, mingw32
 status
 major  2
 minor  11.1
 year   2010
 month  05
 day31
 svn rev52157
 language   R
 version.string R version 2.11.1 (2010-05-31)

 from FAQ 2.9

(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021

http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021 ) it says that:
 For a 64-bit build, the default is the amount of RAM

 So in my case the amount of RAM would be 4 GB. R should be able to
 allocate a vector of size 5 Mb without me typing any command (either 
 as
 memory.limit() or appended string in the target path), is that right?



 From: Lorenzo Cattarino
 Sent: Wednesday, 3 November 2010 10:55 AM
 To: 'r-help@r-project.org'
 Subject: memory allocation problem



 I forgot to mention that I am using windows 7 (64-bit) and the R 
 version
 2.11.1 (64-bit)



 From: Lorenzo Cattarino

 I am trying to run a non linear parameter optimization using the
 function optim() and I have problems regarding memory allocation.

 My data are in a dataframe with 9 columns. There are 656100 rows.

 head(org_results)

 comb.id   p

[R] memory allocation problem

2010-11-02 Thread Lorenzo Cattarino
I forgot to mention that I am using windows 7 (64-bit) and the R version
2.11.1 (64-bit)

 

Thank you 

 

Lorenzo

 

From: Lorenzo Cattarino 
Sent: Wednesday, 3 November 2010 10:52 AM
To: r-help@r-project.org
Subject: memory allocation problem

 

Hi R users 

 

I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.

 

My data are in a dataframe with 9 columns. There are 656100 rows.

head(org_results)

 

  comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

1   1   0.1  0  0 11000
0.2528321  0.1393901

2   1   0.1  0  0 11000
0.4605934  0.1011841

3   1   0.1  0  0 11004
3.4273670  0.1052789

4   1   0.1  0  0 11004
2.8766364  0.1022138

5   1   0.1  0  0 11000
0.3496872  0.1041056

6   1   0.1  0  0 11000
0.1050840  0.3572036

 

 

est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)

 

Error: cannot allocate vector of size 5.0 Mb

In addition: Warning messages:

1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

 

 

 memory.size()

[1] 9978.19

 memory.limit()

[1] 1

 

 

 

I know that I am not sending reproducible codes but I was hoping that
you could help me understand what is going on. I set a maximum limit of
1 mega byte (by writing this string --max-mem-size=1M after the
target path, right click on R icon, shortcut tab). And R is telling me
that it cannot allocate a vector of size 5 Mb??? 

 

Thank you for your help

 

Lorenzo


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation problem

2010-11-02 Thread Lorenzo Cattarino
Hi R users 

 

I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.

 

My data are in a dataframe with 9 columns. There are 656100 rows.

head(org_results)

 

  comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

1   1   0.1  0  0 11000
0.2528321  0.1393901

2   1   0.1  0  0 11000
0.4605934  0.1011841

3   1   0.1  0  0 11004
3.4273670  0.1052789

4   1   0.1  0  0 11004
2.8766364  0.1022138

5   1   0.1  0  0 11000
0.3496872  0.1041056

6   1   0.1  0  0 11000
0.1050840  0.3572036

 

 

est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)

 

Error: cannot allocate vector of size 5.0 Mb

In addition: Warning messages:

1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

 

 

 memory.size()

[1] 9978.19

 memory.limit()

[1] 1

 

 

 

I know that I am not sending reproducible codes but I was hoping that
you could help me understand what is going on. I set a maximum limit of
1 mega byte (by writing this string --max-mem-size=1M after the
target path, right click on R icon, shortcut tab). And R is telling me
that it cannot allocate a vector of size 5 Mb??? 

 

Thank you for your help

 

Lorenzo


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation problem

2010-11-02 Thread Lorenzo Cattarino
I would also like to include details on my R version

 

 version

   _

platform   x86_64-pc-mingw32

arch   x86_64   

os mingw32  

system x86_64, mingw32  

status  

major  2

minor  11.1 

year   2010 

month  05   

day31   

svn rev52157

language   R

version.string R version 2.11.1 (2010-05-31)

 

 

from FAQ 2.9
(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
e-a-limit-on-the-memory-it-uses_0021
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
e-a-limit-on-the-memory-it-uses_0021 ) it says that:

 

For a 64-bit build, the default is the amount of RAM

 

So in my case the amount of RAM would be 4 GB. R should be able to
allocate a vector of size 5 Mb without me typing any command (either as
memory.limit() or appended string in the target path), is that right?

 

Thank you a lot

 

Lorenzo

 

From: Lorenzo Cattarino 
Sent: Wednesday, 3 November 2010 10:55 AM
To: 'r-help@r-project.org'
Subject: memory allocation problem

 

I forgot to mention that I am using windows 7 (64-bit) and the R version
2.11.1 (64-bit)

 

Thank you 

 

Lorenzo

 

From: Lorenzo Cattarino 
Sent: Wednesday, 3 November 2010 10:52 AM
To: r-help@r-project.org
Subject: memory allocation problem

 

Hi R users 

 

I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.

 

My data are in a dataframe with 9 columns. There are 656100 rows.

head(org_results)

 

  comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

1   1   0.1  0  0 11000
0.2528321  0.1393901

2   1   0.1  0  0 11000
0.4605934  0.1011841

3   1   0.1  0  0 11004
3.4273670  0.1052789

4   1   0.1  0  0 11004
2.8766364  0.1022138

5   1   0.1  0  0 11000
0.3496872  0.1041056

6   1   0.1  0  0 11000
0.1050840  0.3572036

 

 

est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)

 

Error: cannot allocate vector of size 5.0 Mb

In addition: Warning messages:

1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

 

 

 memory.size()

[1] 9978.19

 memory.limit()

[1] 1

 

 

 

I know that I am not sending reproducible codes but I was hoping that
you could help me understand what is going on. I set a maximum limit of
1 mega byte (by writing this string --max-mem-size=1M after the
target path, right click on R icon, shortcut tab). And R is telling me
that it cannot allocate a vector of size 5 Mb??? 

 

Thank you for your help

 

Lorenzo


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation problem

2010-11-02 Thread Peter Langfelder
You have (almost) exhausted the 10GB you limited R to (that's what the
memory.size() tells you). Increase memory.limit (if you have more RAM,
use memory.limit(15000) for 15GB etc), or remove large data objects
from you session. Use rm(object), the issue garbage collection gc().
Sometimes garbage collection may solve the problem on its own.

Peter


On Tue, Nov 2, 2010 at 5:55 PM, Lorenzo Cattarino l.cattar...@uq.edu.au wrote:
 I forgot to mention that I am using windows 7 (64-bit) and the R version
 2.11.1 (64-bit)



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation problem

2010-11-02 Thread David Winsemius
Restart your computer. (Yeah, I know that what the help-desk always  
says.)

Start R before doing anything else.

Then run your code in a clean session. Check ls() oafter starte up to  
make sure you don't have a bunch f useless stuff in your .Rdata  
file.   Don't load anything that is not germane to this problem.  Use  
this function to see what sort of space issues you might have after  
loading objects:


 getsizes - function() {z - sapply(ls(envir=globalenv()),
function(x) object.size(get(x)))
   (tmp - as.matrix(rev(sort(z))[1:10]))}

Then run your code.

--
David.

On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:


I would also like to include details on my R version




version  _


platform   x86_64-pc-mingw32
arch   x86_64

os mingw32
system x86_64, mingw32
status
major  2
minor  11.1
year   2010
month  05
day31
svn rev52157
language   R
version.string R version 2.11.1 (2010-05-31)

from FAQ 2.9
(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
e-a-limit-on-the-memory-it-uses_0021
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
e-a-limit-on-the-memory-it-uses_0021 ) it says that:
For a 64-bit build, the default is the amount of RAM

So in my case the amount of RAM would be 4 GB. R should be able to
allocate a vector of size 5 Mb without me typing any command (either  
as

memory.limit() or appended string in the target path), is that right?



From: Lorenzo Cattarino
Sent: Wednesday, 3 November 2010 10:55 AM
To: 'r-help@r-project.org'
Subject: memory allocation problem



I forgot to mention that I am using windows 7 (64-bit) and the R  
version

2.11.1 (64-bit)



From: Lorenzo Cattarino

I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.

My data are in a dataframe with 9 columns. There are 656100 rows.


head(org_results)


comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

1   1   0.1  0  0 11000
0.2528321  0.1393901

2   1   0.1  0  0 11000
0.4605934  0.1011841

3   1   0.1  0  0 11004
3.4273670  0.1052789

4   1   0.1  0  0 11004
2.8766364  0.1022138

5   1   0.1  0  0 11000
0.3496872  0.1041056

6   1   0.1  0  0 11000
0.1050840  0.3572036


est_coeff - optim(coeff,SS, steps=org_results$no.steps,

Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)

Error: cannot allocate vector of size 5.0 Mb

In addition: Warning messages:

1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)


memory.size()


[1] 9978.19


memory.limit()


[1] 1





I know that I am not sending reproducible codes but I was hoping that
you could help me understand what is going on. I set a maximum limit  
of
1 mega byte (by writing this string --max-mem-size=1M after  
the

target path, right click on R icon, shortcut tab). And R is telling me
that it cannot allocate a vector of size 5 Mb???




David Winsemius, MD
West Hartford, CT

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation problem

2010-11-02 Thread Peter Langfelder
Oops,  I missed that you only have 4GB of memory... but since R is
apparently capable of using almost 10GB, either you actually have more
RAM, or the system is swapping some data to disk. Increasing memory
use in R might still help, but also may lead to a situation where the
system waits forever for data to be swapped to and from the disk.

Peter

On Tue, Nov 2, 2010 at 7:36 PM, Peter Langfelder
peter.langfel...@gmail.com wrote:
 You have (almost) exhausted the 10GB you limited R to (that's what the
 memory.size() tells you). Increase memory.limit (if you have more RAM,
 use memory.limit(15000) for 15GB etc), or remove large data objects
 from you session. Use rm(object), the issue garbage collection gc().
 Sometimes garbage collection may solve the problem on its own.

 Peter


 On Tue, Nov 2, 2010 at 5:55 PM, Lorenzo Cattarino l.cattar...@uq.edu.au 
 wrote:
 I forgot to mention that I am using windows 7 (64-bit) and the R version
 2.11.1 (64-bit)




__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation in 64 bit R

2010-10-02 Thread Uwe Ligges



On 02.10.2010 03:10, Peter Langfelder wrote:

Hi Mete,

I think you should look at the help for memory.limit. Try to set a
higher one, for example

memory.limit(16000)

(I think 16GB is what xenon will take).



But not too funny given you have only 8Gb in your machine.
So the answer probably is: Buy more RAM or try to reduce the problem.




Peter

On Fri, Oct 1, 2010 at 6:02 PM, Mete Civelekmcive...@mednet.ucla.edu  wrote:

Hi Everyone,

I am getting the following error message

Error: cannot allocate vector of size 2.6 Gb


So just the next step is about allocating 2.6 Gb! Note that you had only 
2.6Gb free at all given the information you specified below. Hence it 
won't work on any OS, if you limit R to 8Gb.


Uwe Ligges




In addition: Warning messages:
1: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
2: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
3: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
4: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)

Here is the relevant info


sessionInfo()

R version 2.11.1 (2010-05-31)
x86_64-pc-mingw32

locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] splines   tcltk stats graphics  grDevices utils datasets
[8] methods   base

other attached packages:
[1] cluster_1.12.3  WGCNA_0.93  Hmisc_3.8-2
[4] survival_2.35-8 qvalue_1.22.0   flashClust_1.00-2
[7] dynamicTreeCut_1.21 impute_1.22.0

loaded via a namespace (and not attached):
[1] grid_2.11.1 lattice_0.19-11 tools_2.11.1


memory.size(NA)

[1] 8122.89

memory.size()

[1] 5443.18

memory.limit()

[1] 8122

.Machine$sizeof.pointer

[1] 8

And this is what I am trying to do when I get this error message

ls()

[1] datExpr

print(object.size(datExpr), units = auto)

23.5 Mb

ADJ1=((1+bicor(datExpr, use=pairwise.complete.obs, maxPOutliers=0.05, quick=0, 
pearsonFallback=individual))/2)^8


If I understand the archives correctly my problem is with memory allocation of 
a large vector to the address space. Is there any way to get around this 
without having to use a Linux system?  Has anyone been able to solve this 
problem?

I appreciate any suggestions or help.

Mete Civelek


IMPORTANT WARNING: This email (and any attachments) is o...{{dropped:12}}

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation in 64 bit R

2010-10-01 Thread Mete Civelek
Hi Everyone,

I am getting the following error message

Error: cannot allocate vector of size 2.6 Gb
In addition: Warning messages:
1: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
2: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
3: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
4: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)

Here is the relevant info

 sessionInfo()
R version 2.11.1 (2010-05-31)
x86_64-pc-mingw32

locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] splines   tcltk stats graphics  grDevices utils datasets
[8] methods   base

other attached packages:
[1] cluster_1.12.3  WGCNA_0.93  Hmisc_3.8-2
[4] survival_2.35-8 qvalue_1.22.0   flashClust_1.00-2
[7] dynamicTreeCut_1.21 impute_1.22.0

loaded via a namespace (and not attached):
[1] grid_2.11.1 lattice_0.19-11 tools_2.11.1

 memory.size(NA)
[1] 8122.89
 memory.size()
[1] 5443.18
 memory.limit()
[1] 8122
 .Machine$sizeof.pointer
[1] 8

And this is what I am trying to do when I get this error message
 ls()
[1] datExpr
 print(object.size(datExpr), units = auto)
23.5 Mb
 ADJ1=((1+bicor(datExpr, use=pairwise.complete.obs, maxPOutliers=0.05, 
 quick=0, pearsonFallback=individual))/2)^8

If I understand the archives correctly my problem is with memory allocation of 
a large vector to the address space. Is there any way to get around this 
without having to use a Linux system?  Has anyone been able to solve this 
problem?

I appreciate any suggestions or help.

Mete Civelek


IMPORTANT WARNING: This email (and any attachments) is o...{{dropped:12}}

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation in 64 bit R

2010-10-01 Thread Peter Langfelder
Hi Mete,

I think you should look at the help for memory.limit. Try to set a
higher one, for example

memory.limit(16000)

(I think 16GB is what xenon will take).

Peter

On Fri, Oct 1, 2010 at 6:02 PM, Mete Civelek mcive...@mednet.ucla.edu wrote:
 Hi Everyone,

 I am getting the following error message

 Error: cannot allocate vector of size 2.6 Gb
 In addition: Warning messages:
 1: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
 2: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
 3: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)
 4: In dim(res$res) = dim(bi) :
  Reached total allocation of 8122Mb: see help(memory.size)

 Here is the relevant info

 sessionInfo()
 R version 2.11.1 (2010-05-31)
 x86_64-pc-mingw32

 locale:
 [1] LC_COLLATE=English_United States.1252
 [2] LC_CTYPE=English_United States.1252
 [3] LC_MONETARY=English_United States.1252
 [4] LC_NUMERIC=C
 [5] LC_TIME=English_United States.1252

 attached base packages:
 [1] splines   tcltk     stats     graphics  grDevices utils     datasets
 [8] methods   base

 other attached packages:
 [1] cluster_1.12.3      WGCNA_0.93          Hmisc_3.8-2
 [4] survival_2.35-8     qvalue_1.22.0       flashClust_1.00-2
 [7] dynamicTreeCut_1.21 impute_1.22.0

 loaded via a namespace (and not attached):
 [1] grid_2.11.1     lattice_0.19-11 tools_2.11.1

 memory.size(NA)
 [1] 8122.89
 memory.size()
 [1] 5443.18
 memory.limit()
 [1] 8122
 .Machine$sizeof.pointer
 [1] 8

 And this is what I am trying to do when I get this error message
 ls()
 [1] datExpr
 print(object.size(datExpr), units = auto)
 23.5 Mb
 ADJ1=((1+bicor(datExpr, use=pairwise.complete.obs, maxPOutliers=0.05, 
 quick=0, pearsonFallback=individual))/2)^8

 If I understand the archives correctly my problem is with memory allocation 
 of a large vector to the address space. Is there any way to get around this 
 without having to use a Linux system?  Has anyone been able to solve this 
 problem?

 I appreciate any suggestions or help.

 Mete Civelek

 
 IMPORTANT WARNING: This email (and any attachments) is o...{{dropped:12}}

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Win Server x64/R: Memory Allocation Problem

2010-07-14 Thread will . eagle
Dear all,

how can I use R on a 64-bit Windows Server 2003 machine (24GB RAM) with more 
than 3GB of working memory and make full use of it.

I started R --max-mem-size=3G since I got the warning that larger values are 
too large and ignored.

In R I got: 
 memory.size(max=FALSE)
[1] 10.5
 memory.size(max=TRUE)
[1] 12.69
 memory.limit()
[1] 3072

but when I run the next command, I get an error:
climb.expset - ReadAffy(celfile.path=./Data/Original/CLIMB/CEL/)
Error: cannot allocate vector of size 2.4 Gb

Here is the R version I am using:
platform   i386-pc-mingw32  
arch   i386 
os mingw32  
system i386, mingw32   
version.string R version 2.11.1 (2010-05-31)

What can I do?

Thanks a lot in advance,

Will

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Win Server x64/R: Memory Allocation Problem

2010-07-14 Thread Dirk Eddelbuettel
On Wed, Jul 14, 2010 at 05:51:17PM +0200, will.ea...@gmx.net wrote:
 Dear all,
 
 how can I use R on a 64-bit Windows Server 2003 machine (24GB RAM) with more 
 than 3GB of working memory and make full use of it.
 
 I started R --max-mem-size=3G since I got the warning that larger values are 
 too large and ignored.
 
 In R I got: 
  memory.size(max=FALSE)
 [1] 10.5
  memory.size(max=TRUE)
 [1] 12.69
  memory.limit()
 [1] 3072
 
 but when I run the next command, I get an error:
 climb.expset - ReadAffy(celfile.path=./Data/Original/CLIMB/CEL/)
 Error: cannot allocate vector of size 2.4 Gb
 
 Here is the R version I am using:
 platform   i386-pc-mingw32  
 arch   i386 
 os mingw32  
 system i386, mingw32   
 version.string R version 2.11.1 (2010-05-31)
 
 What can I do?

Maybe you want to consider switching to the 64-bit version of R.

-- 
  Regards, Dirk

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation

2009-01-19 Thread Keith Ponting
Gabriel Margarido gramarga at gmail.com writes:

 ... I looked for a way to return the values
 without copying (even tried Rmemprof), but without success. Any ideas?
 ...

I solved similar problems using the R.oo package, which emulates
pass-by-reference semantics in 'R'.

HTH

Keith

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation

2009-01-16 Thread Gabriel Margarido
Hello everyone,

I have the following issue: one function generates a very big array (can be
more than 1 Gb) and returns a few variables, including this big one. Memory
allocation is OK while the function is running, but the final steps make
some copies that can be problematic. I looked for a way to return the values
without copying (even tried Rmemprof), but without success. Any ideas?
The code looks like this:

myfunc - function() {
...
bigarray - ...
...
final - list(..., bigarray=bigarray, ...)
class(final) - myfunc
final
}

Thank you in advance,
Gabriel.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation

2009-01-16 Thread Duncan Murdoch

On 1/16/2009 12:46 PM, Gabriel Margarido wrote:

Hello everyone,

I have the following issue: one function generates a very big array (can be
more than 1 Gb) and returns a few variables, including this big one. Memory
allocation is OK while the function is running, but the final steps make
some copies that can be problematic. I looked for a way to return the values
without copying (even tried Rmemprof), but without success. Any ideas?
The code looks like this:

myfunc - function() {
...
bigarray - ...
...
final - list(..., bigarray=bigarray, ...)
class(final) - myfunc
final
}

Thank you in advance,


I believe this will do less copying, but I haven't profiled it to be 
sure.  Replace the last three lines with this one statement:


structure(list(..., bigarray=bigarray, ...),
   class = myfunc)

If that doesn't help, then you really need to determine where the 
copying is happening: you can use Rprofmem() to do that.



Duncan Murdoch

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem (during kmeans)

2008-09-09 Thread rami batal
Dear all,

I am trying to apply kmeans clusterring on a data file (size is about 300
Mb)

I read this file using

x=read.table('file path' , sep= )

then i do kmeans(x,25)

but the process stops after two minutes with an error :

Error: cannot allocate vector of size 907.3 Mb

when i read the archive i notice that the best solution is to use a 64bit
OS.

Error messages beginning cannot allocate vector of size indicate a failure
to obtain memory, either because the size exceeded the address-space limit
for a process or, more likely, because the system was unable to provide the
memory. Note that on a 32-bit OS there may well be enough free memory
available, but not a large enough contiguous block of address space into
which to map it. 

the problem that I have two machines with two OS (32bit and 64bit) and when
i used the 64bit OS the same error remains.

Thank you if you have any suggestions to me and excuse me because i am a
newbie.

Here the default information for the 64bit os:

 sessionInfo()
R version 2.7.1 (2008-06-23)
x86_64-redhat-linux-gnu

 gc()
 used (Mb) gc trigger (Mb) max used (Mb)
Ncells 137955  7.4 35 18.7   35 18.7
Vcells 141455  1.1 786432  6.0   601347  4.6

I tried also to start R using the options to control the available memory
and the result still the same. or maybe i don't assign the correct values.


Thank you in advance.

-- 
Rami BATAL

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem (during kmeans)

2008-09-09 Thread Peter Dalgaard
rami batal skrev:
 Dear all,

 I am trying to apply kmeans clusterring on a data file (size is about 300
 Mb)

 I read this file using

 x=read.table('file path' , sep= )

 then i do kmeans(x,25)

 but the process stops after two minutes with an error :

 Error: cannot allocate vector of size 907.3 Mb

 when i read the archive i notice that the best solution is to use a 64bit
 OS.

 Error messages beginning cannot allocate vector of size indicate a failure
 to obtain memory, either because the size exceeded the address-space limit
 for a process or, more likely, because the system was unable to provide the
 memory. Note that on a 32-bit OS there may well be enough free memory
 available, but not a large enough contiguous block of address space into
 which to map it. 

 the problem that I have two machines with two OS (32bit and 64bit) and when
 i used the 64bit OS the same error remains.

 Thank you if you have any suggestions to me and excuse me because i am a
 newbie.

 Here the default information for the 64bit os:

   
 sessionInfo()
 
 R version 2.7.1 (2008-06-23)
 x86_64-redhat-linux-gnu

   
 gc()
 
  used (Mb) gc trigger (Mb) max used (Mb)
 Ncells 137955  7.4 35 18.7   35 18.7
 Vcells 141455  1.1 786432  6.0   601347  4.6

 I tried also to start R using the options to control the available memory
 and the result still the same. or maybe i don't assign the correct values.

   
It might be a good idea first to work out what the actual memory
requirements are. 64 bits does not help if you are running out of RAM
(+swap).

-- 
   O__   Peter Dalgaard Øster Farimagsgade 5, Entr.B
  c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
 (*) \(*) -- University of Copenhagen   Denmark  Ph:  (+45) 35327918
~~ - ([EMAIL PROTECTED])  FAX: (+45) 35327907

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem

2008-08-12 Thread Jamie Ledingham
Dear R users,
I am running a large loop over about 400 files. To outline generally,
the code reads in the initial data file, then uses lookup text files to
obtain more information before connecting to a SQL database using RODBC
and extracting more data.  Finally all this is polar plotted.
My problem is that when the loop gets through 170 odd files it gives the
error message:
Calloc could not allocate (263168 of 1) memory
I have increased the memory using memory.limit to the maximum amount.
I strongly suspect that R is holding data temporarily and that this
becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.
Thanks
Jamie Ledingham

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem

2008-08-12 Thread Kerpel, John
See ?gc - it may help.

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of Jamie Ledingham
Sent: Tuesday, August 12, 2008 9:16 AM
To: r-help@r-project.org
Subject: [R] Memory allocation problem

Dear R users,
I am running a large loop over about 400 files. To outline generally,
the code reads in the initial data file, then uses lookup text files to
obtain more information before connecting to a SQL database using RODBC
and extracting more data.  Finally all this is polar plotted.
My problem is that when the loop gets through 170 odd files it gives the
error message:
Calloc could not allocate (263168 of 1) memory
I have increased the memory using memory.limit to the maximum amount.
I strongly suspect that R is holding data temporarily and that this
becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.
Thanks
Jamie Ledingham

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem

2008-08-12 Thread Roland Rau

Jamie Ledingham wrote:

becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.


Besides using gc() (- email by John Kerpel), you might also consider to 
remove all objects:

rm(list=ls())

I hope this helps,
Roland

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation failed: Copying Node

2008-06-25 Thread ppatel3026

Following code bugs with Memory allocation failed: Copying Node error after
parsing n thousand files. I have included the main code(below) and
functions(after the main code).  

I am not sure which lines are causing the copying Node which results in
memory failure. Please advise.   

#Beginning of Code
for(i in 1:nrow(newFile)) { 
if(i%%3000 == 0) gc()
fname - as.character(newFile$File Name[i])   
file = strsplit(fname,/)[[1]][4]
filein = C:\\foldername\\ %+% file

if((!file.exists(filein)) || (length(readLines(filein)) == 0) )
{
  ftp - paste(ftp://servername/;, fname, sep=)
  fileout = filein
  try(download.file(url=ftp, destfile=fileout))  
}

txt - readLines(filein)  
if(length(txt) == 0){
next
}

   xmlInside - grep(/*XML, txt)  
   xmlTxt - txt[seq(xmlInside[1]+1, xmlInside[2]-1)]  
   xml - tryCatch(xmlMalformed2(filein), error = function(err)
unProcessedFiles(filein) )
   if(is.null(xml)) next

if(is.null(xml)) {
  stop(File not processed:  %+% file)
}
 
processed=FALSE   
owner - tryCatch(
data.frame(datadate=xValHelper(periodOfReport),
  CIK=xValHelper(issuerCik),
  conm=xValHelper(issuerName),
 
tic=xValHelper(issuerTradingSymbol)),
  error = function(err) unProcessedFiles(filein)
)
if(is.null(owner)) next

nodes - getNodeSet(xml, //nonDerivativeTransaction)
if(xmlSize(nodes)  0){  
  processed - tryCatch( processTransaction(owner, nodes,
outputFile), 
   error = function(err)
unProcessedFiles(filein) )
  if(is.null(processed)) next 
} 
  }
#End of Code


#List of Functions
xmlMalformed2 - function(filename) {
quotes - c(\r\nquot;,
q\r\nuot;,qu\r\not;,quo\r\nt;,quot\r\n;)
amp - c(\r\namp;, a\r\nmp;,am\r\np;,amp\r\n;)

xmlDoc-NULL
charStream - readChar(filename, file.info(filename)$size)
charStreamNew - gsubfn([^]*, ~ gsub([\r\n], , x), charStream)

for(k in quotes) {  
  if(length(grep(k, charStreamNew))  0) {
  charStreamNew - sub(k, quot;, charStreamNew)   
  } 
}

for(v in amp) {  
  if(length(grep(v, charStreamNew))  0) {
  charStreamNew - sub(v, amp;, charStreamNew)   
  } 
}
charStreamNew - gsub(quot;, \, charStreamNew)
charStreamNew - gsub(amp;, and, charStreamNew)
 
xmlVec-readLines(textConnection(charStreamNew))
xmlInDoc - grep(/*XML, xmlVec)
xmlDoc - xmlTreeParse(xmlVec[seq(xmlInDoc[1]+1, xmlInDoc[2]-1)],
useInternal=TRUE)
}

processTransaction - function(rptOwner, nodes, outFile) {
  transaction - data.frame(  
   
transdate=xValHelperSpecial(nodes,transactionDate),
   
securityTitle=xValHelperSpecial(nodes,securityTitle),   
   
transactionShares=if(length(xValHelperSpecial(nodes,transactionShares)) ==
1)

xValHelperSpecial(nodes,transactionShares)[[1]] else 

xValHelperSpecial(nodes,transactionShares))

  out - merge(rptOwner,transaction, all.x=TRUE)
  output-cbind(out,file) #file - variable containing filename that data
was read from 
  write.table(output, file=outFile, append=TRUE, sep=\t, eol=\n,
quote=FALSE, col.names=FALSE, 
 row.names=FALSE)
  processed=TRUE
  return(processed)  
}

unProcessedFiles - function(filename) {
write.table(filename, file=C:/errorFile.txt, append=TRUE, sep=\t,
eol=\n, quote=FALSE, 
col.names=FALSE, row.names=FALSE)   

  return(NULL)
}

#xValHelperSpecial and xValHelper are prerty similar hence avoiding code for
xValHelper
xValHelperSpecial - function(node, xtag) {
nobs - xmlSize(node)
out-NULL
if(xtag == tagName1) {
  for (n in seq(1:nobs)) {
temp - xpathApply(node[[n]], // %+% xtag, xmlValue)

if(length(temp)  0) {
  if (n==1) assign(out,gsub('(^ +)|(
+$)','',gsub('\n','',temp[[1]]))) else 
   assign(out,rbind(out,gsub('(^ +)|(
+$)','',gsub('\n','',temp[[1]]
} else {
  if (n==1) assign(out,NA) else