Re: [R] memory allocation problem

2016-12-06 Thread Jeff Newmiller
Buy more memory? Do something different than you were doing before the error 
occurred? Use a search engine to find what other people have done when this 
message appeared? Follow the recommendations in the Posting Guide mentioned in 
the footer of this and every post on this mailing list? 
-- 
Sent from my phone. Please excuse my brevity.

On December 6, 2016 7:40:40 AM PST, Elham - via R-help  
wrote:
>hi everyone,
>I tried to run my code in RStudio,but I received this error
>message,what should I do?
>Error: cannot allocate vector of size 12.1 Gb
>In addition: Warning messages:
>1: In cor(coding.rpkm[grep("23.C", coding.rpkm$name), -1],
>ncoding.rpkm[grep("23.C",  :
>  Reached total allocation of 6027Mb: see help(memory.size)
>   [[alternative HTML version deleted]]
>
>__
>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation problem

2016-12-06 Thread Elham - via R-help
hi everyone,
I tried to run my code in RStudio,but I received this error message,what should 
I do?
Error: cannot allocate vector of size 12.1 Gb
In addition: Warning messages:
1: In cor(coding.rpkm[grep("23.C", coding.rpkm$name), -1], 
ncoding.rpkm[grep("23.C",  :
  Reached total allocation of 6027Mb: see help(memory.size)
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem (again!)

2012-02-08 Thread Christofer Bogaso
Dear all, I know this problem was discussed many times in forum, however
unfortunately I could not find any way out for my own problem. Here I am
having Memory allocation problem while generating a lot of random number.
Here is my description:

 rnorm(5*6000)
Error: cannot allocate vector of size 2.2 Gb
In addition: Warning messages:
1: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
2: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
3: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
4: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 memory.size(TRUE)
[1] 15.75
 rnorm(5*6000)
Error: cannot allocate vector of size 2.2 Gb
In addition: Warning messages:
1: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
2: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
3: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
4: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)

And the Session info is here:

 sessionInfo()
R version 2.14.0 (2011-10-31)
Platform: i386-pc-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United
States.1252   
[3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C

[5] LC_TIME=English_United States.1252

attached base packages:
[1] graphics  grDevices utils datasets  grid  stats methods
base 

other attached packages:
[1] ggplot2_0.8.9 proto_0.3-9.2 reshape_0.8.4 plyr_1.6  zoo_1.7-6

loaded via a namespace (and not attached):
[1] lattice_0.20-0

I am using Windows 7 (home version) with 4 GB of RAM (2.16GB is usable as my
computer reports). So in my case, is it not possible to generate a random
vector with such length? Note that generating such vector is my primary job.
Later I need to do something on that vector. Those Job includes:
1. Create a matrix with 50,000 rows.
2. Get the row sum
3. then report some metrics on that sum values (min. 50,000 elements must be
there).

Can somebody help me with some real solution/suggesting?

Thanks and regards,

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem (again!)

2012-02-08 Thread Justin Haynes
32 bit windows has a memory limit of 2GB.  Upgrading to a computer thats
less than 10 years old is the best path.

But short of that, if you're just generating random data, why not do it in
two or more pieces and combine them later?

mat.1 - matrix(rnorm(5*2000),nrow=5)
mat.2 - matrix(rnorm(5*2000),nrow=5)
mat.3 - matrix(rnorm(5*2000),nrow=5)

mat.1.sums - rowSums(mat.1)
mat.2.sums - rowSums(mat.2)
mat.3.sums - rowSums(mat.3)

mat.sums - c(mat.1.sums,mat.2.sums,mat.3.sums)



On Wed, Feb 8, 2012 at 8:37 AM, Christofer Bogaso 
bogaso.christo...@gmail.com wrote:

 Dear all, I know this problem was discussed many times in forum, however
 unfortunately I could not find any way out for my own problem. Here I am
 having Memory allocation problem while generating a lot of random number.
 Here is my description:

  rnorm(5*6000)
 Error: cannot allocate vector of size 2.2 Gb
 In addition: Warning messages:
 1: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 2: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 3: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 4: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
  memory.size(TRUE)
 [1] 15.75
  rnorm(5*6000)
 Error: cannot allocate vector of size 2.2 Gb
 In addition: Warning messages:
 1: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 2: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 3: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)
 4: In rnorm(5 * 6000) :
  Reached total allocation of 1535Mb: see help(memory.size)

 And the Session info is here:

  sessionInfo()
 R version 2.14.0 (2011-10-31)
 Platform: i386-pc-mingw32/i386 (32-bit)

 locale:
 [1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United
 States.1252
 [3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C

 [5] LC_TIME=English_United States.1252

 attached base packages:
 [1] graphics  grDevices utils datasets  grid  stats methods
 base

 other attached packages:
 [1] ggplot2_0.8.9 proto_0.3-9.2 reshape_0.8.4 plyr_1.6  zoo_1.7-6

 loaded via a namespace (and not attached):
 [1] lattice_0.20-0

 I am using Windows 7 (home version) with 4 GB of RAM (2.16GB is usable as
 my
 computer reports). So in my case, is it not possible to generate a random
 vector with such length? Note that generating such vector is my primary
 job.
 Later I need to do something on that vector. Those Job includes:
 1. Create a matrix with 50,000 rows.
 2. Get the row sum
 3. then report some metrics on that sum values (min. 50,000 elements must
 be
 there).

 Can somebody help me with some real solution/suggesting?

 Thanks and regards,

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem (again!)

2012-02-08 Thread Ernest Adrogué
 8-02-2012, 22:22 (+0545); Christofer Bogaso escriu:
 And the Session info is here:
 
  sessionInfo()
 R version 2.14.0 (2011-10-31)
 Platform: i386-pc-mingw32/i386 (32-bit)

Not an expert, but I think that 32-bit applications can only address
up to 2GB on Windows.

-- 
Bye,
Ernest

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem

2011-04-08 Thread Luis Felipe Parra
Hello, I am runnning  a program on R with a big number of simulations and
I am getting the following error:

Error: no se puede ubicar un vector de tamaño  443.3 Mb

I don't understand why because when I check the memory status in my pc I get
the following:

 memory.size()
[1] 676.3
 memory.size(T)
[1] 1124.69
 memory.limit()
[1] 4000

which should in theory allow to have a vector of size 443.Mb. I am running
it on a pc on windows, 4gb RAM and intel core i7 processor. Does anybody
know what might be going on?

Thank you

Felipe Parra
-- 

Este mensaje de correo electrónico es enviado por Quantil S.A.S y puede
contener información confidencial o privilegiada.

This e-mail is sent by Quantil S.A.S and may contain confidential or
privileged information

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem

2011-04-08 Thread Joshua Wiley
Hi Felipe,

On Fri, Apr 8, 2011 at 7:54 PM, Luis Felipe Parra
felipe.pa...@quantil.com.co wrote:
 Hello, I am runnning  a program on R with a big number of simulations and
 I am getting the following error:

 Error: no se puede ubicar un vector de tamaño  443.3 Mb

 I don't understand why because when I check the memory status in my pc I get
 the following:

 memory.size()
 [1] 676.3
 memory.size(T)
 [1] 1124.69
 memory.limit()
 [1] 4000

 which should in theory allow to have a vector of size 443.Mb. I am running
 it on a pc on windows, 4gb RAM and intel core i7 processor. Does anybody
 know what might be going on?

It is not that *a* vector of size 443 MB could not be allocated given
your system.  However, during your simulation, multiple objects take
up memory, and that 443 MB vector was the final one that was too big
to assign.  Depending how your simulation is setup, you might be able
to remove objects that are no longer needed or rewrite it to a less
memory intensive form.  I do not know enough about memory+R to offer
any specific advice as to solutions.  Hopefully someone else here can
chime in.

Good luck,

Josh


 Thank you

 Felipe Parra
 --

 Este mensaje de correo electrónico es enviado por Quantil S.A.S y puede
 contener información confidencial o privilegiada.

 This e-mail is sent by Quantil S.A.S and may contain confidential or
 privileged information

        [[alternative HTML version deleted]]


 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

-- 
Joshua Wiley
Ph.D. Student, Health Psychology
University of California, Los Angeles
http://www.joshuawiley.com/

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation problem

2010-11-03 Thread Lorenzo Cattarino
Following on my memory allocation problem...

I tried to run my code on our university HPU facility, requesting 61 GB
of memory, and it still can not allocate a vector of 5 MB of size.

 load('/home/uqlcatta/test_scripts/.RData')
 
 myfun - function(Range, H1, H2, p, coeff)
+ {
+
-(coeff[1]+coeff[2]*H1+coeff[3]*H2+coeff[4]*p)*exp(-(coeff[5]+coeff[6]*H
1+coeff[7]*H2+coeff[8]*p)*Range)+coeff[9]+coeff[10]*H1+coeff[11]*H2+coef
f[12]*p
+ }
 
 SS - function(coeff,steps,Range,H1,H2,p)
+ {
+ sum((steps - myfun(Range,H1,H2,p,coeff))^2)
+ }
 
 coeff - c(1,1,1,1,1,1,1,1,1,1,1,1)
 
 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
Execution halted

May it be a proble of the function?

Any input is very much appreciated

Lorenzo

-Original Message-
From: Lorenzo Cattarino 
Sent: Wednesday, 3 November 2010 2:22 PM
To: 'David Winsemius'; 'Peter Langfelder'
Cc: r-help@r-project.org
Subject: RE: [R] memory allocation problem

Thanks for all your suggestions,

This is what I get after removing all the other (not useful) objects and
run my code:

 getsizes()
[,1]
org_results 47240832
myfun  11672
getsizes4176
SS  3248
coeff168
NA  NA
NA  NA
NA  NA
NA  NA
NA  NA

 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
In addition: Warning messages:
1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)


It seems that R is using all the default availabe memory (4 GB, which is
the RAM of my processor).

 memory.limit()
[1] 4055
 memory.size()
[1] 4049.07


My dataframe has a size of 47240832 bytes, or about 45 Mb. So it should
not be a problem in terms of memory usage?

I do not understand what is going on.

Thanks for your help anyway

Lorenzo

-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net] 
Sent: Wednesday, 3 November 2010 12:48 PM
To: Lorenzo Cattarino
Cc: r-help@r-project.org
Subject: Re: [R] memory allocation problem

Restart your computer. (Yeah, I know that what the help-desk always  
says.)
Start R before doing anything else.

Then run your code in a clean session. Check ls() oafter starte up to  
make sure you don't have a bunch f useless stuff in your .Rdata  
file.   Don't load anything that is not germane to this problem.  Use  
this function to see what sort of space issues you might have after  
loading objects:

  getsizes - function() {z - sapply(ls(envir=globalenv()),
 function(x) object.size(get(x)))
(tmp - as.matrix(rev(sort(z))[1:10]))}

Then run your code.

-- 
David.

On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:

 I would also like to include details on my R version



 version  _

 platform   x86_64-pc-mingw32
 arch   x86_64

 os mingw32
 system x86_64, mingw32
 status
 major  2
 minor  11.1
 year   2010
 month  05
 day31
 svn rev52157
 language   R
 version.string R version 2.11.1 (2010-05-31)

 from FAQ 2.9

(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021

http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021 ) it says that:
 For a 64-bit build, the default is the amount of RAM

 So in my case the amount of RAM would be 4 GB. R should be able to
 allocate a vector of size 5 Mb without me typing any command (either  
 as
 memory.limit() or appended string in the target path), is that right?



 From: Lorenzo Cattarino
 Sent: Wednesday, 3 November 2010 10:55 AM
 To: 'r-help@r-project.org'
 Subject: memory allocation problem



 I forgot to mention that I am using windows 7 (64-bit) and the R  
 version
 2.11.1 (64-bit)



 From: Lorenzo Cattarino

 I am trying to run a non linear parameter optimization using the
 function optim() and I have problems regarding memory allocation.

 My data are in a dataframe with 9 columns. There are 656100 rows.

 head(org_results)

 comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

 1   1   0.1  0  0 11000
 0.2528321

Re: [R] memory allocation problem

2010-11-03 Thread Lorenzo Cattarino
Thanks for all your suggestions,

This is what I get after removing all the other (not useful) objects and
run my code:

 getsizes()
[,1]
org_results 47240832
myfun  11672
getsizes4176
SS  3248
coeff168
NA  NA
NA  NA
NA  NA
NA  NA
NA  NA

 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
In addition: Warning messages:
1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)


It seems that R is using all the default availabe memory (4 GB, which is
the RAM of my processor).

 memory.limit()
[1] 4055
 memory.size()
[1] 4049.07


My dataframe has a size of 47240832 bytes, or about 45 Mb. So it should
not be a problem in terms of memory usage?

I do not understand what is going on.

Thanks for your help anyway

Lorenzo

-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net] 
Sent: Wednesday, 3 November 2010 12:48 PM
To: Lorenzo Cattarino
Cc: r-help@r-project.org
Subject: Re: [R] memory allocation problem

Restart your computer. (Yeah, I know that what the help-desk always  
says.)
Start R before doing anything else.

Then run your code in a clean session. Check ls() oafter starte up to  
make sure you don't have a bunch f useless stuff in your .Rdata  
file.   Don't load anything that is not germane to this problem.  Use  
this function to see what sort of space issues you might have after  
loading objects:

  getsizes - function() {z - sapply(ls(envir=globalenv()),
 function(x) object.size(get(x)))
(tmp - as.matrix(rev(sort(z))[1:10]))}

Then run your code.

-- 
David.

On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:

 I would also like to include details on my R version



 version  _

 platform   x86_64-pc-mingw32
 arch   x86_64

 os mingw32
 system x86_64, mingw32
 status
 major  2
 minor  11.1
 year   2010
 month  05
 day31
 svn rev52157
 language   R
 version.string R version 2.11.1 (2010-05-31)

 from FAQ 2.9

(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021

http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021 ) it says that:
 For a 64-bit build, the default is the amount of RAM

 So in my case the amount of RAM would be 4 GB. R should be able to
 allocate a vector of size 5 Mb without me typing any command (either  
 as
 memory.limit() or appended string in the target path), is that right?



 From: Lorenzo Cattarino
 Sent: Wednesday, 3 November 2010 10:55 AM
 To: 'r-help@r-project.org'
 Subject: memory allocation problem



 I forgot to mention that I am using windows 7 (64-bit) and the R  
 version
 2.11.1 (64-bit)



 From: Lorenzo Cattarino

 I am trying to run a non linear parameter optimization using the
 function optim() and I have problems regarding memory allocation.

 My data are in a dataframe with 9 columns. There are 656100 rows.

 head(org_results)

 comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

 1   1   0.1  0  0 11000
 0.2528321  0.1393901

 2   1   0.1  0  0 11000
 0.4605934  0.1011841

 3   1   0.1  0  0 11004
 3.4273670  0.1052789

 4   1   0.1  0  0 11004
 2.8766364  0.1022138

 5   1   0.1  0  0 11000
 0.3496872  0.1041056

 6   1   0.1  0  0 11000
 0.1050840  0.3572036

 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
 Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
 p=org_results$p)

 Error: cannot allocate vector of size 5.0 Mb

 In addition: Warning messages:

 1: In optim(coeff, SS, steps = org_results$no.steps, Range =
 org_results$Range,  : Reached total allocation of 1Mb: see
 help(memory.size)

 2: In optim(coeff, SS, steps = org_results$no.steps, Range =
 org_results$Range,  : Reached total allocation of 1Mb: see
 help(memory.size)

 3: In optim(coeff, SS, steps = org_results$no.steps, Range =
 org_results$Range

Re: [R] memory allocation problem

2010-11-03 Thread Jonathan P Daily
The optim function is very resource hungry. I have had similar problems in 
the past when dealing with extremely large datasets.

What is perhaps happening is that each 'step' of the optimization 
algorithm stores some info so that it can compare to the next 'step', and 
while the original vector may only be a few Mb of data, over many 
iterations a huge amount memory is allocated to the optimization steps.

Maybe look at the control options under ?optim, particularly stuff like 
trace, fnscale, ndeps, etc. that may cut down on the amount of data being 
stored each step as well as the number of steps needed.

Good luck!
--
Jonathan P. Daily
Technician - USGS Leetown Science Center
11649 Leetown Road
Kearneysville WV, 25430
(304) 724-4480
Is the room still a room when its empty? Does the room,
 the thing itself have purpose? Or do we, what's the word... imbue it.
 - Jubal Early, Firefly



From:
Lorenzo Cattarino l.cattar...@uq.edu.au
To:
David Winsemius dwinsem...@comcast.net, Peter Langfelder 
peter.langfel...@gmail.com
Cc:
r-help@r-project.org
Date:
11/03/2010 03:26 AM
Subject:
Re: [R] memory allocation problem
Sent by:
r-help-boun...@r-project.org



Thanks for all your suggestions,

This is what I get after removing all the other (not useful) objects and
run my code:

 getsizes()
[,1]
org_results 47240832
myfun  11672
getsizes4176
SS  3248
coeff168
NA  NA
NA  NA
NA  NA
NA  NA
NA  NA

 est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
In addition: Warning messages:
1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)
4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  :
  Reached total allocation of 4055Mb: see help(memory.size)


It seems that R is using all the default availabe memory (4 GB, which is
the RAM of my processor).

 memory.limit()
[1] 4055
 memory.size()
[1] 4049.07


My dataframe has a size of 47240832 bytes, or about 45 Mb. So it should
not be a problem in terms of memory usage?

I do not understand what is going on.

Thanks for your help anyway

Lorenzo

-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net] 
Sent: Wednesday, 3 November 2010 12:48 PM
To: Lorenzo Cattarino
Cc: r-help@r-project.org
Subject: Re: [R] memory allocation problem

Restart your computer. (Yeah, I know that what the help-desk always 
says.)
Start R before doing anything else.

Then run your code in a clean session. Check ls() oafter starte up to 
make sure you don't have a bunch f useless stuff in your .Rdata 
file.   Don't load anything that is not germane to this problem.  Use 
this function to see what sort of space issues you might have after 
loading objects:

  getsizes - function() {z - sapply(ls(envir=globalenv()),
 function(x) object.size(get(x)))
(tmp - as.matrix(rev(sort(z))[1:10]))}

Then run your code.

-- 
David.

On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:

 I would also like to include details on my R version



 version  _

 platform   x86_64-pc-mingw32
 arch   x86_64

 os mingw32
 system x86_64, mingw32
 status
 major  2
 minor  11.1
 year   2010
 month  05
 day31
 svn rev52157
 language   R
 version.string R version 2.11.1 (2010-05-31)

 from FAQ 2.9

(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021

http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
 e-a-limit-on-the-memory-it-uses_0021 ) it says that:
 For a 64-bit build, the default is the amount of RAM

 So in my case the amount of RAM would be 4 GB. R should be able to
 allocate a vector of size 5 Mb without me typing any command (either 
 as
 memory.limit() or appended string in the target path), is that right?



 From: Lorenzo Cattarino
 Sent: Wednesday, 3 November 2010 10:55 AM
 To: 'r-help@r-project.org'
 Subject: memory allocation problem



 I forgot to mention that I am using windows 7 (64-bit) and the R 
 version
 2.11.1 (64-bit)



 From: Lorenzo Cattarino

 I am trying to run a non linear parameter optimization using the
 function optim() and I have problems regarding memory allocation.

 My data are in a dataframe with 9 columns. There are 656100 rows.

 head(org_results)

 comb.id   p

[R] memory allocation problem

2010-11-02 Thread Lorenzo Cattarino
I forgot to mention that I am using windows 7 (64-bit) and the R version
2.11.1 (64-bit)

 

Thank you 

 

Lorenzo

 

From: Lorenzo Cattarino 
Sent: Wednesday, 3 November 2010 10:52 AM
To: r-help@r-project.org
Subject: memory allocation problem

 

Hi R users 

 

I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.

 

My data are in a dataframe with 9 columns. There are 656100 rows.

head(org_results)

 

  comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

1   1   0.1  0  0 11000
0.2528321  0.1393901

2   1   0.1  0  0 11000
0.4605934  0.1011841

3   1   0.1  0  0 11004
3.4273670  0.1052789

4   1   0.1  0  0 11004
2.8766364  0.1022138

5   1   0.1  0  0 11000
0.3496872  0.1041056

6   1   0.1  0  0 11000
0.1050840  0.3572036

 

 

est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)

 

Error: cannot allocate vector of size 5.0 Mb

In addition: Warning messages:

1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

 

 

 memory.size()

[1] 9978.19

 memory.limit()

[1] 1

 

 

 

I know that I am not sending reproducible codes but I was hoping that
you could help me understand what is going on. I set a maximum limit of
1 mega byte (by writing this string --max-mem-size=1M after the
target path, right click on R icon, shortcut tab). And R is telling me
that it cannot allocate a vector of size 5 Mb??? 

 

Thank you for your help

 

Lorenzo


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation problem

2010-11-02 Thread Lorenzo Cattarino
Hi R users 

 

I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.

 

My data are in a dataframe with 9 columns. There are 656100 rows.

head(org_results)

 

  comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

1   1   0.1  0  0 11000
0.2528321  0.1393901

2   1   0.1  0  0 11000
0.4605934  0.1011841

3   1   0.1  0  0 11004
3.4273670  0.1052789

4   1   0.1  0  0 11004
2.8766364  0.1022138

5   1   0.1  0  0 11000
0.3496872  0.1041056

6   1   0.1  0  0 11000
0.1050840  0.3572036

 

 

est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)

 

Error: cannot allocate vector of size 5.0 Mb

In addition: Warning messages:

1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

 

 

 memory.size()

[1] 9978.19

 memory.limit()

[1] 1

 

 

 

I know that I am not sending reproducible codes but I was hoping that
you could help me understand what is going on. I set a maximum limit of
1 mega byte (by writing this string --max-mem-size=1M after the
target path, right click on R icon, shortcut tab). And R is telling me
that it cannot allocate a vector of size 5 Mb??? 

 

Thank you for your help

 

Lorenzo


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation problem

2010-11-02 Thread Lorenzo Cattarino
I would also like to include details on my R version

 

 version

   _

platform   x86_64-pc-mingw32

arch   x86_64   

os mingw32  

system x86_64, mingw32  

status  

major  2

minor  11.1 

year   2010 

month  05   

day31   

svn rev52157

language   R

version.string R version 2.11.1 (2010-05-31)

 

 

from FAQ 2.9
(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
e-a-limit-on-the-memory-it-uses_0021
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
e-a-limit-on-the-memory-it-uses_0021 ) it says that:

 

For a 64-bit build, the default is the amount of RAM

 

So in my case the amount of RAM would be 4 GB. R should be able to
allocate a vector of size 5 Mb without me typing any command (either as
memory.limit() or appended string in the target path), is that right?

 

Thank you a lot

 

Lorenzo

 

From: Lorenzo Cattarino 
Sent: Wednesday, 3 November 2010 10:55 AM
To: 'r-help@r-project.org'
Subject: memory allocation problem

 

I forgot to mention that I am using windows 7 (64-bit) and the R version
2.11.1 (64-bit)

 

Thank you 

 

Lorenzo

 

From: Lorenzo Cattarino 
Sent: Wednesday, 3 November 2010 10:52 AM
To: r-help@r-project.org
Subject: memory allocation problem

 

Hi R users 

 

I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.

 

My data are in a dataframe with 9 columns. There are 656100 rows.

head(org_results)

 

  comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

1   1   0.1  0  0 11000
0.2528321  0.1393901

2   1   0.1  0  0 11000
0.4605934  0.1011841

3   1   0.1  0  0 11004
3.4273670  0.1052789

4   1   0.1  0  0 11004
2.8766364  0.1022138

5   1   0.1  0  0 11000
0.3496872  0.1041056

6   1   0.1  0  0 11000
0.1050840  0.3572036

 

 

est_coeff - optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)

 

Error: cannot allocate vector of size 5.0 Mb

In addition: Warning messages:

1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

 

 

 memory.size()

[1] 9978.19

 memory.limit()

[1] 1

 

 

 

I know that I am not sending reproducible codes but I was hoping that
you could help me understand what is going on. I set a maximum limit of
1 mega byte (by writing this string --max-mem-size=1M after the
target path, right click on R icon, shortcut tab). And R is telling me
that it cannot allocate a vector of size 5 Mb??? 

 

Thank you for your help

 

Lorenzo


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation problem

2010-11-02 Thread Peter Langfelder
You have (almost) exhausted the 10GB you limited R to (that's what the
memory.size() tells you). Increase memory.limit (if you have more RAM,
use memory.limit(15000) for 15GB etc), or remove large data objects
from you session. Use rm(object), the issue garbage collection gc().
Sometimes garbage collection may solve the problem on its own.

Peter


On Tue, Nov 2, 2010 at 5:55 PM, Lorenzo Cattarino l.cattar...@uq.edu.au wrote:
 I forgot to mention that I am using windows 7 (64-bit) and the R version
 2.11.1 (64-bit)



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation problem

2010-11-02 Thread David Winsemius
Restart your computer. (Yeah, I know that what the help-desk always  
says.)

Start R before doing anything else.

Then run your code in a clean session. Check ls() oafter starte up to  
make sure you don't have a bunch f useless stuff in your .Rdata  
file.   Don't load anything that is not germane to this problem.  Use  
this function to see what sort of space issues you might have after  
loading objects:


 getsizes - function() {z - sapply(ls(envir=globalenv()),
function(x) object.size(get(x)))
   (tmp - as.matrix(rev(sort(z))[1:10]))}

Then run your code.

--
David.

On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:


I would also like to include details on my R version




version  _


platform   x86_64-pc-mingw32
arch   x86_64

os mingw32
system x86_64, mingw32
status
major  2
minor  11.1
year   2010
month  05
day31
svn rev52157
language   R
version.string R version 2.11.1 (2010-05-31)

from FAQ 2.9
(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
e-a-limit-on-the-memory-it-uses_0021
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
e-a-limit-on-the-memory-it-uses_0021 ) it says that:
For a 64-bit build, the default is the amount of RAM

So in my case the amount of RAM would be 4 GB. R should be able to
allocate a vector of size 5 Mb without me typing any command (either  
as

memory.limit() or appended string in the target path), is that right?



From: Lorenzo Cattarino
Sent: Wednesday, 3 November 2010 10:55 AM
To: 'r-help@r-project.org'
Subject: memory allocation problem



I forgot to mention that I am using windows 7 (64-bit) and the R  
version

2.11.1 (64-bit)



From: Lorenzo Cattarino

I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.

My data are in a dataframe with 9 columns. There are 656100 rows.


head(org_results)


comb.id   p H1 H2 Range Rep no.steps  dist aver.hab.amount

1   1   0.1  0  0 11000
0.2528321  0.1393901

2   1   0.1  0  0 11000
0.4605934  0.1011841

3   1   0.1  0  0 11004
3.4273670  0.1052789

4   1   0.1  0  0 11004
2.8766364  0.1022138

5   1   0.1  0  0 11000
0.3496872  0.1041056

6   1   0.1  0  0 11000
0.1050840  0.3572036


est_coeff - optim(coeff,SS, steps=org_results$no.steps,

Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)

Error: cannot allocate vector of size 5.0 Mb

In addition: Warning messages:

1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)

4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range,  : Reached total allocation of 1Mb: see
help(memory.size)


memory.size()


[1] 9978.19


memory.limit()


[1] 1





I know that I am not sending reproducible codes but I was hoping that
you could help me understand what is going on. I set a maximum limit  
of
1 mega byte (by writing this string --max-mem-size=1M after  
the

target path, right click on R icon, shortcut tab). And R is telling me
that it cannot allocate a vector of size 5 Mb???




David Winsemius, MD
West Hartford, CT

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation problem

2010-11-02 Thread Peter Langfelder
Oops,  I missed that you only have 4GB of memory... but since R is
apparently capable of using almost 10GB, either you actually have more
RAM, or the system is swapping some data to disk. Increasing memory
use in R might still help, but also may lead to a situation where the
system waits forever for data to be swapped to and from the disk.

Peter

On Tue, Nov 2, 2010 at 7:36 PM, Peter Langfelder
peter.langfel...@gmail.com wrote:
 You have (almost) exhausted the 10GB you limited R to (that's what the
 memory.size() tells you). Increase memory.limit (if you have more RAM,
 use memory.limit(15000) for 15GB etc), or remove large data objects
 from you session. Use rm(object), the issue garbage collection gc().
 Sometimes garbage collection may solve the problem on its own.

 Peter


 On Tue, Nov 2, 2010 at 5:55 PM, Lorenzo Cattarino l.cattar...@uq.edu.au 
 wrote:
 I forgot to mention that I am using windows 7 (64-bit) and the R version
 2.11.1 (64-bit)




__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Win Server x64/R: Memory Allocation Problem

2010-07-14 Thread will . eagle
Dear all,

how can I use R on a 64-bit Windows Server 2003 machine (24GB RAM) with more 
than 3GB of working memory and make full use of it.

I started R --max-mem-size=3G since I got the warning that larger values are 
too large and ignored.

In R I got: 
 memory.size(max=FALSE)
[1] 10.5
 memory.size(max=TRUE)
[1] 12.69
 memory.limit()
[1] 3072

but when I run the next command, I get an error:
climb.expset - ReadAffy(celfile.path=./Data/Original/CLIMB/CEL/)
Error: cannot allocate vector of size 2.4 Gb

Here is the R version I am using:
platform   i386-pc-mingw32  
arch   i386 
os mingw32  
system i386, mingw32   
version.string R version 2.11.1 (2010-05-31)

What can I do?

Thanks a lot in advance,

Will

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Win Server x64/R: Memory Allocation Problem

2010-07-14 Thread Dirk Eddelbuettel
On Wed, Jul 14, 2010 at 05:51:17PM +0200, will.ea...@gmx.net wrote:
 Dear all,
 
 how can I use R on a 64-bit Windows Server 2003 machine (24GB RAM) with more 
 than 3GB of working memory and make full use of it.
 
 I started R --max-mem-size=3G since I got the warning that larger values are 
 too large and ignored.
 
 In R I got: 
  memory.size(max=FALSE)
 [1] 10.5
  memory.size(max=TRUE)
 [1] 12.69
  memory.limit()
 [1] 3072
 
 but when I run the next command, I get an error:
 climb.expset - ReadAffy(celfile.path=./Data/Original/CLIMB/CEL/)
 Error: cannot allocate vector of size 2.4 Gb
 
 Here is the R version I am using:
 platform   i386-pc-mingw32  
 arch   i386 
 os mingw32  
 system i386, mingw32   
 version.string R version 2.11.1 (2010-05-31)
 
 What can I do?

Maybe you want to consider switching to the 64-bit version of R.

-- 
  Regards, Dirk

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem (during kmeans)

2008-09-09 Thread rami batal
Dear all,

I am trying to apply kmeans clusterring on a data file (size is about 300
Mb)

I read this file using

x=read.table('file path' , sep= )

then i do kmeans(x,25)

but the process stops after two minutes with an error :

Error: cannot allocate vector of size 907.3 Mb

when i read the archive i notice that the best solution is to use a 64bit
OS.

Error messages beginning cannot allocate vector of size indicate a failure
to obtain memory, either because the size exceeded the address-space limit
for a process or, more likely, because the system was unable to provide the
memory. Note that on a 32-bit OS there may well be enough free memory
available, but not a large enough contiguous block of address space into
which to map it. 

the problem that I have two machines with two OS (32bit and 64bit) and when
i used the 64bit OS the same error remains.

Thank you if you have any suggestions to me and excuse me because i am a
newbie.

Here the default information for the 64bit os:

 sessionInfo()
R version 2.7.1 (2008-06-23)
x86_64-redhat-linux-gnu

 gc()
 used (Mb) gc trigger (Mb) max used (Mb)
Ncells 137955  7.4 35 18.7   35 18.7
Vcells 141455  1.1 786432  6.0   601347  4.6

I tried also to start R using the options to control the available memory
and the result still the same. or maybe i don't assign the correct values.


Thank you in advance.

-- 
Rami BATAL

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem (during kmeans)

2008-09-09 Thread Peter Dalgaard
rami batal skrev:
 Dear all,

 I am trying to apply kmeans clusterring on a data file (size is about 300
 Mb)

 I read this file using

 x=read.table('file path' , sep= )

 then i do kmeans(x,25)

 but the process stops after two minutes with an error :

 Error: cannot allocate vector of size 907.3 Mb

 when i read the archive i notice that the best solution is to use a 64bit
 OS.

 Error messages beginning cannot allocate vector of size indicate a failure
 to obtain memory, either because the size exceeded the address-space limit
 for a process or, more likely, because the system was unable to provide the
 memory. Note that on a 32-bit OS there may well be enough free memory
 available, but not a large enough contiguous block of address space into
 which to map it. 

 the problem that I have two machines with two OS (32bit and 64bit) and when
 i used the 64bit OS the same error remains.

 Thank you if you have any suggestions to me and excuse me because i am a
 newbie.

 Here the default information for the 64bit os:

   
 sessionInfo()
 
 R version 2.7.1 (2008-06-23)
 x86_64-redhat-linux-gnu

   
 gc()
 
  used (Mb) gc trigger (Mb) max used (Mb)
 Ncells 137955  7.4 35 18.7   35 18.7
 Vcells 141455  1.1 786432  6.0   601347  4.6

 I tried also to start R using the options to control the available memory
 and the result still the same. or maybe i don't assign the correct values.

   
It might be a good idea first to work out what the actual memory
requirements are. 64 bits does not help if you are running out of RAM
(+swap).

-- 
   O__   Peter Dalgaard Øster Farimagsgade 5, Entr.B
  c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
 (*) \(*) -- University of Copenhagen   Denmark  Ph:  (+45) 35327918
~~ - ([EMAIL PROTECTED])  FAX: (+45) 35327907

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem

2008-08-12 Thread Jamie Ledingham
Dear R users,
I am running a large loop over about 400 files. To outline generally,
the code reads in the initial data file, then uses lookup text files to
obtain more information before connecting to a SQL database using RODBC
and extracting more data.  Finally all this is polar plotted.
My problem is that when the loop gets through 170 odd files it gives the
error message:
Calloc could not allocate (263168 of 1) memory
I have increased the memory using memory.limit to the maximum amount.
I strongly suspect that R is holding data temporarily and that this
becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.
Thanks
Jamie Ledingham

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem

2008-08-12 Thread Kerpel, John
See ?gc - it may help.

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of Jamie Ledingham
Sent: Tuesday, August 12, 2008 9:16 AM
To: r-help@r-project.org
Subject: [R] Memory allocation problem

Dear R users,
I am running a large loop over about 400 files. To outline generally,
the code reads in the initial data file, then uses lookup text files to
obtain more information before connecting to a SQL database using RODBC
and extracting more data.  Finally all this is polar plotted.
My problem is that when the loop gets through 170 odd files it gives the
error message:
Calloc could not allocate (263168 of 1) memory
I have increased the memory using memory.limit to the maximum amount.
I strongly suspect that R is holding data temporarily and that this
becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.
Thanks
Jamie Ledingham

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem

2008-08-12 Thread Roland Rau

Jamie Ledingham wrote:

becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.


Besides using gc() (- email by John Kerpel), you might also consider to 
remove all objects:

rm(list=ls())

I hope this helps,
Roland

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.