[R] memory allocation glitches

2007-08-14 Thread Ben Bolker
(not sure whether this is better for R-devel or R-help ...)

 I am currently trying to debug someone else's package (they're
not available at the moment, and I would like it to work *now*),
which among other things allocates memory for a persistent
buffer that gets used by various functions.

 The first symptoms of a problem were that some things just
didn't work under Windows but were (apparently) fine on Linux.
I don't have all the development tools installed for Windows, so
I started messing around under Linux, adding Rprintf() statements
to the main code.

 Once I did that, strange pointer-error-like inconsistencies started
appearing -- e.g., the properties of some of the persistent variables
would change if I did debug(function).  I'm wondering if anyone
has any tips on how to tackle this -- figure out how to use valgrind?
Do straight source-level debugging (R -d gdb etc.) and look for
obvious problems?  The package uses malloc/realloc rather than
Calloc/Realloc -- does it make sense to go through the code
replacing these all and see if that fixes the problem?

 cheers
   Ben Bolker

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory allocation glitches

2007-08-14 Thread Peter Dalgaard
Ben Bolker wrote:
 (not sure whether this is better for R-devel or R-help ...)

   
Hardcore debugging is usually better off in R-devel. I'm leaving it in 
R-help though.

  I am currently trying to debug someone else's package (they're
 not available at the moment, and I would like it to work *now*),
 which among other things allocates memory for a persistent
 buffer that gets used by various functions.

  The first symptoms of a problem were that some things just
 didn't work under Windows but were (apparently) fine on Linux.
 I don't have all the development tools installed for Windows, so
 I started messing around under Linux, adding Rprintf() statements
 to the main code.

  Once I did that, strange pointer-error-like inconsistencies started
 appearing -- e.g., the properties of some of the persistent variables
 would change if I did debug(function).  I'm wondering if anyone
 has any tips on how to tackle this -- figure out how to use valgrind?
 Do straight source-level debugging (R -d gdb etc.) and look for
 obvious problems?  The package uses malloc/realloc rather than
 Calloc/Realloc -- does it make sense to go through the code
 replacing these all and see if that fixes the problem?
   
Valgrind is a good idea to try and as I recall it, the basic 
incantations are not too hard to work out (now exactly where is it that 
we wrote them down?). It only catches certain error types though, mostly 
use of uninitialized data and read/write off the ends of allocated 
blocks of memory.

If that doesn't catch it, you get to play with R -d  gdb. However, my 
experience is that line-by-line tracing is usually a dead end, unless 
you have the trouble spot pretty well narrowed down.

Apart from that, my usual procedure would be

1) find a minimal script reproducing the issue and hang onto it. Or at 
least as small as you can get it without losing the bug. Notice that any 
change to either the script or R itself may allow the bug to run away 
and hide somewhere else.

2) if memory corruption is involved, run under gdb, set a hardware 
watchpoint on the relevant location (this gets a little tricky sometimes 
because it might be outside the initial address space, in which case you 
need to somehow run the code for a while, break to gdb, and then set the 
watchpoint).

3) It is not unlikely that the watchpoint triggers several thousand 
times before the relevant one. You can conditionalize it; a nice trick 
is to use the gc_count.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation

2006-09-07 Thread alex lam \(RI\)
Dear list,

I have been trying to run the function qvalue under the package qvalue
on a vector with about 20 million values.

 asso_p.qvalue-qvalue(asso_p.vector)
Error: cannot allocate vector of size 156513 Kb
 sessionInfo()
Version 2.3.1 (2006-06-01)
i686-pc-linux-gnu

attached base packages:
[1] methods   stats graphics  grDevices utils
datasets
[7] base

other attached packages:
qvalue
 1.1
 gc()
used  (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells320188   8.6   23540643  628.7  20464901  546.5
Vcells 101232265 772.4  294421000 2246.3 291161136 2221.4

I have been told that the linux box has 4Gb of RAM, so it should be able
to do better than this.
I searched the FAQ and found some tips on increasing memory size, but
they seem to be windows specific, such as memory.size() and the
-max-mem-size flag. On my linux box R didn't recognise them.

I don't understand the meaning of max-vsize, max-nsize and max-ppsize.
Any help on how to increase the memory allocation on linux is much
appreciated.

Many thanks,
Alex


Alex Lam
PhD student
Department of Genetics and Genomics
Roslin Institute (Edinburgh)
Roslin
Midlothian EH25 9PS

Phone +44 131 5274471
Web   http://www.roslin.ac.uk

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation

2006-09-07 Thread Prof Brian Ripley
On Thu, 7 Sep 2006, alex lam (RI) wrote:

 Dear list,
 
 I have been trying to run the function qvalue under the package qvalue
 on a vector with about 20 million values.
 
  asso_p.qvalue-qvalue(asso_p.vector)
 Error: cannot allocate vector of size 156513 Kb
  sessionInfo()
 Version 2.3.1 (2006-06-01)
 i686-pc-linux-gnu
 
 attached base packages:
 [1] methods   stats graphics  grDevices utils
 datasets
 [7] base
 
 other attached packages:
 qvalue
  1.1
  gc()
 used  (Mb) gc trigger   (Mb)  max used   (Mb)
 Ncells320188   8.6   23540643  628.7  20464901  546.5
 Vcells 101232265 772.4  294421000 2246.3 291161136 2221.4
 
 I have been told that the linux box has 4Gb of RAM, so it should be able
 to do better than this.

But it also has a 4Gb/process address space, and of that some (1Gb?) is 
reserved for the system.  So it is quite possible that with 2.2Gb used you 
are unable to find any large blocks.

 I searched the FAQ and found some tips on increasing memory size, but
 they seem to be windows specific, such as memory.size() and the
 -max-mem-size flag. On my linux box R didn't recognise them.

?Memory-limits is the key

 Error messages beginning 'cannot allocate vector of size' indicate
 a failure to obtain memory, either because the size exceeded the
 address-space limit for a process or, more likely, because the
 system was unable to provide the memory.  Note that on a 32-bit OS
 there may well be enough free memory available, but not a large
 enough contiguous block of address space into which to map it.

 I don't understand the meaning of max-vsize, max-nsize and max-ppsize.
 Any help on how to increase the memory allocation on linux is much
 appreciated.

Get a 64-bit OS.

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory allocation

2005-12-15 Thread Bill Hunsicker
R-Help,
 
I am running R 2.1 on Windows(XP) platform.  I am reading in a 50 MB CSV
file and R is Erroring on memory.  I have tried to invoke R with
--max-mem-size and this also generates error.   Since I think that I
have about 200 MB of free memory and the file is on the order of 50
MB, is there a way to invoke R and allocate sufficient memory?
 
Regards,
Bill
 



Bill Hunsicker
RF Micro Devices
7625 Thorndike Road
Greensboro, NC 27409-9421
[EMAIL PROTECTED]
336-678-5260(w)
610-597-9985(m)
336-678-5088(lab)


 

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Memory allocation

2005-10-12 Thread Rod Staggs
I am trying to work with 75 affymetrix U133plus2 chips and am running into
memory allocation errors when trying to merge or convert probe level data to
expression values.

I keep getting - Error: cannot allocate vector of size 561011 Kb and that is
simply with a data subset.

Is there a way around this limitation?

-- 
Rodney A. Staggs
Cancer Center Informatics Shared Resource
425 Delaware St S.E. MMC 806
University of Minnesota
Minneapolis, MN  55455
 
Office: Room B532 Mayo
Telephone: (612)624-2445
Email: [EMAIL PROTECTED]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Memory allocation

2005-10-12 Thread James W. MacDonald
Rod Staggs wrote:
 I am trying to work with 75 affymetrix U133plus2 chips and am running into
 memory allocation errors when trying to merge or convert probe level data to
 expression values.

You don't say what package(s) you are using to do this, but assuming you 
are using the affy package, this is not the correct list - you should be 
using the BioC list ([EMAIL PROTECTED]).

To answer your question, there are several possibilities.

1.) Get more RAM (if on windows, start R with --max-mem-size=amount of 
RAM)

2.) Use justRMA()

3.) If on Windows, switch to *nix. The memory allocation is better, so 
you can analyze more chips with the same amount of RAM. Probably the 
easisest way to do this is with Quantian, which runs off a CD.


Best,

Jim



 
 I keep getting - Error: cannot allocate vector of size 561011 Kb and that is
 simply with a data subset.
 
 Is there a way around this limitation?
 


-- 
James W. MacDonald
Affymetrix and cDNA Microarray Core
University of Michigan Cancer Center
1500 E. Medical Center Drive
7410 CCGC
Ann Arbor MI 48109
734-647-5623

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory allocation failures

2005-06-21 Thread Prof Brian Ripley
Most likely your linear model fitting is too large for the amount of 
memory you have in your Windows box.  However, you should check ?Memory 
and in particular how your maximum RAM usage is set.  (I don't know how 
the `JGR JRI tools' give you access to R's startup settings.)

It may be that you are creating lots of large objects and so filling 
memory that way: try running gc() before the call to check your memory 
usage.  If so, you can remove (or save()) previous work.

On Mon, 20 Jun 2005, Matthew Padilla wrote:

 Hi,

 I am running R version rw2010 on a Windows 2000 desktop.  I am invoking R
 from Java via the JGR JRI tools.  My process consists of repeated calls to R
 in order to create linear models and process the resulting statistics.  I
 find, however, that the process often dies due to memory allocation errors:

 lm command:
 .model_resp10_1=lm(resp10_1~rannor10+rannor11+rannor13+rannor14+rannor15+rannor18+rannor23+rannor26+rannor28+rannor29+rannor32+rannor33+rannor35+rannor36+rannor39+rannor40+rannor43+rannor44+rannor46+rannor47+rannor48+rannor50+rannor51+rannor53+rannor55+rannor56+rannor57+rannor59).
 Garbage collection 172 = 54+41+77 (level 2) ...
 626936 cons cells free (75%)
 12.5 Mbytes of heap free (15%)
 Error: cannot allocate vector of size 2265 Kb

 I have tried to remedy this situation by corresponding calls to gc(), but
 this does not seem to fix the problem.  My files are not that large - about
 1000 records.  Additionally, general communication to R from Java from JRI
 does seem to work - I only run into problems when repeatedly creating models
 as demonstrated above.

 Any help would be greatly appreciated - I am very much an R newbie.

 Thank you,
 Matthew

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] memory allocation failures

2005-06-20 Thread Matthew Padilla
Hi,

I am running R version rw2010 on a Windows 2000 desktop.  I am invoking R 
from Java via the JGR JRI tools.  My process consists of repeated calls to R 
in order to create linear models and process the resulting statistics.  I 
find, however, that the process often dies due to memory allocation errors:

lm command: 
.model_resp10_1=lm(resp10_1~rannor10+rannor11+rannor13+rannor14+rannor15+rannor18+rannor23+rannor26+rannor28+rannor29+rannor32+rannor33+rannor35+rannor36+rannor39+rannor40+rannor43+rannor44+rannor46+rannor47+rannor48+rannor50+rannor51+rannor53+rannor55+rannor56+rannor57+rannor59).
Garbage collection 172 = 54+41+77 (level 2) ...
626936 cons cells free (75%)
12.5 Mbytes of heap free (15%)
Error: cannot allocate vector of size 2265 Kb

I have tried to remedy this situation by corresponding calls to gc(), but 
this does not seem to fix the problem.  My files are not that large - about 
1000 records.  Additionally, general communication to R from Java from JRI 
does seem to work - I only run into problems when repeatedly creating models 
as demonstrated above.

Any help would be greatly appreciated - I am very much an R newbie.

Thank you,
Matthew

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] memory allocation problem under linux

2005-06-12 Thread [EMAIL PROTECTED]
I have some compiled code that works under winXp but not under linux (kernel
2.6.10-5). I'm also using R 2.1.0
After debugging, I've discovered that this code:
  #define NMAX 256
  long **box;
  ...
  box   = (long **)R_alloc(NMAX,   sizeof(long *));

gives a null pointer, so subsequent line:
  for (i=0; iNMAX; i++) box[i] = (long *) R_alloc(NMAX, sizeof(long));
gives a SIGSEGV signal.
In the same shared library, I have a function with this code:
  partitions=16;
  ...
  h2=(long **)R_alloc(partitions,sizeof(long *));
  for (i=0;ipartitions;i++) 
  h2[i]=(long *)R_alloc(partitions,sizeof(long));
that works! Naturally, I've tried to change NMAX from 256 to 16, without any
success.

Any idea on where the problem can reside? (Note that this not happens under 
WinXp).
And just another question. When R_alloc fails, should-it terminate the function
with an error, without returning control to the function?

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] memory allocation problem under linux

2005-06-12 Thread [EMAIL PROTECTED]
I've written:

  #define NMAX 256
  long **box;
  ...
  box   = (long **)R_alloc(NMAX,   sizeof(long *));
gives a null pointer, so subsequent line:
  for (i=0; iNMAX; i++) box[i] = (long *) R_alloc(NMAX, sizeof(long));
gives a SIGSEGV signal.

Sorry, that's not exact: I have a segmentation fault just *inside* R_alloc!
Substituting R_alloc with malloc and Calloc gives the same error.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] memory allocation

2004-11-16 Thread Massimiliano Copetti

Dear sirs,

I'm using the Splancs package to compute standard errors of the estimates of a 
spatio-temporal k function.
When I use as spatial and temporal distances too long vectors (respectively 60 
and 80 entries) for a dataset of 1000 observations, R gives me the message

Error: cannot allocate vector of size 18 Kb

Reached total allocation of 512 Mb.

I ran the function memory.size() and obtained 39200 more or less.

Can anyone help me? 

Thanks in advance.

Massimiliano Copetti



-- 
Massimiliano Copetti, PhD Student
Institute of Quantitative Methods
L.Bocconi University
Viale Isonzo 25
20135 Milano (Italy)
http://www.unibocconi.it
http://spazioinwind.libero.it/maxcop78

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory allocation

2004-11-16 Thread Prof Brian Ripley
There is more to that message you have not reproduced.
I am pretty sure you are on Windows, and you need to read the rw-FAQ (as 
the posting guide asks you too).  See Q2.7.

On Tue, 16 Nov 2004, Massimiliano Copetti wrote:
I'm using the Splancs package to compute standard errors of the 
estimates of a spatio-temporal k function. When I use as spatial and 
temporal distances too long vectors (respectively 60 and 80 entries) for 
a dataset of 1000 observations, R gives me the message

Error: cannot allocate vector of size 18 Kb
Reached total allocation of 512 Mb.
I ran the function memory.size() and obtained 39200 more or less.
Yes, but see its help page for the relevant usage.
--
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595
__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory allocation

2004-11-16 Thread Roger Bivand
On Tue, 16 Nov 2004, Massimiliano Copetti wrote:

 
 Dear sirs,
 
 I'm using the Splancs package to compute standard errors of the
 estimates of a spatio-temporal k function. When I use as spatial and
 temporal distances too long vectors (respectively 60 and 80 entries) for
 a dataset of 1000 observations, R gives me the message

The memory demands arise in stvmat() called inside stsecal(), in which an 
(s*tm)*(s*tm) numeric matrix is allocated, described in the code as a 
full spacetime variance/covariance matrix, and which is necessarily 
large. In your case 2304 elements are used of 8 bytes each, so to do 
this in the way you describe you need more RAM - the matrix may be copied 
as well. The authors of the package and the underlying method did not 
anticipate needing this high degree of spatial and temporal resolution.

Roger Bivand

 
 Error: cannot allocate vector of size 18 Kb
 
 Reached total allocation of 512 Mb.
 
 I ran the function memory.size() and obtained 39200 more or less.
 
 Can anyone help me? 
 
 Thanks in advance.
 
 Massimiliano Copetti
 
 
 
 

-- 
Roger Bivand
Economic Geography Section, Department of Economics, Norwegian School of
Economics and Business Administration, Breiviksveien 40, N-5045 Bergen,
Norway. voice: +47 55 95 93 55; fax +47 55 95 93 93
e-mail: [EMAIL PROTECTED]

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] memory allocation error message

2004-09-14 Thread Prodromos Zanis
Dear all

I use the library(netCDF) to read in NCEP data. The file I want to read has size 113 
Mb.
When i try to read it I get the following message:

Error: cannot allocate vector of size 221080 Kb
In addition: Warning message: 
Reached total allocation of 255Mb: see help(memory.size) 

I get a similar message when I try to read a file with 256 Mb in a PC with 2 GigaByte 
RAM.

Is there something that I can do to handle this problem of reading big netCDF files 
with R-project.

I look forward for your help.

Prodromos Zanis



Dr. Prodromos Zanis
Research Centre for Atmospheric Physics and Climatology 
Academy of Athens
3rd September 131, Athens 11251, Greece
Tel. +30 210 8832048
Fax: +30 210 8832048
e-mail: [EMAIL PROTECTED]
Web address: http://users.auth.gr/~zanis/
*


[[alternative HTML version deleted]]

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory allocation error message

2004-09-14 Thread Peter Dalgaard
Prodromos Zanis [EMAIL PROTECTED] writes:

 Dear all
 
 I use the library(netCDF) to read in NCEP data. The file I want to
 read has size 113 Mb. When i try to read it I get the following
 message:
 
 Error: cannot allocate vector of size 221080 Kb
 In addition: Warning message: 
 Reached total allocation of 255Mb: see help(memory.size) 
 
 I get a similar message when I try to read a file with 256 Mb in a
 PC with 2 GigaByte RAM.
 
 Is there something that I can do to handle this problem of reading
 big netCDF files with R-project.

Did you read help(memory.size)? and follow instructions therein?

-- 
   O__   Peter Dalgaard Blegdamsvej 3  
  c/ /'_ --- Dept. of Biostatistics 2200 Cph. N   
 (*) \(*) -- University of Copenhagen   Denmark  Ph: (+45) 35327918
~~ - ([EMAIL PROTECTED]) FAX: (+45) 35327907

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory allocation error message

2004-09-14 Thread Uwe Ligges
Prodromos Zanis wrote:
Dear all
I use the library(netCDF) to read in NCEP data. The file I want to read has size 113 
Mb.
When i try to read it I get the following message:
Error: cannot allocate vector of size 221080 Kb
In addition: Warning message: 
Reached total allocation of 255Mb: see help(memory.size) 
So this is an R version  1.9.0 !
1. Please upgrade.
2. Please read ?Memory and learn how to increase the maximum amount of 
memory consumed by R under Windows.

Uwe Ligges

I get a similar message when I try to read a file with 256 Mb in a PC with 2 GigaByte 
RAM.
Is there something that I can do to handle this problem of reading big netCDF files 
with R-project.
I look forward for your help.
Prodromos Zanis

Dr. Prodromos Zanis
Research Centre for Atmospheric Physics and Climatology 
Academy of Athens
3rd September 131, Athens 11251, Greece
Tel. +30 210 8832048
Fax: +30 210 8832048
e-mail: [EMAIL PROTECTED]
Web address: http://users.auth.gr/~zanis/
*

[[alternative HTML version deleted]]
__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory allocation error message

2004-09-14 Thread Prof Brian Ripley
On 14 Sep 2004, Peter Dalgaard wrote:

 Prodromos Zanis [EMAIL PROTECTED] writes:
 
  Dear all
  
  I use the library(netCDF) to read in NCEP data. The file I want to
  read has size 113 Mb. When i try to read it I get the following
  message:
  
  Error: cannot allocate vector of size 221080 Kb
  In addition: Warning message: 
  Reached total allocation of 255Mb: see help(memory.size) 
  
  I get a similar message when I try to read a file with 256 Mb in a
  PC with 2 GigaByte RAM.
  
  Is there something that I can do to handle this problem of reading
  big netCDF files with R-project.
 
 Did you read help(memory.size)? and follow instructions therein?

Also, netCDF has been withdrawn from CRAN, and you might want to use ncdf 
or RNetCDF instead.  (Windows ports of both are available now: see the 
ReadMe on the CRAN windows contrib area.)

If the message really was similar you are using R  1.6.0 and need to 
upgrade.

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory allocation and interrupts

2004-06-12 Thread Prof Brian Ripley
On Fri, 11 Jun 2004, Vadim Ogranovich wrote:

 A recent discussion on the list about tryCatch and signals made me think
 about memory allocation and signals in C extension modules. What happens
 to the memory allocated by R_alloc and Calloc if the user pressed Ctr-C
 during the call? R-ext doesn't seem to discuss this. I'd guess that
 R_alloc is interrupt-safe while Calloc is not, but I am not sure. In any
 case a paragraph in R-ext on signals would be helpful.

Easy: such code is not interruptible by Ctrl-C (sic).  And that *is* in
R_exts.* (sic), even with an entry `interrupts' in its index!


-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory allocation and interrupts

2004-06-12 Thread Peter Dalgaard
Prof Brian Ripley [EMAIL PROTECTED] writes:

 On Fri, 11 Jun 2004, Vadim Ogranovich wrote:
 
  A recent discussion on the list about tryCatch and signals made me think
  about memory allocation and signals in C extension modules. What happens
  to the memory allocated by R_alloc and Calloc if the user pressed Ctr-C
  during the call? R-ext doesn't seem to discuss this. I'd guess that
  R_alloc is interrupt-safe while Calloc is not, but I am not sure. In any
  case a paragraph in R-ext on signals would be helpful.
 
 Easy: such code is not interruptible by Ctrl-C (sic).  And that *is* in
 R_exts.* (sic), even with an entry `interrupts' in its index!

The programmer might reenable interrupts though. This is not something
that we have a precedence (nor a policy) for, but it has crossed my
mind a couple of times. 

In some cases it should be possible to wrap time-consuming C code in a
setjmp/longjmp construct. It's not possible to do it generally because
all hell breaks loose if the C code calls back into R, but not all C
code does that. 

Of course it is critically important that the code resets the
interrupt handling to a sane state when it is done, so it would be
nice if we could abstract a reasonably safe construction into a
RUN_INTERRUPTIBLE() macro.

-- 
   O__   Peter Dalgaard Blegdamsvej 3  
  c/ /'_ --- Dept. of Biostatistics 2200 Cph. N   
 (*) \(*) -- University of Copenhagen   Denmark  Ph: (+45) 35327918
~~ - ([EMAIL PROTECTED]) FAX: (+45) 35327907

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] memory allocation and interrupts

2004-06-11 Thread Vadim Ogranovich
Hi,
 
A recent discussion on the list about tryCatch and signals made me think
about memory allocation and signals in C extension modules. What happens
to the memory allocated by R_alloc and Calloc if the user pressed Ctr-C
during the call? R-ext doesn't seem to discuss this. I'd guess that
R_alloc is interrupt-safe while Calloc is not, but I am not sure. In any
case a paragraph in R-ext on signals would be helpful.
 
While looking around for interrupts handling in the code I came across
BEGIN_SUSPEND_INTERRUPTS macro in Defn.h file. Unfortunately, it is not
available via R.h or Rinternals.h. Am I missing something? If not, could
future releases of R make it available via, say, Rinternals.h?
 
Thanks,
Vadim

[[alternative HTML version deleted]]

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Memory allocation

2003-12-23 Thread JFRI (Jesper Frickmann)
Go download R version 1.8.1 (or later if available). They fixed
something with memory management from 1.8.0 to 1.8.1 which helped me out
of the exact same problem. I think it has to do with memory
fragmentation; R cannot find any chunk big enough for even a small
vector after some time.

Kind regards, 
Jesper Frickmann 
Statistician, Quality Control 
Novozymes North America Inc. 
Tel. +1 919 494 3266
Fax +1 919 494 3460

-Original Message-
From: Richards, Thomas [mailto:[EMAIL PROTECTED] 
Sent: Monday, December 22, 2003 3:11 PM
To: '[EMAIL PROTECTED]'
Subject: [R] Memory allocation


Hello:

I am trying to work with a couple of microarray data sets, using

platform i386-pc-mingw32
arch i386   
os   mingw32
system   i386, mingw32  
status  
major1  
minor8.1
year 2003   
month11 
day  21 
language R  


In the shortcut for invoking R I have set --max-mem-size=1024M, so that
I get

 memory.limit()
[1] 1073741824

Below is an example of what keeps happening as I am working. Any
suggestions as to how I can stop running out of mermory?

 memory.size()
[1] 502904736
 log2.harvAD - log2.harvAD[log2.harvAD$Probesets %in%
harvard.genes$probeset,]
Error: cannot allocate vector of size 49 Kb
 log2.harvAD - log2.harvAD[,c(1,1+order(names(log2.harvAD)[-1]))]
Error: cannot allocate vector of size 49 Kb
 log2.harvAD.annot - 
 unlist(lapply(strsplit(names(log2.harvAD),split=:),
+   function(L) L[1]))[-1]
 log2.harvAD$probeset - as.character(log2.harvAD$probeset)
Error: cannot allocate vector of size 49 Kb
 memory.size()
[1] 502912536
 gc()
   used  (Mb) gc trigger  (Mb)
Ncells  2586025  69.16812252 182.0
Vcells 20108076 153.5   41205530 314.4
 memory.size()
[1] 330645720
 memory.limit()/memory.size()
[1] 3.247408
 ##  Try again:
 log2.harvAD - log2.harvAD[log2.harvAD$Probesets %in%
harvard.genes$probeset,]
Error: cannot allocate vector of size 49 Kb

[[alternative HTML version deleted]]

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


[R] Memory allocation

2003-12-22 Thread Richards, Thomas
Hello:

I am trying to work with a couple of microarray data sets, using

platform i386-pc-mingw32
arch i386   
os   mingw32
system   i386, mingw32  
status  
major1  
minor8.1
year 2003   
month11 
day  21 
language R  


In the shortcut for invoking R I have set --max-mem-size=1024M, so that I
get

 memory.limit()
[1] 1073741824

Below is an example of what keeps happening as I am working. Any suggestions
as to how I can stop running out of mermory?

 memory.size()
[1] 502904736
 log2.harvAD - log2.harvAD[log2.harvAD$Probesets %in%
harvard.genes$probeset,]
Error: cannot allocate vector of size 49 Kb
 log2.harvAD - log2.harvAD[,c(1,1+order(names(log2.harvAD)[-1]))]
Error: cannot allocate vector of size 49 Kb
 log2.harvAD.annot - unlist(lapply(strsplit(names(log2.harvAD),split=:),
+   function(L) L[1]))[-1]
 log2.harvAD$probeset - as.character(log2.harvAD$probeset)
Error: cannot allocate vector of size 49 Kb
 memory.size()
[1] 502912536
 gc()
   used  (Mb) gc trigger  (Mb)
Ncells  2586025  69.16812252 182.0
Vcells 20108076 153.5   41205530 314.4
 memory.size()
[1] 330645720
 memory.limit()/memory.size()
[1] 3.247408
 ##  Try again:
 log2.harvAD - log2.harvAD[log2.harvAD$Probesets %in%
harvard.genes$probeset,]
Error: cannot allocate vector of size 49 Kb

[[alternative HTML version deleted]]

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


Re: [R] Memory allocation

2003-12-22 Thread Prof Brian Ripley
Not really much hope here, but

1) If you have fast discs, try increasing --max-mem-size to more than your 
RAM, and

2) Try compiling up R-devel (see the FAQ for where to get it) as it has
a potentially better memory allocator.

I supect though that your problem is too big for R on 32-bit Windows.

BDR

On Mon, 22 Dec 2003, Richards, Thomas wrote:

 Hello:
 
   I am trying to work with a couple of microarray data sets, using
 
 platform i386-pc-mingw32
 arch i386   
 os   mingw32
 system   i386, mingw32  
 status  
 major1  
 minor8.1
 year 2003   
 month11 
 day  21 
 language R  
 
 
 In the shortcut for invoking R I have set --max-mem-size=1024M, so that I
 get
 
  memory.limit()
 [1] 1073741824
 
 Below is an example of what keeps happening as I am working. Any suggestions
 as to how I can stop running out of mermory?
 
  memory.size()
 [1] 502904736
  log2.harvAD - log2.harvAD[log2.harvAD$Probesets %in%
 harvard.genes$probeset,]
 Error: cannot allocate vector of size 49 Kb
  log2.harvAD - log2.harvAD[,c(1,1+order(names(log2.harvAD)[-1]))]
 Error: cannot allocate vector of size 49 Kb
  log2.harvAD.annot - unlist(lapply(strsplit(names(log2.harvAD),split=:),
 +   function(L) L[1]))[-1]
  log2.harvAD$probeset - as.character(log2.harvAD$probeset)
 Error: cannot allocate vector of size 49 Kb
  memory.size()
 [1] 502912536
  gc()
used  (Mb) gc trigger  (Mb)
 Ncells  2586025  69.16812252 182.0
 Vcells 20108076 153.5   41205530 314.4
  memory.size()
 [1] 330645720
  memory.limit()/memory.size()
 [1] 3.247408
  ##  Try again:
  log2.harvAD - log2.harvAD[log2.harvAD$Probesets %in%
 harvard.genes$probeset,]
 Error: cannot allocate vector of size 49 Kb
 
   [[alternative HTML version deleted]]
 
 __
 [EMAIL PROTECTED] mailing list
 https://www.stat.math.ethz.ch/mailman/listinfo/r-help
 
 

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


[R] R memory allocation error - Unix

2003-11-21 Thread Susan Shortreed


I am using ESS on a unix system for my analysis.  My R environment
contains a 90118 by 94 dataframe.  I am trying to calculate the mean of a
column in this data frame and I am getting the following error:

Error: can not allocate a vector of size 704 Kb

I have tried
options(memory=100)
and this does not help.

when I call gc() this is what is returned
 gc()
   used (Mb) gc trigger  (Mb)
Ncells  1178845 31.52564037  68.5
Vcells 1183 89.1   38686231 295.2

I tried calling mem.limits(nsize=1).  Any value for vsize
gives an NA error, and when I recall gc() the limit for Vcells is NA.
There is more than enough memory available on the Unix machine, when I
call top I am using 0.0% of the memory and the other handful of users are
using about 10% all together.  I have increased my  user memory limit
and that still did not help (I found an email in R-help archives
suggesting this).  It seems to me that 704Kb is a rather small size to
give an error and it appears to be available on the system.

Any suggestions?

Thank you,
Susan

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


RE: [R] R memory allocation error - Unix

2003-11-21 Thread Liaw, Andy
 From: Susan Shortreed [mailto:[EMAIL PROTECTED] 
 
 I am using ESS on a unix system for my analysis.  My R 
 environment contains a 90118 by 94 dataframe.  I am trying to 
 calculate the mean of a column in this data frame and I am 
 getting the following error:
 
 Error: can not allocate a vector of size 704 Kb
 
 I have tried
 options(memory=100)
 and this does not help.
 
 when I call gc() this is what is returned
  gc()
used (Mb) gc trigger  (Mb)
 Ncells  1178845 31.52564037  68.5
 Vcells 1183 89.1   38686231 295.2
 
 I tried calling mem.limits(nsize=1).  Any value 
 for vsize gives an NA error, and when I recall gc() the limit 
 for Vcells is NA. There is more than enough memory available 
 on the Unix machine, when I call top I am using 0.0% of the 
 memory and the other handful of users are using about 10% all 
 together.  I have increased my  user memory limit and that 
 still did not help (I found an email in R-help archives 
 suggesting this).  It seems to me that 704Kb is a rather 
 small size to give an error and it appears to be available on 
 the system.
 
 Any suggestions?

What exactly is this unix system?  This could be important, because I was
recently made aware of a problem on AIX 5, where R was compiled as 32-bit.
By default the R process will only have access to 256 MB of RAM, even though
ulimit -a reports unlimited and the machines has GBs of RAM.  I had to set
an evironment variable before starting R to get it to use up two 4GB of
memory.

This may not be your problem, but you do need to provide more details about
your system.

Andy

 
 Thank you,
 Susan
 
 __
 [EMAIL PROTECTED] mailing list 
 https://www.stat.math.ethz.ch/mailman/listinfo /r-help


__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


Re: [R] Memory allocation, IBM-AIX and R-1.6.2 - addendum

2003-02-11 Thread Luke Tierney
On 10 Feb 2003, Peter Dalgaard BSA wrote:

 Laurent Gautier [EMAIL PROTECTED] writes:
 
  ...sorry for the spam, but answers I get to my previous
  question suggest that I should specify that the
  machine has a *lot* of memory and should be able
  to the instanciation...
  
  Anybody with an IBM mainframe, R-1.6.2 and large
  matrices ?
 
 For whatever it's worth, that size does not appear to be a problem in
 Linux. Takes a while, and a lot of memory, to get a summary of the
 matrix though (I'm slightly puzzled by that: The memory footprint
 appears to shoot up to 1.7G for something that is just working
 on the columns of a 319M matrix?).

If the AIX you are using is 32 bit (I forget if they support 64) then
you have to remember that, no matter how much memory you have, you are
starting to run up against address space limits.  It is only fairly
recent Linux malloc's that are able to give a process more than 1GB of
address space, and the way things are done when the 1G threshold is
crossed is not necessarily very efficient.  (1G is usually the point
in the address space where the traditional heap has to stop because
the shared library loader is mapped there).  AIX will be different in
detail but the basic issue is the same: 2^32 = 4G only leave you so
much room to do the things the OS needs to do.

luke

-- 
Luke Tierney
University of Iowa  Phone: 319-335-3386
Department of Statistics andFax:   319-335-3017
   Actuarial Science
241 Schaeffer Hall  email:  [EMAIL PROTECTED]
Iowa City, IA 52242 WWW:  http://www.stat.uiowa.edu

__
[EMAIL PROTECTED] mailing list
http://www.stat.math.ethz.ch/mailman/listinfo/r-help