On Wed, 4 Feb 2004, Cere M. Davis wrote:

>
> Hi folks,
>
> I've got this funny problem with R's foreign library when reading stata
> files.  One file consistently produces vector out of memory errors after
> gobbling up 2.7G of memory.  I parsed through the read.dta function and
> figured out where the error occurs and the description is below.  I am
> running R-1.8.1 on Debian stable system glibc2.2 kernel 2.4.24.  R is is
> compiled from source as a shared library.  The file that I am reading is
> only 172M in size.  The system I am using has 4G of free memory and 8 G of
> swap so this doesn't seem to be a problem for lack of free memory.  See
> Below.


I though this bug had already been fixed (Stefano Iacus reported it to me
a while back).  The problem occurs when a variable has a set of factor
names assigned, but that set of names is not present in the file -- it was
not clear from the otherwise excellent Stata documentation that this is
possible in  a valid .dta file.

Obviously the fix is not completely effective.  I'll look into it.

        -thomas

Thomas Lumley                   Assoc. Professor, Biostatistics
[EMAIL PROTECTED]       University of Washington, Seattle

______________________________________________
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to