[R] Memory allocation problem (during kmeans)

2008-09-09 Thread rami batal
Dear all,

I am trying to apply kmeans clusterring on a data file (size is about 300
Mb)

I read this file using

x=read.table('file path' , sep= )

then i do kmeans(x,25)

but the process stops after two minutes with an error :

Error: cannot allocate vector of size 907.3 Mb

when i read the archive i notice that the best solution is to use a 64bit
OS.

Error messages beginning cannot allocate vector of size indicate a failure
to obtain memory, either because the size exceeded the address-space limit
for a process or, more likely, because the system was unable to provide the
memory. Note that on a 32-bit OS there may well be enough free memory
available, but not a large enough contiguous block of address space into
which to map it. 

the problem that I have two machines with two OS (32bit and 64bit) and when
i used the 64bit OS the same error remains.

Thank you if you have any suggestions to me and excuse me because i am a
newbie.

Here the default information for the 64bit os:

 sessionInfo()
R version 2.7.1 (2008-06-23)
x86_64-redhat-linux-gnu

 gc()
 used (Mb) gc trigger (Mb) max used (Mb)
Ncells 137955  7.4 35 18.7   35 18.7
Vcells 141455  1.1 786432  6.0   601347  4.6

I tried also to start R using the options to control the available memory
and the result still the same. or maybe i don't assign the correct values.


Thank you in advance.

-- 
Rami BATAL

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem (during kmeans)

2008-09-09 Thread Peter Dalgaard
rami batal skrev:
 Dear all,

 I am trying to apply kmeans clusterring on a data file (size is about 300
 Mb)

 I read this file using

 x=read.table('file path' , sep= )

 then i do kmeans(x,25)

 but the process stops after two minutes with an error :

 Error: cannot allocate vector of size 907.3 Mb

 when i read the archive i notice that the best solution is to use a 64bit
 OS.

 Error messages beginning cannot allocate vector of size indicate a failure
 to obtain memory, either because the size exceeded the address-space limit
 for a process or, more likely, because the system was unable to provide the
 memory. Note that on a 32-bit OS there may well be enough free memory
 available, but not a large enough contiguous block of address space into
 which to map it. 

 the problem that I have two machines with two OS (32bit and 64bit) and when
 i used the 64bit OS the same error remains.

 Thank you if you have any suggestions to me and excuse me because i am a
 newbie.

 Here the default information for the 64bit os:

   
 sessionInfo()
 
 R version 2.7.1 (2008-06-23)
 x86_64-redhat-linux-gnu

   
 gc()
 
  used (Mb) gc trigger (Mb) max used (Mb)
 Ncells 137955  7.4 35 18.7   35 18.7
 Vcells 141455  1.1 786432  6.0   601347  4.6

 I tried also to start R using the options to control the available memory
 and the result still the same. or maybe i don't assign the correct values.

   
It might be a good idea first to work out what the actual memory
requirements are. 64 bits does not help if you are running out of RAM
(+swap).

-- 
   O__   Peter Dalgaard Ă˜ster Farimagsgade 5, Entr.B
  c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
 (*) \(*) -- University of Copenhagen   Denmark  Ph:  (+45) 35327918
~~ - ([EMAIL PROTECTED])  FAX: (+45) 35327907

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.