Dear Tyler,
Thanks a lot for you attention! I tested my system using your suggestion (with
'top') and I my linux is using all memory (4 Gb). I am using a
6.32-25-generic-pae #45-Ubuntu.
The main problem is that I do not know how to make R to use all this memory.
When I run inside of R the comments gc(), system("free") or system("cat
/proc/meminfo") it seems that I have a lot memory not used for R. My R is only
using 2 Gb Please see bellow and maybe test your system:
Start and than type:
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 131067 3.5 350000 9.4 350000 9.4
Vcells 81430 0.7 786432 6.0 434604 3.4
> system("free")
total used free shared buffers cached
Mem: 3911252 1601280 2309972 0 165884 622388
-/+ buffers/cache: 813008 3098244
Swap: 1950712 342876 1607836
> system("cat /proc/meminfo")
total: used: free: shared: buffers: cached:
Mem: 1589321728 1497116672 92205056 0 181149696 945790976
Swap: 1998733312 12656640 1986076672
MemTotal: 1552072 kB
MemFree: 90044 kB
MemShared: 0 kB
Buffers: 176904 kB
Despite the system, the MAC of my friend, 5 years old and with 4Gb had a
complete different result of memory allocation:
>gc()
used (Mb) gc trigger (Mb) max
used (Mb)
Ncells 481875 25.8 984024 52.6
984024 52.6
Vcells 481512 3.7 140511341 1072.1
641928461 4897.6
I tried to start my R with some pre-allocation memory (please see bellow), but
it did not work.
R --min-vsize=10M --max-vsize=3G --min-nsize=10M --max-nsize=3G
Best,
Fernando
De: Tyler Smith <[email protected]>
Assunto: Re: [R-sig-Debian] R memory allocation in Linux
Para: [email protected]
Data: Quarta-feira, 10 de Novembro de 2010, 14:14
ricardo souza <[email protected]> writes:
> The only thing that is very strange for me is how the performance of
> MAC is much better to allocate memory than linux and windows both
> 32bit.
As far as I understand it, this is not true. Windows places more
restrictions on memory use than Linux, but I don't know about Macs.
> I am using a OS linux 32 with a windows partition. I believe that is
> the problem.
I disagree. The windows partition has no effect on how much memory is
available, and I can use more than 2GB of RAM in R on Debian here. If
what you mean to say is you can complete an analysis in Windows but not
in Linux on the same machine, then you may have the wrong kernel
installed for Linux.
> R just understand that I have 2Gb of Ram, although I have 4 Gb
> available. Is that true? Somebody else have this problem before?
>
It is possible that you are using a kernel that doesn't use all of your
memory. You can check how much memory the system can use by running the
command 'top' from the command line. Towards the top of the screen
you'll see an entry Mem: NNNNNNNN total. If NNNNNNN isn't close to the
number you expect (i.e., 4GB or approx. 4000000 kb), then you need a
different kernel.
This was an issue for me with Debian testing and the 486 kernels a year
or two back. I think the 686 kernels all recognize 4GB of RAM now. I'm
using 2.6.32-5-686 here and I have access to 3GB of RAM. For 4GB you
might need the 686-bigmem kernel. All are available through your package
manager.
HTH,
Tyler
ps, just noticed that this was cross-posted to r.geo. I don't follow
that list, but OS-specific questions like this are almost certainly
off-topic there.
_______________________________________________
R-SIG-Debian mailing list
[email protected]
https://stat.ethz.ch/mailman/listinfo/r-sig-debian
[[alternative HTML version deleted]]
_______________________________________________
R-SIG-Debian mailing list
[email protected]
https://stat.ethz.ch/mailman/listinfo/r-sig-debian