Thanks for the tips, Roger.
fyi: When I added /3GB to the boot.ini, the resulting desktop was incomplete
and locked - no chance to try starting R. Searching the web lead me to
believe that this was possibly a dead-end, so I abandoned this effort. Any
hints on getting this to work, anyone?
On 7 Mar 2007 at 8:26, Bos, Roger wrote:
David,
I wouldn't give up on windows so fast. Many people have gotten the 3Gb
switch to work. One used to have to modify the header of the Rgui.exe
program to use the switch, but now the binary comes ready for that, so
its really quite easy. I would like to hear more about why its not
working for you.
As for Linux, I use FC5 for which there is a 64-bit binary. But there
are also 64-bit binaries for other distros. The 32-bit and 64-bit
binaries are in different directories, so you should have no trouble
telling them apart.
I have heard good things about Ubuntu--mainly that its very easy to
use--but FC5 has been pretty easy to learn too and I use the KDE desktop
which gives me Kate as a text editor. You can open a terminal window in
Kate to run R and set up a key like F10 to send the code from the editor
to R. Its not quite as good as my Windows setup with Tinn-R, but almost
as good.
Thanks,
Roger
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]
Sent: Tuesday, March 06, 2007 5:37 PM
To: Bos, Roger
Subject: RE: [R] Memory Limits in Ubuntu Linux
Thanks for your prompt reply!
The windows 3GB switch is quite problematic - it was not useable on my
machine, and there are comments about these problems around the net.
Thus, on to Linux. My machine has 4Gig, and some megabytes are grabbed
by my Asus motherboard, leaving some 3.56 Gig.
So if I understand your suggestion, try the 64-bit version of Ubuntu
(based on Debian but I had better luck with the video part of the
install) and then use the corresponding image from CRAN. My fear is that
the CRAN Ubuntu version might be 32-bit - any idea how to find out
before I embark on another install? Which Linux do you have - you
described some significant success with getting large jobs to run.
And yes, I've worked hard to save memory by tweaking the code.
Thanks again.
On 6 Mar 2007 at 16:51, Bos, Roger wrote:
David,
First of all, under Windows you can get about 3GB available to R by
using the /3Gb switch in your boot.ini file, assuming you have 4Gb of
memory installed on your windows machine. Using that method, I have
seen the memory using of my R process get as big as 2.7Gb in task
manager. What's important, of course, is contiguous space, as you
mentioned. There, you may want to check your code closely and make
sure that its memory usage is as efficient as possible and you are
storing the minimal amount you need for each run. If you don't need
an object for a while consider writing it to disk and reading it back
in later.
Second, AFAIK to get any benefit from more memory is Linux you have to
go to the 64bit version. I am a Linux newbie too, so I choose to use
one of the pre-compiled binaries available on CRAN. In other words,
you shouldn't have to compile anything yourself. How much memory do
you have on your Linux box? I have 16Gb and I know I have ran stuff
that wouldn't run on my 4Gb windows box.
HTH,
Roger
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of
[EMAIL PROTECTED]
Sent: Tuesday, March 06, 2007 3:44 PM
To: r-help@stat.math.ethz.ch
Subject: [R] Memory Limits in Ubuntu Linux
I am an R user trying to get around the 2Gig memory limit in Windows,
so here I am days later with a working Ubuntu, and R under Ubuntu. But
- the memory problems seem worse than ever. R code that worked under
windows fails, unable to allocate memory.
Searching around the web, it appears that the problem may be the
ability to find contguous memory for my big vectors, but a fresh boot
of Ubuntu does not help either.
Which way to go?
1) Try to install 64-bit version for bigger address space. Would this
help? Is this workable for my Athlon 64 Dual-core? (the live cd seems
to work but I never got it to boot after a disk install, but then the
386 version was no better until I learned more about Grub...I could
try again if this might solve the
problem)
2) Recompile R to get bigger memory capability? (I'll have to
cross-post to some R forums too) This will be a challenge for a Linux
newbie...like me.
3) Any other suggestions? My goal is to create a bigger neural network
than fits in my Windows R version.
--
David Katz
www.davidkatzconsulting.com
541 482-1137
[[alternative HTML version deleted]]
__
R-help@stat.math.ethz.ch mailing list