Re: [R] Memory Limits in Ubuntu Linux

2007-03-07 Thread Bos, Roger
David,

I wouldn't give up on windows so fast.  Many people have gotten the 3Gb
switch to work. One used to have to modify the header of the Rgui.exe
program to use the switch, but now the binary comes ready for that, so
its really quite easy.  I would like to hear more about why its not
working for you.

As for Linux, I use FC5 for which there is a 64-bit binary.  But there
are also 64-bit binaries for other distros.  The 32-bit and 64-bit
binaries are in different directories, so you should have no trouble
telling them apart.  

I have heard good things about Ubuntu--mainly that its very easy to
use--but FC5 has been pretty easy to learn too and I use the KDE desktop
which gives me Kate as a text editor.  You can open a terminal window in
Kate to run R and set up a key like F10 to send the code from the editor
to R.  Its not quite as good as my Windows setup with Tinn-R, but almost
as good.

Thanks,

Roger


-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] 
Sent: Tuesday, March 06, 2007 5:37 PM
To: Bos, Roger
Subject: RE: [R] Memory Limits in Ubuntu Linux

Thanks for your prompt reply!

The windows 3GB switch is quite problematic - it was not useable on my
machine, and there are comments about these problems around the net.
Thus, on to Linux. My machine has 4Gig, and some megabytes are grabbed
by my Asus motherboard, leaving some 3.56 Gig. 

So if I understand your suggestion, try the 64-bit version of Ubuntu
(based on Debian but I had better luck with the video part of the
install) and then use the corresponding image from CRAN. My fear is that
the CRAN Ubuntu version might be 32-bit - any idea how to find out
before I embark on another install? Which Linux do you have - you
described some significant success with getting large jobs to run.

And yes, I've worked hard to save memory by tweaking the code.


Thanks again.


On 6 Mar 2007 at 16:51, Bos, Roger wrote:

 David,
 
 First of all, under Windows you can get about 3GB available to R by 
 using the /3Gb switch in your boot.ini file, assuming you have 4Gb of 
 memory installed on your windows machine.  Using that method, I have 
 seen the memory using of my R process get as big as 2.7Gb in task 
 manager.  What's important, of course, is contiguous space, as you 
 mentioned.  There, you may want to check your code closely and make 
 sure that its memory usage is as efficient as possible and you are 
 storing the minimal amount you need for each run.  If you don't need 
 an object for a while consider writing it to disk and reading it back
in later.
 
 Second, AFAIK to get any benefit from more memory is Linux you have to

 go to the 64bit version.  I am a Linux newbie too, so I choose to use 
 one of the pre-compiled binaries available on CRAN.  In other words, 
 you shouldn't have to compile anything yourself.  How much memory do 
 you have on your Linux box?  I have 16Gb and I know I have ran stuff 
 that wouldn't run on my 4Gb windows box.
 
 HTH,
 
 Roger
 
 
 
 
 
  
 
 -Original Message-
 From: [EMAIL PROTECTED] 
 [mailto:[EMAIL PROTECTED] On Behalf Of 
 [EMAIL PROTECTED]
 Sent: Tuesday, March 06, 2007 3:44 PM
 To: r-help@stat.math.ethz.ch
 Subject: [R] Memory Limits in Ubuntu Linux
 
 I am an R user trying to get around the 2Gig memory limit in Windows, 
 so here I am days later with a working Ubuntu, and R under Ubuntu. But

 - the memory problems seem worse than ever. R code that worked under 
 windows fails, unable to allocate memory.
 
 Searching around the web, it appears that the problem may be the 
 ability to find contguous memory for my big vectors, but a fresh boot 
 of Ubuntu does not help either.
 
 Which way to go?
 
 1) Try to install 64-bit version for bigger address space. Would this 
 help? Is this workable for my Athlon 64 Dual-core? (the live cd seems 
 to work but I never got it to boot after a disk install, but then the 
 386 version was no better until I learned more about Grub...I could 
 try again if this might solve the
 problem)
 
 2) Recompile R to get bigger memory capability? (I'll have to 
 cross-post to some R forums too) This will be a challenge for a Linux 
 newbie...like me.
 
 3) Any other suggestions? My goal is to create a bigger neural network

 than fits in my Windows R version.
 --
 David Katz
  www.davidkatzconsulting.com
541 482-1137
 
   [[alternative HTML version deleted]]
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
 
 **

 * This message is for the named person's use only. It may contain 
 confidential, proprietary or legally privileged information. No right 
 to confidential or privileged treatment of this message is waived or 
 lost by any error

Re: [R] Memory Limits in Ubuntu Linux

2007-03-07 Thread davidkat
Thanks for the tips, Roger.

fyi: When I added /3GB to the boot.ini, the resulting desktop was incomplete 
and locked - no chance to try starting R. Searching the web lead me to 
believe that this was possibly a dead-end, so I abandoned this effort. Any 
hints on getting this to work, anyone? 



On 7 Mar 2007 at 8:26, Bos, Roger wrote:

 David,
 
 I wouldn't give up on windows so fast.  Many people have gotten the 3Gb
 switch to work. One used to have to modify the header of the Rgui.exe
 program to use the switch, but now the binary comes ready for that, so
 its really quite easy.  I would like to hear more about why its not
 working for you.
 
 As for Linux, I use FC5 for which there is a 64-bit binary.  But there
 are also 64-bit binaries for other distros.  The 32-bit and 64-bit
 binaries are in different directories, so you should have no trouble
 telling them apart.  
 
 I have heard good things about Ubuntu--mainly that its very easy to
 use--but FC5 has been pretty easy to learn too and I use the KDE desktop
 which gives me Kate as a text editor.  You can open a terminal window in
 Kate to run R and set up a key like F10 to send the code from the editor
 to R.  Its not quite as good as my Windows setup with Tinn-R, but almost
 as good.
 
 Thanks,
 
 Roger
 
 
 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] 
 Sent: Tuesday, March 06, 2007 5:37 PM
 To: Bos, Roger
 Subject: RE: [R] Memory Limits in Ubuntu Linux
 
 Thanks for your prompt reply!
 
 The windows 3GB switch is quite problematic - it was not useable on my
 machine, and there are comments about these problems around the net.
 Thus, on to Linux. My machine has 4Gig, and some megabytes are grabbed
 by my Asus motherboard, leaving some 3.56 Gig. 
 
 So if I understand your suggestion, try the 64-bit version of Ubuntu
 (based on Debian but I had better luck with the video part of the
 install) and then use the corresponding image from CRAN. My fear is that
 the CRAN Ubuntu version might be 32-bit - any idea how to find out
 before I embark on another install? Which Linux do you have - you
 described some significant success with getting large jobs to run.
 
 And yes, I've worked hard to save memory by tweaking the code.
 
 
 Thanks again.
 
 
 On 6 Mar 2007 at 16:51, Bos, Roger wrote:
 
  David,
  
  First of all, under Windows you can get about 3GB available to R by 
  using the /3Gb switch in your boot.ini file, assuming you have 4Gb of 
  memory installed on your windows machine.  Using that method, I have 
  seen the memory using of my R process get as big as 2.7Gb in task 
  manager.  What's important, of course, is contiguous space, as you 
  mentioned.  There, you may want to check your code closely and make 
  sure that its memory usage is as efficient as possible and you are 
  storing the minimal amount you need for each run.  If you don't need 
  an object for a while consider writing it to disk and reading it back
 in later.
  
  Second, AFAIK to get any benefit from more memory is Linux you have to
 
  go to the 64bit version.  I am a Linux newbie too, so I choose to use 
  one of the pre-compiled binaries available on CRAN.  In other words, 
  you shouldn't have to compile anything yourself.  How much memory do 
  you have on your Linux box?  I have 16Gb and I know I have ran stuff 
  that wouldn't run on my 4Gb windows box.
  
  HTH,
  
  Roger
  
  
  
  
  
   
  
  -Original Message-
  From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED] On Behalf Of 
  [EMAIL PROTECTED]
  Sent: Tuesday, March 06, 2007 3:44 PM
  To: r-help@stat.math.ethz.ch
  Subject: [R] Memory Limits in Ubuntu Linux
  
  I am an R user trying to get around the 2Gig memory limit in Windows, 
  so here I am days later with a working Ubuntu, and R under Ubuntu. But
 
  - the memory problems seem worse than ever. R code that worked under 
  windows fails, unable to allocate memory.
  
  Searching around the web, it appears that the problem may be the 
  ability to find contguous memory for my big vectors, but a fresh boot 
  of Ubuntu does not help either.
  
  Which way to go?
  
  1) Try to install 64-bit version for bigger address space. Would this 
  help? Is this workable for my Athlon 64 Dual-core? (the live cd seems 
  to work but I never got it to boot after a disk install, but then the 
  386 version was no better until I learned more about Grub...I could 
  try again if this might solve the
  problem)
  
  2) Recompile R to get bigger memory capability? (I'll have to 
  cross-post to some R forums too) This will be a challenge for a Linux 
  newbie...like me.
  
  3) Any other suggestions? My goal is to create a bigger neural network
 
  than fits in my Windows R version.
  --
  David Katz
   www.davidkatzconsulting.com
 541 482-1137
  
  [[alternative HTML version deleted]]
  
  __
  R-help@stat.math.ethz.ch mailing list
  https://stat.ethz.ch/mailman/listinfo

Re: [R] Memory Limits in Ubuntu Linux

2007-03-07 Thread Bos, Roger
David,

Here is what my boot.ini file looks like:

[boot loader]
timeout=5
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS=Microsoft Windows XP
Professional /noexecute=optin /fastdetect /3gb

The easiest way to edit the boot.ini file is My
Computer/Properties/Advanced/Startup  Recovery/Edit; add the /3gb and
reboot.  I know that a messed up boot.ini file can be a real pain.  I
posted what mine looks like so you can compare yours, but I wouldn't
suggest making any changes to your boot.ini except at the very end of
the last line.

HTH,

Roger


 

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] 
Sent: Wednesday, March 07, 2007 12:27 PM
To: Bos, Roger
Cc: r-help@stat.math.ethz.ch
Subject: RE: [R] Memory Limits in Ubuntu Linux

Thanks for the tips, Roger.

fyi: When I added /3GB to the boot.ini, the resulting desktop was
incomplete and locked - no chance to try starting R. Searching the web
lead me to believe that this was possibly a dead-end, so I abandoned
this effort. Any hints on getting this to work, anyone? 



On 7 Mar 2007 at 8:26, Bos, Roger wrote:

 David,
 
 I wouldn't give up on windows so fast.  Many people have gotten the 
 3Gb switch to work. One used to have to modify the header of the 
 Rgui.exe program to use the switch, but now the binary comes ready for

 that, so its really quite easy.  I would like to hear more about why 
 its not working for you.
 
 As for Linux, I use FC5 for which there is a 64-bit binary.  But there

 are also 64-bit binaries for other distros.  The 32-bit and 64-bit 
 binaries are in different directories, so you should have no trouble 
 telling them apart.
 
 I have heard good things about Ubuntu--mainly that its very easy to 
 use--but FC5 has been pretty easy to learn too and I use the KDE 
 desktop which gives me Kate as a text editor.  You can open a terminal

 window in Kate to run R and set up a key like F10 to send the code 
 from the editor to R.  Its not quite as good as my Windows setup with 
 Tinn-R, but almost as good.
 
 Thanks,
 
 Roger
 
 
 -Original Message-
 From: [EMAIL PROTECTED] 
 [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, March 06, 2007 5:37 PM
 To: Bos, Roger
 Subject: RE: [R] Memory Limits in Ubuntu Linux
 
 Thanks for your prompt reply!
 
 The windows 3GB switch is quite problematic - it was not useable on my

 machine, and there are comments about these problems around the net.
 Thus, on to Linux. My machine has 4Gig, and some megabytes are grabbed

 by my Asus motherboard, leaving some 3.56 Gig.
 
 So if I understand your suggestion, try the 64-bit version of Ubuntu 
 (based on Debian but I had better luck with the video part of the
 install) and then use the corresponding image from CRAN. My fear is 
 that the CRAN Ubuntu version might be 32-bit - any idea how to find 
 out before I embark on another install? Which Linux do you have - you 
 described some significant success with getting large jobs to run.
 
 And yes, I've worked hard to save memory by tweaking the code.
 
 
 Thanks again.
 
 
 On 6 Mar 2007 at 16:51, Bos, Roger wrote:
 
  David,
  
  First of all, under Windows you can get about 3GB available to R by 
  using the /3Gb switch in your boot.ini file, assuming you have 4Gb 
  of memory installed on your windows machine.  Using that method, I 
  have seen the memory using of my R process get as big as 2.7Gb in 
  task manager.  What's important, of course, is contiguous space, as 
  you mentioned.  There, you may want to check your code closely and 
  make sure that its memory usage is as efficient as possible and you 
  are storing the minimal amount you need for each run.  If you don't 
  need an object for a while consider writing it to disk and reading 
  it back
 in later.
  
  Second, AFAIK to get any benefit from more memory is Linux you have 
  to
 
  go to the 64bit version.  I am a Linux newbie too, so I choose to 
  use one of the pre-compiled binaries available on CRAN.  In other 
  words, you shouldn't have to compile anything yourself.  How much 
  memory do you have on your Linux box?  I have 16Gb and I know I have

  ran stuff that wouldn't run on my 4Gb windows box.
  
  HTH,
  
  Roger
  
  
  
  
  
   
  
  -Original Message-
  From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED] On Behalf Of 
  [EMAIL PROTECTED]
  Sent: Tuesday, March 06, 2007 3:44 PM
  To: r-help@stat.math.ethz.ch
  Subject: [R] Memory Limits in Ubuntu Linux
  
  I am an R user trying to get around the 2Gig memory limit in 
  Windows, so here I am days later with a working Ubuntu, and R under 
  Ubuntu. But
 
  - the memory problems seem worse than ever. R code that worked under

  windows fails, unable to allocate memory.
  
  Searching around the web, it appears that the problem may be the 
  ability to find contguous memory for my big vectors, but a fresh 
  boot of Ubuntu does not help either.
  
  Which way to go?
  
  1

[R] Memory Limits in Ubuntu Linux

2007-03-06 Thread davidkat
I am an R user trying to get around the 2Gig memory limit in Windows, so 
here I am days later with a working Ubuntu, and R under Ubuntu. But - the 
memory problems seem worse than ever. R code that worked under 
windows fails, unable to allocate memory.

Searching around the web, it appears that the problem may be the ability to 
find contguous memory for my big vectors, but a fresh boot of Ubuntu does 
not help either.

Which way to go?

1) Try to install 64-bit version for bigger address space. Would this help? Is 
this workable for my Athlon 64 Dual-core? (the live cd seems to work but I 
never got it to boot after a disk install, but then the 386 version was no 
better 
until I learned more about Grub...I could try again if this might solve the 
problem)

2) Recompile R to get bigger memory capability? (I'll have to cross-post to 
some R forums too)
This will be a challenge for a Linux newbie...like me.

3) Any other suggestions? My goal is to create a bigger neural network than 
fits in my Windows R version.
-- 
David Katz
 www.davidkatzconsulting.com
   541 482-1137

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Limits in Ubuntu Linux

2007-03-06 Thread Uwe Ligges


[EMAIL PROTECTED] wrote:
 I am an R user trying to get around the 2Gig memory limit in Windows, so 
 here I am days later with a working Ubuntu, and R under Ubuntu. But - the 
 memory problems seem worse than ever. R code that worked under 
 windows fails, unable to allocate memory.
 
 Searching around the web, it appears that the problem may be the ability to 
 find contguous memory for my big vectors, but a fresh boot of Ubuntu does 
 not help either.
 
 Which way to go?
 
 1) Try to install 64-bit version for bigger address space. Would this help? 
 Is 
 this workable for my Athlon 64 Dual-core? (the live cd seems to work but I 
 never got it to boot after a disk install, but then the 386 version was no 
 better 
 until I learned more about Grub...I could try again if this might solve the 
 problem)

If you really have got such amounts of RAM in that machine, it should be 
worth trying.

Uwe Ligges


 2) Recompile R to get bigger memory capability? (I'll have to cross-post to 
 some R forums too)
 This will be a challenge for a Linux newbie...like me.
 
 3) Any other suggestions? My goal is to create a bigger neural network than 
 fits in my Windows R version.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Limits in Ubuntu Linux

2007-03-06 Thread Christos Hatzis
Take a look at Windows FAQ 2.9.  Following the instructions there, I was
able to make WinXP use at least 3GB of RAM (physical RAM installed) with
Rgui.exe.

-Christos

Christos Hatzis, Ph.D.
Nuvera Biosciences, Inc.
400 West Cummings Park
Suite 5350
Woburn, MA 01801
Tel: 781-938-3830
www.nuverabio.com
 

 -Original Message-
 From: [EMAIL PROTECTED] 
 [mailto:[EMAIL PROTECTED] On Behalf Of 
 [EMAIL PROTECTED]
 Sent: Tuesday, March 06, 2007 3:44 PM
 To: r-help@stat.math.ethz.ch
 Subject: [R] Memory Limits in Ubuntu Linux
 
 I am an R user trying to get around the 2Gig memory limit in 
 Windows, so here I am days later with a working Ubuntu, and R 
 under Ubuntu. But - the memory problems seem worse than ever. 
 R code that worked under windows fails, unable to allocate memory.
 
 Searching around the web, it appears that the problem may be 
 the ability to find contguous memory for my big vectors, but 
 a fresh boot of Ubuntu does not help either.
 
 Which way to go?
 
 1) Try to install 64-bit version for bigger address space. 
 Would this help? Is this workable for my Athlon 64 Dual-core? 
 (the live cd seems to work but I never got it to boot after a 
 disk install, but then the 386 version was no better until I 
 learned more about Grub...I could try again if this might solve the
 problem)
 
 2) Recompile R to get bigger memory capability? (I'll have to 
 cross-post to some R forums too) This will be a challenge for 
 a Linux newbie...like me.
 
 3) Any other suggestions? My goal is to create a bigger 
 neural network than fits in my Windows R version.
 --
 David Katz
  www.davidkatzconsulting.com
541 482-1137
 
   [[alternative HTML version deleted]]
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide 
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
 


__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Limits in Ubuntu Linux

2007-03-06 Thread Dirk Eddelbuettel

On 6 March 2007 at 12:43, [EMAIL PROTECTED] wrote:
| I am an R user trying to get around the 2Gig memory limit in Windows, so 

The real limit on 32bit systems is a 3gb address space. R under Windows can
get there, see the R-Windows FAQ.

| here I am days later with a working Ubuntu, and R under Ubuntu. But - the 
| memory problems seem worse than ever. R code that worked under 
| windows fails, unable to allocate memory.

Well, maybe you had virtual memory enabled under Windows but not under
Ubuntu. Or maybe you had other memory-hungry applications up under Ubuntu.

There is only so much magic the OS can do.  You easiest remedy will be to
upgrade to 4gb.  And even 8gb can useful on 32bit system, despite the fact
that each individual address space can only max out at 3gb, as you may have
multi-core / multi-cpu systems that allow you to multitask better.  
 
| Which way to go?
| 
| 1) Try to install 64-bit version for bigger address space. Would this help?

Yes, but you'd probably would have to buy more ram to. The main advantage is
that your limit is now way above the 3gb -- and probably set by your hardware
or budget. Maybe it is as high as 16gb.

But again, on the _same_ box with the _same_ amount of ram that is already
constrained under 32bit, you will not see any improvement.  Rather the
opposite as the basic building block is now 8 bytes instead of 4, you will
need more memory for the same tasks.  No free lunch, as they say.

| 2) Recompile R to get bigger memory capability?

Nope. It's what you give your OS in terms of RAM what's binding here.

| 3) Any other suggestions? 

Different algorithms or approaches, tricks like the data.frame-in-sqlite or
biglm, ...

Dirk

-- 
Hell, there are no rules here - we're trying to accomplish something. 
  -- Thomas A. Edison

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.