Re: [R] CPU or memory

2006-11-08 Thread Stefan Grosse
64bit does not make anything faster. It is only of use if you want to
use more then 4 GB of RAM of if you need a higher precision of your
variables

The dual core question: dual core is faster if programs are able to use
that. What is sure that R cannot make (until now) use of the two cores
if you are stuck on Windows. It works excellent if you use Linux. So if
you want dual core you should work with linux (and then its faster of
course).

The Core 2 duo is the fastest processor at the moment however.

(the E6600 has a good price/performance ration)

What I already told Taka is that it is probably always a good idea to
improve your code for which purpose you could ask in this mailing
list... (And I am very sure that you have there a lot of potential).
Another speeding up possibility is e.g. using the atlas library...
(where I am not sure if you already use it)

Stefan

John C Frain schrieb:
 *Can I extend Taka's question?*
 **
 *Many of my programs in (mainly simulations in R which are cpu bound) on a
 year old PC ( Intel(R) Pentium(R) M processor 1.73GHz or Dell GX380
 with 2.8Gh Pentium) are taking hours and perhaps days to complete on a
 one year old
 PC.  I am looking at an upgrade but the variety of cpu's available is
 confusing at least.   Does any one know of comparisons of the Pentium
 9x0, Pentium(r)
 Extreme/Core 2 Duo,   AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
 FX/Dual Core AM2 and
 similar chips when used for this kind of work.  Does anyone have any advice
 on (1)  the use of a single core or dual core cpu or (2) on the use of 32
 bit and 64 bit cpu.  This question is now much more difficult as the numbers
 on the various chips do not necessarily refer to the relative speed of the
 chips.
 *
 *John

 * On 06/11/06, Taka Matzmoto [EMAIL PROTECTED] wrote:

   
 Hi R users

 Having both a faster CPU and more memory will boost computing power. I was
 wondering if only adding more memory (1GB - 2GB)  will significantly
 reduce
 R computation time?

 Taka,

 _
 Get FREE company branded e-mail accounts and business Web site from
 Microsoft Office Live

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

 





__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] CPU or memory

2006-11-08 Thread Prof Brian Ripley
On Wed, 8 Nov 2006, Stefan Grosse wrote:

 64bit does not make anything faster. It is only of use if you want to
 use more then 4 GB of RAM of if you need a higher precision of your
 variables

 The dual core question: dual core is faster if programs are able to use
 that. What is sure that R cannot make (until now) use of the two cores
 if you are stuck on Windows. It works excellent if you use Linux. So if
 you want dual core you should work with linux (and then its faster of
 course).

Not necessarily.  We have seen several examples in which using a 
multithreaded BLAS (the only easy way to make use of multiple CPUs under 
Linux for a single R process) makes things many times slower.  For tasks 
that are do not make heavy use of linear algebra, the advantage of a 
multithreaded BLAS is small, and even from those which do the speed-up is 
rarely close to double for a dual-CPU system.

John mentioned simulations.  Often by far the most effective way to use a 
multi-CPU platform (and I have had one as my desktop for over a decade) is 
to use coarse-grained parallelism: run two or more processes each doing 
some of the simulation runs.

 The Core 2 duo is the fastest processor at the moment however.

 (the E6600 has a good price/performance ration)

 What I already told Taka is that it is probably always a good idea to
 improve your code for which purpose you could ask in this mailing
 list... (And I am very sure that you have there a lot of potential).
 Another speeding up possibility is e.g. using the atlas library...
 (where I am not sure if you already use it)

 Stefan

 John C Frain schrieb:
 *Can I extend Taka's question?*
 **
 *Many of my programs in (mainly simulations in R which are cpu bound) on a
 year old PC ( Intel(R) Pentium(R) M processor 1.73GHz or Dell GX380
 with 2.8Gh Pentium) are taking hours and perhaps days to complete on a
 one year old
 PC.  I am looking at an upgrade but the variety of cpu's available is
 confusing at least.   Does any one know of comparisons of the Pentium
 9x0, Pentium(r)
 Extreme/Core 2 Duo,   AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
 FX/Dual Core AM2 and
 similar chips when used for this kind of work.  Does anyone have any advice
 on (1)  the use of a single core or dual core cpu or (2) on the use of 32
 bit and 64 bit cpu.  This question is now much more difficult as the numbers
 on the various chips do not necessarily refer to the relative speed of the
 chips.
 *
 *John

 * On 06/11/06, Taka Matzmoto [EMAIL PROTECTED] wrote:


 Hi R users

 Having both a faster CPU and more memory will boost computing power. I was
 wondering if only adding more memory (1GB - 2GB)  will significantly
 reduce
 R computation time?

 Taka,

 _
 Get FREE company branded e-mail accounts and business Web site from
 Microsoft Office Live

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.







 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] CPU or memory

2006-11-08 Thread Christos Hatzis
Prof. Ripley,

Do you mind providing some pointers on how coarse-grained parallelism
could be implemented on a Windows environment?  Would it be as simple as
running two R-console sessions and then (manually) combining the results of
these simulations.  Or it would be better to run them as batch processes.
RSiteSearch('coarse grained') did not produce any hits so this topic might
have not been discussed on this list.

I am not really familiar with running R in any mode other than the default
(R-console in Windows) so I might be missing something really obvious. I am
interested in running Monte-Carlo cross-validation in some sort of a
parallel mode on a dual core (Pentium D) Windows XP machine.

Thank you.
-Christos

Christos Hatzis, Ph.D.
Nuvera Biosciences, Inc.
400 West Cummings Park
Suite 5350
Woburn, MA 01801
Tel: 781-938-3830
www.nuverabio.com
 


-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Prof Brian Ripley
Sent: Wednesday, November 08, 2006 5:29 AM
To: Stefan Grosse
Cc: r-help@stat.math.ethz.ch; Taka Matzmoto
Subject: Re: [R] CPU or memory

On Wed, 8 Nov 2006, Stefan Grosse wrote:

 64bit does not make anything faster. It is only of use if you want to 
 use more then 4 GB of RAM of if you need a higher precision of your 
 variables

 The dual core question: dual core is faster if programs are able to 
 use that. What is sure that R cannot make (until now) use of the two 
 cores if you are stuck on Windows. It works excellent if you use 
 Linux. So if you want dual core you should work with linux (and then 
 its faster of course).

Not necessarily.  We have seen several examples in which using a
multithreaded BLAS (the only easy way to make use of multiple CPUs under
Linux for a single R process) makes things many times slower.  For tasks
that are do not make heavy use of linear algebra, the advantage of a
multithreaded BLAS is small, and even from those which do the speed-up is
rarely close to double for a dual-CPU system.

John mentioned simulations.  Often by far the most effective way to use a
multi-CPU platform (and I have had one as my desktop for over a decade) is
to use coarse-grained parallelism: run two or more processes each doing some
of the simulation runs.

 The Core 2 duo is the fastest processor at the moment however.

 (the E6600 has a good price/performance ration)

 What I already told Taka is that it is probably always a good idea to 
 improve your code for which purpose you could ask in this mailing 
 list... (And I am very sure that you have there a lot of potential).
 Another speeding up possibility is e.g. using the atlas library...
 (where I am not sure if you already use it)

 Stefan

 John C Frain schrieb:
 *Can I extend Taka's question?*
 **
 *Many of my programs in (mainly simulations in R which are cpu bound) 
 on a year old PC ( Intel(R) Pentium(R) M processor 1.73GHz or Dell 
 GX380 with 2.8Gh Pentium) are taking hours and perhaps days to 
 complete on a one year old PC.  I am looking at an upgrade but the 
 variety of cpu's available is
 confusing at least.   Does any one know of comparisons of the Pentium
 9x0, Pentium(r)
 Extreme/Core 2 Duo,   AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
 FX/Dual Core AM2 and
 similar chips when used for this kind of work.  Does anyone have any 
 advice on (1)  the use of a single core or dual core cpu or (2) on 
 the use of 32 bit and 64 bit cpu.  This question is now much more 
 difficult as the numbers on the various chips do not necessarily 
 refer to the relative speed of the chips.
 *
 *John

 * On 06/11/06, Taka Matzmoto [EMAIL PROTECTED] wrote:


 Hi R users

 Having both a faster CPU and more memory will boost computing power. 
 I was wondering if only adding more memory (1GB - 2GB)  will 
 significantly reduce R computation time?

 Taka,

 _
 Get FREE company branded e-mail accounts and business Web site from 
 Microsoft Office Live

 __
 R-help@stat.math.ethz.ch mailing list 
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.







 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide 
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r

Re: [R] CPU or memory

2006-11-08 Thread Prof Brian Ripley
On Wed, 8 Nov 2006, Christos Hatzis wrote:

 Prof. Ripley,

 Do you mind providing some pointers on how coarse-grained parallelism
 could be implemented on a Windows environment?  Would it be as simple as
 running two R-console sessions and then (manually) combining the results of
 these simulations.  Or it would be better to run them as batch processes.

That is what I would do in any environment (I don't do such things under 
Windows since all my fast machines run Linux/Unix).

Suppose you want to do 1 simulations.  Set up two batch scripts
that each run 5000, and save() the results as a list or matrix under 
different names, and set a different seed at the top.  Then run each via
R CMD BATCH simultaneously.  When both have finished, use an interactive 
session to load() both sets of results and merge them.

 RSiteSearch('coarse grained') did not produce any hits so this topic might
 have not been discussed on this list.

 I am not really familiar with running R in any mode other than the default
 (R-console in Windows) so I might be missing something really obvious. I am
 interested in running Monte-Carlo cross-validation in some sort of a
 parallel mode on a dual core (Pentium D) Windows XP machine.

 Thank you.
 -Christos

 Christos Hatzis, Ph.D.
 Nuvera Biosciences, Inc.
 400 West Cummings Park
 Suite 5350
 Woburn, MA 01801
 Tel: 781-938-3830
 www.nuverabio.com



 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Prof Brian Ripley
 Sent: Wednesday, November 08, 2006 5:29 AM
 To: Stefan Grosse
 Cc: r-help@stat.math.ethz.ch; Taka Matzmoto
 Subject: Re: [R] CPU or memory

 On Wed, 8 Nov 2006, Stefan Grosse wrote:

 64bit does not make anything faster. It is only of use if you want to
 use more then 4 GB of RAM of if you need a higher precision of your
 variables

 The dual core question: dual core is faster if programs are able to
 use that. What is sure that R cannot make (until now) use of the two
 cores if you are stuck on Windows. It works excellent if you use
 Linux. So if you want dual core you should work with linux (and then
 its faster of course).

 Not necessarily.  We have seen several examples in which using a
 multithreaded BLAS (the only easy way to make use of multiple CPUs under
 Linux for a single R process) makes things many times slower.  For tasks
 that are do not make heavy use of linear algebra, the advantage of a
 multithreaded BLAS is small, and even from those which do the speed-up is
 rarely close to double for a dual-CPU system.

 John mentioned simulations.  Often by far the most effective way to use a
 multi-CPU platform (and I have had one as my desktop for over a decade) is
 to use coarse-grained parallelism: run two or more processes each doing some
 of the simulation runs.

 The Core 2 duo is the fastest processor at the moment however.

 (the E6600 has a good price/performance ration)

 What I already told Taka is that it is probably always a good idea to
 improve your code for which purpose you could ask in this mailing
 list... (And I am very sure that you have there a lot of potential).
 Another speeding up possibility is e.g. using the atlas library...
 (where I am not sure if you already use it)

 Stefan

 John C Frain schrieb:
 *Can I extend Taka's question?*
 **
 *Many of my programs in (mainly simulations in R which are cpu bound)
 on a year old PC ( Intel(R) Pentium(R) M processor 1.73GHz or Dell
 GX380 with 2.8Gh Pentium) are taking hours and perhaps days to
 complete on a one year old PC.  I am looking at an upgrade but the
 variety of cpu's available is
 confusing at least.   Does any one know of comparisons of the Pentium
 9x0, Pentium(r)
 Extreme/Core 2 Duo,   AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
 FX/Dual Core AM2 and
 similar chips when used for this kind of work.  Does anyone have any
 advice on (1)  the use of a single core or dual core cpu or (2) on
 the use of 32 bit and 64 bit cpu.  This question is now much more
 difficult as the numbers on the various chips do not necessarily
 refer to the relative speed of the chips.
 *
 *John

 * On 06/11/06, Taka Matzmoto [EMAIL PROTECTED] wrote:


 Hi R users

 Having both a faster CPU and more memory will boost computing power.
 I was wondering if only adding more memory (1GB - 2GB)  will
 significantly reduce R computation time?

 Taka,

 _
 Get FREE company branded e-mail accounts and business Web site from
 Microsoft Office Live

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.







 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R

Re: [R] CPU or memory

2006-11-08 Thread Christos Hatzis
Great.  I will try it.
Thank you.

-Christos 

-Original Message-
From: Prof Brian Ripley [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 08, 2006 1:21 PM
To: Christos Hatzis
Cc: 'Stefan Grosse'; r-help@stat.math.ethz.ch; 'Taka Matzmoto'
Subject: RE: [R] CPU or memory

On Wed, 8 Nov 2006, Christos Hatzis wrote:

 Prof. Ripley,

 Do you mind providing some pointers on how coarse-grained parallelism
 could be implemented on a Windows environment?  Would it be as simple 
 as running two R-console sessions and then (manually) combining the 
 results of these simulations.  Or it would be better to run them as batch
processes.

That is what I would do in any environment (I don't do such things under
Windows since all my fast machines run Linux/Unix).

Suppose you want to do 1 simulations.  Set up two batch scripts that
each run 5000, and save() the results as a list or matrix under different
names, and set a different seed at the top.  Then run each via R CMD BATCH
simultaneously.  When both have finished, use an interactive session to
load() both sets of results and merge them.

 RSiteSearch('coarse grained') did not produce any hits so this topic 
 might have not been discussed on this list.

 I am not really familiar with running R in any mode other than the 
 default (R-console in Windows) so I might be missing something really 
 obvious. I am interested in running Monte-Carlo cross-validation in 
 some sort of a parallel mode on a dual core (Pentium D) Windows XP
machine.

 Thank you.
 -Christos

 Christos Hatzis, Ph.D.
 Nuvera Biosciences, Inc.
 400 West Cummings Park
 Suite 5350
 Woburn, MA 01801
 Tel: 781-938-3830
 www.nuverabio.com



 -Original Message-
 From: [EMAIL PROTECTED] 
 [mailto:[EMAIL PROTECTED] On Behalf Of Prof Brian 
 Ripley
 Sent: Wednesday, November 08, 2006 5:29 AM
 To: Stefan Grosse
 Cc: r-help@stat.math.ethz.ch; Taka Matzmoto
 Subject: Re: [R] CPU or memory

 On Wed, 8 Nov 2006, Stefan Grosse wrote:

 64bit does not make anything faster. It is only of use if you want to 
 use more then 4 GB of RAM of if you need a higher precision of your 
 variables

 The dual core question: dual core is faster if programs are able to 
 use that. What is sure that R cannot make (until now) use of the two 
 cores if you are stuck on Windows. It works excellent if you use 
 Linux. So if you want dual core you should work with linux (and then 
 its faster of course).

 Not necessarily.  We have seen several examples in which using a 
 multithreaded BLAS (the only easy way to make use of multiple CPUs 
 under Linux for a single R process) makes things many times slower.  
 For tasks that are do not make heavy use of linear algebra, the 
 advantage of a multithreaded BLAS is small, and even from those which 
 do the speed-up is rarely close to double for a dual-CPU system.

 John mentioned simulations.  Often by far the most effective way to 
 use a multi-CPU platform (and I have had one as my desktop for over a 
 decade) is to use coarse-grained parallelism: run two or more 
 processes each doing some of the simulation runs.

 The Core 2 duo is the fastest processor at the moment however.

 (the E6600 has a good price/performance ration)

 What I already told Taka is that it is probably always a good idea to 
 improve your code for which purpose you could ask in this mailing 
 list... (And I am very sure that you have there a lot of potential).
 Another speeding up possibility is e.g. using the atlas library...
 (where I am not sure if you already use it)

 Stefan

 John C Frain schrieb:
 *Can I extend Taka's question?*
 **
 *Many of my programs in (mainly simulations in R which are cpu 
 bound) on a year old PC ( Intel(R) Pentium(R) M processor 1.73GHz or 
 Dell GX380 with 2.8Gh Pentium) are taking hours and perhaps days to 
 complete on a one year old PC.  I am looking at an upgrade but the 
 variety of cpu's available is
 confusing at least.   Does any one know of comparisons of the Pentium
 9x0, Pentium(r)
 Extreme/Core 2 Duo,   AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
 FX/Dual Core AM2 and
 similar chips when used for this kind of work.  Does anyone have any 
 advice on (1)  the use of a single core or dual core cpu or (2) on 
 the use of 32 bit and 64 bit cpu.  This question is now much more 
 difficult as the numbers on the various chips do not necessarily 
 refer to the relative speed of the chips.
 *
 *John

 * On 06/11/06, Taka Matzmoto [EMAIL PROTECTED] wrote:


 Hi R users

 Having both a faster CPU and more memory will boost computing power.
 I was wondering if only adding more memory (1GB - 2GB)  will 
 significantly reduce R computation time?

 Taka,

 _
 Get FREE company branded e-mail accounts and business Web site from 
 Microsoft Office Live

 __
 R-help@stat.math.ethz.ch mailing list 
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read

Re: [R] CPU or memory

2006-11-08 Thread John C Frain
I would like to thank all who replied to my question about the efficiency of
various cpu's in R.

Following the advice of Bogdan Romocea I have put a sample simulation and
the latest version of R on a USB drive and will go to a few suppliers to try
it out.  I will report back if I find anything of interest.

With regard to 64-bit and 32-bit I thought that the 64-bit chip might
require less clock cycles for a specific machine instruction than a 32-bit.
This was one of the advantages of moving from 8 to 16 or from 16 to 32 bit
chips.  Thus a slower, in terms of clock speed, 64-bit chip might run faster
than a somewhat similar 32-bit chip.  I fully realize that the full
advantage of a 64-bit chip is available only with a 64-bit operating system
and I am preparing to switch some work to Linux in case I acquire a 64-bit
PC.  If I do I will time the simulations on that also.

I already do some coarse-grained parallelism as described by *Brian Ripley
* but on two separate PC's.  This is not ideal but allows the processing
time to be halved without the overheads.

FORTRAN 2 was my first programming language and I agree that I should try to
use C or FORTRAN to speed up things.  Finally Rprof could be a great help.
There are lots of utilities in the utils package with which I was not
familiar.

Again Many Thanks to all who made various suggestions.


   bogdan romocea[EMAIL PROTECTED] to *r-help*, me
 More options   07-Nov (1 day ago)   Does any one know of comparisons of
the Pentium 9x0, Pentium(r)
 Extreme/Core 2 Duo, AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
 FX/Dual Core AM2 and similar chips when used for this kind of work.



On 08/11/06, Prof Brian Ripley [EMAIL PROTECTED] wrote:

 On Wed, 8 Nov 2006, Christos Hatzis wrote:

  Prof. Ripley,
 
  Do you mind providing some pointers on how coarse-grained parallelism
  could be implemented on a Windows environment?  Would it be as simple as
  running two R-console sessions and then (manually) combining the results
 of
  these simulations.  Or it would be better to run them as batch
 processes.

 That is what I would do in any environment (I don't do such things under
 Windows since all my fast machines run Linux/Unix).

 Suppose you want to do 1 simulations.  Set up two batch scripts
 that each run 5000, and save() the results as a list or matrix under
 different names, and set a different seed at the top.  Then run each via
 R CMD BATCH simultaneously.  When both have finished, use an interactive
 session to load() both sets of results and merge them.

  RSiteSearch('coarse grained') did not produce any hits so this topic
 might
  have not been discussed on this list.
 
  I am not really familiar with running R in any mode other than the
 default
  (R-console in Windows) so I might be missing something really obvious. I
 am
  interested in running Monte-Carlo cross-validation in some sort of a
  parallel mode on a dual core (Pentium D) Windows XP machine.
 
  Thank you.
  -Christos
 
  Christos Hatzis, Ph.D.
  Nuvera Biosciences, Inc.
  400 West Cummings Park
  Suite 5350
  Woburn, MA 01801
  Tel: 781-938-3830
  www.nuverabio.com
 
 
 
  -Original Message-
  From: [EMAIL PROTECTED]
  [mailto:[EMAIL PROTECTED] On Behalf Of Prof Brian Ripley
  Sent: Wednesday, November 08, 2006 5:29 AM
  To: Stefan Grosse
  Cc: r-help@stat.math.ethz.ch; Taka Matzmoto
  Subject: Re: [R] CPU or memory
 
  On Wed, 8 Nov 2006, Stefan Grosse wrote:
 
  64bit does not make anything faster. It is only of use if you want to
  use more then 4 GB of RAM of if you need a higher precision of your
  variables
 
  The dual core question: dual core is faster if programs are able to
  use that. What is sure that R cannot make (until now) use of the two
  cores if you are stuck on Windows. It works excellent if you use
  Linux. So if you want dual core you should work with linux (and then
  its faster of course).
 
  Not necessarily.  We have seen several examples in which using a
  multithreaded BLAS (the only easy way to make use of multiple CPUs under
  Linux for a single R process) makes things many times slower.  For tasks
  that are do not make heavy use of linear algebra, the advantage of a
  multithreaded BLAS is small, and even from those which do the speed-up
 is
  rarely close to double for a dual-CPU system.
 
  John mentioned simulations.  Often by far the most effective way to use
 a
  multi-CPU platform (and I have had one as my desktop for over a decade)
 is
  to use coarse-grained parallelism: run two or more processes each doing
 some
  of the simulation runs.
 
  The Core 2 duo is the fastest processor at the moment however.
 
  (the E6600 has a good price/performance ration)
 
  What I already told Taka is that it is probably always a good idea to
  improve your code for which purpose you could ask in this mailing
  list... (And I am very sure that you have there a lot of potential).
  Another speeding up possibility is e.g. using the atlas library...
  (where

Re: [R] CPU or memory

2006-11-08 Thread Liaw, Andy
My understanding is that it doesn't have much to do with 32- vs. 64-bit,
but what the instruction sets of the CPUs.  If I'm not mistaken, at the
same clock speed, a P4 would run slower than PIII simply because P4 does
less per clock-cycle.  Also, I believe for the same architecture, single
core chips are available at higher clock speeds than their multi-core
counterparts.  That's why we recently went for a box with four
single-core Opterons instead of two dual-core ones.

64-bit PCs should be really affordable:  I've seen HP laptops based on
the Turion chip selling below $500US.

Andy 

From: John C Frain
 
 I would like to thank all who replied to my question about 
 the efficiency of various cpu's in R.
 
 Following the advice of Bogdan Romocea I have put a sample 
 simulation and the latest version of R on a USB drive and 
 will go to a few suppliers to try it out.  I will report back 
 if I find anything of interest.
 
 With regard to 64-bit and 32-bit I thought that the 64-bit 
 chip might require less clock cycles for a specific machine 
 instruction than a 32-bit.
 This was one of the advantages of moving from 8 to 16 or from 
 16 to 32 bit chips.  Thus a slower, in terms of clock speed, 
 64-bit chip might run faster than a somewhat similar 32-bit 
 chip.  I fully realize that the full advantage of a 64-bit 
 chip is available only with a 64-bit operating system and I 
 am preparing to switch some work to Linux in case I acquire a 
 64-bit PC.  If I do I will time the simulations on that also.
 
 I already do some coarse-grained parallelism as described 
 by *Brian Ripley
 * but on two separate PC's.  This is not ideal but allows the 
 processing time to be halved without the overheads.
 
 FORTRAN 2 was my first programming language and I agree that 
 I should try to use C or FORTRAN to speed up things.  Finally 
 Rprof could be a great help.
 There are lots of utilities in the utils package with which I 
 was not familiar.
 
 Again Many Thanks to all who made various suggestions.
 
 
bogdan romocea[EMAIL PROTECTED] to *r-help*, me
  More options   07-Nov (1 day ago)   Does any one know of 
 comparisons of
 the Pentium 9x0, Pentium(r)
  Extreme/Core 2 Duo, AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 
 64 FX/Dual 
  Core AM2 and similar chips when used for this kind of work.
 
 
 
 On 08/11/06, Prof Brian Ripley [EMAIL PROTECTED] wrote:
 
  On Wed, 8 Nov 2006, Christos Hatzis wrote:
 
   Prof. Ripley,
  
   Do you mind providing some pointers on how 
 coarse-grained parallelism
   could be implemented on a Windows environment?  Would it be as 
   simple as running two R-console sessions and then (manually) 
   combining the results
  of
   these simulations.  Or it would be better to run them as batch
  processes.
 
  That is what I would do in any environment (I don't do such things 
  under Windows since all my fast machines run Linux/Unix).
 
  Suppose you want to do 1 simulations.  Set up two batch scripts 
  that each run 5000, and save() the results as a list or 
 matrix under 
  different names, and set a different seed at the top.  Then 
 run each 
  via R CMD BATCH simultaneously.  When both have finished, use an 
  interactive session to load() both sets of results and merge them.
 
   RSiteSearch('coarse grained') did not produce any hits so 
 this topic
  might
   have not been discussed on this list.
  
   I am not really familiar with running R in any mode other than the
  default
   (R-console in Windows) so I might be missing something really 
   obvious. I
  am
   interested in running Monte-Carlo cross-validation in 
 some sort of a 
   parallel mode on a dual core (Pentium D) Windows XP machine.
  
   Thank you.
   -Christos
  
   Christos Hatzis, Ph.D.
   Nuvera Biosciences, Inc.
   400 West Cummings Park
   Suite 5350
   Woburn, MA 01801
   Tel: 781-938-3830
   www.nuverabio.com
  
  
  
   -Original Message-
   From: [EMAIL PROTECTED] 
   [mailto:[EMAIL PROTECTED] On Behalf Of Prof Brian 
   Ripley
   Sent: Wednesday, November 08, 2006 5:29 AM
   To: Stefan Grosse
   Cc: r-help@stat.math.ethz.ch; Taka Matzmoto
   Subject: Re: [R] CPU or memory
  
   On Wed, 8 Nov 2006, Stefan Grosse wrote:
  
   64bit does not make anything faster. It is only of use 
 if you want 
   to use more then 4 GB of RAM of if you need a higher 
 precision of 
   your variables
  
   The dual core question: dual core is faster if programs 
 are able to 
   use that. What is sure that R cannot make (until now) use of the 
   two cores if you are stuck on Windows. It works excellent if you 
   use Linux. So if you want dual core you should work with 
 linux (and 
   then its faster of course).
  
   Not necessarily.  We have seen several examples in which using a 
   multithreaded BLAS (the only easy way to make use of 
 multiple CPUs 
   under Linux for a single R process) makes things many 
 times slower.  
   For tasks that are do not make heavy use of linear algebra, the 
   advantage

Re: [R] CPU or memory

2006-11-07 Thread John C Frain
*Can I extend Taka's question?*
**
*Many of my programs in (mainly simulations in R which are cpu bound) on a
year old PC ( Intel(R) Pentium(R) M processor 1.73GHz or Dell GX380
with 2.8Gh Pentium) are taking hours and perhaps days to complete on a
one year old
PC.  I am looking at an upgrade but the variety of cpu's available is
confusing at least.   Does any one know of comparisons of the Pentium
9x0, Pentium(r)
Extreme/Core 2 Duo,   AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
FX/Dual Core AM2 and
similar chips when used for this kind of work.  Does anyone have any advice
on (1)  the use of a single core or dual core cpu or (2) on the use of 32
bit and 64 bit cpu.  This question is now much more difficult as the numbers
on the various chips do not necessarily refer to the relative speed of the
chips.
*
*John

* On 06/11/06, Taka Matzmoto [EMAIL PROTECTED] wrote:

 Hi R users

 Having both a faster CPU and more memory will boost computing power. I was
 wondering if only adding more memory (1GB - 2GB)  will significantly
 reduce
 R computation time?

 Taka,

 _
 Get FREE company branded e-mail accounts and business Web site from
 Microsoft Office Live

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
John C Frain
Trinity College Dublin
Dublin 2
Ireland
www.tcd.ie/Economics/staff/frainj/home.html
mailto:[EMAIL PROTECTED]
mailto:[EMAIL PROTECTED]

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] CPU or memory

2006-11-07 Thread bogdan romocea
 Does any one know of comparisons of the Pentium 9x0, Pentium(r)
 Extreme/Core 2 Duo, AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
 FX/Dual Core AM2 and similar chips when used for this kind of work.

I think your best option, by far, is to answer the question on your
own. Put R and your programs on a USB drive, go to a computer shop,
and ask the sales person to let you try a few configurations. Run your
simulations, time the results and compare. Keep in mind that the
numbers may be affected by the bus and memory frequency (and perhaps
memory size/fragmentation). With regards to single/dual core and 32/64
bit, search the archives and the documentation, it was asked before.
(I guess dual core and 64 bit would be a sound choice.) One other
thing you need to consider (if you haven't already) is ?Rprof, maybe
you can significantly improve the efficiency of your code or write
parts of it in C or Fortran.


 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of John C Frain
 Sent: Tuesday, November 07, 2006 2:24 PM
 To: Taka Matzmoto
 Cc: r-help@stat.math.ethz.ch
 Subject: Re: [R] CPU or memory

 *Can I extend Taka's question?*
 **
 *Many of my programs in (mainly simulations in R which are
 cpu bound) on a
 year old PC ( Intel(R) Pentium(R) M processor 1.73GHz or Dell GX380
 with 2.8Gh Pentium) are taking hours and perhaps days to complete on a
 one year old
 PC.  I am looking at an upgrade but the variety of cpu's available is
 confusing at least.   Does any one know of comparisons of the Pentium
 9x0, Pentium(r)
 Extreme/Core 2 Duo,   AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
 FX/Dual Core AM2 and
 similar chips when used for this kind of work.  Does anyone
 have any advice
 on (1)  the use of a single core or dual core cpu or (2) on
 the use of 32
 bit and 64 bit cpu.  This question is now much more difficult
 as the numbers
 on the various chips do not necessarily refer to the relative
 speed of the
 chips.
 *
 *John

 * On 06/11/06, Taka Matzmoto [EMAIL PROTECTED] wrote:

  Hi R users
 
  Having both a faster CPU and more memory will boost
 computing power. I was
  wondering if only adding more memory (1GB - 2GB)  will
 significantly
  reduce
  R computation time?
 
  Taka,
 
  _
  Get FREE company branded e-mail accounts and business Web site from
  Microsoft Office Live
 
  __
  R-help@stat.math.ethz.ch mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide
  http://www.R-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.
 



 --
 John C Frain
 Trinity College Dublin
 Dublin 2
 Ireland
 www.tcd.ie/Economics/staff/frainj/home.html
 mailto:[EMAIL PROTECTED]
 mailto:[EMAIL PROTECTED]

 [[alternative HTML version deleted]]

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] CPU or memory

2006-11-06 Thread Taka Matzmoto
Hi R users

Having both a faster CPU and more memory will boost computing power. I was 
wondering if only adding more memory (1GB - 2GB)  will significantly reduce 
R computation time?

Taka,

_
Get FREE company branded e-mail accounts and business Web site from 
Microsoft Office Live

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] CPU or memory

2006-11-06 Thread Uwe Ligges


Taka Matzmoto wrote:
 Hi R users
 
 Having both a faster CPU and more memory will boost computing power. I was 
 wondering if only adding more memory (1GB - 2GB)  will significantly reduce 
 R computation time?


If your computations consume just a few Mb, it won't make it faster, if 
it consumes a lot of memory, you will prevent swapping and make it much 
faster.

Uwe Ligges


 Taka,
 
 _
 Get FREE company branded e-mail accounts and business Web site from 
 Microsoft Office Live
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.