My inclination would be to, whenever possible, replace the core scalar libraries with compatible parallel versions (lapack -> scalapack), rather than make it an add-on package. If the R client code is general enough, and the make file can automatically find the parallel version, then its a simple matter of compiling with the parallel libs. (Don't know if this is possible at run-time.) No rewriting (high level) R code at all. I tried to contact the plapack folks here at UT about integrating with R, but it appears the project is no longer active.
Tim On Tue, 2004-03-23 at 13:32, A.J. Rossini wrote: > [EMAIL PROTECTED] writes: > > >> does anyone know if there exists an effort to bring some kind of > >> distributed computing to R? The most simple functionality I'm after is > >> to be able to explicitly perform a task on a computing server. Sorry if > >> this is a non-informed newbie question... > > > > As an alternate to the PVM/MPI interfaces mentioned by other people, I am > > working on a (very soon to be released) project for using the ScaLAPACK library > > [1] through a simple R interface. If the tasks that you want run an a computing > > server are simple (LAPACK) functions (solve, svd, etc) and not whole R scripts, > > then this be useful. > > A number of folks have commented on having this in progress (esp a > group at Vanderbilt). It's intriguing, but how did you plan on > replacing the standard system-level library calls? (or did you just > provide new interfaces at the user (R command) level?) > > best, > -tony -- Timothy H. Keitt Section of Integrative Biology University of Texas at Austin http://www.keittlab.org/ ______________________________________________ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel