RE: [R] Persistent state of R

2003-11-25 Thread Warnes, Gregory R

Starting up R and loading libraries can be very time consuming.  For my
RSOAP system (http://www.analytics.washington.edu/Zope/projects/RSOAP/)  I
took the step of pre-starting the R process, including the loading of some
libraries, and then handing work of to the pre-started process.  You should
be able to use RSOAP from perl, and it would be a simple change to have it
add the bioconductor packages to the pre-loaded set.

Alternatively, I suppose that one could force R to dump core and then start
it from the core image...

-G

-Original Message-
From: michael watson (IAH-C)
To: '[EMAIL PROTECTED]'
Sent: 11/25/03 8:54 AM
Subject: [R] Persistent state of R

Hi

I am using R as a back-end to some CGI scripts, written in Perl.  My
platform is Suse Linux 8.2, Apache 1.3.7.  So the CGI script takes some
form parameters, opens a pipe to an R process, loads up some
Bioconductor libraries, executes some R commands and takes the ouput and
creates a web page.  It is all very neat and works well.

I am trying to make my cgi scripts quicker and it turns out that the
bottle-neck is the loading of the libraries into R - for example loading
up marrayPlots into R takes 10-20 seconds, which although not long, is
long enough for users to imagine it is not working and start clicking
reload

So I just wondered if anyone had a neat solution whereby I could somehow
have the required libraries permanently loaded into R - perhaps I need a
persistent R process with the libraries in memory that I can pipe
commands to?  Is this possible?

Thanks
Mick

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


LEGAL NOTICE\ Unless expressly stated otherwise, this messag...{{dropped}}

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


Re: [R] Persistent state of R

2003-11-25 Thread Luke Tierney
On Tue, 25 Nov 2003, michael watson (IAH-C) wrote:

 Hi
 
 I am using R as a back-end to some CGI scripts, written in Perl.  My platform is 
 Suse Linux 8.2, Apache 1.3.7.  So the CGI script takes some form parameters, opens a 
 pipe to an R process, loads up some Bioconductor libraries, executes some R commands 
 and takes the ouput and creates a web page.  It is all very neat and works well.
 
 I am trying to make my cgi scripts quicker and it turns out that the bottle-neck is 
 the loading of the libraries into R - for example loading up marrayPlots into R 
 takes 10-20 seconds, which although not long, is long enough for users to imagine it 
 is not working and start clicking reload
 
 So I just wondered if anyone had a neat solution whereby I could somehow have the 
 required libraries permanently loaded into R - perhaps I need a persistent R process 
 with the libraries in memory that I can pipe commands to?  Is this possible?
 
 Thanks
 Mick

One option we have been experimenting with is to process the contents
of packages into a simple data base and then use lazy loading.  This
means that loading a package will load only a small amount of
information, basically the names of the variables defined.  The actual
values are only loaded from the data base on demand.  An experimental
package that implements this is available at

http://www.stat.uiowa.edu/~luke/R/serialize/lazyload.tar.gz

I believe `make check-all' passes with all base and recommended
packages set up for lazy loading.  Lazy laoding has not been tested
much with packages using S4 methods or with anything in Bioconductor
as far as I know.  So this may or may not do anything useful for you.

[WARNING: Since this messes with the installed packages in your R
system you should only experiment with it in an R installation you can
afford to mess up.]

The README file from the package is attached below.

Best,

luke

(README)---
This package provides tools to set up packages for lazy loading from a
data base.  If you want to try this out, here are the steps:

1) Install the current version of package lazyload from
http://www.stat.uiowa.edu/~luke/R/serialize/.

2) To make base use lazy loading, start R with something like

env R_DEFAULT_PACKAGES=NULL R

to make sure no packages are loaded.  Then do

source(file.path(.find.package(lazyload),makebasedb.R))
library(lazyload)
makeLazyLoading(base)

Make sure to do the source first, then the library call.

3) To make package foo use lazy loading use makeLazyLoading(foo).
You can make all base packages use lazy loading with

for (p in rev(installed.packages(priority=base)[,Package])) {
cat(paste(converting, p, ...))
makeLazyLoading(p)
cat(done\n)
}

The rev() is a quick and dirty way to get stepfun done before modreg,
since modreg imports from stepfun.


-- 
Luke Tierney
University of Iowa  Phone: 319-335-3386
Department of Statistics andFax:   319-335-3017
   Actuarial Science
241 Schaeffer Hall  email:  [EMAIL PROTECTED]
Iowa City, IA 52242 WWW:  http://www.stat.uiowa.edu

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


Re: [R] Persistent state of R

2003-11-25 Thread Joe Conway
michael watson (IAH-C) wrote:
I am trying to make my cgi scripts quicker and it turns out that the
bottle-neck is the loading of the libraries into R - for example
loading up marrayPlots into R takes 10-20 seconds, which although not
long, is long enough for users to imagine it is not working and start
clicking reload
So I just wondered if anyone had a neat solution whereby I could
somehow have the required libraries permanently loaded into R -
perhaps I need a persistent R process with the libraries in memory
that I can pipe commands to?  Is this possible?
If you are processing data already stored in a database, you could use 
Postgres and PL/R. See:
  http://www.joeconway.com/

Use Postgres 7.4 and preload PL/R for the best performance -- i.e put 
the following line in $PGDATA/postgresql.conf
preload_libraries = '$libdir/plr:plr_init'

HTH,

Joe

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help