Hi,

I am working on some code to have a background R process running that I can 
submit data to, check computation progress, and retrieve results later. I am 
aware that "parallel" does a lot of that - however, "parallel" shuts down the 
nodes when I quit the master process. On the contrary, I would want these nodes 
to continue running, so I can fire up R again later and reconnect to the nodes 
to retrieve the results.

The use case is Shiny apps, where I want a thin frontend as a GUI, workflow 
launcher and result viewer, and launch background computation that isn't 
dependent on the Shiny script staying alive.

Has this been done already, and/or are there simple modifications of 
parallel/snow/etc that allow this? My current WIP thing uses Rserve.

(shiny-discuss cc'd).

Michael Stravs
Eawag
Umweltchemie
BU E 23
�berlandstrasse 133
8600 D�bendorf
+41 58 765 6742


        [[alternative HTML version deleted]]

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to