My Rdsm package will do what you want, https://cran.r-project.org/web/packages/Rdsm/index.html
Norm Matloff > Message: 4 > Date: Mon, 10 Jul 2017 17:12:57 +0000 > From: "Stravs, Michael" <michael.str...@eawag.ch> > To: "r-devel@r-project.org" <r-devel@r-project.org> > Cc: "shiny-disc...@googlegroups.com" <shiny-disc...@googlegroups.com> > Subject: [Rd] Background session with R > Message-ID: > <9dd73f68ac266d4aa329e07b678177b191e37...@ee-mbx3.ee.emp-eaw.ch> > Content-Type: text/plain; charset="UTF-8" > > Hi, > > I am working on some code to have a background R process running that I can > submit data to, check computation progress, and retrieve results later. I am > aware that "parallel" does a lot of that - however, "parallel" shuts down the > nodes when I quit the master process. On the contrary, I would want these > nodes to continue running, so I can fire up R again later and reconnect to > the nodes to retrieve the results. > > The use case is Shiny apps, where I want a thin frontend as a GUI, workflow > launcher and result viewer, and launch background computation that isn't > dependent on the Shiny script staying alive. > > Has this been done already, and/or are there simple modifications of > parallel/snow/etc that allow this? My current WIP thing uses Rserve. > > (shiny-discuss cc'd). > > Michael Stravs > Eawag > Umweltchemie > BU E 23 > ?berlandstrasse 133 > 8600 D?bendorf > +41 58 765 6742 ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel