On 4/22/15 6:25 PM, Gilles wrote: > On Wed, 22 Apr 2015 11:33:30 -0500, Ole Ersoy wrote: >> On Mon, Apr 20, 2015 at 6:05 PM Phil Steitz >> <phil.ste...@gmail.com> wrote: >>>> >>>> There are lots of ways to allow distributed processes to share >>>> common data. Spark has a very nice construct called a Resilient >>>> Distributed Dataset (RDD) designed for exactly this purpose. >> Are there any examples of a class in commons math where threads have >> to share the data structure? > > It's the case for the only example that was mentioned in this thread > with sufficient level of details so as to permit concrete statements: > it's the SOFM implementation (in package > "o.a.c.m.ml.neuralnet.sofm"). > The shared structure is the "Network" instance. > > I've only read the "Spark" examples. I assume that an alternate > version of "KohonenTrainingTask" would follow the "logistic > regression" example (IIUC). > But isn't that going to create a dependency towards Spark???
The challenge - possibly hopeless, but I am not giving up yet - is to find a way to make it easy for someone who wants to use Spark to do this (or Hadoop, or...). We don't want to create dependencies on these frameworks - just make it easy for users to distribute computation tasks to them. It may be that there is nothing to add to what we already have in sofm. Let's see what it takes to actually get it working. Phil > > Regards, > Gilles > >> Cheers, >> - Ole > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org > For additional commands, e-mail: dev-h...@commons.apache.org > > --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org For additional commands, e-mail: dev-h...@commons.apache.org