On computation, RRDD launches one R process for each partition, so there won't 
be thread-safe issue

Could you give more details on your new environment?

-----Original Message-----
From: Simon Hafner [mailto:reactorm...@gmail.com] 
Sent: Monday, February 15, 2016 7:31 PM
To: Sun, Rui <rui....@intel.com>
Cc: user <user@spark.apache.org>
Subject: Re: Running synchronized JRI code

2016-02-15 4:35 GMT+01:00 Sun, Rui <rui....@intel.com>:
> Yes, JRI loads an R dynamic library into the executor JVM, which faces 
> thread-safe issue when there are multiple task threads within the executor.
>
> I am thinking if the demand like yours (calling R code in RDD 
> transformations) is much desired, we may consider refactoring RRDD for this 
> purpose, although it is currently intended for internal use by SparkR and not 
> a public API.
So the RRDDs don't have that thread safety issue? I'm currently creating a new 
environment for each call, but it still crashes.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to