2016-02-15 14:02 GMT+01:00 Sun, Rui :
> On computation, RRDD launches one R process for each partition, so there
> won't be thread-safe issue
>
> Could you give more details on your new environment?
Running on EC2, I start the executors via
/usr/bin/R CMD javareconf -e "/usr/lib/spark/sbin/
user
Subject: Re: Running synchronized JRI code
2016-02-15 4:35 GMT+01:00 Sun, Rui :
> Yes, JRI loads an R dynamic library into the executor JVM, which faces
> thread-safe issue when there are multiple task threads within the executor.
>
> I am thinking if the demand like yours (call
2016-02-15 4:35 GMT+01:00 Sun, Rui :
> Yes, JRI loads an R dynamic library into the executor JVM, which faces
> thread-safe issue when there are multiple task threads within the executor.
>
> I am thinking if the demand like yours (calling R code in RDD
> transformations) is much desired, we may
For YARN mode, you can set --executor-cores 1
-Original Message-
From: Sun, Rui [mailto:rui@intel.com]
Sent: Monday, February 15, 2016 11:35 AM
To: Simon Hafner ; user
Subject: RE: Running synchronized JRI code
Yes, JRI loads an R dynamic library into the executor JVM, which faces
is currently intended for internal use by SparkR and not a public API.
-Original Message-
From: Simon Hafner [mailto:reactorm...@gmail.com]
Sent: Monday, February 15, 2016 5:09 AM
To: user
Subject: Running synchronized JRI code
Hello
I'm currently running R code in an executor vi
Hello
I'm currently running R code in an executor via JRI. Because R is
single-threaded, any call to R needs to be wrapped in a
`synchronized`. Now I can use a bit more than one core per executor,
which is undesirable. Is there a way to tell spark that this specific
application (or even specific U