2016-02-15 14:02 GMT+01:00 Sun, Rui :
> On computation, RRDD launches one R process for each partition, so there
> won't be thread-safe issue
>
> Could you give more details on your new environment?
Running on EC2, I start the executors via
/usr/bin/R CMD javareconf -e "/usr/lib/spark/sbin/
user
Subject: Re: Running synchronized JRI code
2016-02-15 4:35 GMT+01:00 Sun, Rui :
> Yes, JRI loads an R dynamic library into the executor JVM, which faces
> thread-safe issue when there are multiple task threads within the executor.
>
> I am thinking if the demand like yours (call
2016-02-15 4:35 GMT+01:00 Sun, Rui :
> Yes, JRI loads an R dynamic library into the executor JVM, which faces
> thread-safe issue when there are multiple task threads within the executor.
>
> I am thinking if the demand like yours (calling R code in RDD
> transformations) is much desired, we may
For YARN mode, you can set --executor-cores 1
-Original Message-
From: Sun, Rui [mailto:rui@intel.com]
Sent: Monday, February 15, 2016 11:35 AM
To: Simon Hafner ; user
Subject: RE: Running synchronized JRI code
Yes, JRI loads an R dynamic library into the executor JVM, which faces
Yes, JRI loads an R dynamic library into the executor JVM, which faces
thread-safe issue when there are multiple task threads within the executor.
If you are running Spark on Standalone mode, it is possible to run multiple
workers per node, and at the same time, limit the cores per worker to be