Hi Ophir,

You can set them by the following lines in you zeppelin-env.sh:

-DZEPPELIN_INTP_JAVA_OPTS="-Dspark.executor.instances=10"

JL.

On Mon, Apr 6, 2015 at 3:05 AM, Ophir Cohen <[email protected]> wrote:

>
> Hi All,
> I'm using Zeppelin on top Spark that runs on Yarn.
> For some reason I can't set the number of executors.
> With any configuration it gets only 3 cores.
> I tried to change in two places:
> 1. conf/zeppelin-env.sh
> 2. interpreter.json
> In both of them I tried to set "num-executors": "10" but I still get 3
> (it's a 3 nodes cluster - I have 12 available cores).
>
> Any idea?
> Thanks!
> Ophir
>
> BTW
> I also tried to pass compression variables to be used in the underline
> Hive metastore and no luck as well - it seems that it does not passed.
>



-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Reply via email to