Hi,

I upgraded to Spark 1.3.1 and after running the job on the interpreter I see 
only 3 virtual cores are used, even after specifying the number of cores in 
zeppelin-env.sh as “export ZEPPELIN_JAVA_OPTS “

Is there a place where I can set the number correctly? I am using Spark on Yarn.


Thanks in advance for your pointers.


Regards,
Sambit.

Reply via email to