You need to set --total-executor-cores to limit how many total cores it grabs 
on the cluster. --executor-cores is just for each individual executor, but it 
will try to launch many of them.

Matei

On Oct 1, 2014, at 4:29 PM, Sanjay Subramanian 
<sanjaysubraman...@yahoo.com.INVALID> wrote:

> hey guys
> 
> I am using  spark 1.0.0+cdh5.1.0+41
> When two users try to run "spark-shell" , the first guy's spark-shell shows
> active in the 18080 Web UI but the second user shows WAITING and the shell
> has a bunch of errors but does go the spark-shell and "sc.master" seems to
> point to the correct master.
> 
> I tried controlling the number of cores in the "spark-shell" command
> --executor-cores 8
> Does not work
> 
> thanks
> 
> sanjay 
> 
> 

Reply via email to