Did you `export` the environment variables? Also, are you running in client
mode or cluster mode? If it still doesn't work you can try to set these
through the spark-submit command lines --num-executors, --executor-cores,
and --executor-memory.

2014-10-23 19:25 GMT-07:00 firemonk9 <dhiraj.peech...@gmail.com>:

> Hi,
>
>    I am facing same problem. My spark-env.sh has below entries yet I see
> the
> yarn container with only 1G and yarn only spawns two workers.
>
> SPARK_EXECUTOR_CORES=1
> SPARK_EXECUTOR_MEMORY=3G
> SPARK_EXECUTOR_INSTANCES=5
>
> Please let me know if you are able to resolve this issue.
>
> Thank you
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-0-0-on-yarn-cluster-problem-tp7560p17175.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to