Hi,

This is for Spark 1.6.0-SNAPSHOT (SHA1
a96ba40f7ee1352288ea676d8844e1c8174202eb).

I've been toying with Spark Standalone cluster and have the following
file in conf/spark-env.sh:

➜  spark git:(master) ✗ cat conf/spark-env.sh
SPARK_WORKER_CORES=2
SPARK_WORKER_MEMORY=2g

# multiple Spark worker processes on a machine
SPARK_WORKER_INSTANCES=2

It's fine and the cluster works fine. It's also fine according to
https://spark.apache.org/docs/latest/spark-standalone.html.

So far so good.

Just today I saw the following when I executed `spark-submit`:

=============
15/09/23 00:48:26 WARN SparkConf:
SPARK_WORKER_INSTANCES was detected (set to '2').
This is deprecated in Spark 1.0+.

Please instead use:
 - ./spark-submit with --num-executors to specify the number of executors
 - Or set SPARK_EXECUTOR_INSTANCES
 - spark.executor.instances to configure the number of instances in
the spark config.
=============

Why is the deprecation? Is it not supported (not recommended given the
message) to have a Spark Standalone cluster and executing spark-submit
on the same machine?

Pozdrawiam,
Jacek

--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to