You should distribute your configuration file to workers and set the
appropriate environment variables, like HADOOP_HOME, SPARK_HOME,
HADOOP_CONF_DIR, SPARK_CONF_DIR.

On Mon, Apr 27, 2015 at 12:56 PM James King <jakwebin...@gmail.com> wrote:

> I renamed spark-defaults.conf.template to spark-defaults.conf
> and invoked
>
> spark-1.3.0-bin-hadoop2.4/sbin/start-slave.sh
>
> But I still get
>
> failed to launch org.apache.spark.deploy.worker.Worker:
>     --properties-file FILE   Path to a custom Spark properties file.
>                              Default is conf/spark-defaults.conf.
>
> But I'm thinking it should pick up the default spark-defaults.conf from
> conf dir
>
> Am I expecting or doing something wrong?
>
> Regards
> jk
>
>
>

Reply via email to