Re: Spark authenticate enablement

2014-09-18 Thread Andrew Or
Or and...@databricks.com and...@databricks.com* 2014/09/17 02:06 To Tom Graves tgraves...@yahoo.com, cc Jun Feng Liu/China/IBM@IBMCN, dev@spark.apache.org dev@spark.apache.org Subject Re: Spark authenticate enablement Hi Jun, You can still set the authentication variables through `spark-env.sh

Re: Spark authenticate enablement

2014-09-16 Thread Andrew Or
Hi Jun, You can still set the authentication variables through `spark-env.sh`, by exporting SPARK_MASTER_OPTS, SPARK_WORKER_OPTS, SPARK_HISTORY_OPTS etc to include -Dspark.auth.{...}. There is an open pull request that allows these processes to also read from spark-defaults.conf, but this is not

Re: Spark authenticate enablement

2014-09-16 Thread Jun Feng Liu
, Dist.Haidian Beijing 100193 China Andrew Or and...@databricks.com 2014/09/17 02:06 To Tom Graves tgraves...@yahoo.com, cc Jun Feng Liu/China/IBM@IBMCN, dev@spark.apache.org dev@spark.apache.org Subject Re: Spark authenticate enablement Hi Jun, You can still set the authentication

Re: Spark authenticate enablement

2014-09-15 Thread Tom Graves
Spark authentication does work in standalone mode (atleast it did, I haven't tested it in a while). The same shared secret has to be set on all the daemons (master and workers) and then also in the configs of any applications submitted. Since everyone shares the same secret its by no means

Re: Spark authenticate enablement

2014-09-12 Thread Sandy Ryza
Hi Jun, I believe that's correct that Spark authentication only works against YARN. -Sandy On Thu, Sep 11, 2014 at 2:14 AM, Jun Feng Liu liuj...@cn.ibm.com wrote: Hi, there I am trying to enable the authentication on spark on standealone model. Seems like only SparkSubmit load the

Spark authenticate enablement

2014-09-11 Thread Jun Feng Liu
Hi, there I am trying to enable the authentication on spark on standealone model. Seems like only SparkSubmit load the properties from spark-defaults.conf. org.apache.spark.deploy.master.Master dose not really load the default setting from spark-defaults.conf. Dose it mean the spark