Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/560#issuecomment-46594901
so the idea is that you can set spark configs via SPARK_JAVA_OPTS in order
to be backwards compatible with 0.9. Before the SparkConf
(conf/spark-defaults.conf) and spark-submit the only way to specify the spark
configs was either through SPARK_JAVA_OPTS on command line or spark-env.sh. So
that is what I've been looking at.
My examples which is pretty easy to see if works is set:
SPARK_JAVA_OPTS="-Dspark.authenticate=true -Dspark.ui.acls.enable=true"
Run one of the Spark examples (SparkPi):
./bin/spark-submit
examples/target/scala-2.10/spark-examples-1.1.0-SNAPSHOT-hadoop0.23.9.jar
--master yarn --deploy-mode cluster --class org.apache.spark.examples.SparkPi
Hopefully people using spark-submit will have converted to use configs so
testing with spark-class is more important for those that haven't converted:
./bin/spark-class org.apache.spark.deploy.yarn.Client --jar
examples/target/scala-2.10/spark-examples-1.1.0-SNAPSHOT-hadoop0.23.9.jar
--class org.apache.spark.examples.SparkPi
With this change it fails with the error I listed, without this change it
works and you can view the logs and see that the configs take affect:
14/06/19 17:54:34 INFO SecurityManager: SecurityManager: authentication
enabled; ui acls enabled; users with view permissions: Set(yarn, tgraves)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---