[ 
https://issues.apache.org/jira/browse/SPARK-25078?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16581781#comment-16581781
 ] 

Bo Meng commented on SPARK-25078:
---------------------------------

what is your suggestion to improve this?

> Standalone does not work with spark.authenticate.secret and 
> deploy-mode=cluster
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-25078
>                 URL: https://issues.apache.org/jira/browse/SPARK-25078
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 2.4.0
>            Reporter: Imran Rashid
>            Priority: Major
>
> When running a spark standalone cluster with spark.authenticate.secret setup, 
> you cannot submit a program in cluster mode, even with the right secret.  The 
> driver fails with:
> {noformat}
> 18/08/09 08:17:21 INFO SecurityManager: SecurityManager: authentication 
> enabled; ui acls disabled; users  with view permissions: Set(systest); groups 
> with view permissions: Set(); users  with modify permissions: Set(systest); 
> groups with modify permissions: Set()
> 18/08/09 08:17:21 ERROR SparkContext: Error initializing SparkContext.
> java.lang.IllegalArgumentException: requirement failed: A secret key must be 
> specified via the spark.authenticate.secret config.
>         at scala.Predef$.require(Predef.scala:224)
>         at 
> org.apache.spark.SecurityManager.initializeAuth(SecurityManager.scala:361)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:238)
>         at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
>         at 
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
> ...
> {noformat}
> but its actually doing the wrong check in 
> {{SecurityManager.initializeAuth()}}.  The secret is there, its just in an 
> environment variable {{_SPARK_AUTH_SECRET}} (so its not visible to another 
> process).
> *Workaround*: In your program, you can pass in a dummy secret to your spark 
> conf.  It doesn't matter what it is at all, later it'll be ignored and when 
> establishing connections, the secret from the env variable will be used.  Eg.
> {noformat}
> val conf = new SparkConf()
> conf.setIfMissing("spark.authenticate.secret", "doesn't matter")
> val sc = new SparkContext(conf)
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to