Michael Gummelt created SPARK-17002:
---------------------------------------

             Summary: Document that spark.ssl.protocol. is required for SSL
                 Key: SPARK-17002
                 URL: https://issues.apache.org/jira/browse/SPARK-17002
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 2.0.0, 1.6.2
            Reporter: Michael Gummelt


cc [~jlewandowski]

I was trying to start the Spark master.  When setting 
{{spark.ssl.enabled=true}}, but failing to set {{spark.ssl.protocol}}, I get 
this none-too-helpful error message:

{code}
16/08/10 15:17:50 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(mgummelt); users 
with modify permissions: Set(mgummelt)
16/08/10 15:17:50 WARN SecurityManager: Using 'accept-all' trust manager for 
SSL connections.
Exception in thread "main" java.security.KeyManagementException: Default 
SSLContext is initialized automatically
        at 
sun.security.ssl.SSLContextImpl$DefaultSSLContext.engineInit(SSLContextImpl.java:749)
        at javax.net.ssl.SSLContext.init(SSLContext.java:282)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:284)
        at 
org.apache.spark.deploy.master.Master$.startRpcEnvAndEndpoint(Master.scala:1121)
        at org.apache.spark.deploy.master.Master$.main(Master.scala:1106)
        at org.apache.spark.deploy.master.Master.main(Master.scala)
{code}

We should document that {{spark.ssl.protocol}} is required, and throw a more 
helpful error message when it isn't set.  In fact, we should remove the 
`getOrElse` here: 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SecurityManager.scala#L285,
 since the following line fails when the protocol is set to "Default"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to