hasnain-db commented on code in PR #43220:
URL: https://github.com/apache/spark/pull/43220#discussion_r1346818789


##########
common/network-common/src/main/java/org/apache/spark/network/util/TransportConf.java:
##########
@@ -257,6 +258,159 @@ public int sslShuffleChunkSize() {
       conf.get("spark.network.ssl.maxEncryptedBlockSize", "64k")));
   }
 
+  /**
+   * Whether Secure (SSL/TLS) RPC (including Block Transfer Service) is enabled
+   */
+  public boolean sslRpcEnabled() {
+    return conf.getBoolean("spark.ssl.rpc.enabled", false);
+  }

Review Comment:
   We support both of these. 
   
   If you like to use the JDK SSL provider, we require a `keyStore` argument 
which takes in the key being used for the server, and the `trustStore` is used 
to verify the peer we talk to.
   
   If the openSSL provider is used, we require the `privateKey` and `certChain` 
arguments which have similar purposes.
   
   You can look at the `SSLFactory` in the reference PR 
https://github.com/apache/spark/pull/42685 (I was going to put it up after this 
PR and another one) to see how it's used.
   
   The tests in the reference PR https://github.com/apache/spark/pull/42685 use 
self signed certificates and work fine. 
   
   Does that answer your question? Happy to clarify as needed



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to