Hi all, I'm trying to lock down ALL Spark ports and have tried using spark-defaults.conf and via the sparkContext. (The example below was run in local[*] mode, but all attempts to run in local or spark-submit.sh on cluster via jar all result in the same results).
My goal is to define all communication between the driver and worker to use 50000-50006 and to not use any random ports. (all other ports, 7077, 4040, etc are ok). But, I am still seeing random ports being generated from akka and would like to define all ports for the spark application due to strict security that is needed. As part of my SparkContext, I've defined: (snippet) set("spark.driver.port", "50001"). set("spark.fileserver.port", "50002"). set("spark.broadcast.port", "50003"). set("spark.replClassServer.port", "50004"). set("spark.blockManager.port", "50005"). set("spark.executor.port", "50006"). And upon execution, I see the following being read into the UI correctly, but am still seeing random port assignments: . . . <This is valid and what I'm expecting> Remoting started; listening on addresses :[akka.tcp://spark@10.x.x.x:50001] Remoting: Remoting now listens on addresses: [akka.tcp://spark@10.x.x.x:50001] . . . <This isn't being set> ConnectionManager: Bound socket to port 54061 with id = ConnectionManagerId(10.x.x.x,54061) BlockManagerMaster: Trying to register BlockManager BlockManagerInfo: Registering block manager 10.x.x.x:54061 with 2.1 GB RAM BlockManagerMaster: Registered BlockManager HttpServer: Starting HTTP Server HttpBroadcast: Broadcast server started at http://10.x.x.x:54062 I defined a block manager port, but it simply isn't getting set, what else can I try to get this resolved? I can also see in the UI that are not being set properly. spark.fileserver.uri spark.httpBroadcast.uri Thanks for your time in reviewing/answering my post in advance. Regards, Dan -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Port-Configuration-tp20839.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org