HyukjinKwon commented on code in PR #50006:
URL: https://github.com/apache/spark/pull/50006#discussion_r1962488891


##########
python/pyspark/sql/connect/session.py:
##########
@@ -1041,11 +1045,17 @@ def _start_connect_server(master: str, opts: Dict[str, 
Any]) -> None:
             init_opts.update(opts)
             opts = init_opts
 
+            token = str(uuid.uuid4())
+
             # Configurations to be overwritten
             overwrite_conf = opts
             overwrite_conf["spark.master"] = master
             overwrite_conf["spark.local.connect"] = "1"
+            # When running a local server, always use an ephemeral port
+            overwrite_conf["spark.connect.grpc.binding.port"] = "0"
+            overwrite_conf["spark.connect.authenticate.token"] = token
             os.environ["SPARK_LOCAL_CONNECT"] = "1"
+            os.environ["SPARK_CONNECT_AUTHENTICATE_TOKEN"] = token

Review Comment:
   You're making the same mistake as I did :-). Individual Spark tests does not 
actually terminate JVM but only stop SparkContext via SparkSession.stop. After 
that, you create the SparkContext (via SparkSession) to Spark connect server 
here, and it creates a new token. Then the previously set 
`SPARK_CONNECT_AUTHENTICATE_TOKEN` will be applied to the Spark Connect server, 
then you end up with a different token. 
   
   To workaround this problem, you should either create the token once (as in 
https://github.com/apache/spark/pull/49880), or have a variable to reset on 
`SparkSession.stop`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to