hvanhovell commented on code in PR #47434:
URL: https://github.com/apache/spark/pull/47434#discussion_r1689030629


##########
core/src/main/scala/org/apache/spark/deploy/PythonRunner.scala:
##########
@@ -74,6 +77,16 @@ object PythonRunner {
     // Launch Python process
     val builder = new ProcessBuilder((Seq(pythonExec, formattedPythonFile) ++ 
otherArgs).asJava)
     val env = builder.environment()
+    if (!Utils.isLocalRemote(sparkConf)) {
+      // For non-local remote, pass configurations to environment variables so
+      // Spark Connect client sets them. For local remotes, they will be set
+      // via Py4J.
+      // For PySpark specifically, we can't send other configurations through 
properties.
+      // So, here we should send it together.
+      val confs = sparkConf.getAll.filter(p =>

Review Comment:
   Where are these confs being used? In client or on the server side. You don't 
have to pass them if they are needed on server side.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to