hvanhovell commented on code in PR #47434:
URL: https://github.com/apache/spark/pull/47434#discussion_r1733450435


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -859,6 +865,49 @@ object SparkSession extends Logging {
     }
   }
 
+  /**
+   * Create a new Spark Connect server to connect locally.
+   */
+  private[sql] def withLocalConnectServer[T](f: => T): T = {
+    synchronized {
+      val remoteString = initOptions

Review Comment:
   Can you give an example of what `spark.remote` is supposed to contain? 
Instead of spark connect string, it seems to contain the spark master URL. I 
don't really like overloading configuration like this. This is also illustrated 
in a little bit further down the line, where are reconfiguring this thing.
   
   In a perfect world I think we should not start the driver process here. IMO 
that belongs in the launcher. The more pressing downside is that submit now 
only works for the languages we directly support, which conflicts with the 
language agnostic nature of connect. In that case this code can disappear :)...
   
   For now, I would just make the launcher set the normal configurations and 
(optionally) set the spark.remote flag to what it is supposed to be. If the 
master conf is present we will start a driver process.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to