HyukjinKwon commented on code in PR #49053:
URL: https://github.com/apache/spark/pull/49053#discussion_r1868833587


##########
sql/connect/common/src/main/scala/org/apache/spark/sql/connect/client/SparkConnectClient.scala:
##########
@@ -620,8 +620,18 @@ object SparkConnectClient {
      * Configure the builder using the env SPARK_REMOTE environment variable.
      */
     def loadFromEnvironment(): Builder = {
+      lazy val isAPIModeConnect = Option(System.getProperty("spark.api.mode"))
+        .getOrElse("connect")
+        .toLowerCase(Locale.ROOT) == "connect"
       Option(System.getProperty("spark.remote")) // Set from Spark Submit
         .orElse(sys.env.get(SparkConnectClient.SPARK_REMOTE))
+        .orElse {
+          if (isAPIModeConnect) {
+            
Option(System.getProperty("spark.master")).orElse(sys.env.get("MASTER"))

Review Comment:
   This is needed when we run it with Shells. How it's done is consistent with 
PySpark shell, SparkR shell for now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to