panbingkun commented on PR #46822:
URL: https://github.com/apache/spark/pull/46822#issuecomment-2141968024

   I have compared the logic of `spark version 3.5` as follows:
   <img width="333" alt="image" 
src="https://github.com/apache/spark/assets/15246973/d6160c92-bb0e-4af2-a7ab-1a87ca925b67";>
   
   ```
   (base) ➜  spark-3.5.1-bin-hadoop3 sh bin/spark-shell --verbose
   Using properties file: 
/Users/panbingkun/Developer/spark/spark-client/spark-3.5.1-bin-hadoop3/conf/spark-defaults.conf
   24/05/31 20:22:21 WARN Utils: Your hostname, panbingkun.local resolves to a 
loopback address: 127.0.0.1; using 172.26.2.12 instead (on interface en0)
   24/05/31 20:22:21 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to 
another address
   Adding default property: spark.connect.grpc.binding.port=8888
   Adding default property: spark.connect.grpc.binding.address=127.0.0.1
   Adding default property: spark.master=local-cluster[2, 1, 1024]
   Parsed arguments:
     master                  local-cluster[2, 1, 1024]
     remote                  null
     deployMode              null
     executorMemory          null
     executorCores           null
     totalExecutorCores      null
     propertiesFile          
/Users/panbingkun/Developer/spark/spark-client/spark-3.5.1-bin-hadoop3/conf/spark-defaults.conf
     driverMemory            null
     driverCores             null
     driverExtraClassPath    null
     driverExtraLibraryPath  null
     driverExtraJavaOptions  null
     supervise               false
     queue                   null
     numExecutors            null
     files                   null
     pyFiles                 null
     archives                null
     mainClass               org.apache.spark.repl.Main
     primaryResource         spark-shell
     name                    Spark shell
     childArgs               []
     jars                    null
     packages                null
     packagesExclusions      null
     repositories            null
     verbose                 true
   
   Spark properties used, including those specified through
    --conf and those from the properties file 
/Users/panbingkun/Developer/spark/spark-client/spark-3.5.1-bin-hadoop3/conf/spark-defaults.conf:
     (spark.connect.grpc.binding.address,127.0.0.1)
     (spark.connect.grpc.binding.port,8888)
     (spark.master,local-cluster[2, 1, 1024])
   
   
   Main class:
   org.apache.spark.repl.Main
   Arguments:
   
   Spark config:
   (spark.app.name,Spark shell)
   (spark.app.submitTime,1717158141370)
   (spark.connect.grpc.binding.address,127.0.0.1)
   (spark.connect.grpc.binding.port,8888)
   (spark.jars,)
   (spark.master,local-cluster[2, 1, 1024])
   (spark.submit.deployMode,client)
   (spark.submit.pyFiles,)
   (spark.ui.showConsoleProgress,true)
   Classpath elements:
   
   
   
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   24/05/31 20:22:23 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   24/05/31 20:22:23 WARN Utils: Service 'SparkUI' could not bind on port 4040. 
Attempting port 4041.
   Spark context Web UI available at http://172.26.2.12:4041
   Spark context available as 'sc' (master = local-cluster[2, 1, 1024], app id 
= app-20240531202223-0000).
   Spark session available as 'spark'.
   Welcome to
         ____              __
        / __/__  ___ _____/ /__
       _\ \/ _ \/ _ `/ __/  '_/
      /___/ .__/\_,_/_/ /_/\_\   version 3.5.1
         /_/
   
   Using Scala version 2.12.18 (OpenJDK 64-Bit Server VM, Java 17.0.10)
   Type in expressions to have them evaluated.
   Type :help for more information.
   
   scala>
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to