dongjoon-hyun commented on code in PR #48755:
URL: https://github.com/apache/spark/pull/48755#discussion_r1828839479


##########
sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -889,6 +889,9 @@ object SparkSession extends api.BaseSparkSessionCompanion 
with Logging {
 
         // No active nor global default session. Create a new one.
         val sparkContext = userSuppliedContext.getOrElse {
+          // Override appName with the submitted appName
+          sparkConf.getOption("spark.submit.appName")

Review Comment:
   Thank you for review, @viirya .
   
   1. The below is `SparkContext`-generated random app name only applied when 
`spark.app.name` is not applied by the users or any setting (`-c` , `-D`, or 
`SparkSession.setAppName`).
     - The random name is not used when `spark.app.name` is given like the 
following. In addition, in the following case, SREs or system admins cannot 
override this statically compiled appName, `Spark Pi`, via `-c` or `-D`, 
because this setting is applied at the last code path.
   
   
https://github.com/apache/spark/blob/0d2d031c2d907393ad6933677ea90ec95a652d50/examples/src/main/scala/org/apache/spark/examples/SparkPi.scala#L28-L31
   
   2. `spark.submit.appName` is a newly proposed configuration which allows SRE 
and admin can override the compile time `appName` like the above case.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to