AngersZhuuuu commented on a change in pull request #31680:
URL: https://github.com/apache/spark/pull/31680#discussion_r584497705



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
##########
@@ -946,7 +946,11 @@ object SparkSession extends Logging {
           SparkContext.getOrCreate(sparkConf)
           // Do not update `SparkConf` for existing `SparkContext`, as it's 
shared by all sessions.
         }
-
+        // We should reset `spark.sql.catalogImplementation` according to 
current requirement.
+        if (sparkContext.conf.get(CATALOG_IMPLEMENTATION) == "in-memory" &&
+          sparkConf.get(CATALOG_IMPLEMENTATION) == "hive") {

Review comment:
       > > I think if sparkContext.conf.get(CATALOG_IMPLEMENTATION) == "hive", 
I think user have set this before start SparkSession.
   > 
   > `CATALOG_IMPLEMENTATION` is a static configuration, so we can change the 
value that a user sets before starting a SparkContext?
   
   My expression may be inaccurate. I mean when we call `getOrCreate()` is 
current existed `SparkContext.conf`'s 
   `CATALOG_IMPLEMENTATION` is `hive`, it means user have decide to use hive 
ssupport.
   Such as if we start SparkContext and then call `HiveUtils. 
withHiveExternalCatalog()` then use this SparkContext to start SparkSession?
   
   Seems our test framework is build like this. and you can see my test result 
before the last finished one.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to