AngersZhuuuu opened a new pull request #31680: URL: https://github.com/apache/spark/pull/31680
### What changes were proposed in this pull request? When SparkContext is initialed, if we want to start SparkSession, when we call `SparkSession.buillder.enableHiveSupport().getOrCreate()`, the SparkSession we created won't have hive support since we have't reset existed SC's conf's `spark.sql.catalogImplementation`. Also, if we use existed SparkSession or default SparkSession, we need to check if the catalog implementation is same. In this pr we do two thinks: 1. When SparkContext is existed, reset `spark.sql.catalogImplementation` according to current configuration. 2. When use existed SparkSession and defaultSession, we should check if catalog implementation is same. ### Why are the changes needed? We should respect `enableHiveSupport` ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? WIP ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
