HyukjinKwon commented on a change in pull request #35410:
URL: https://github.com/apache/spark/pull/35410#discussion_r806358814
##########
File path: python/pyspark/sql/context.py
##########
@@ -714,14 +711,13 @@ def __init__(self, sparkContext: SparkContext,
jhiveContext: Optional[JavaObject
+ "SparkSession.builder.enableHiveSupport().getOrCreate()
instead.",
FutureWarning,
)
+ static_conf = {}
if jhiveContext is None:
- sparkContext._conf.set( # type: ignore[attr-defined]
- "spark.sql.catalogImplementation", "hive"
- )
- sparkSession =
SparkSession.builder._sparkContext(sparkContext).getOrCreate()
- else:
- sparkSession = SparkSession(sparkContext,
jhiveContext.sparkSession())
- SQLContext.__init__(self, sparkContext, sparkSession, jhiveContext)
+ static_conf = {"spark.sql.catalogImplementation": "in-memory"}
Review comment:
Shoot. I pushed a wrong change in the last minute - I was manually
testing if this value is correctly set or not. I has to be `hive`. This uses an
existing session that has `hive` by default in tests so it doesn't affect the
tests but would impact shell if users use `HiveContext` directly without the
existing Spark session.
Sorry this was my mistake. I will make a quick followup.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]