Github user RussellSpitzer commented on a diff in the pull request:
https://github.com/apache/spark/pull/21990#discussion_r210428804
--- Diff: python/pyspark/sql/session.py ---
@@ -218,7 +218,9 @@ def __init__(self, sparkContext, jsparkSession=None):
.sparkContext().isStopped():
jsparkSession =
self._jvm.SparkSession.getDefaultSession().get()
else:
- jsparkSession = self._jvm.SparkSession(self._jsc.sc())
+ jsparkSession = self._jvm.SparkSession.builder() \
+ .sparkContext(self._jsc.sc()) \
+ .getOrCreate()
--- End diff --
Yeah let me add in the test, and then I'll clear out all the python
duplication of Scala code. I can make it more of a wrapper and less of a
reimplementer.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]