GitHub user RussellSpitzer opened a pull request: https://github.com/apache/spark/pull/21990
[SPARK-25003][PYSPARK] Use SessionExtensions in Pyspark Master ## What changes were proposed in this pull request? Previously Pyspark used the private constructor for SparkSession when building that object. This resulted in a SparkSession without checking the sql.extensions parameter for additional session extensions. To fix this we instead use the Session.builder() path as SparkR uses, this loads the extensions and allows their use in PySpark. ## How was this patch tested? This was manually tested by passing a class to spark.sql.extensions and making sure it's included strategies appeared in the spark._jsparkSession.sessionState.planner.strategies list. We could add a automatic test but i'm not very familiar with the Pyspark Testing framework. But I would be glad to implement that if requested. Please review http://spark.apache.org/contributing.html before opening a pull request. You can merge this pull request into a Git repository by running: $ git pull https://github.com/RussellSpitzer/spark SPARK-25003-master Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/21990.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #21990 ---- commit f790ae000ae1c3f4030162028186448d345e2984 Author: Russell Spitzer <russell.spitzer@...> Date: 2018-08-03T16:04:00Z [SPARK-25003][PYSPARK] Use SessionExtensions in Pyspark Previously Pyspark used the private constructor for SparkSession when building that object. This resulted in a SparkSession without checking the sql.extensions parameter for additional session extensions. To fix this we instead use the Session.builder() path as SparkR uses, this loads the extensions and allows their use in PySpark. ---- --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org