Repository: spark
Updated Branches:
  refs/heads/master a3c29fcbb -> 1b50e0e0d


[SPARK-20256][SQL] SessionState should be created more lazily

## What changes were proposed in this pull request?

`SessionState` is designed to be created lazily. However, in reality, it 
created immediately in `SparkSession.Builder.getOrCreate` 
([here](https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala#L943)).

This PR aims to recover the lazy behavior by keeping the options into 
`initialSessionOptions`. The benefit is like the following. Users can start 
`spark-shell` and use RDD operations without any problems.

**BEFORE**
```scala
$ bin/spark-shell
java.lang.IllegalArgumentException: Error while instantiating 
'org.apache.spark.sql.hive.HiveSessionStateBuilder'
...
Caused by: org.apache.spark.sql.AnalysisException:
    org.apache.hadoop.hive.ql.metadata.HiveException:
       MetaException(message:java.security.AccessControlException:
          Permission denied: user=spark, access=READ,
             inode="/apps/hive/warehouse":hive:hdfs:drwx------
```
As reported in SPARK-20256, this happens when the warehouse directory is not 
allowed for this user.

**AFTER**
```scala
$ bin/spark-shell
...
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.3.0-SNAPSHOT
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.

scala> sc.range(0, 10, 1).count()
res0: Long = 10
```

## How was this patch tested?

Manual.

This closes #18512 .

Author: Dongjoon Hyun <[email protected]>

Closes #18501 from dongjoon-hyun/SPARK-20256.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/1b50e0e0
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/1b50e0e0
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/1b50e0e0

Branch: refs/heads/master
Commit: 1b50e0e0d6fd9d1b815a3bb37647ea659222e3f1
Parents: a3c29fc
Author: Dongjoon Hyun <[email protected]>
Authored: Tue Jul 4 09:48:40 2017 -0700
Committer: gatorsmile <[email protected]>
Committed: Tue Jul 4 09:48:40 2017 -0700

----------------------------------------------------------------------
 .../main/scala/org/apache/spark/sql/SparkSession.scala  | 12 ++++++++++--
 1 file changed, 10 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/1b50e0e0/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
----------------------------------------------------------------------
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
index 2c38f7d..0ddcd21 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
@@ -118,6 +118,12 @@ class SparkSession private(
   }
 
   /**
+   * Initial options for session. This options are applied once when 
sessionState is created.
+   */
+  @transient
+  private[sql] val initialSessionOptions = new 
scala.collection.mutable.HashMap[String, String]
+
+  /**
    * State isolated across sessions, including SQL configurations, temporary 
tables, registered
    * functions, and everything else that accepts a 
[[org.apache.spark.sql.internal.SQLConf]].
    * If `parentSessionState` is not null, the `SessionState` will be a copy of 
the parent.
@@ -132,9 +138,11 @@ class SparkSession private(
     parentSessionState
       .map(_.clone(this))
       .getOrElse {
-        SparkSession.instantiateSessionState(
+        val state = SparkSession.instantiateSessionState(
           SparkSession.sessionStateClassName(sparkContext.conf),
           self)
+        initialSessionOptions.foreach { case (k, v) => 
state.conf.setConfString(k, v) }
+        state
       }
   }
 
@@ -940,7 +948,7 @@ object SparkSession {
         }
 
         session = new SparkSession(sparkContext, None, None, extensions)
-        options.foreach { case (k, v) => 
session.sessionState.conf.setConfString(k, v) }
+        options.foreach { case (k, v) => session.initialSessionOptions.put(k, 
v) }
         defaultSession.set(session)
 
         // Register a successfully instantiated context to the singleton. This 
should be at the


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to