Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/6314#discussion_r30904612
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala ---
    @@ -149,22 +149,6 @@ class HiveContext(sc: SparkContext) extends 
SQLContext(sc) {
       protected[sql] lazy val substitutor = new VariableSubstitution()
     
       /**
    -   * The copy of the hive client that is used for execution.  Currently 
this must always be
    -   * Hive 13 as this is the version of Hive that is packaged with Spark 
SQL.  This copy of the
    -   * client is used for execution related tasks like registering temporary 
functions or ensuring
    -   * that the ThreadLocal SessionState is correctly populated.  This copy 
of Hive is *not* used
    -   * for storing peristent metadata, and only point to a dummy metastore 
in a temporary directory.
    -   */
    -  @transient
    -  protected[hive] lazy val executionHive: ClientWrapper = {
    -    logInfo(s"Initilizing execution hive, version $hiveExecutionVersion")
    -    new ClientWrapper(
    -      version = IsolatedClientLoader.hiveVersion(hiveExecutionVersion),
    -      config = newTemporaryConfiguration())
    --- End diff --
    
    @scwf This is a Spark SQL specific design, which enables us to connect 
multiple versions of Hive metastore with a single code base. The 
`executionHive` is used for internal jobs within Spark SQL framework, while the 
`metadataHive` is used to talk to external Hive metastore.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to