Github user marmbrus commented on a diff in the pull request:

    https://github.com/apache/spark/pull/8441#discussion_r38126088
  
    --- Diff: docs/sql-programming-guide.md ---
    @@ -1696,12 +1711,16 @@ version specified by users. An isolated classloader 
is used here to avoid depend
           property can be one of three options:
           <ol>
             <li><code>builtin</code></li>
    -        Use Hive 0.13.1, which is bundled with the Spark assembly jar when 
<code>-Phive</code> is
    +        Use Hive 1.2.1, which is bundled with the Spark assembly jar when 
<code>-Phive</code> is
             enabled. When this option is chosen, 
<code>spark.sql.hive.metastore.version</code> must be
    -        either <code>0.13.1</code> or not defined.
    +        either <code>1.2.1</code> or not defined.
             <li><code>maven</code></li>
    -        Use Hive jars of specified version downloaded from Maven 
repositories.
    -        <li>A classpath in the standard format for both Hive and 
Hadoop.</li>
    +        Use Hive jars of specified version downloaded from Maven 
repositories.  This configuration
    +        is not generally recommended for production deployments. 
    +        <li>A classpath in the standard format for the JVM.  This 
classpath must include all of Hive 
    +        and its dependencies, including the correct version of Hadoop.  
These jars only need to be
    +        present on the driver, but if you are running in yarn client mode 
then you must ensure
    --- End diff --
    
    Correct, they are only used by the driver to get metadata.  Thanks for the 
clarification on cluster vs client.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to