[ 
https://issues.apache.org/jira/browse/SPARK-7950?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yin Huai resolved SPARK-7950.
-----------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0

Issue resolved by pull request 6500
[https://github.com/apache/spark/pull/6500]

> HiveThriftServer2.startWithContext() doesn't set "spark.sql.hive.version"
> -------------------------------------------------------------------------
>
>                 Key: SPARK-7950
>                 URL: https://issues.apache.org/jira/browse/SPARK-7950
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0, 1.2.1, 1.2.2, 1.3.0, 1.3.1, 1.4.0
>         Environment: Simba Spark SQL ODBC driver 1.0.8.1006
>            Reporter: Cheng Lian
>            Assignee: Cheng Lian
>            Priority: Critical
>             Fix For: 1.4.0
>
>
> While testing the newly released Simba Spark SQL ODBC driver 1.0.8.1006 
> against 1.4.0-SNAPSHOT, we found that if {{HiveThriftServer2}} is started 
> with {{HiveThriftServer2.startWithContext()}}, then simple queries like
> {code:sql}
> SELECT * FROM src
> {code}
> fail with the following error message (need to turn on ODBC trace log):
> {noformat}
> DIAG [S0002] [Simba][SQLEngine] (31740) Table or view not found: SPARK..src
> {noformat}
> However, JDBC client like Beeline is fine. Also, if the server is started via 
> {{sbin/start-thriftserver.sh}}, both ODBC and JDBC work fine.
> The reason for this failure is that, {{HiveThriftServer2.startWithContext()}} 
> doesn't properly set the "spark.sql.hive.version" property. It seems that 
> Simba ODBC driver 1.0.8.1006 behaves differently when this property is 
> missing. What I observed is that, in this case, the ODBC driver issues a 
> {{GetColumns}} command, which isn't overriden in Spark {{HiveThriftServer2}}, 
> and this falls back to original Hive code path, which results in unexpected 
> behavior.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to