[ 
https://issues.apache.org/jira/browse/SPARK-13446?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16931751#comment-16931751
 ] 

JP Bordenave edited comment on SPARK-13446 at 9/17/19 7:00 PM:
---------------------------------------------------------------

okl  thanks a lot for your help

first solution,  i restore all jars 1.2.1 under spark/jar  folder and hdfs 
/sparkjars

Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Hive Schema 
version 1.2.0 does not match metastore's schema version 2.3.0 Metastore is not 
upgraded or corrupt
 at 
org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:6679)

 i got conflict  with the schema of hive 2.3.6 initialized on db mysql,   when 
i use schema 1.2.0  for reinit metatstore ,  spark-shell doesn't show error but 
hive and hadoop doenst work together and fail exeption until i restore schema 
2.3.6  metastore and get them back to work.

i understand why we must copy hive*2.3.6 on spark/jars and remove 
hive-*1.2.1***, seem an issue with the schema

 i will try second solution next week end with  the patch,  

i take spark 2..4.4 and hadoop 2.7.7, and like says documentation  hive from 
0.12.0 to 2.3.3,  but this version is not more available on hive website, the 
only one available is hive 2.3.6 stable vesrion

[https://www-eu.apache.org/dist/hive/]

 

Spark SQL is designed to be compatible with the Hive Metastore, SerDes and 
UDFs. Currently, Hive SerDes and UDFs are based on Hive 1.2.1, and Spark SQL 
can be connected to different versions of Hive Metastore (from 0.12.0 to 2.3.3. 
Also see

[https://spark.apache.org/docs/latest/sql-data-sources-hive-tables.html#interacting-with-different-versions-of-hive-metastore]

 

 

 

 

 


was (Author: jpbordi):
okl  thanks a lot for your help

first solution,  i restore all jars 1.2.1 under spark/jar  folder and hdfs 
/sparkjars

Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Hive Schema 
version 1.2.0 does not match metastore's schema version 2.3.0 Metastore is not 
upgraded or corrupt
 at 
org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:6679)

 i got conflict  with the schema of hive 2.3.6 initialized on db mysql,   when 
i use schema 1.2.0  for reinit metatstore ,  spark-shell doesn't show error but 
hive and hadoop doenst work together and fail exeption until i restore schema 
2.3.6  metastore and get them back to work.

i understand why we must copy hive*2.3.6 on spark/jars and remove hive-*1.2.1***

 i will try second solution next week end with  the patch,  

i take spark 2..4.4 and hadoop 2.7.7, and like says documentation  hive from 
0.12.0 to 2.3.3,  but this version is not more available on hive website, the 
only one available is hive 2.3.6 stable vesrion

[https://www-eu.apache.org/dist/hive/]

 

Spark SQL is designed to be compatible with the Hive Metastore, SerDes and 
UDFs. Currently, Hive SerDes and UDFs are based on Hive 1.2.1, and Spark SQL 
can be connected to different versions of Hive Metastore (from 0.12.0 to 2.3.3. 
Also see

[https://spark.apache.org/docs/latest/sql-data-sources-hive-tables.html#interacting-with-different-versions-of-hive-metastore]

 

 

 

 

 

> Spark need to support reading data from Hive 2.0.0 metastore
> ------------------------------------------------------------
>
>                 Key: SPARK-13446
>                 URL: https://issues.apache.org/jira/browse/SPARK-13446
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Lifeng Wang
>            Assignee: Xiao Li
>            Priority: Major
>             Fix For: 2.2.0
>
>
> Spark provided HIveContext class to read data from hive metastore directly. 
> While it only supports hive 1.2.1 version and older. Since hive 2.0.0 has 
> released, it's better to upgrade to support Hive 2.0.0.
> {noformat}
> 16/02/23 02:35:02 INFO metastore: Trying to connect to metastore with URI 
> thrift://hsw-node13:9083
> 16/02/23 02:35:02 INFO metastore: Opened a connection to metastore, current 
> connections: 1
> 16/02/23 02:35:02 INFO metastore: Connected to metastore.
> Exception in thread "main" java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT
>         at 
> org.apache.spark.sql.hive.HiveContext.configure(HiveContext.scala:473)
>         at 
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:192)
>         at 
> org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
>         at 
> org.apache.spark.sql.hive.HiveContext$$anon$1.<init>(HiveContext.scala:422)
>         at 
> org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:422)
>         at 
> org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:421)
>         at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:72)
>         at org.apache.spark.sql.SQLContext.table(SQLContext.scala:739)
>         at org.apache.spark.sql.SQLContext.table(SQLContext.scala:735)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to