zhedoubushishi commented on pull request #1760:
URL: https://github.com/apache/hudi/pull/1760#issuecomment-713049606


   > RetryingMetaStoreClient
   
   I suspect that this is because Spark 3.0.0 uses Hive 
[2.3.7](https://github.com/apache/spark/blob/v3.0.0/pom.xml#L130) but Spark 2.x 
uses Hive 
[1.2.1.spark2](https://github.com/apache/spark/blob/v2.4.0/pom.xml#L129) and 
this causes some API conflict. 
   
   Looks like this signature: 
```org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(Lorg/apache/hadoop/hive/conf/HiveConf;Lorg/apache/hadoop/hive/metastore/HiveMetaHookLoader;Ljava/util/concurrent/ConcurrentHashMap;Ljava/lang/String;Z)```
 only exists in Hive ```1.2.1.spark2```: 
https://github.com/JoshRosen/hive/blob/release-1.2.1-spark2/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java#L101
 but no longer exists in Hive ```2.3.7```: 
https://github.com/apache/hive/blob/rel/release-2.3.7/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java.
   
   If we compile with Spark 2 and then run with Spark 3, we will run into this 
kind of issue.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to