[ 
https://issues.apache.org/jira/browse/KYLIN-4493?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

xuekaiqi reassigned KYLIN-4493:
-------------------------------

    Assignee: Zhichao  Zhang

> On HDP3 using spark to build dimension dictionary, it throws 
> NoSuchMethodError.
> -------------------------------------------------------------------------------
>
>                 Key: KYLIN-4493
>                 URL: https://issues.apache.org/jira/browse/KYLIN-4493
>             Project: Kylin
>          Issue Type: Bug
>          Components: Spark Engine
>    Affects Versions: v3.0.1, v3.0.2
>            Reporter: Zhichao  Zhang
>            Assignee: Zhichao  Zhang
>            Priority: Minor
>
> Test env:
> {code:java}
>  HDP 3.0.1.0-187
>  Hadoop 3.1.1
>  Hive 3.1.0
>  Kylin 3.0.2
>  Spark 2.3.2{code}
>  
> Problems:
>  when use spark engine and turn on 
> *'kylin.engine.spark-dimension-dictionary'*, it will throw error:
> {code:java}
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(Lorg/apache/hadoop/conf/Configuration;)V
>     at 
> org.apache.kylin.source.hive.CLIHiveClient.getMetaStoreClient(CLIHiveClient.java:164)
>     at 
> org.apache.kylin.source.hive.CLIHiveClient.getHiveTableMeta(CLIHiveClient.java:78)
>     at org.apache.kylin.source.hive.HiveTable.<init>(HiveTable.java:48)
>     at 
> org.apache.kylin.source.hive.HiveSource.createReadableTable(HiveSource.java:68)
>     at 
> org.apache.kylin.source.SourceManager.createReadableTable(SourceManager.java:145)
>     at 
> org.apache.kylin.engine.spark.SparkBuildDictionary$SnapshotBuildFunction.buildSnapshotTable(SparkBuildDictionary.java:386)
>     at 
> org.apache.kylin.engine.spark.SparkBuildDictionary$SnapshotBuildFunction.call(SparkBuildDictionary.java:367)
>     at 
> org.apache.kylin.engine.spark.SparkBuildDictionary$SnapshotBuildFunction.call(SparkBuildDictionary.java:325)
>     at 
> org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1043)
>     at 
> org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1043)
>     at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>     at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:462)
>     at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>     at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
>     at 
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>     at 
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>     at 
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>     at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
>     at scala.collection.AbstractIterator.to(Iterator.scala:1336)
>     at 
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
>     at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
>     at 
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
>     at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
>     at 
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
>     at 
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
>     at 
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
>     at 
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to