yes we finally had to add other jars to make it work.
I was waiting for the official release of v2.6.0 to see whether we had a
problem with our packaging process or not.... but the issue remains and we had
to patch it the same way.
Are we doing something wrong in configuring our Kylin/Aws setup ?
Hubert
Le vendredi 11 janvier 2019 à 11:38:06 UTC+1, ShaoFeng Shi
<[email protected]> a écrit :
Hi Hubert,
In the original log file, I see hbase-server and hbase-common are on the
spark command:
--jars
*/usr/lib/hbase/lib/hbase-common-1.4.2.jar*,*/usr/lib/hbase/lib/hbase-server-1.4.2.jar*,/usr/lib/hbase/lib/hbase-client-1.4.2.jar,/usr/lib/hbase/lib/hbase-protocol-1.4.2.jar,/usr/lib/hbase/lib/hbase-hadoop-compat-1.4.2.jar,/usr/lib/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/lib/hbase/lib/metrics-core-2.2.0.jar,/usr/lib/hbase/lib/hbase-hadoop-compat-1.4.2.jar,/usr/lib/hbase/lib/hbase-hadoop2-compat-1.4.2.jar,
Did you add other jars?
Best regards,
Shaofeng Shi 史少锋
Apache Kylin PMC
Work email: [email protected]
Kyligence Inc: https://kyligence.io/
Apache Kylin FAQ: https://kylin.apache.org/docs/gettingstarted/faq.html
Join Kylin user mail group: [email protected]
Join Kylin dev mail group: [email protected]
hubert stefani <[email protected]> 于2019年1月11日周五 下午6:20写道:
> indeed. By adding the hbase-server and hbase common jars the pb seems to
> be fixed.
>
>
> Le vendredi 11 janvier 2019 à 11:11:40 UTC+1, ShaoFeng Shi <
> [email protected]> a écrit :
>
> It seems missing some HBase class; The HFile.class should be in
> hbase-server-<version>.jar, not sure whether it is EMR's package issue. You
> can unzip the HBase jar files to see which jar has the class, and then add
> it to spark/lib folder.
>
> java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.hadoop.hbase.io.hfile.HFile
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:305)
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:229)
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:167)
> at org.apache.spark.internal.io
> .HadoopMapReduceWriteConfigUtil.write(SparkHadoopWriter.scala:356)
> at org.apache.spark.internal.io
> .SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:130)
> at org.apache.spark.internal.io
> .SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:127)
> at
> org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1415)
>
>
>
> Best regards,
>
> Shaofeng Shi 史少锋
> Apache Kylin PMC
> Work email: [email protected]
> Kyligence Inc: https://kyligence.io/
>
> Apache Kylin FAQ: https://kylin.apache.org/docs/gettingstarted/faq.html
> Join Kylin user mail group: [email protected]
> Join Kylin dev mail group: [email protected]
>
>
>
>
> hubert stefani <[email protected]> 于2019年1月11日周五 下午4:52写道:
>
> > hello,
> >
> > we are testing the 2.6 RC and we are facing a systematic issue when
> > building cubes with spark engine (even with sample cube), whereas the
> > MapReduce engin succeeds.
> >
> > The job process fails at step #8 Step Name: Convert Cuboid Data to HFile
> > with the following error (full log is available as attachment):
> >
> > ClassNotFoundException: org.apache.hadoop.hbase.metrics.MetricRegistry
> >
> > We run kylin on AWS EMR 5.13 (it failed also with 5.17).
> >
> > Do you have any idea of the reasons why it happens ?
> > Hubert
> >