Hi ShaoFeng, I'm sure about hbase classpath and hive classpath have
commons-lang and commons-lang3, this error still occur in the step "Build
Cube", when I cut down fact_table's records( from 2 billion to 1million
rows), it can build correctory....It confused us....why this jar problem
related to records number ? Maybe  it accidentally happened in my cluster.

2016-10-13 18:15 GMT+08:00 ShaoFeng Shi <shaofeng...@apache.org>:

> hi Mars, "commons-lang" and "commons-lang3" are different artifact; Kylin
> is using commons-lang 2.6 and commons-lang3 3.4, please check the versions
> in your HBase (using 'hbase classpath').
>
> 2016-10-13 15:12 GMT+08:00 Mars J <xujiao.myc...@gmail.com>:
>
> > Even when I run the example cube kylin_learn ,it gives the same error.
> > Is that a bug in this version ?
> >
> > 2016-10-13 14:54 GMT+08:00 Mars J <xujiao.myc...@gmail.com>:
> >
> > >
> > > ---------- Forwarded message ----------
> > > From: Mars J <xujiao.myc...@gmail.com>
> > > Date: 2016-10-13 10:26 GMT+08:00
> > > Subject:
> > > To: u...@kylin.apache.org
> > >
> > >
> > > Hello ,
> > >
> > >      I come cross a problem about commons-lang when I use Apache Kylin
> > > 1.5.4.1 to build a cube including a date type column as dimension.
>  the
> > > error log print:
> > >
> > > org.apache.hadoop.mapred.YarnChild: Error running child : java.lan
> > > g.NoSuchMethodError: org.apache.commons.lang3.time.FastDateFormat.p
> > > arse(Ljava/lang/String;)Ljava/util/Date;
> > >
> > > Hadoop 2.7.2 's commons-lang version is commons-lang-2.6.jar, I think
> > > maybe it would be a version problem, but when I cp or mv the
> > > commons-lang3-3.4.jar under $KYLIN_HOME/tomcat/..../WEB-INF/lib/ to
> > >  $HADOOP_HOME/share/hadoop/common/lib/ , this problem is still there.
> > >
> > > Is anyone have the problem and can tell me how to solve it. Thanks
> > >
> > >
> >
>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>
  • Fwd: Mars J
    • Re: Mars J
      • Re: ShaoFeng Shi
        • Re: Mars J
        • Re: Mars J
          • Re: ShaoFeng Shi

Reply via email to