Hi Santosh,
I think you are the first one:) could you please help write a wiki to
summarize your journey and issues you faced during your pilot? It will be
great reference for others to try:
https://github.com/KylinOLAP/Kylin/wiki
<https://github.com/KylinOLAP/Kylin/wiki>
Thank you very much and looking forward for your contribution.
Luke
Best Regards!
---------------------
Luke Han
2015-03-02 12:48 GMT+08:00 Santoshakhilesh <[email protected]>:
> Hi Shaofeng ,
>
> Changing the Hbase libs with 2.6.0 hadoop libs did the trick.
>
> Now my cube build is successful and I am also able to query the
> preaggergated data.
>
> Thanks a lot to you for helping me out to install kylin
> sucessfully on pseudo distributed mode.
>
> Now I will start my PoC with our use cases.
>
> I am not sure if its first time one has sucessfully configured
> kylin on single node in pseudo distributed mode , but if its first then let
> me know where can I write the wiki to compile the steps / common issues
> faced and resolutions.
> Thanks once again.
>
> Regards,
> Santosh Akhilesh
> Bangalore R&D
> HUAWEI TECHNOLOGIES CO.,LTD.
>
> www.huawei.com
>
> -------------------------------------------------------------------------------------------------------------------------------------
> This e-mail and its attachments contain confidential information from
> HUAWEI, which
> is intended only for the person or entity whose address is listed above.
> Any use of the
> information contained herein in any way (including, but not limited to,
> total or partial
> disclosure, reproduction, or dissemination) by persons other than the
> intended
> recipient(s) is prohibited. If you receive this e-mail in error, please
> notify the sender by
> phone or email immediately and delete it!
>
> ________________________________________
> From: Shi, Shaofeng [[email protected]]
> Sent: Monday, March 02, 2015 8:42 AM
> To: [email protected]
> Subject: Re: Cube Build Failed at Last Step//RE: Error while making cube &
> Measure option is not responding on GUI
>
> HDP sandbox doesn¹t have that problem. I also checked a cluster that
> manually installed by Ambari, it also doesn¹t have this problem: There is
> no hadoop-mapreduce-client jar cached in hbase/lib folder at all; I
> believe Ambari has followed the hbase guidances.
>
> Install hadoop cluster by hands is not recommended and it is not Kylin¹s
> focus; But it will be welcomed if you can summary the issues/solutions you
> encountered and contribute to Kylin¹s Wiki, that may help a lot of guys.
> Thank you!
>
> On 3/2/15, 10:48 AM, "Santosh Akhilesh" <[email protected]> wrote:
>
> >Hi Shaofeng,
> >Thanks.
> >
> >I will try replacing Hadoop libs in my hbase install. I am running Hadoop
> >on single node in pseudo distributed mode. So I guess this issue will come
> >always if a MR job is launched by a client from outside.
> >Does sandbox deployment ensures that hbase , hive etc.. Refer to a common
> >Hadoop lib? I wonder why this issue didn't come in your test.
> >Do I need to this for hive too?
> >As I see the MR counter comes in each step of cube build and not only in
> >last step.
> >If replacing works. I guess this should be mentioned in install docs.
> >Regards
> >Santosh
> >On Mon, 2 Mar 2015 at 7:43 am, Shi, Shaofeng <[email protected]> wrote:
> >
> >> Please refer to this segment in http://hbase.apache.org/book.html:
> >>
> >> Replace the Hadoop Bundled With HBase!
> >> Because HBase depends on Hadoop, it bundles an instance of the Hadoop
> >>jar
> >> under its lib directory. The bundled jar is ONLY for use in standalone
> >> mode. In distributed mode, it is critical that the version of Hadoop
> >>that
> >> is out on your cluster match what is under HBase. Replace the hadoop jar
> >> found in the HBase lib directory with the hadoop jar you are running on
> >> your cluster to avoid version mismatch issues. Make sure you replace the
> >> jar in HBase everywhere on your cluster. Hadoop version mismatch issues
> >> have various manifestations but often all looks like its hung up.
> >>
> >>
> >>
> >> On 3/1/15, 8:34 PM, "Santosh Akhilesh" <[email protected]>
> >>wrote:
> >>
> >> >Hi Shaofeng,
> >> >
> >> > I have raised the bug; Please suggest a resolution or
> >> >alternate ASAP
> >> > https://issues.apache.org/jir/browse/KYLIN-617
> >> >
> >> >Regards,
> >> >Santosh Akhilesh
> >> >
> >> >
> >> >On Sun, Mar 1, 2015 at 5:02 PM, Shi, Shaofeng <[email protected]>
> wrote:
> >> >
> >> >> Hi Santosh, this is very likely the problem; We will verify this on
> >> >> Monday; In the meantime, could you please report a new JIRA with this
> >> >> problem and your findings? I appreciateyour input!
> >> >>
> >> >> On 3/1/15, 3:03 PM, "Santosh Akhilesh" <[email protected]>
> >> >>wrote:
> >> >>
> >> >> >Hi Shaofeng ,
> >> >> > My map reduce application class path doesnt contain the hbase
> >> >>libs.
> >> >> >But I find that kylin.sh start / stop scripts has initialized hbase
> >>env
> >> >> >first before anything else. So when I see th kylin.log client env
> >> >>loads
> >> >> >the hbases client libs before hadoop and hbase client lib is
> >>2.2.0. Is
> >> >> >this isue related to kylin.sh startup script ? I am attaching my
> >>class
> >> >> >path setting in mapred site xml and kylin.log print of classpath.
> >> >> >
> >> >> ><name>mapreduce.application.classpath</name>
> >> >>
> >> >>><value>/tmp/kylin/*,/home/santosh/work/frameworks/
> >> hadoop-2.6.0/etc/hadoo
> >> >>>p,
> >> >>
> >> >>>/home/santosh/work/frameworks/hadoop-2.6.0/etc/
> >> hadoop,/home/santosh/work
> >> >>>/f
> >> >>
> >> >>>rameworks/hadoop-2.6.0/etc/hadoop,/home/santosh/work/
> >> frameworks/hadoop-2
> >> >>>.6
> >> >>
> >> >>>.0/share/hadoop/common/lib/*,/home/santosh/work/
> >> frameworks/hadoop-2.6.0/
> >> >>>sh
> >> >>
> >> >>>are/hadoop/common/*,/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/had
> >> >>>oo
> >> >> >>>p/hdfs,/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/hdfs/lib/
> >> >>>*,
> >> >>
> >> >>>/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/hdfs/*,/home/san
> >> >>>to
> >> >>
> >> >>>sh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/*,
> >> /home/santosh/wo
> >> >>>rk
> >> >>
> >> >>>/frameworks/hadoop-2.6.0/share/hadoop/yarn/*,/home/
> >> santosh/work/framewor
> >> >>>ks
> >> >>
> >> >>>/hadoop-2.6.0/share/hadoop/mapreduce/lib/*,/home/santosh/
> >> work/frameworks
> >> >>>/h
> >> >>
> >> >>>adoop-2.6.0/share/hadoop/mapreduce/*,/contrib/capacity-
> >> scheduler/*.jar,/
> >> >>>ho
> >> >>
> >> >>>me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> yarn/*,/home/santos
> >> >>>h/
> >> >>
> >> >>>wrk/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/*,/
> >> home/santosh/work/f
> >> >>>r
> >> >>
> >> >>>ameworks/apache-hive-1.0.0/conf,/home/santosh/work/
> >> frameworks/apache-hiv
> >> >>>e-
> >> >>
> >> >>>1.0.0/hcatalog/shar/hcatalog/*,/home/santosh/work/
> >> frameworks/apache-hive
> >> >>>-
> >> >> >1.0.0/lib/hive-exec-1.0.0.jar</value>
> >> >> >
> >> >> >
> >> >> >Kylin.log
> >> >> >Client
> >> >>
> >> >>>environment:java.class.path=/etc/kylin:/home/santosh/
> >> work/software/tomca
> >> >>>t/
> >> >>
> >> >>>bin/bootstrap.jar:/home/santosh/work/software/tomcat/
> >> bin/tomcat-juli.jar
> >> >>>:/
> >> >>
> >> >>>home/santosh/work/software/tomcat/lib/catalina-tribes.
> >> jar:/home/santosh/
> >> >>>wo
> >> >>
> >>
> >>>>>rk/software/tomca/lib/jsp-api.jar:/home/santosh/work/software/tomcat/l
> >>>>>i
> >> >>>b/
> >> >>
> >> >>>catalina-ant.jar:/home/santosh/work/software/tomcat/
> >> lib/ecj-4.4.jar:/hom
> >> >>>e/
> >> >>
> >> >>>santosh/work/software/tomcat/lib/tomcat-dbcp.jar:/
> >> home/santosh/work/soft
> >> >>>wa
> >> >>
> >> >>>re/tomcat/lib/catalina.jar:/home/santosh/work/software/
> >> tomcat/lib/tomcat
> >> >>>-a
> >> >>
> >> >>>pi.jar:/home/santosh/work/software/tomcat/lib/catalina-
> >> ha.jar:/home/sant
> >> >>>os
> >> >>
> >> >>>h/work/software/tomcat/lib/jasper-el.jar:/home/santosh/
> >> work/software/tom
> >> >>>ca
> >> >>
> >> >>>t/lib/tomcat7-websocket.jar:/home/santosh/work/
> >> software/tomcat/lib/jaspe
> >> >>>r.
> >> >>
> >> >>>jar:/home/santosh/work/software/tomcat/lib/tomcat-
> >> coyote.jar:/home/santo
> >> >>>sh
> >> >>
> >>
> >>>>>/work/software/tomcat/li/tomcat-i18n-ja.jar:/home/santosh/work/softwar
> >>>>>e
> >> >>>/t
> >> >>
> >> >>>omcat/lib/tomcat-util.jar:/home/santosh/work/software/
> >> tomcat/lib/el-api.
> >> >>>ja
> >> >>
> >>
> >>>>>r:/home/santosh/work/software/tomcat/lib/ebsocket-api.jar:/home/santos
> >>>>>h
> >> >>>/w
> >> >>
> >> >>>ork/software/tomcat/lib/servlet-api.jar:/home/santosh/
> >> work/software/tomc
> >> >>>at
> >> >>
> >> >>>/lib/annotations-api.jar:/home/santosh/work/software/
> >> tomcat/lib/tomcat-i
> >> >>>18
> >> >>
> >> >>>n-es.jar:/home/santosh/work/software/tomcat/lib/
> >> tomcat-i18n-fr.jar:/home
> >> >>>/s
> >> >>
> >> >>>antosh/work/software/tomcat/lib/tomcat-jdbc.jar::/
> >> home/santosh/work/fram
> >> >>>ew
> >> >>
> >> >>>orks/hase-0.98.10/bin/../conf:/home/santosh/work/java/
> >> jdk1.7.0_75/lib/to
> >> >>>o
> >> >>
> >> >>>ls.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> .:/home/santosh/
> >> >>>wo
> >> >>
> >> >>>rk/frameworks/hbase-0.98.10/bin/../lib/activation-1.1.
> >> jar:/home/santosh/
> >> >>>wo
> >> >>
> >> >>>rk/frameworks/hbase-0.98.10/bin/../lib/aopalliance-1.0.
> >> jar:/home/santosh
> >> >>>/w
> >> >>
> >> >>>ork/frameworks/hbase-0.98.10/bin/../lib/asm-3.1.jar:/
> >> home/santosh/work/f
> >> >>>ra
> >> >>
> >> >>>meworks/hbase-0.98.10/bin/../lib/avro-1.7.4.jar:/home/
> >> santosh/work/frame
> >> >>>wo
> >> >>
> >>
> >>>>>rks/hbase-0.98.10/bin/../lib/commons-beanutils-1.7.0.ja:/home/santosh/
> >>>>>w
> >> >>>or
> >> >>
> >> >>>k/frameworks/hbase-0.98.10/bin/../lib/commons-beanutils-
> >> core-1.8.0.jar:/
> >> >>>ho
> >> >>
> >> >>>me/santosh/work/frameworks/hbase-0.98.10/bin/../li/
> >> commons-cli-1.2.jar:/
> >> >>>h
> >> >>
> >> >>>ome/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/commons-codec-1.7.j
> >> >>>ar
> >> >>
> >>
> >>>>>:/home/santosh/wor/frameworks/hbase-0.98.10/bin/../lib/commons-collect
> >>>>>i
> >> >>>on
> >> >>
> >> >>>s-3.2.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/commo
> >> >>ns
> >> >>
> >> >>>-copress-1.4.1.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../li
> >> >>>b
> >> >>
> >> >>>/commons-configuration-1.6.jar:/home/santosh/work/
> >> frameworks/hbase-0.98.
> >> >>>10
> >> >>
> >> >>>/bin/../lib/commons-daemon-1.0.13.jar:/home/santosh/work/
> >> frameworks/hbas
> >> >>>e-
> >> >>
> >> >>>0.98.10/bin/../lib/commons-digester-1.8.jar:/home/
> >> santosh/work/framework
> >> >>>s/
> >> >>
> >> >>>hbase-0.98.10/bin/../lib/commons-el-1.0.jar:/home/
> >> santosh/work/framework
> >> >>>s/
> >> >>
> >> >>>hbase-0.98.10/bin/../lib/commons-httpclient-3.1.jar:/
> >> home/santosh/work/f
> >> >>>ra
> >> >>
> >>
> >>>>>meworks/hbase-0.98.0/bin/../lib/commons-io-2.4.jar:/home/santosh/work/
> >>>>>f
> >> >>>ra
> >> >>
> >> >>>meworks/hbase-0.98.10/bin/../lib/commons-lang-2.6.jar:/
> >> home/santosh/work
> >> >>>/f
> >> >>
> >>
> >>>>>rameworks/hbse-0.98.10/bin/../lib/commons-logging-1.1.1.jar:/home/sant
> >>>>>o
> >> >>>sh
> >> >>
> >> >>>/work/frameworks/hbase-0.98.10/bin/../lib/commons-math-
> >> 2.1.jar:/home/san
> >> >>>to
> >> >>
> >> >>>sh/work/frameworks/hbase-0.98.10/bin/../lib/commons-net-
> >> 3.1.jar:/home/sa
> >> >>>nt
> >> >>
> >>
> >>>>>osh/work/framewors/hbase-0.98.10/bin/../lib/findbugs-annotations-1.3.9
> >>>>>-
> >> >>>1.
> >> >>
> >> >>>jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/gmbal-api-onl
> >> >>>y-
> >> >>
> >> >>>3.0.0-b023.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/gr
> >> >>>iz
> >> >
> >> >>>zly-framework-2.1.2.jar:/home/santosh/work/frameworks/
> >> hbase-0.98.10/bin/
> >> >>>..
> >> >>
> >> >>>/lib/grizzly-http-2.1.2.jar:/home/santosh/work/
> >> frameworks/hbase-0.98.10/
> >> >>>bi
> >> >>
> >> >>>n/../lib/grizzly-http-server-2.1.2.jar:/home/
> >> santosh/work/frameworks/hba
> >> >>>se
> >> >>
> >> >>>-0.98.10/bin/../lib/grizzly-http-servlet-2.1.2.
> >> jar:/home/santosh/work/fr
> >> >>>am
> >> >>
> >> >>>eworks/hbase-0.98.10/bin/../lib/grizzly-rcm-2.1.2.jar:/
> >> home/santosh/work
> >> >>>/f
> >> >>
> >>
> >>>>>rameworks/hbase-0.98.10/bin/..lib/guava-12.0.1.jar:/home/santosh/work/
> >>>>>f
> >> >>>ra
> >> >>
> >> >>>meworks/hbase-0.98.10/bin/../lib/guice-3.0.jar:/home/
> >> santosh/work/framew
> >> >>>or
> >> >>
> >> >>>ks/hbase-0.98.10/bin/../lib/guice-servlet-3.0.jar:/
> >> home/santosh/work/fra
> >> >>>me
> >> >>
> >> >>>works/hbase-0.98.10/bin/../lib/hadoop-annotations-2.2.0.
> >> jar:/home/santos
> >> >>>h/
> >> >>
> >> >>>work/frameworks/hbase-0.98.10/bin/../lib/hadoop-auth-2.2.
> >> 0.jar:/home/san
> >> >>>to
> >> >>
> >> >>>sh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-
> >> client-2.2.0.jar:/hom
> >> >>>e/
> >> >>
> >>
> >>>>>santosh/work/frameworks/hbase-0.9810/bin/../lib/hadoop-common-2.2.0.ja
> >>>>>r
> >> >>>:/
> >> >>
> >> >>>home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/hadoop-hdfs-2.2.0.
> >> >>>ja
> >> >>
> >>
> >>>>>r:/home/santoh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-mapredu
> >>>>>c
> >> >>>e-
> >> >>
> >> >>>client-app-2.2.0.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../
> >> >>>li
> >> >>
> >> >>>b/hadoop-mapreduce-client-common-2.2.0.jar:/home/
> >> santosh/work/frameworks
> >> >>>/h
> >> >>
> >> >>>base-0.98.10/bin/../lib/hadoop-mapreduce-client-core-
> >> 2.2.0.jar:/home/san
> >> >>>to
> >> >>
> >> >>>sh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-
> >> mapreduce-client-jobc
> >> >>>li
> >> >>
> >> >>>ent-2.2.0.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/had
> >> >>>oo
> >> >>
> >> >>>p-mapreduce-client-shuffle-2.2.0.jar:/home/santosh/work/
> >> frameworks/hbase
> >> >>>-0
> >> >>
> >> >>>.98.10/bin/../lib/hadoop-yarn-api-2.2.0.jar:/home/
> >> santosh/work/framework
> >> >>>s/
> >> >>
> >> >>>hbase-0.98.10/bin/../lib/hadoop-yarn-client-2.2.0.jar:/
> >> home/santosh/work
> >> >>>/f
> >> >>
> >> >>>rameworks/hbase-0.98.10/bin/../lib/hadoop-yarn-common-
> >> 2.2.0.jar:/home/sa
> >> >>>nt
> >> >>
> >> >>>osh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-
> >> yarn-server-common-2
> >> >>>.2
> >> >>
> >> >>>.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/hadoop-yar
> >> >>>n-
> >> >>
> >> >>>server-nodemanager-2.2.0.jar:/home/santosh/work/
> >> frameworks/hbase-0.98.10
> >> >>>/b
> >> >>
> >> >>>in/../lib/hamcrest-core-1.3.jar:/home/santosh/work/
> >> frameworks/hbase-0.98
> >> >>>.1
> >> >>
> >> >>>0/bin/../lib/hbase-annotations-0.98.10-hadoop2.
> >> jar:/home/santosh/work/fr
> >> >>>am
> >> >>
> >> >>>eworks/hbase-0.98.10/bin/../lib/hbase-checkstyle-0.98.10-
> >> hadoop2.jar:/ho
> >> >>>me
> >> >>
> >> >>>/santosh/work/frameworks/hbase-0.98.10/bin/../lib/
> >> hbase-client-0.98.10-h
> >> >>>ad
> >> >>
> >> >>>oop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/hbase-co
> >> >>>mm
> >> >>
> >> >>>on-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/
> >> hbase-0.98.10/bin/.
> >> >>>./
> >> >>
> >> >>>lib/hbase-common-0.98.10-hadoop2-tests.jar:/home/
> >> santosh/work/frameworks
> >> >>>/h
> >> >>
> >> >>>base-0.98.10/bin/../lib/hbase-examples-0.98.10-
> >> hadoop2.jar:/home/santosh
> >> >>>/w
> >> >>
> >> >>>ork/frameworks/hbase-0.98.10/bin/../lib/hbase-hadoop2-
> >> compat-0.98.10-had
> >> >>>oo
> >> >>
> >> >>>p2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/hbase-hado
> >> >>>op
> >> >>
> >> >>>-compat-0.98.10-hadoop2.jar:/home/santosh/work/
> >> frameworks/hbase-0.98.10/
> >> >>>bi
> >> >>
> >> >>>n/../lib/hbase-it-0.98.10-hadoop2.jar:/home/santosh/
> >> work/frameworks/hbas
> >> >>>e-
> >> >>
> >> >>>0.98.10/bin/../lib/hbase-it-0.98.10-hadoop2-tests.jar:/
> >> home/santosh/work
> >> >>>/f
> >> >>
> >>
> >>>>>rameworks/hbase-0.98.10/bin/../libhbase-prefix-tree-0.98.10-hadoop2.ja
> >>>>>r
> >> >>>:/
> >> >>
> >> >>>home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/hbase-protocol-0.9
> >> >>>8.
> >> >>
> >> >>>10-hadoop2.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/hb
> >> >>>as
> >> >>
> >> >>>e-rest-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/
> >> hbase-0.98.10/b
> >> >>>in
> >> >>
> >> >>>/../lib/hbase-server-0.98.10-hadoop2.jar:/home/santosh/
> >> work/frameworks/h
> >> >>>ba
> >> >>
> >> >>>se-0.98.10/bin/../lib/hbase-server-0.98.10-hadoop2-
> >> tests.jar:/home/santo
> >> >>>sh
> >> >>
> >> >>>/work/frameworks/hbase-0.98.10/bin/../lib/hbase-shell-
> >> 0.98.10-hadoop2.ja
> >> >>>r:
> >> >>
> >> >>>/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/hbase-testing-uti
> >> >>>l-
> >> >>
> >> >>>0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../l
> >> >>>ib
> >> >>
> >> >>>/hbase-thrift-0.98.10-hadoop2.jar:/home/santosh/
> >> work/frameworks/hbase-0.
> >> >>>98
> >> >>
> >> >>>.10/bin/../lib/high-scale-lib-1.1.1.jar:/home/santosh/
> >> work/frameworks/hb
> >> >>>as
> >> >>
> >> >>>e-0.98.10/bin/../lib/htrace-core-2.04.jar:/home/
> >> santosh/work/frameworks/
> >> >>>hb
> >> >>
> >> >>>ase-0.98.10/bin/../lib/httpclient-4.1.3.jar:/home/
> >> santosh/work/framework
> >> >>>s/
> >> >>
> >> >>>hbase-0.98.10/bin/../lib/httpcore-4.1.3.jar:/home/
> >> santosh/work/framework
> >> >>>s/
> >> >>
> >> >>>hbase-0.98.10/bin/../lib/jackson-core-asl-1.8.8.jar:/
> >> home/santosh/work/f
> >> >>>ra
> >> >>
> >> >>>meworks/hbase-0.98.10/bin/../lib/jackson-jaxrs-1.8.8.jar:
> >> /home/santosh/w
> >> >>>or
> >> >>
> >> >>>k/frameworks/hbase-0.98.10/bin/../lib/jackson-mapper-asl-
> >> 1.8.8.jar:/home
> >> >>>/s
> >> >>
> >> >>>antosh/work/frameworks/hbase-0.98.10/bin/../lib/
> >> jackson-xc-1.8.8.jar:/ho
> >> >>>me
> >> >>
> >> >>>/santosh/work/frameworks/hbase-0.98.10/bin/../lib/
> >> jamon-runtime-2.3.1.ja
> >> >>>r:
> >> >>
> >> >>>/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/jasper-compiler-5
> >> >>>.5
> >> >>
> >> >>>.23.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/jasper-ru
> >> >>>nt
> >> >>
> >> >>>ime-5.5.23.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/ja
> >> >>>va
> >> >>
> >> >>>x.inject-1.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/ja
> >> >>>va
> >> >>
> >> >>>x.servlet-3.1.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib
> >> >>>/j
> >> >>
> >> >>>avax.servlet-api-3.0.1.jar:/home/santosh/work/frameworks/
> >> hbase-0.98.10/b
> >> >>>in
> >> >>
> >> >>>/../lib/jaxb-api-2.2.2.jar:/home/santosh/work/frameworks/
> >> hbase-0.98.10/b
> >> >>>in
> >> >>
> >> >>>/../lib/jaxb-impl-2.2.3-1.jar:/home/santosh/work/
> >> frameworks/hbase-0.98.1
> >> >>>0/
> >> >>
> >> >>>bin/../lib/jcodings-1.0.8.jar:/home/santosh/work/
> >> frameworks/hbase-0.98.1
> >> >>>0/
> >> >>
> >> >>>bin/../lib/jersey-client-1.9.jar:/home/santosh/work/
> >> frameworks/hbase-0.9
> >> >>>8.
> >> >>
> >> >>>10/bin/../lib/jersey-core-1.8.jar:/home/santosh/work/
> >> frameworks/hbase-0.
> >> >>>98
> >> >>
> >> >>>.10/bin/../lib/jerey-grizzly2-1.9.jar:/home/
> >> santosh/work/frameworks/hbas
> >> >>>e
> >> >>
> >> >>>-0.98.10/bin/../lib/jersey-guice-1.9.jar:/home/santosh/
> >> work/frameworks/h
> >> >>>ba
> >> >>
> >> >>>se-0.98.10/bin/../lib/jersey-json-1.8.jar:/home/
> >> santosh/work/frameworks/
> >> >>>hb
> >> >>
> >> >>>ase-0.98.10/bin/../lib/jersey-server-1.8.jar:/home/
> >> santosh/work/framewor
> >> >>>s
> >> >>
> >> >>>/hbase-0.98.10/bin/../lib/jersey-test-framework-core-1.
> >> 9.jar:/home/santo
> >> >>>sh
> >> >>
> >>
> >>>>>/work/framworks/hbase-0.98.10/bin/../lib/jersey-test-framework-grzzly2
> >>>>>-
> >> >>>1
> >> >>
> >> >>>.9.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/jets3t-0.6
> >> >>>.1
> >> >>
> >>
> >>>>>.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/./lib/jettison-1.
> >>>>>3
> >> >>>.1
> >> >>
> >> >>>.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/jetty-6.1.26
> >> >>>.j
> >> >>
> >> >>>ar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
> >> ./lib/jetty-sslengin
> >> >>>e-
> >> >>
> >> >>>6.1.26.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/jetty-
> >> >>>ut
> >> >>
> >> >>>il-6.1.26.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/jon
> >> >>>i-
> >> >>
> >> >>>2.1.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/jruby-c
> >> >>>om
> >> >>
> >> >>>plete-1.6.8.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/j
> >> >>>sc
> >> >>
> >> >>>h-0.1.42.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/jsp-
> >> >>>2.
> >> >>
> >> >>>1-6.1.14.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/jsp-
> >> >>>ap
> >> >>
> >> >>>i-2.1-6.1.14.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/
> >> >>>js
> >> >>
> >> >>>r305-1.3.9.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../lib/ju
> >> >>>ni
> >> >>
> >> >>>t-4.11.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/libthr
> >> >>>if
> >> >>
> >> >>>t-0.9.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/log4j
> >> >>>-1
> >> >>
> >> >>>.2.17.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >> bin/../lib/managem
> >> >>>en
> >> >>
> >> >>>t-api-3.0.0-b012.jar:/home/santosh/work/frameworks/hbase-
> >> 0.98.10/bin/../
> >> >>>li
> >> >>
> >> >>>b/metrics-core-2.2.0.jar:/home/santosh/work/frameworks/
> >> hbase-.98.10/bin/
> >> >>>.
> >> >>
> >> >>>./lib/netty-3.6.6.Final.jar:/home/santosh/work/
> >> frameworks/hbase-0.98.10/
> >> >>>bi
> >> >>
> >> >>>n/../lib/paranamer-2.3.jar:/home/santosh/work/frameworks/
> >> hbase-0.98.10/b
> >> >>>in
> >> >>
> >> >>>/../lib/protobuf-java-2.5.0.jar:/home/santosh/work/
> >> frameworks/hbase-0.98
> >> >>>.1
> >> >>
> >> >>>0/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/santosh/
> >> work/frameworks/hb
> >> >>>as
> >> >>
> >> >>>e-0.98.10/bin/../lib/slf4j-api-1.6.4.jar:/home/santosh/
> >> work/frameworks/h
> >> >>>ba
> >> >>
> >> >>>se-0.98.10/bin/../lib/slf4j-log4j12-1.6.4.jar:/home/
> >> santosh/work/framewo
> >> >>>rk
> >> >>
> >> >>>s/hbase-0.98.10/bin/../lib/snappy-java-1.0.4.1.jar:/home/
> >> santosh/work/fr
> >> >>>am
> >> >>
> >> >>>eworks/hbase-0.98.10/bin/../lib/xmlenc-0.52.jar:/home/
> >> santosh/work/frame
> >> >>>wo
> >> >>
> >> >>>rks/hbase-0.98.10/bin/../lib/xz-1.0.jar:/home/santosh/
> >> work/frameworks/hb
> >> >>>as
> >> >>
> >> >>>e-0.98.10/bin/../lib/zookeeper-3.4.6.jar:/home/
> >> santosh/work/frameworks/h
> >> >>>ad
> >> >>
> >> >>>oop-2..0/etc/hadoop:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/had
> >> >>>o
> >> >>
> >> >>>op/common/lib/jaxb-api-2.2.2.jar:/home/santosh/work/
> >> frameworks/hadoop-2.
> >> >>>6.
> >> >>
> >> >>>0/share/hadoop/common/lib/curator-framework-2.6.0.jar:/
> >> home/santosh/work
> >> >>>/f
> >> >>
> >> >>>raeworks/hadoop-2.6.0/share/hadoop/common/lib/
> >> commons-io-2.4.jar:/home/s
> >> >>>a
> >> >>
> >> >>>ntosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> common/lib/jackson-jaxrs
> >> >>>-1
> >> >>
> >> >>>.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/ommon/
> >> >>>l
> >> >>
> >> >>>ib/protobuf-java-2.5.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/sh
> >> >>>ar
> >> >>
> >> >>>e/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/
> >> santosh/work/framework
> >> >>>s/
> >> >>
> >> >>>hadoop-2.6.0/share/hadoop/common/lib/paranamer-2.3.jar:/
> >> home/santosh/wor
> >> >>>k/
> >> >>
> >> >>>frameworks/hadoop-2.6.0/share/hadoop/common/lib/log4j-
> >> 1.2.17.jar:/home/s
> >> >>>an
> >> >>
> >> >>>tosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> common/lib/jsr305-1.3.9.j
> >> >>>ar
> >> >>
> >> >>>:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/lib/comm
> >> >>>on
> >> >>
> >> >>>s-beanutils-core-1.8.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/sh
> >> >>>ar
> >> >>
> >> >>>e/hadoop/common/lib/commons-el-1.0.jar:/home/
> >> santosh/work/frameworks/had
> >> >>>oo
> >> >>
> >> >>>p-2.6.0/share/hadoop/common/lib/servlet-api-2.5.
> >> jar:/home/santosh/work/f
> >> >>>ra
> >> >>
> >> >>>meworks/hadoop-2.6.0/share/hadoop/common/lib/jsch-0.1.42.
> >> jar:/home/santo
> >> >>>sh
> >> >>
> >> >>>/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/
> >> commons-configurat
> >> >>>io
> >> >>
> >> >>>n-1.6.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/common
> >> >>>/l
> >> >>
> >> >>>ib/jackson-core-asl-1.9.13.jar:/home/santosh/work/
> >> frameworks/hadoop-2.6.
> >> >>>0/
> >> >>
> >> >>>share/hadoop/common/lib/activation-1.1.jar:/home/
> >> santosh/work/frameworks
> >> >>>/h
> >> >>
> >> >>>adoop-2.6.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.
> >> jar:/home/santosh/
> >> >>>wo
> >> >>
> >> >>>rk/frameworks/hadoop-2.6.0/share/hadoop/common/lib/
> >> apacheds-i18n-2.0.0-M
> >> >>>15
> >> >>
> >> >>>.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/lib/
> >> >>>ht
> >> >>
> >> >>>tpcore-4.2.5.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoop
> >> >>>/c
> >> >>
> >> >>>ommon/lib/xmlenc-0.52.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/sha
> >> >>>re
> >> >>
> >> >>>/hadoop/common/lib/curator-recipes-2.6.0.jar:/home/
> >> santosh/work/framewor
> >> >>>ks
> >> >>
> >> >>>/hadoop-2.6.0/share/hadoop/common/lib/xz-1.0.jar:/home/
> >> santosh/work/fram
> >> >>>ew
> >> >>
> >> >>>orks/hadoop-2.6.0/share/hadoop/common/lib/hadoop-auth-
> >> 2.6.0.jar:/home/sa
> >> >>>nt
> >> >>
> >> >>>osh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/
> >> commons-cli-1.2
> >> >>>.j
> >> >>
> >> >>>ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/lib/av
> >> >>>ro
> >> >>
> >> >>>-1.7.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/commo
> >> >>>n/
> >> >>
> >> >>>lib/jets3t-0.9.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/ha
> >> >>>do
> >> >>
> >> >>>op/common/lib/jettison-1.1.jar:/home/santosh/work/
> >> frameworks/hadoop-2.6.
> >> >>>0/
> >> >>
> >> >>>share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/
> >> home/santosh/work/
> >> >>>fr
> >> >>
> >> >>>ameworks/hadoop-2.6.0/share/hadoop/common/lib/
> >> hamcrest-core-1.3.jar:/hom
> >> >>>e/
> >> >>
> >> >>>santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> common/lib/commons-dig
> >> >>>es
> >> >>
> >> >>>ter-1.8.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/comm
> >> >>>on
> >> >>
> >> >>>/lib/commons-math3-3.1.1.jar:/home/santosh/work/
> >> frameworks/hadoop-2.6.0/
> >> >>>sh
> >> >>
> >> >>>are/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/
> >> santosh/work/framewor
> >> >>>ks
> >> >>
> >> >>>/hadoop-2.6.0/share/hadoop/common/lib/commons-compress-1.
> >> 4.1.jar:/home/s
> >> >>>an
> >> >>
> >> >>>tosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> common/lib/jetty-util-6.1
> >> >>>.2
> >> >>
> >> >>>6.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/lib
> >> >>>/j
> >> >>
> >> >>>ackson-xc-1.9.13.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/ha
> >> >>>do
> >> >>
> >> >>>op/common/lib/jersey-core-1.9.jar:/home/santosh/work/
> >> frameworks/hadoop-2
> >> >>>.6
> >> >>
> >> >>>.0/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/
> >> santosh/work/fr
> >> >>>am
> >> >>
> >> >>>eworks/hadoop-2.6.0/share/hadoop/common/lib/jackson-
> >> mapper-asl-1.9.13.ja
> >> >>>r:
> >> >>
> >> >>>/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/lib/curat
> >> >>>or
> >> >>
> >> >>>-client-2.6.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoo
> >> >>>p/
> >> >>
> >> >>>common/lib/jersey-server-1.9.jar:/home/santosh/work/
> >> frameworks/hadoop-2.
> >> >>>6.
> >> >>
> >> >>>0/share/hadoop/common/lib/httpclient-4.2.5.jar:/home/
> >> santosh/work/framew
> >> >>>or
> >> >>
> >> >>>ks/hadoop-2.6.0/share/hadoop/common/lib/zookeeper-3.
> >> 4.6.jar:/home/santos
> >> >>>h/
> >> >>
> >> >>>work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/
> >> slf4j-log4j12-1.7.5
> >> >>>.j
> >> >>
> >> >>>ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/lib/ne
> >> >>>tt
> >> >>
> >> >>>y-3.6.2.Final.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoo
> >> >>>p/
> >> >>
> >> >>>common/lib/commons-collections-3.2.1.jar:/home/
> >> santosh/work/frameworks/h
> >> >>>ad
> >> >>
> >> >>>oop-2.6.0/share/hadoop/common/lib/commons-lang-2.6.
> >> jar:/home/santosh/wor
> >> >>>k/
> >> >>
> >> >>>frameworks/hadoop-2.6.0/share/hadoop/common/lib/
> >> commons-net-3.1.jar:/hom
> >> >>>e/>>
> >> >>>santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> common/lib/commons-bea
> >> >>>nu
> >> >>
> >> >>>tils-1.7.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoop/c
> >> >>>om
> >> >>
> >> >>>mon/lib/commons-logging-1.1.3.jar:/home/santosh/work/
> >> frameworks/hadoop-2
> >> >>>.6
> >> >>
> >> >>>.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/santosh/
> >> work/frameworks
> >> >>>/h
> >> >>
> >> >>>adoop-2.6.0/share/hadoop/common/lib/slf4j-api-1.7.5.
> >> jar:/home/santosh/wo
> >> >>>rk
> >> >>
> >> >>>/frameworks/hadoop-2.6.0/share/hadoop/common/lib/
> >> mockito-all-1.8.5.jar:/
> >> >>>ho
> >> >>
> >> >>>me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> comon/lib/guava-11.
> >> >>>0
> >> >>
> >>
> >>>>>.2.jar:/home/santosh/work/frameworks/hadoop2.6.0/share/hadoop/common/l
> >>>>>i
> >> >>>b/
> >> >>
> >> >>>asm-3.2.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/comm
> >> >>>on
> >> >>
> >> >>>/lib/gson-2.2.4.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/had
> >> >>>oo
> >> >>
> >> >>>p/common/lib/htrace-core-3.0.4.jar:/home/santosh/work/
> >> frameworks/hadoop-
> >> >>>2.
> >> >>
> >> >>>6.0/share/hadoop/common/lib/jasper-compiler-5.5.23.
> >> jar:/home/santosh/wor
> >> >>>k/
> >> >>
> >> >>>frameworks/hadoop-2.6.0/share/hadoop/common/lib/api-
> >> asn1-api-1.0.0-M20.j
> >> >>>ar
> >> >>
> >> >>>:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/lib/jett
> >> >>>y-
> >> >>
> >> >>>6.1.26.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/commo
> >> >>>n/
> >> >>
> >> >>>lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/
> >> santosh/work/frameworks/
> >> >>>ha
> >> >>
> >> >>>doop-2.6.0/share/hadoop/common/lib/jersey-json-1.9.
> >> jar:/home/santosh/wor
> >> >>>k/
> >> >>
> >> >>>frameworks/hadoop-2.6.0/share/hadoop/common/lib/junit-
> >> 4.11.jar:/home/san
> >> >>>to
> >> >>
> >>
> >>>>sh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-httpcli
> >>>>e
> >> >>>nt
> >> >>
> >> >>>-3.1.jar:/home/santos/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/l
> >> >>>i
> >> >>
> >> >>>b/stax-api-1.0-2.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/ha
> >> >>>do
> >> >>
> >> >>>op/common/lib/commons-codec-1.4.jar:/home/santosh/
> >> work/frameworks/hadoop
> >> >>>-2
> >> >>
> >> >>>.6.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:
> >> /home/santosh/wor
> >> >>>k/
> >> >>
> >> >>>frameworks/hadoop-2.6.0/share/hadoop/common/hadoop-
> >> common-2.6.0-tests.ja
> >> >>>r:
> >> >>
> >> >>>/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/common/hadoop-co
> >> >>>mm
> >> >>
> >> >>>on-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/com
> >> >>>mo
> >> >>
> >>
> >>>>>n/hadoop-nfs-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0share
> >>>>>/
> >> >>>ha
> >> >>
> >> >>>doop/hdfs:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/haoop/hdfs/li
> >> >>>b
> >> >>
> >> >>>/commons-io-2.4.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/had
> >> >>>oo
> >> >>
> >> >>>p/hdfs/lib/protobuf-java-2.5.0.jar:/home/santosh/work/
> >> frameworks/hadoop-
> >> >>>2.
> >> >>
> >> >>>6.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/
> >> santosh/work/frameworks
> >> >>>/h
> >> >>
> >> >>>adoop-2.6.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/
> >> home/santosh/work/fr
> >> >>>am
> >> >>
> >> >>>eworks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-el-1.
> >> 0.jar:/home/santo
> >> >>>sh
> >> >>
> >> >>>/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/
> >> servlet-api-2.5.jar:
> >> >>>/h
> >> >>
> >> >>>ome/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/hdfs/lib/jackson-c
> >> >>>or
> >> >>
> >> >>>e-asl-1.9.13.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoop
> >> >>>/h
> >> >>
> >> >>>dfs/lib/xmlenc-0.52.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share
> >> >>>/h
> >> >>
> >> >>>adoop/hdfs/lib/commons-cli-1.2.jar:/home/santosh/work/
> >> frameworks/hadoop-
> >> >>>2.
> >> >>
> >> >>>6.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/
> >> santosh/work/frame
> >> >>>wo
> >> >>
> >> >>>rks/hadoop-2.6.0/share/hadoop/hdfs/lib/jersey-core-1.
> >> 9.jar:/ome/santosh/
> >> >>>w
> >> >>
> >> >>>ork/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/
> >> commons-daemon-1.0.13.
> >> >>>ja
> >> >>
> >> >>>r:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/hdfs/lib/jacks
> >> >>>on
> >> >>
> >> >>>-mapper-asl-1.9.13.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/
> >> >>>ha
> >> >>
> >> >>>doop/hdfs/lib/jersey-server-1.9.jar:/home/santosh/
> >> work/frameworks/hadoop
> >> >>>-2
> >> >>
> >> >>>.6.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/
> >> santosh/work/fram
> >> >>>ew
> >> >>
> >> >>>orks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-lang-
> >> 2.6.jar:/home/santo
> >> >>>sh
> >> >>
> >> >>>/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/xml-
> >> apis-1.3.04.jar:
> >> >>>/h
> >> >>
> >> >>>ome/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/hdfs/lib/xercesImp
> >> >>>l-
> >> >>
> >> >>>2.9.1.ar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/hdfs/li
> >> >>>b
> >> >>
> >>
> >>>>>/commons-logging-1.1.3.jar:/ome/santosh/work/frameworks/hadoop-2.6.0/s
> >>>>>h
> >> >>>ar
> >> >>
> >> >>>e/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/santosh/work/
> >> frameworks/hadoop-2
> >> >>>.6
> >> >>
> >> >>>.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/
> >> santosh/work/frameworks/
> >> >>>ha
> >> >>
> >> >>>doop-2.6.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/
> >> santosh/work/framewor
> >> >>>ks
> >> >>
> >> >>>/hadoop-2.6.0/share/hadoop/hdfs/lib/htrace-core-3.0.4.
> >> jar:/home/santosh/
> >> >>>wo
> >> >>
> >> >>>rk/framewors/hadoop-2.6.0/share/hadoop/hdfs/lib/jetty-6.
> >> 1.26.jar:/home/s
> >> >>>a
> >> >>
> >> >>>ntosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> hdfs/lib/commons-codec-1
> >> >>>.4
> >> >>
> >> >>>.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/hdfs/lib/ja
> >> >>>sp
> >> >>
> >> >>>er-runtime-5.5.23.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/h
> >> >>>ad
> >> >>
> >> >>>oop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/home/santosh/work/
> >> frameworks/hadoop-
> >> >>>2.
> >> >>
> >> >>>6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/
> >> home/santosh/work/fra
> >> >>>me
> >> >>
> >> >>>works/hadoop-2.6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0.
> >> jar:/home/santosh
> >> >>>/w
> >> >>
> >> >>>ork/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jaxb-
> >> api-2.2.2.jar:/ho
> >> >>>me
> >> >>
> >> >>>/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> yarn/lib/commons-io-2
> >> >>>.4
> >> >>
> >> >>>.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/yarn/lib/ja
> >> >>>ck
> >> >>
> >> >>>son-jaxrs-1.9.13.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/ha
> >> >>>do
> >> >>
> >> >>>op/yarn/lib/jline-0.9.94.jar:/home/santosh/work/
> >> frameworks/hadoop-2.6.0/
> >> >>>sh
> >> >>
> >> >>>are/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/
> >> santosh/work/framework
> >> >>>s/
> >> >>
> >> >>>hadoop-2.6.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/
> >> home/santosh/work/f
> >> >>>ra
> >> >>
> >> >>>meworks/hadoop-2.6.0/share/hadoop/yarn/lib/jsr305-1.3.9.
> >> jar:/home/santos
> >> >>>h/
> >> >>
> >> >>>work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/
> >> servlet-api-2.5.jar:/
> >> >>>ho
> >> >>
> >> >>>me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> yarn/lib/javax.inje
> >> >>>ct
> >> >>
> >> >>>-1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/yarn/lib/
> >> >>>ja
> >> >>
> >> >>>ckson-core-asl-1.9.13.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/sha
> >> >>>re
> >> >>
> >> >>>/hadoop/yarn/lib/activation-1.1.jar:/home/
> >> santosh/work/frameworks/hadoop
> >> >>>-2
> >> >>
> >> >>>.6.0/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/
> >> santosh/work/fram
> >> >>>ew
> >> >>
> >> >>>orks/hadoop-2.6.0/share/hadoop/yarn/lib/xz-1.0.jar:/
> >> home/santosh/work/fr
> >> >>>am
> >> >>
> >> >>>eworks/hadoop-2.6.0/share/hadoop/yarn/lib/commons-cli-1.
> >> 2.jar:/home/sant
> >> >>>os
> >> >>
> >> >>>h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/
> >> jettison-1.1.jar:/h
> >> >>>om
> >> >>
> >> >>>e/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> yarn/lib/jersey-clie
> >> >>>nt
> >> >>
> >> >>>-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> hare/hadoop/yarn/lib
> >> >>>/
> >> >>
> >>
> >>>>>commons-compress-14.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/s
> >>>>>h
> >> >>>ar
> >> >>
> >>
> >>>>>e/hdoop/yarn/lib/guice-servlet-3.0.jar:/home/santosh/work/frameworks/h
> >>>>>a
> >> >>>o
> >> >>
> >> >>>op-2.6.0/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/
> >> home/santosh/work/
> >> >>>fr
> >> >>
> >> >>>ameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jackson-
> >> xc-1.9.13.jar:/home/
> >> >>>sa
> >> >>
> >> >>>ntosh/work/frameorks/hadoop-2.6.0/share/hadoop/
> >> yarn/lib/jersey-core-1.9.>>>j
> >> >>
> >> >>>ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/yarn/lib/jack
> >> >>>so
> >> >>
> >> >>>n-mapper-asl-1.9.13.jar:/home/santos/work/frameworks/
> >> hadoop-2.6.0/share/
> >> >>>h
> >> >>
> >> >>>adoop/yarn/lib/leveldbjni-all-1.8.jar:/home/santosh/
> >> work/frameworks/hado
> >> >>>op
> >> >>
> >> >>>-2.6.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/
> >> homesantosh/work/fra
> >> >>>m
> >> >>
> >> >>>eworks/hadoop-2.6.0/share/hadoop/yarn/lib/aopalliance-1.
> >> 0.jar:/home/sant
> >> >>>os
> >> >>
> >> >>>h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/
> >> zookeeper-3.4.6.jar
> >> >>>:/
> >> >>
> >> >>>ome/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/yarn/lib/netty-3.6
> >> >>>.
> >> >>
> >> >>>2.Final.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/yarn
> >> >>>/l
> >> >>
> >> >>>ib/commons-collections-3.2.1.jar:/home/santosh/work/
> >> frameworks/hadoop-2.
> >> >>>6.
> >> >>
> >> >>>0/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/
> >> santosh/work/framewor
> >> >>>ks
> >> >>
> >> >>>/hadoop-2.6.0/share/hadoop/yarn/lib/commons-logging-1.1.
> >> 3.jar:/home/sant
> >> >>>os
> >> >>
> >> >>>h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/
> >> guava-11.0.2.jar:/h
> >> >>>om
> >> >>
> >> >>>e/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> yarn/lib/asm-3.2.jar
> >> >>>:/
> >> >>
> >> >>>home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/yarn/lib/jetty-6.
> >> >>>1.
> >> >>
> >> >>>26.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/yarn/lib/
> >> >>>je
> >> >>
> >> >>>rsey-json-1.9.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoo
> >> >>>p/
> >> >>
> >> >>>yarn/lib/guice-3.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/
> >> >>>ha
> >> >>
> >> >>>doop/yarn/lib/commons-httpclient-3.1.jar:/home/
> >> santosh/work/frameworks/h
> >> >>>ad
> >> >>
> >> >>>oop-2.6.0/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/
> >> home/santosh/work/fr
> >> >>>am
> >> >>
> >> >>>eworks/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-guice-
> >> 1.9.jar:/home/san
> >> >>>to
> >> >>
> >> >>>sh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/
> >> ommons-codec-1.4.j
> >> >>>a
> >> >>
> >> >>>r:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/yarn/hadoop-ya
> >> >>>rn
> >> >>
> >>
> >>>>>-server-nodemanager-2.6.0.jar:/home/santosh/work/frameorks/hadoop-2.6.
> >>>>>0
> >> >>>/s
> >> >>
> >> >>>hare/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.
> >> 0.jar:
> >> >>>/h
> >> >>
> >> >>>ome/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/yarn/hadoop-yarn-c
> >> >>>li
> >> >>
> >> >>>ent-2.6.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoop/ya
> >> >>>rn
> >> >>
> >> >>>/hadoop-yarn-server-web-proxy-2.6.0.jar:/home/santosh/
> >> work/frameworks/ha
> >> >>>do
> >> >>
> >> >>>op-2.6.0/share/hadoop/yarn/hadoop-yarn-applications-
> >> unmanaged-am-launche
> >> >>>r-
> >> >>
> >> >>>2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/yarn/h
> >> >>>ad
> >> >>
> >> >>>oop-yarn-server-tets-2.6.0.jar:/home/santosh/work/
> >> frameworks/hadoop-2.6.
> >> >>>0
> >> >>
> >> >>>/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/
> >> home/santosh/wor
> >> >>>k/
> >> >>
> >> >>>frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-
> >> api-2.6.0.jar:/hom
> >> >>>e/
> >> >>
> >> >>>santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> yarn/hadoop-yarn-serve
> >> >>>r-
> >> >>
> >>
> >>>>>resourcemanager-2.6.0.jar:/home/santos/work/frameworks/hadoop-2.6.0/sh
> >>>>>a
> >> >>>re
> >> >>
> >> >>>/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/home/
> >> santosh/work/frameworks/
> >> >>>ha
> >> >>
> >> >>>doop-2.6.0/share/hadoop/yarn/hadoop-yarn-applications-
> >> distributedshell-2
> >> >>>.6
> >> >>
> >>
> >>>>>.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/arn/had
> >>>>>o
> >> >>>op
> >> >>
> >> >>>-yarn-registry-2.6.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2..0/share
> >> >>>/
> >> >>
> >> >>>hadoop/mapreduce/lib/commons-io-2.4.jar:/home/
> >> santosh/work/frameworks/ha
> >> >>>do
> >> >>
> >>
> >>>>>op-2.6.0/shae/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/santo
> >>>>>s
> >> >>>h/
> >> >>
> >> >>>work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/
> >> snappy-java-1.0.
> >> >>>4.
> >> >>
> >> >>>1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/mapreduce/
> >> >>>li
> >> >>
> >> >>>b/paranamer-2.3.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/had
> >> >>>oo
> >> >>
> >> >>>p/mapreduce/lib/log4j-1.2.17.jar:/home/santosh/work/
> >> frameworks/hadoop-2.
> >> >>>6.
> >> >>
> >> >>>0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/
> >> santosh/work/frame
> >> >>>wo
> >> >>
> >> >>>rks/hadoop-2.6.0/share/hadoop/mapreduce/lib/jackson-
> >> core-asl-1.9.13.jar:
> >> >>>/h
> >> >>
> >>
> >>>>>ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/li/xz-
> >>>>>1
> >> >>>.0
> >> >>
> >> >>>.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/mapreduce/l
> >> >>>ib
> >> >>
> >> >>>/avro-1.7.4.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoop/
> >> >>>ma
> >> >>
> >> >>>preduce/lib/hadoop-annotations-2.6.0.jar:/home/
> >> santosh/work/frameworks/h
> >> >>>ad
> >> >>
> >>
> >>>>>oop-2.6.0/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/hoe/santos
> >>>>>h
> >> >>>/w
> >> >>
> >> >>>ork/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/
> >> commons-compress-
> >> >>>1.
> >> >>
> >> >>>4.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/mapreduc
> >> >>>e/
> >> >>
> >> >>>lib/guice-servlet-3.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/sha
> >> >>>re
> >> >>
> >> >>>/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/
> >> santosh/work/frameworks/
> >> >>>ha
> >> >>
> >> >>>doop-2.6.0/share/hadoop/mapreduce/lib/jackson-mapper-
> >> asl-1.9.13.jar:/hom
> >> >>>e/
> >> >>
> >>
> >>>>>santosh/work/fameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/leveldb
> >>>>>j
> >> >>>ni
> >> >>
> >> >>>-all-1.8.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/map
> >> >>>re
> >> >>
> >> >>>duce/lib/jersey-server-1.9.jar:/home/santosh/work/
> >> frameworks/hadoop-2.6.
> >> >>>0/
> >> >>
> >> >>>share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/
> >> santosh/work/framew
> >> >>>or
> >> >>
> >> >>>ks/hadoop-2.6.0/share/hadoop/mapreduce/lib/netty-3.
> >> 6.2.Final.jar:/home/s
> >> >>>an
> >> >>
> >> >>>tosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> mapreduce/lib/asm-3.2.jar
> >> >>>:/
> >> >>
> >> >>>home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/mapreduce/lib/jun
> >> >>>it
> >> >>
> >> >>>-4.11.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/mapred
> >> >>>uc
> >> >>
> >> >>>e/lib/guice-3.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/had
> >> >>>oo
> >> >>
> >> >>>p/mapreduce/lib/jersey-guice-1.9.jar:/home/santosh/
> >> work/frameworks/hadoo
> >> >>>p-
> >> >>
> >> >>>2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-
> >> examples-2.6.0.jar:/home/s
> >> >>>an
> >> >>
> >> >>>tosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >> mapreduce/adoop-mapreduce
> >> >>>-
> >> >>
> >> >>>client-jobclient-2.6.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/sh
> >> >>>ar
> >> >>
> >> >>>e/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.
> >> jar:/home/santos
> >> >>>h/
> >> >>
> >> >>>work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/
> >> hadoop-mapreduce-cli
> >> >>>en
> >> >>
> >> >>>t-core-2.6.0.jar:/home/santosh/work/frameworks/
> >> hadoop-2.6.0/share/hadoop
> >> >>>/m
> >> >>
> >> >>>apreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/
> >> home/santosh/work/fr
> >> >>>am
> >> >>
> >> >>>eworks/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-
> >> mapreduce-client-app-2
> >> >>>.6
> >> >>
> >> >>>.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
> >> hadoop/mapreduce
> >> >>>/h
> >> >>
> >> >>>adoop-mapreduce-client-jobclient-2.6.0-tests.jar:/
> >> home/santosh/work/fram
> >> >>>ew
> >> >>
> >> >>>orks/hadoop2.6.0/share/hadoop/mapreduce/hadoop-
> >> mapreduce-client-hs-plugi
> >> >>>n
> >> >>
> >> >>>s-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >> share/hadoop/mapr
> >> >>>ed
> >> >>
> >> >>>uce/hadoop-mapreduce-client-hs-2.6.0.jar:/contrib/
> >> apacity-scheduler/*.ja
> >> >>>r
> >> >> >
> >> >> >
> >> >> >On Sat, Feb 28, 2015 at 9:06 AM, Shi, Shaofeng <[email protected]>
> >> >>wrote:
> >> >> >
> >> >> >> I don¹t think downgrade hbase can fix that; The jar version in
> >>hbase
> >> >>is
> >> >> >> low; I suggest to check the mapped-site.xml, find property
> >> >> >> mapreduce.application.classpath, that is the class path that be
> >> >>loaded
> >> >> >>in
> >> >> >> MR; Check whether the hbase folder was put ahead of hadoop
> >>folders;
> >> >> >>
> >> >> >> On 2/28/15, 11:25 AM, "Santosh Akhilesh"
> >><[email protected]>
> >> >> >> wrote:
> >> >> >>
> >> >> >> >Only difference I find in my setup is hbase mine hbase is 0.98.10
> >> >>and
> >> >> >> >kylin is 0.98.4. I wil try downgrading my hbase. But I really
> >>doubt
> >> >> >>that
> >> >> >> >this will solve the problem. But since there is no alternate
> >>option
> >> >>in
> >> >> >> >sight , I will anyhow give it a try.
> >> >> >> >
> >> >> >> >Sent fro Outlook on iPhone
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >On Fri, Feb 27, 2015 at 7:00 PM -0800 "Shi, Shaofeng"
> >> >> >><[email protected]>
> >> >> >> >wrote:
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >> >HmmS please use the same level client jars; In Kylin¹s pom.xml,
> >>it
> >> >> >> >compiles with 2.6.0 jars:
> >> >> >> >https://github.com/KylinOLAP/Kylin/blob/master/pom.xml#L19
> >> >> >> >
> >> >> >> >
> >> >> >> >On 2/27/15, 8:53 PM, "Santoshakhilesh" wrote:
> >> >> >> >
> >> >> >> >>Hi Shaofeng ,
> >> >> >> >> I checked the hbase libs , I am using hbas 0.98.10-hadoop2
> >>its
> >> >> >>using
> >> >> >> >>hadoop-mapreduce-client-app-2.2.0.jar
> >> >> >> >>but hadoop is using 2.6.0
> >> >> >> >>
> >> >> >> >>Is this the issue ?
> >> >> >> >>
> >> >> >> >>I checked kylin POM its using the 0.98.4-hadoop2
> >> >> >> >>
> >> >> >> >>Is this problem due to this mismatch ? do you sugegst me to try
> >> >>with
> >> >> >> >>changing my hbase version ?
> >> >> >> >>
> >> >> >> >>Regards,
> >> >> >> >>Santosh Akhilesh
> >> >> >> >>Bangalore R&D
> >> >> >> >>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >> >> >>
> >> >> >> >>www.huawei.com
> >> >> >>
> >> >>
> >> >>>>>>------------------------------------------------------
> >> ---------------
> >> >>>>>>--
> >> >> >>>>--
> >> >> >> >>-
> >> >> >> >>-----------------------------------------------------------
> >> >> >> >>This e-mail and its attachments contain confidential information
> >> >>from
> >> >> >> >>HUAWEI, which
> >> >> >> >>is intended only for the person or entity whose address is
> >>listed
> >> >> >>above.
> >> >> >> >>Any use of the
> >> >> >> >>information contained herein in any way (including, but not
> >>limited
> >> >> >>to,
> >> >> >> >>total or partial
> >> >> >> >>disclosure, reproduction, or dissemination) by persons other
> >>than
> >> >>the
> >> >> >> >>intended
> >> >> >> >>recipient(s) is prohibited. If you receive this e-mail in error,
> >> >> >>please
> >> >> >> >>notifythe sender by
> >> >> >> >>phone or email immediately and delete it!
> >> >> >> >>
> >> >> >> >>________________________________________
> >> >> >> >>From: Santoshakhilesh [[email protected]]
> >> >> >> >>Sent: Friday, February 27, 2015 4:49 PM
> >> >> >> >>To: [email protected]
> >> >> >> >>Cc: Kulbhushan Rana
> >> >> >> >>Subject: RE: Cube Build Failed at Last Step//RE: Error while
> >>making
> >> >> >>cube
> >> >> >> >>& Measure option is not responding on GUI
> >> >> >> >>
> >> >> >> >>Hi Shaofeng ,
> >> >> >> >> I configured job histroy server and no more connection
> >> >>exception.
> >> >> >>now
> >> >> >> >>I get the MR counter exception which we were suspecting.
> >> >> >> >> My haddop version is indeed 2.6.0 , So any idea what can be
> >> >>done
> >> >> >>for
> >> >> >> >>this ?
> >> >> >> >>
> >> >> >> >>QuartzScheduler_Worker-8]:[2015-02-28
> >> >> >>
> >> >>
> >> >>>>>>00:36:26,507][DEBUG][com.kylinolap.job.tools.
> >> HadoopStatusChecker.chec
> >> >>>>>>kS
> >> >> >>>>ta
> >> >> >> >>t
> >> >> >> >>us(HadoopStatusChecker.java:74)] - State of Hadoop job:
> >> >> >> >>job_1424957178195_0031:FINISHED-SUCCEEDED
> >> >> >> >>[QuartzScheduler_Worker-8]:[2015-02-28
> >> >> >>
> >> >>
> >> >>>>>>00:36:27,204][ERROR][com.kylinolap.job.cmd.
> >> JavaHadoopCmdOutput.update
> >> >>>>>>Jo
> >> >> >>>>bC
> >> >> >> >>o
> >> >> >> >>unter(JavaHadoopCmdOutput.java:176)] - No enum constant
> >> >> >> >>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES
> >> >> >> >>java.lang.IllegalArgumentException: No enum constant
> >> >> >> >>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES
> >> >> >> >> at java.lang.Enum.valueOf(Enum.java:236)
> >> >> >> >> at
> >> >> >>
> >> >>
> >>
> >>>>>>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.valueOf(
> >> Fr
> >> >>>>>>am
> >> >> >>>>ew
> >> >> >> >>o
> >> >> >> >>rkCounterGroup.java:148)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.
> >> findCounte
> >> >>>>>>r(
> >> >> >>>>Fr
> >> >> >> >>a
> >> >> >> >>meworkCounterGroup.java:182)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>org.apache.hadoop.mapreduce.counters.AbstractCounters.findCounter(
> >> Abs
> >> >>>>>>tr
> >> >> >>>>ac
> >> >> >> >>t
> >> >> >> >>Counters.java:154)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>org.apache.hadoop.mapreduce.TypeConverter.
> >> fromYarn(TypeConverter.java
> >> >>>>>>:2
> >> >> >>>>40
> >> >> >> >>)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.
> >> getJobCounters(ClientS
> >> >>>>>>er
> >> >> >>>>vi
> >> >> >> >>c
> >> >> >> >>eDelegate.java:370)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>org.apache.hadoop.mapred.YARNRunner.getJobCounters(YARNRunner.
> >> java:51
> >> >>>>>>1)
> >> >> >> >> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:756)
> >> >> >> >> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:753)
> >> >> >> >> at java.security.AccessController.doPrivileged(Native Method)
> >> >> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>org.apache.hadoop.security.UserGroupInformation.
> >> doAs(UserGroupInforma
> >> >>>>>>ti
> >> >> >>>>on
> >> >> >> >>.
> >> >> >> >>java:1491)
> >> >> >> >> at org.apache.hadoop.mapreduce.Job.getCounters(Job.java:753)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.
> >> getCounters(AbstractHadoop
> >> >>>>>>Jo
> >> >> >>>>b.
> >> >> >> >>j
> >> >> >> >>ava:287)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.
> >> updateJobCounter(JavaHadoop
> >> >>>>>>Cm
> >> >> >>>>dO
> >> >> >> >>u
> >> >> >> >>tput.java:162)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.getStatus(
> >> JavaHadoopCmdOutp
> >> >>>>>>ut
> >> >> >>>>.j
> >> >> >> >>a
> >> >> >> >>va:85)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(
> >> AsyncJobFlowNode.java
> >> >>>>>>:8
> >> >> >>>>6)
> >> >> >> >> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> >> >> >> >> at
> >> >> >>
> >> >>
> >> >>>>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.
> >> run(SimpleThreadPool.j
> >> >>>>>>av
> >> >> >>>>a:
> >> >> >> >>5
> >> >> >> >>73)
> >> >> >> >>
> >> >> >> >>Regards,
> >> >> >> >>Santosh Akhilesh
> >> >> >> >>Bangalore R&D
> >> >> >> >>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >> >> >>
> >> >> >> >>www.huawei.com
> >> >> >>
> >> >>
> >> >>>>>>------------------------------------------------------
> >> ---------------
> >> >>>>>>--
> >> >> >>>>--
> >> >> >> >>-
> >> >> >> >>-----------------------------------------------------------
> >> >> >> >>This e-mail and its attachments contain confidential information
> >> >>from
> >> >> >> >>HUAWEI, which
> >> >> >> >>is intended only for the person or entity whose address is
> >>listed
> >> >> >>above.
> >> >> >> >>Anyuse of the
> >> >> >> >>information contained herein in any way (including, but not
> >>limited
> >> >> >>to,
> >> >> >> >total or partial
> >> >> >> >>disclosure, reproduction, or dissemination) by persons other
> >> >>thanthe
> >> >> >> >>intended
> >> >> >> >>recipient(s) is prohibited. If you receive this e-mail in error,
> >> >> >>please
> >> >> >> >>notify th sender by
> >> >> >> >>phone or email immediately and delete it!
> >> >> >> >>
> >> >> >> >>________________________________________
> >> >> >> >>From: Shi, Shaofeng [[email protected]]
> >> >> >> >>Sent: Friday, February 27, 2015 3:10 PM
> >> >> >> >>To: [email protected]
> >> >> >> >>Subject: Re: Cube Build Failed at Last Step//RE: Error while
> >>making
> >> >> >>cube
> >> >> >> >>& Measure option is not responding on GUI
> >> >> >> >>
> >> >> >> >>0.0.0.0:10020 isn¹t a valid network address I think; please
> >>check
> >> >>the
> >> >> >> >>³mapreduce.jobhistory.address² in your mapred-site.xml; it
> >>should
> >> >>be
> >> >> >> >>something like:
> >> >> >> >>
> >> >> >> >>
> >> >> >> >>
> >> >> >> >> mapreduce.jobhistory.address
> >> >> >> >> sandbox.hortonworks.com:10020
> >> >> >> >>
> >> >> >> >>
> >> >> >> >>
> >> >> >> >>
> >> >> >> >>On 2/27/15, 5:29 PM, "Santoshakhilesh"
> >> >> >> >>wrote:
> >> >> >> >>
> >> >> >> >>>Hi Shaofeng ,
> >> >> >> >>> No I have not found MR counter exception. I get following
> >> >> >>exception
> >> >> >> >>>frequently. I think this is related LogHistory server of
> >>hadoop.
> >> >> >> >>>
> >> >> >> >>>[QuartzScheduler_Worker-23]:[2015-02-27
> >> >> >>
> >> >>
> >> >>>>>>>22:18:37,299][ERROR][com.kylinolap.job.cmd.
> >> JavaHadoopCmdOutput.updat
> >> >>>>>>>eJ
> >> >> >>>>>ob
> >> >> >> >>>C
> >> >> >> >>>o
> >> >> >> >>>unter(JavaHadoopCmdOutput.java:176)] - java.io.IOException:
> >> >> >> >>>java.net.ConnectException: Call From linux/10.19.93.68 to
> >> >> >>0.0.0.0:10020
> >> >> >> >>>failed on connection exception: java.net.ConnectException:
> >> >>Connection
> >> >> >> >>>refused; For more details see:
> >> >> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >> >> >>>com.kylinolap.job.exception.JobException: java.io.IOException:
> >> >> >> >>>java.net.ConnectException: Call From linux/10.19.93.68 to
> >> >> >>0.0.0.0:10020
> >> >> >> >>>failed on connection exception: java.net.ConnectException:
> >> >>Connection
> >> >> >> >>>refused; For more details see:
> >> >> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >> >> >>> at
> >> >> >>
> >> >>
> >> >>>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.
> >> getCounters(AbstractHadoo
> >> >>>>>>>pJ
> >> >> >>>>>ob
> >> >> >> >>>.
> >> >> >> >>>j
> >> >> >> >>>ava:289)
> >> >> >> >>> at
> >> >> >>
> >> >>
> >> >>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.
> >> updateJobCounter(JavaHadoo
> >> >>>>>>>pC
> >> >> >>>>>md
> >> >> >> >>>O
> >> >> >> >>>u
> >> >> >> >>>tput.java:162)
> >> >> >> >>> at
> >> >> >>
> >> >>
> >> >>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.getStatus(
> >> JavaHadoopCmdOut
> >> >>>>>>>pu
> >> >> >>>>>t.
> >> >> >> >>>j
> >> >> >> >>>a
> >> >> >> >>>va:85)
> >> >> >> >>> at
> >> >> >>
> >> >>
> >> >>>>>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(
> >> AsyncJobFlowNode.jav
> >> >>>>>>>a:
> >> >> >>>>>86
> >> >> >> >>>)
> >> >> >> >>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> >> >> >> >>> at
> >> >> >>
> >> >>
> >> >>>>>>>rg.quartz.simpl.SimpleThreadPool$WorkerThread.
> >> run(SimpleThreadPool.j
> >> >>>>>>>av
> >> >> >>>>>a:
> >> >> >> >>>5
> >> >> >> >>>73)
> >> >> >> >>>Caused by: java.io.IOException: java.net.ConnectException: Call
> >> >>From
> >> >> >> >>>linux/10.19.93.68 to 0.0.0.0:10020 failed on connection
> >> exception:
> >> >> >> >>>java.net.ConnectException: Connection refused; For more details
> >> >>see:
> >> >> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >> >> >>>
> >> >> >> >>>Regards,
> >> >> >> >>>Santosh Akhilesh
> >> >> >> >>>Bangalore R&D
> >> >> >> >>>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >> >> >>>
> >> >> >> >>>www.huawei.com
> >> >> >>
> >> >>
> >> >>>>>>>-----------------------------------------------------
> >> ---------------
> >> >>>>>>>--
> >> >> >>>>>--
> >> >> >> >>>-
> >> >> >> >>>-
> >> >> >> >>>-----------------------------------------------------------
> >> >> >> >>>This e-mail and its attachments contain confidential
> >>information
> >> >>from
> >> >> >> >>>HUAWEI, which
> >> >> >> >>>is intended only for the person or entity whose address is
> >>listed
> >> >> >>above.
> >> >> >> >>>Any use of the
> >> >> >> >>>information contained herein in any way (including, but not
> >> >>limited
> >> >> >>to,
> >> >> >> >>>total or partial
> >> >> >> >>>disclosure, reproduction, or dissemination) by persons other
> >>than
> >> >>the
> >> >> >> >>>intended
> >> >> >> >>>recipient(s) is prohibited. If you receive this e-mail in
> >>error,
> >> >> >>please
> >> >> >> >>>notify the sender by
> >> >> >> >>>phone or email immediately and delete it!
> >> >> >> >>>
> >> >> >> >>>________________________________________
> >> >> >> >>>From: Shi, Shaofeng [[email protected]]
> >> >> >> >>>Sent: Friday, February 27, 2015 2:47 PM
> >> >> >> >>>To: [email protected]
> >> >> >> >>>Cc: Kulbhushan Rana
> >> >> >> >>>Subject: Re: Cube Build Failed at Last Step//RE: Error while
> >> >>making
> >> >> >>cube
> >> >> >> >>>& Measure option is not responding on GUI
> >> >> >> >>>
> >> >> >> >>>Did ou figure out the exception of "No enum constant
> >> >> >> >>>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES² ? Is
> >>it
> >> >> >>still
> >> >> >> >>>be
> >> >> >> >>>thrown in the logs? In the last step, Kyin need to parse the MR
> >> >> >> >>>counters
> >> >> >> >>>to update cube size; Please refer to
> >> >> >> >>>https://issues.apache.org/jira/browse/MAPREDUCE-5831 for that
> >> >>error.
> >> >> >> >>>
> >> >> >> >>>On 2/27/15 5:04 PM, "Santoshakhilesh"
> >> >> >> >>>wrote:
> >> >> >> >>>
> >> >> >> >>>>Hi Shaofeng ,
> >> >> >> >>>> Cube building is failed at last step while loading
> >> >>Hfile
> >> >> >>to
> >> >> >> >>>>Hbase with exception "Can't get cube segment size.
> >> >> >> >>>>". What could be reason ?
> >> >> >> >>>>
> >> >> >> >>>>parameter : -input
> >> >> >> >>>>/tmp/kylin-17a4606f-905b-4ea1-922a-27c2bfb5c68b/
> >> RetailCube/hfile/
> >> >> >> >>>>-htablename KYLIN_K27LDMX63W -cubename RetailCube
> >> >> >> >>>>
> >> >> >> >>>>Log:
> >> >> >> >>>>
> >> >> >> >>>>Start to execute command:
> >> >> >> >>>> -input
> >> >> >> >>>>/tmp/kylin-17a4606f-905b-4ea1-922a-27c2bfb5c68b/
> >> RetailCube/hfile/
> >> >> >> >>>>-htablename KYLIN_K27LDMX63W -cubename RetailCube
> >> >> >> >>>>Command execute return code 0
> >> >> >> >>>>Failed with Exception:java.lang.RuntimeException: Can't get
> >>cube
> >> >> >> >>>>segment
> >> >> >> >>>>size.
> >> >> >> >>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>com.kylinolap.job.flow.JobFlowListener.
> >> updateCubeSegmentInfoOnSucce
> >> >>>>>>>>ed
> >> >> >>>>>>(J
> >> >> >> >>>>o
> >> >> >> >>>>b
> >> >> >> >>>>F
> >> >> >> >>>>lowListener.java:247)
> >> >> >> >>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>com.kylinolap.job.flow.JobFlowListener.
> >> jobWasExecuted(JobFlowListen
> >> >>>>>>>>er
> >> >> >>>>>>.j
> >> >> >> >>>>a
> >> >> >> >>>>v
> >> >> >> >>>>a
> >> >> >> >>>>:101)
> >> >> >> >>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>org.quartz.core.QuartzScheduler.notifyJobListenersWasExecuted(
> >> Quart
> >> >>>>>>>>zS
> >> >> >>>>>>ch
> >> >> >> >>>>e
> >> >> >> >>>>d
> >> >> >> >>>>u
> >> >> >> >>>>ler.java:1985)
> >> >> >> >>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>org.quartz.core.JobRunShell.notifyJobListenersComplete(
> >> JobRunShell.
> >> >>>>>>>>ja
> >> >> >>>>>>va
> >> >> >> >>>>:
> >> >> >> >>>>3
> >> >> >> >>>>4
> >> >> >> >>>>0)
> >> >> >> >>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:224)
> >> >> >> >>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.
> >> run(SimpleThreadPool
> >> >>>>>>>>.j
> >> >> >>>>>>av
> >> >> >> >>>>a
> >> >> >> >>>>:
> >> >> >> >>>>5
> >> >> >> >>>>73)
> >> >> >> >>>>
> >> >> >> >>>>I have checked in hbase shell and following are the tables in
> >> >>hbase;
> >> >> >> >>>>hbase(main):001:0> list
> >> >> >> >>>>TABLE
> >> >> >> >>>>
> >> >> >> >>>>KYLIN_K27LDMX63W
> >> >> >> >>>>kylin_metadata_qa
> >> >> >> >>>>kylin_metadata_qa_acl
> >> >> >> >>>>kylin_metadata_qa_cube
> >> >> >> >>>>kylin_metadata_qa_dict
> >> >> >> >>>>kylin_metadata_qa_invertedindex
> >> >> >> >>>>kylin_metadata_qa_job
> >> >> >> >>>>kylin_metadata_qa_job_output
> >> >> >> >>>>kylin_metadata_qa_proj
> >> >> >> >>>>kylin_metadata_qa_table_snapshot
> >> >> >> >>>>kylin_metadata_qa_user
> >> >> >> >>>>11 row(s) in 0.8990 seconds
> >> >> >> >>>>
> >> >> >> >>>>
> >> >> >> >>>>Regards,
> >> >> >> >>>>Santosh Akhilesh
> >> >> >> >>>>Bangalore R&D
> >> >> >> >>>>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >> >> >>>>
> >> >> >> >>>>www.huawei.com
> >> >> >>
> >> >>
> >> >>>>>>>>----------------------------------------------------
> >> ---------------
> >> >>>>>>>>--
> >> >> >>>>>>--
> >> >> >> >>>>-
> >> >> >> >>>>-
> >> >> >> >>>>-
> >> >> >> >>>>-----------------------------------------------------------
> >> >> >> >>>>This e-mail and its attachments contain confidential
> >>information
> >> >> >>from
> >> >> >> >>>>HUAWEI, which
> >> >> >> >>>>is intended only for the person or entity whose address is
> >>listed
> >> >> >> >>>>above.
> >> >> >> >>>>Any use of the
> >> >> >> >>>>information contained herein in any way (including, but not
> >> >>limited
> >> >> >>to,
> >> >> >> >>>>total or partial
> >> >> >> >>>>disclosure, reproduction, or dissemination) by persons other
> >>than
> >> >> >>the
> >> >> >> >>>>intended
> >> >> >> >>>>recipient(s) is prohibited. If you receive this e-mail in
> >>error,
> >> >> >>please
> >> >> >> >>>>notify the sender by
> >> >> >> >>>>phone or email immediately and delete it!
> >> >> >> >>>>
> >> >> >> >>>>________________________________________
> >> >> >> >>>>From: Santoshakhilesh
> >> >> >> >>>>Sent: Friday, February 27, 2015 2:15 PM
> >> >> >> >>>>To: [email protected]
> >> >> >> >>>>Subject: RE: Error while making cube & Measure option is not
> >> >> >>responding
> >> >> >> >>>>on GUI
> >> >> >> >>>>
> >> >> >> >>>>I have manually copied the jar to /tmp/kylin , now satge 2 is
> >> >>done ,
> >> >> >> >>>>thanks.
> >> >> >> >>>>
> >> >> >> >>>>Regards,
> >> >> >> >>>>Santosh Akhilesh
> >> >> >> >>>>Bangalore R&D
> >> >> >> >>>>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >> >> >>>>
> >> >> >> >>>>www.huawei.com
> >> >> >>
> >> >>
> >> >>>>>>>>----------------------------------------------------
> >> ---------------
> >> >>>>>>>>--
> >> >> >>>>>>--
> >> >> >> >>>>-
> >> >> >> >>>>-
> >> >> >> >>>>-
> >> >> >> >>>>-----------------------------------------------------------
> >> >> >> >>>>This e-mail and its attachments contain confidential
> >>information
> >> >> >>from
> >> >> >> >>>>HUAWEI, which
> >> >> >> >>>>is intended only for the person or entity whose address is
> >>listed
> >> >> >> >>>>above.
> >> >> >> >>>>Any use of the
> >> >> >> >>>>information contained herein in any way (including, but not
> >> >>limited
> >> >> >>to,
> >> >> >> >>>>total or partial
> >> >> >> >>>>disclosure, reproduction, or dissemination) by persons other
> >>than
> >> >> >>the
> >> >> >> >>>>intended
> >> >> >> >>>>recipient(s) is prohibited. If you receive this e-mail in
> >>error,
> >> >> >>please
> >> >> >> >>>>notify the sender by
> >> >> >> >>>>phone or email immediately and delete it!
> >> >> >> >>>>
> >> >> >> >>>>________________________________________
> >> >> >> >>>>From: Shi, Shaofeng [[email protected]]
> >> >> >> >>>>Sent: Friday, February 27, 2015 1:00 PM
> >> >> >> >>>>To: [email protected]
> >> >> >> >>>>Cc: Kulbhushan Rana
> >> >> >> >>>>Subject: Re: Error while making cube & Measure option is not
> >> >> >>responding
> >> >> >> >>>>on GUI
> >> >> >> >>>>
> >> >> >> >>>>In 0.6.x the packages are named with ³com.kylinolap.xxx², from
> >> >>0.7
> >> >> >>we
> >> >> >> >>>>renamed the package to ³org.apache.kylin.xxx²; When you
> >> >>downgrade to
> >> >> >> >>>>0.6,
> >> >> >> >>>>did you also replace the jar location with 0.6 ones in
> >> >> >> >>>>kylin.properties?
> >> >> >> >>>>
> >> >> >> >>>>On 2/27/15, 3:13 PM, "Santoshakhilesh"
> >> >> >> >>>>wrote:
> >> >> >> >>>>
> >> >> >> >>>>>Hi Shaofeng ,
> >> >> >> >>>>> I have added my fact and dimension tables under
> >>default
> >> >> >> >>>>>database
> >> >> >> >>>>>of hive.
> >> >> >> >>>>> Now stage 1 of Cube Build is ok. And there is
> >>failure
> >> >>at
> >> >> >> >>>>>step2.
> >> >> >> >>>>> The map reduce job for the finding distinct columns
> >>of
> >> >>fact
> >> >> >> >>>>>table
> >> >> >> >>>>>is error. Yarn log is as below.
> >> >> >> >>>>> Strangely this is class not found error. I have
> >>checked
> >> >>the
> >> >> >> >>>>>Kylin.properties and the jar is already set as below.
> >> >> >> >>>>>kylin. log has one exception connecting to linux/10.19.93.68
> >>to
> >> >> >> >>>>>0.0.0.0:10020
> >> >> >> >>>>> Please help me to give a clue , I am also trying to check
> >> >> >>meanwhile
> >> >> >> >>>>>
> >> >> >> >>>>>Thanks.
> >> >> >> >>>>>kylin property
> >> >> >> >>>>># Temp folder in hdfs
> >> >> >> >>>>>kylin.hdfs.working.dir=/tmp
> >> >> >> >>>>># Path to the local(relative to job engine) job jar, job
> >>engine
> >> >> >>will
> >> >> >> >>>>>use
> >> >> >> >>>>>this jar
> >> >> >> >>>>>kylin.job.jar=/tmp/kylin/kylin-job-latest.jar
> >> >> >> >>>>>
> >> >> >> >>>>>Map Reduce error
> >> >> >> >>>>>----------------------------
> >> >> >> >>>>>2015-02-27 20:24:25,262 FATAL [main]
> >> >> >> >>>>>org.apache.hadoop.mapred.YarnChild:
> >> >> >> >>>>>Error running child : java.lang.NoClassDefFoundError:
> >> >> >> >>>>>com/kylinolap/common/mr/KylinMapper
> >> >> >> >>>>> at java.lang.ClassLoader.defineClass1(Native Method)
> >> >> >> >>>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> >> >> >> >>>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>>java.security.SecureClassLoader.defineClass(
> >> SecureClassLoader.java
> >> >>>>>>>>>:1
> >> >> >>>>>>>42
> >> >> >> >>>>>)
> >> >> >> >>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.
> >> java:449)
> >> >> >> >>>>> at
> >>java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> >> >> >> >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> >> >> >> >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >> >> >> >>>>> at java.security.AccessController.doPrivileged(Native
> >>Method)
> >> >> >> >>>>> at
> >>java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >> >> >> >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >> >> >> >>>>> at
> >> >>sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >> >> >> >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >> >> >> >>>>> at java.lang.Class.forName0(Native Method)
> >> >> >> >>>>> at java.lang.Class.forName(Class.java:274)
> >> >> >> >>>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>>org.apache.hadoop.conf.Configuration.
> >> getClassByNameOrNull(Configur
> >> >>>>>>>>>at
> >> >> >>>>>>>io
> >> >> >> >>>>>n
> >> >> >> >>>>>.
> >> >> >> >>>>>j
> >> >> >> >>>>>a
> >> >> >> >>>>>va:2013)
> >> >> >> >>>>>
> >> >> >> >>>>>Kylin.log
> >> >> >> >>>>>QuartzScheduler_Worker-20]:[2015-02-27
> >> >> >>
> >> >>
> >> >>>>>>>>>20:25:00,663][DEBUG][com.kylinolap.job.engine.
> >> JobFetcher.execute(J
> >> >>>>>>>>>ob
> >> >> >>>>>>>Fe
> >> >> >> >>>>>t
> >> >> >> >>>>>c
> >> >> >> >>>>>h
> >> >> >> >>>>>e
> >> >> >> >>>>>r.java:60)] - 0 pending jobs
> >> >> >> >>>>>[QuartzScheduler_Worker-19]:[2015-02-27
> >> >> >>
> >> >>
> >> >>>>>>>>>20:25:01,730][ERROR][com.kylinolap.job.cmd.
> >> JavaHadoopCmdOutput.upd
> >> >>>>>>>>>at
> >> >> >>>>>>>eJ
> >> >> >> >>>>>o
> >> >> >> >>>>>b
> >> >> >> >>>>>C
> >> >> >> >>>>>o
> >> >> >> >>>>>unter(JavaHadoopCmdOutput.java:176)] - java.io.IOException:
> >> >> >> >>>>>java.net.ConnectException: Call From linux/10.19.93.68 to
> >> >> >> >>>>>0.0.0.0:10020
> >> >> >> >>>>>failed on connection exception: java.net.ConnectException:
> >> >> >>Connection
> >> >> >> >>>>>refused; For more details see:
> >> >> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >> >> >>>>>com.kylinolap.job.exception.JobException:
> >>java.io.IOException:
> >> >> >> >>>>>java.net.ConnectException: Call From linux/10.19.93.68 to
> >> >> >> >>>>>0.0.0.0:10020
> >> >> >> >>>>>failed on connection exception: java.net.ConnectException:
> >> >> >>Connection
> >> >> >> >>>>>refused; For more details see:
> >> >> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >> >> >>>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.
> >> getCounters(AbstractHad
> >> >>>>>>>>>oo
> >> >> >>>>>>>pJ
> >> >> >> >>>>>o
> >> >> >> >>>>>b
> >> >> >> >>>>>.
> >> >> >> >>>>>j
> >> >> >> >>>>>ava:289)
> >> >> >> >>>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.
> >> updateJobCounter(JavaHad
> >> >>>>>>>>>oo
> >> >> >>>>>>>pC
> >> >> >> >>>>>m
> >> >> >> >>>>>d
> >> >> >> >>>>>O
> >> >> >> >>>>>u
> >> >> >> >>>>>tput.java:162)
> >> >> >> >>>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.
> >> getStatus(JavaHadoopCmdO
> >> >>>>>>>>>ut
> >> >> >>>>>>>pu
> >> >> >> >>>>>t
> >> >> >> >>>>>.
> >> >> >> >>>>>j
> >> >> >> >>>>>a
> >> >> >> >>>>>va:85)
> >> >> >> >>>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(
> >> AsyncJobFlowNode.j
> >> >>>>>>>>>av
> >> >> >>>>>>>a:
> >> >> >> >>>>>8
> >> >> >> >>>>>6
> >> >> >> >>>>>)
> >> >> >> >>>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> >> >> >> >>>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.
> >> run(SimpleThreadPoo
> >> >>>>>>>>>l.
> >> >> >>>>>>>ja
> >> >> >> >>>>>v
> >> >> >> >>>>>a
> >> >> >> >>>>>:
> >> >> >> >>>>>5
> >> >> >> >>>>>73)
> >> >> >> >>>>>Caused by: java.io.IOException: java.net.ConnectException:
> >>Call
> >> >> >>From
> >> >> >> >>>>>linux/10.19.93.68 to 0.0.0.0:10020 failed on connection
> >> >>exception:
> >> >> >> >>>>>java.net.ConnectException: Connection refused; For more
> >>details
> >> >> >>see:
> >> >> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >> >> >>>>> at
> >> >> >>
> >> >>
> >> >>>>>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.
> >> invoke(ClientServic
> >> >>>>>>>>>eD